Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-22
..., evaluate, and enforce fishery regulations. Framework Adjustment 1 (FW1) to the Atlantic Surf Clam and Ocean... DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration Proposed Information Collection; Comment Request; Atlantic Surfclam and Ocean Quahog Framework Adjustment I AGENCY: National...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-14
..., evaluate, and enforce fishery regulations. Framework Adjustment 1 (FW1) to the Atlantic Surf Clam and Ocean... DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration Proposed Information Collection; Comment Request; Atlantic Surfclam and Ocean Quahog Framework Adjustment I AGENCY: National...
78 FR 55252 - Proposed Information Collection Request; Comment Request; State Review Framework
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-10
... ENVIRONMENTAL PROTECTION AGENCY [EPA-HQ-OECA-2010-0291; FRL-9900-88-OECA] Proposed Information Collection Request; Comment Request; State Review Framework AGENCY: Environmental Protection Agency (EPA). ACTION: Notice. SUMMARY: The Environmental Protection Agency is planning to submit a request to renew an...
Information Literacy for Archives and Special Collections: Defining Outcomes
ERIC Educational Resources Information Center
Carini, Peter
2016-01-01
This article provides the framework for a set of standards and outcomes that would constitute information literacy with primary sources. Based on a working model used at Dartmouth College's Rauner Special Collections Library in Hanover, New Hampshire, these concepts create a framework for teaching with primary source materials intended to produce…
Integrating medical and research information: a big data approach.
Tilve Álvarez, Carlos M; Ayora Pais, Alberto; Ruíz Romero, Cristina; Llamas Gómez, Daniel; Carrajo García, Lino; Blanco García, Francisco J; Vázquez González, Guillermo
2015-01-01
Most of the information collected in different fields by Instituto de Investigación Biomédica de A Coruña (INIBIC) is classified as unstructured due to its high volume and heterogeneity. This situation, linked to the recent requirement of integrating it to the medical information, makes it necessary to implant specific architectures to collect and organize it before it can be analysed. The purpose of this article is to present the Hadoop framework as a solution to the problem of integrating research information in the Business Intelligence field. This framework can collect, explore, process and structure the aforementioned information, which allow us to develop an equivalent function to a data mart in an Intelligence Business system.
Moloczij, Natasha; Gough, Karla; Solomon, Benjamin; Ball, David; Mileshkin, Linda; Duffy, Mary; Krishnasamy, Mei
2018-01-11
Patient-reported outcome (PRO) data is central to the delivery of quality health care. Establishing sustainable, reliable and cost-efficient methods for routine collection and integration of PRO data into health information systems is challenging. This protocol paper describes the design and structure of a study to develop and pilot test a PRO framework to systematically and longitudinally collect PRO data from a cohort of lung cancer patients at a comprehensive cancer centre in Australia. Best-practice guidelines for developing registries aimed at collecting PROs informed the development of this PRO framework. Framework components included: achieving consensus on determining the purpose of the framework, the PRO measures to be included, the data collection time points and collection methods (electronic and paper), establishing processes to safeguard the quality of the data collected and to link the PRO framework to an existing hospital-based lung cancer clinical registry. Lung cancer patients will be invited to give feedback on the PRO measures (PROMs) chosen and the data collection time points and methods. Implementation of the framework will be piloted for 12 months. Then a mixed-methods approach used to explore patient and multidisciplinary perspectives on the feasibility of implementing the framework and linking it to the lung cancer clinical registry, its clinical utility, perceptions of data collection burden, and preliminary assessment of resource costs to integrate, implement and sustain the PRO framework. The PRO data set will include: a quality of life questionnaire (EORTC-QLQ-C30) and the EORTC lung cancer specific module (QLQC-LC-13). These will be collected pre-treatment (baseline), 2, 6 and 12 months post-baseline. Also, four social isolation questions (PROMIS) will be collected at baseline. Identifying and deciding on the overall purpose, clinical utility of data and which PROs to collect from patients requires careful consideration. Our study will explore how PRO data collection processes that link to a clinical data set can be developed and integrated; how PRO systems that are easy for patients to complete and professionals to use in practice can be achieved, and will provide indicative costs of developing and integrating a longitudinal PRO framework into routine hospital data collection systems. This study is not a clinical trial and is therefore not registered in any trial registry. However, it has received human research ethics approval (LNR/16/PMCC/45).
Davis, Jenny; Morgans, Amee; Burgess, Stephen
2016-04-01
Efficient information systems support the provision of multi-disciplinary aged care and a variety of organisational purposes, including quality, funding, communication and continuity of care. Agreed minimum data sets enable accurate communication across multiple care settings. However, in aged care multiple and poorly integrated data collection frameworks are commonly used for client assessment, government reporting and funding purposes. To determine key information needs in aged care settings to improve information quality, information transfer, safety, quality and continuity of care to meet the complex needs of aged care clients. Modified Delphi methods involving five stages were employed by one aged care provider in Victoria, Australia, to establish stakeholder consensus for a derived minimum data set and address barriers to data quality. Eleven different aged care programs were identified; with five related data dictionaries, three minimum data sets, five program standards or quality frameworks. The remaining data collection frameworks related to diseases classification, funding, service activity reporting, and statistical standards and classifications. A total of 170 different data items collected across seven internal information systems were consolidated to a derived set of 60 core data items and aligned with nationally consistent data collection frameworks. Barriers to data quality related to inconsistencies in data items, staff knowledge, workflow, system access and configuration. The development an internal aged care minimum data set highlighted the critical role of primary data quality in the upstream and downstream use of client information; and presents a platform to build national consistency across the sector.
Knowledge Management for School Leaders: An Ecological Framework for Thinking Schools.
ERIC Educational Resources Information Center
Petrides, Lisa A.; Guiney, Susan Zahra
2002-01-01
Using examples from schools, this paper illustrates how knowledge management can enable schools to examine the plethora of data they collect and how an ecological framework can be used to transform these data into meaningful information. The paper highlights: the history of management information systems; shifts from information management to…
Collaborative Information Retrieval Method among Personal Repositories
NASA Astrophysics Data System (ADS)
Kamei, Koji; Yukawa, Takashi; Yoshida, Sen; Kuwabara, Kazuhiro
In this paper, we describe a collaborative information retrieval method among personal repositorie and an implementation of the method on a personal agent framework. We propose a framework for personal agents that aims to enable the sharing and exchange of information resources that are distributed unevenly among individuals. The kernel of a personal agent framework is an RDF(resource description framework)-based information repository for storing, retrieving and manipulating privately collected information, such as documents the user read and/or wrote, email he/she exchanged, web pages he/she browsed, etc. The repository also collects annotations to information resources that describe relationships among information resources and records of interaction between the user and information resources. Since the information resources in a personal repository and their structure are personalized, information retrieval from other users' is an important application of the personal agent. A vector space model with a personalized concept-base is employed as an information retrieval mechanism in a personal repository. Since a personalized concept-base is constructed from information resources in a personal repository, it reflects its user's knowledge and interests. On the other hand, it leads to another problem while querying other users' personal repositories; that is, simply transferring query requests does not provide desirable results. To solve this problem, we propose a query equalization scheme based on a relevance feedback method for collaborative information retrieval between personalized concept-bases. In this paper, we describe an implementation of the collaborative information retrieval method and its user interface on the personal agent framework.
RPD-based Hypothesis Reasoning for Cyber Situation Awareness
NASA Astrophysics Data System (ADS)
Yen, John; McNeese, Michael; Mullen, Tracy; Hall, David; Fan, Xiaocong; Liu, Peng
Intelligence workers such as analysts, commanders, and soldiers often need a hypothesis reasoning framework to gain improved situation awareness of the highly dynamic cyber space. The development of such a framework requires the integration of interdisciplinary techniques, including supports for distributed cognition (human-in-the-loop hypothesis generation), supports for team collaboration (identification of information for hypothesis evaluation), and supports for resource-constrained information collection (hypotheses competing for information collection resources). We here describe a cognitively-inspired framework that is built upon Klein’s recognition-primed decision model and integrates the three components of Endsley’s situation awareness model. The framework naturally connects the logic world of tools for cyber situation awareness with the mental world of human analysts, enabling the perception, comprehension, and prediction of cyber situations for better prevention, survival, and response to cyber attacks by adapting missions at the operational, tactical, and strategic levels.
Experience API: Flexible, Decentralized and Activity-Centric Data Collection
ERIC Educational Resources Information Center
Kevan, Jonathan M.; Ryan, Paul R.
2016-01-01
This emerging technology report describes the Experience API (xAPI), a new e-learning specification designed to support the learning community in standardizing and collecting both formal and informal distributed learning activities. Informed by Activity Theory, a framework aligned with constructivism, data is collected in the form of activity…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-14
... licensing plan, the FCC's goal is to establish a flexible regulator framework that allows for efficient... requirement. Obligation to Respond: Required to obtain or retain benefits. Statutory authority for this... operational plan for a radio service, a certain number of regulatory and information collection requirements...
Bridging gaps in health information systems: a case study from Somaliland, Somalia.
Askar, Ahmed; Ardakani, Malekafzali; Majdzade, Reza
2018-01-02
Reliable and timely health information is fundamental for health information systems (HIS) to work effectively. This case study aims to assess Somaliland HIS in terms of its contextual situation, major weaknesses and proposes key evidence-based recommendations. Data were collected through national level key informants' interviews, observations, group discussion and scoring using the HIS framework and assessment tool developed by World Health Organization Health Metrics Network (WHO/HMN). The study found major weaknesses including: no policy, strategic plan and legal framework in place; fragmented sub-information systems; Poor information and communications technology (ICT) infrastructure; poorly motivated and under-skilled personnel; dependence on unsustainable external funds; no census or civil registration in place; data from private health sector not captured; insufficient technical capacity to analyse data collected by HIS; and information is not widely shared, disseminated or utilized for decision-making. We recommend developing a national HIS strategic plan that harmonizes and directs collective efforts to become a more integrated, cost-effective and sustainable HIS.
Agent And Component Object Framework For Concept Design Modeling Of Mobile Cyber Physical Systems
2018-03-01
burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing...data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this...burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington headquarters
Addressing location uncertainties in GPS-based activity monitoring: A methodological framework
Wan, Neng; Lin, Ge; Wilson, Gaines J.
2016-01-01
Location uncertainty has been a major barrier in information mining from location data. Although the development of electronic and telecommunication equipment has led to an increased amount and refined resolution of data about individuals’ spatio-temporal trajectories, the potential of such data, especially in the context of environmental health studies, has not been fully realized due to the lack of methodology that addresses location uncertainties. This article describes a methodological framework for deriving information about people’s continuous activities from individual-collected Global Positioning System (GPS) data, which is vital for a variety of environmental health studies. This framework is composed of two major methods that address critical issues at different stages of GPS data processing: (1) a fuzzy classification method for distinguishing activity patterns; and (2) a scale-adaptive method for refining activity locations and outdoor/indoor environments. Evaluation of this framework based on smartphone-collected GPS data indicates that it is robust to location errors and is able to generate useful information about individuals’ life trajectories. PMID:28943777
Nano-Enriched and Autonomous Sensing Framework for Dissolved Oxygen.
Shehata, Nader; Azab, Mohammed; Kandas, Ishac; Meehan, Kathleen
2015-08-14
This paper investigates a nano-enhanced wireless sensing framework for dissolved oxygen (DO). The system integrates a nanosensor that employs cerium oxide (ceria) nanoparticles to monitor the concentration of DO in aqueous media via optical fluorescence quenching. We propose a comprehensive sensing framework with the nanosensor equipped with a digital interface where the sensor output is digitized and dispatched wirelessly to a trustworthy data collection and analysis framework for consolidation and information extraction. The proposed system collects and processes the sensor readings to provide clear indications about the current or the anticipated dissolved oxygen levels in the aqueous media.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oesterling, Patrick; Scheuermann, Gerik; Teresniak, Sven
During the last decades, electronic textual information has become the world's largest and most important information source available. People have added a variety of daily newspapers, books, scientific and governmental publications, blogs and private messages to this wellspring of endless information and knowledge. Since neither the existing nor the new information can be read in its entirety, computers are used to extract and visualize meaningful or interesting topics and documents from this huge information clutter. In this paper, we extend, improve and combine existing individual approaches into an overall framework that supports topological analysis of high dimensional document point cloudsmore » given by the well-known tf-idf document-term weighting method. We show that traditional distance-based approaches fail in very high dimensional spaces, and we describe an improved two-stage method for topology-based projections from the original high dimensional information space to both two dimensional (2-D) and three dimensional (3-D) visualizations. To show the accuracy and usability of this framework, we compare it to methods introduced recently and apply it to complex document and patent collections.« less
A Transparent and Transferable Framework for Tracking Quality Information in Large Datasets
Smith, Derek E.; Metzger, Stefan; Taylor, Jeffrey R.
2014-01-01
The ability to evaluate the validity of data is essential to any investigation, and manual “eyes on” assessments of data quality have dominated in the past. Yet, as the size of collected data continues to increase, so does the effort required to assess their quality. This challenge is of particular concern for networks that automate their data collection, and has resulted in the automation of many quality assurance and quality control analyses. Unfortunately, the interpretation of the resulting data quality flags can become quite challenging with large data sets. We have developed a framework to summarize data quality information and facilitate interpretation by the user. Our framework consists of first compiling data quality information and then presenting it through 2 separate mechanisms; a quality report and a quality summary. The quality report presents the results of specific quality analyses as they relate to individual observations, while the quality summary takes a spatial or temporal aggregate of each quality analysis and provides a summary of the results. Included in the quality summary is a final quality flag, which further condenses data quality information to assess whether a data product is valid or not. This framework has the added flexibility to allow “eyes on” information on data quality to be incorporated for many data types. Furthermore, this framework can aid problem tracking and resolution, should sensor or system malfunctions arise. PMID:25379884
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-06
... ENVIRONMENTAL PROTECTION AGENCY [EPA-HQ-OECA-2010-0291; FRL- 9903-87-OEI] Agency Information Collection Activities: Submission to OMB for Review and Approval; State Review Framework; EPA ICR Number 2185.05 AGENCY: Environmental Protection Agency (EPA). ACTION: Notice. SUMMARY: In compliance with the...
77 FR 49060 - Proposed Information Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-15
... Public Law 111-47, section 501(a). Section 1471 is part of the new Foreign Account Tax Compliance Act... of the new Foreign Account Tax Compliance Act (FATCA) legislative framework to obtain reporting from... proposed and/or continuing information collections, as required by the Paperwork Reduction Act of 1995...
Site-based data curation based on hot spring geobiology
Palmer, Carole L.; Thomer, Andrea K.; Baker, Karen S.; Wickett, Karen M.; Hendrix, Christie L.; Rodman, Ann; Sigler, Stacey; Fouke, Bruce W.
2017-01-01
Site-Based Data Curation (SBDC) is an approach to managing research data that prioritizes sharing and reuse of data collected at scientifically significant sites. The SBDC framework is based on geobiology research at natural hot spring sites in Yellowstone National Park as an exemplar case of high value field data in contemporary, cross-disciplinary earth systems science. Through stakeholder analysis and investigation of data artifacts, we determined that meaningful and valid reuse of digital hot spring data requires systematic documentation of sampling processes and particular contextual information about the site of data collection. We propose a Minimum Information Framework for recording the necessary metadata on sampling locations, with anchor measurements and description of the hot spring vent distinct from the outflow system, and multi-scale field photography to capture vital information about hot spring structures. The SBDC framework can serve as a global model for the collection and description of hot spring systems field data that can be readily adapted for application to the curation of data from other kinds scientifically significant sites. PMID:28253269
Framework for Integrating Science Data Processing Algorithms Into Process Control Systems
NASA Technical Reports Server (NTRS)
Mattmann, Chris A.; Crichton, Daniel J.; Chang, Albert Y.; Foster, Brian M.; Freeborn, Dana J.; Woollard, David M.; Ramirez, Paul M.
2011-01-01
A software framework called PCS Task Wrapper is responsible for standardizing the setup, process initiation, execution, and file management tasks surrounding the execution of science data algorithms, which are referred to by NASA as Product Generation Executives (PGEs). PGEs codify a scientific algorithm, some step in the overall scientific process involved in a mission science workflow. The PCS Task Wrapper provides a stable operating environment to the underlying PGE during its execution lifecycle. If the PGE requires a file, or metadata regarding the file, the PCS Task Wrapper is responsible for delivering that information to the PGE in a manner that meets its requirements. If the PGE requires knowledge of upstream or downstream PGEs in a sequence of executions, that information is also made available. Finally, if information regarding disk space, or node information such as CPU availability, etc., is required, the PCS Task Wrapper provides this information to the underlying PGE. After this information is collected, the PGE is executed, and its output Product file and Metadata generation is managed via the PCS Task Wrapper framework. The innovation is responsible for marshalling output Products and Metadata back to a PCS File Management component for use in downstream data processing and pedigree. In support of this, the PCS Task Wrapper leverages the PCS Crawler Framework to ingest (during pipeline processing) the output Product files and Metadata produced by the PGE. The architectural components of the PCS Task Wrapper framework include PGE Task Instance, PGE Config File Builder, Config File Property Adder, Science PGE Config File Writer, and PCS Met file Writer. This innovative framework is really the unifying bridge between the execution of a step in the overall processing pipeline, and the available PCS component services as well as the information that they collectively manage.
NASA Technical Reports Server (NTRS)
1976-01-01
The framework within which the Applications Systems Verification Tests (ASVTs) are performed and the economic consequences of improved meteorological information demonstrated is described. This framework considers the impact of improved information on decision processes, the data needs to demonstrate the economic impact of the improved information, the data availability, the methodology for determining and analyzing the collected data and demonstrating the economic impact of the improved information, and the possible methods of data collection. Three ASVTs are considered and program outlines and plans are developed for performing experiments to demonstrate the economic consequences of improved meteorological information. The ASVTs are concerned with the citrus crop in Florida, the cotton crop in Mississippi and a group of diverse crops in Oregon. The program outlines and plans include schedules, manpower estimates and funding requirements.
EPA Plan for Reducing Personally Identifiable Information, January 2013
The EPA Privacy Policy, issued in 2007, establishes the framework and accountability for reducing Agency personally identifiable information (PII). Learn about the Agency's plan to reduce the collection of Personally Identifiable Information (PII).
The Use of the Data-to-Action Framework in the Evaluation of CDC's DELTA FOCUS Program.
Armstead, Theresa L; Kearns, Megan; Rambo, Kirsten; Estefan, Lianne Fuino; Dills, Jenny; Rivera, Moira S; El-Beshti, Rasha
The Centers for Disease Control and Prevention's (CDC's) Domestic Violence Prevention Enhancements and Leadership Through Alliances, Focusing on Outcomes for Communities United with States (DELTA FOCUS) program is a 5-year cooperative agreement (2013-2018) funding 10 state domestic violence coalitions and local coordinated community response teams to engage in primary prevention of intimate partner violence. Grantees' prevention strategies were often developmental and emergent; therefore, CDC's approach to program oversight, administration, and support to grantees required a flexible approach. CDC staff adopted a Data-to-Action Framework for the DELTA FOCUS program evaluation that supported a culture of learning to meet dynamic and unexpected information needs. Briefly, a Data-to-Action Framework involves the collection and use of information in real time for program improvement. Utilizing this framework, the DELTA FOCUS data-to-action process yielded important insights into CDC's ongoing technical assistance, improved program accountability by providing useful materials, and information for internal agency leadership, and helped build a learning community among grantees. CDC and other funders, as decision makers, can promote program improvements that are data-informed by incorporating internal processes supportive of ongoing data collection and review.
Standardized reporting of functioning information on ICF-based common metrics.
Prodinger, Birgit; Tennant, Alan; Stucki, Gerold
2018-02-01
In clinical practice and research a variety of clinical data collection tools are used to collect information on people's functioning for clinical practice and research and national health information systems. Reporting on ICF-based common metrics enables standardized documentation of functioning information in national health information systems. The objective of this methodological note on applying the ICF in rehabilitation is to demonstrate how to report functioning information collected with a data collection tool on ICF-based common metrics. We first specify the requirements for the standardized reporting of functioning information. Secondly, we introduce the methods needed for transforming functioning data to ICF-based common metrics. Finally, we provide an example. The requirements for standardized reporting are as follows: 1) having a common conceptual framework to enable content comparability between any health information; and 2) a measurement framework so that scores between two or more clinical data collection tools can be directly compared. The methods needed to achieve these requirements are the ICF Linking Rules and the Rasch measurement model. Using data collected incorporating the 36-item Short Form Health Survey (SF-36), the World Health Organization Disability Assessment Schedule 2.0 (WHODAS 2.0), and the Stroke Impact Scale 3.0 (SIS 3.0), the application of the standardized reporting based on common metrics is demonstrated. A subset of items from the three tools linked to common chapters of the ICF (d4 Mobility, d5 Self-care and d6 Domestic life), were entered as "super items" into the Rasch model. Good fit was achieved with no residual local dependency and a unidimensional metric. A transformation table allows for comparison between scales, and between a scale and the reporting common metric. Being able to report functioning information collected with commonly used clinical data collection tools with ICF-based common metrics enables clinicians and researchers to continue using their tools while still being able to compare and aggregate the information within and across tools.
Electronic immunization data collection systems: application of an evaluation framework.
Heidebrecht, Christine L; Kwong, Jeffrey C; Finkelstein, Michael; Quan, Sherman D; Pereira, Jennifer A; Quach, Susan; Deeks, Shelley L
2014-01-14
Evaluating the features and performance of health information systems can serve to strengthen the systems themselves as well as to guide other organizations in the process of designing and implementing surveillance tools. We adapted an evaluation framework in order to assess electronic immunization data collection systems, and applied it in two Ontario public health units. The Centers for Disease Control and Prevention's Guidelines for Evaluating Public Health Surveillance Systems are broad in nature and serve as an organizational tool to guide the development of comprehensive evaluation materials. Based on these Guidelines, and informed by other evaluation resources and input from stakeholders in the public health community, we applied an evaluation framework to two examples of immunization data collection and examined several system attributes: simplicity, flexibility, data quality, timeliness, and acceptability. Data collection approaches included key informant interviews, logic and completeness assessments, client surveys, and on-site observations. Both evaluated systems allow high-quality immunization data to be collected, analyzed, and applied in a rapid fashion. However, neither system is currently able to link to other providers' immunization data or provincial data sources, limiting the comprehensiveness of coverage assessments. We recommended that both organizations explore possibilities for external data linkage and collaborate with other jurisdictions to promote a provincial immunization repository or data sharing platform. Electronic systems such as the ones described in this paper allow immunization data to be collected, analyzed, and applied in a rapid fashion, and represent the infostructure required to establish a population-based immunization registry, critical for comprehensively assessing vaccine coverage.
Atlanta congestion reduction demonstration. National evaluation : content analysis test plan.
DOT National Transportation Integrated Search
2000-05-30
Commercial Vehicle Information Systems and Networks (CVISN) is the collection of information systems and communication networks that support commercial vehicle operations (CVO.) The National ITS Architecture provides a technical framework that descri...
2017-09-01
unique characteristics of reported anomalies in the collected traffic signals to build a classification framework. Other cyber events, such as a...Furthermore, we identify unique characteristics of reported anomalies in the collected traffic signals to build a classification framework. Other cyber...2]. The applications build flow rules using network topology information provided by the control plane [1]. Since the control plane is able to
A conceptual framework for the collection of food products in a Total Diet Study.
Turrini, Aida; Lombardi-Boccia, Ginevra; Aureli, Federica; Cubadda, Francesco; D'Addezio, Laura; D'Amato, Marilena; D'Evoli, Laura; Darnerud, PerOla; Devlin, Niamh; Dias, Maria Graça; Jurković, Marina; Kelleher, Cecily; Le Donne, Cinzia; López Esteban, Maite; Lucarini, Massimo; Martinez Burgos, Maria Alba; Martínez-Victoria, Emilio; McNulty, Breige; Mistura, Lorenza; Nugent, Anne; Oktay Basegmez, Hatice Imge; Oliveira, Luisa; Ozer, Hayrettin; Perelló, Gemma; Pite, Marina; Presser, Karl; Sokolić, Darja; Vasco, Elsa; Volatier, Jean-Luc
2018-02-01
A total diet study (TDS) provides representative and realistic data for assessing the dietary intake of chemicals, such as contaminants and residues, and nutrients, at a population level. Reproducing the diet through collection of customarily consumed foods and their preparation as habitually eaten is crucial to ensure representativeness, i.e., all relevant foods are included and all potential dietary sources of the substances investigated are captured. Having this in mind, a conceptual framework for building a relevant food-shopping list was developed as a research task in the European Union's 7th Framework Program project, 'Total Diet Study Exposure' (TDS-Exposure), aimed at standardising methods for food sampling, analyses, exposure assessment calculations and modelling, priority foods, and selection of chemical contaminants. A stepwise approach following the knowledge translation (KT) model for concept analysis is proposed to set up a general protocol for the collection of food products in a TDS in terms of steps (characterisation of the food list, development of the food-shopping list, food products collection) and pillars (background documentation, procedures, and tools). A simple model for structuring the information in a way to support the implementation of the process, by presenting relevant datasets, forms to store inherent information, and folders to record the results is also proposed. Reproducibility of the process and possibility to exploit the gathered information are two main features of such a system for future applications.
Bahşi, Hayretdin; Levi, Albert
2010-01-01
Wireless sensor networks (WSNs) generally have a many-to-one structure so that event information flows from sensors to a unique sink. In recent WSN applications, many-to-many structures evolved due to the need for conveying collected event information to multiple sinks. Privacy preserved data collection models in the literature do not solve the problems of WSN applications in which network has multiple un-trusted sinks with different level of privacy requirements. This study proposes a data collection framework bases on k-anonymity for preventing record disclosure of collected event information in WSNs. Proposed method takes the anonymity requirements of multiple sinks into consideration by providing different levels of privacy for each destination sink. Attributes, which may identify an event owner, are generalized or encrypted in order to meet the different anonymity requirements of sinks at the same anonymized output. If the same output is formed, it can be multicasted to all sinks. The other trivial solution is to produce different anonymized outputs for each sink and send them to related sinks. Multicasting is an energy efficient data sending alternative for some sensor nodes. Since minimization of energy consumption is an important design criteria for WSNs, multicasting the same event information to multiple sinks reduces the energy consumption of overall network.
Koorehdavoudi, Hana; Bogdan, Paul
2016-01-01
Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity. PMID:27297496
NASA Astrophysics Data System (ADS)
Koorehdavoudi, Hana; Bogdan, Paul
2016-06-01
Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity.
Reducing data friction through site-based data curation
NASA Astrophysics Data System (ADS)
Thomer, A.; Palmer, C. L.
2017-12-01
Much of geoscience research takes place at "scientifically significant sites": localities which have attracted a critical mass of scientific interest, and thereby merit protection by government bodies, as well as the preservation of specimen and data collections and the development of site-specific permitting requirements for access to the site and its associated collections. However, many data standards and knowledge organization schemas do not adequately describe key characteristics of the sites, despite their centrality to research projects. Through work conducted as part of the IMLS-funded Site-Based Data Curation (SBDC) project, we developed a Minimum Information Framework (MIF) for site-based science, in which "information about a site's structure" is considered a core class of information. Here we present our empirically-derived information framework, as well as the methods used to create it. We believe these approaches will lead to the development of more effective data repositories and tools, and thereby will reduce "data friction" in interdisciplinary, yet site-based, geoscience workflows. The Minimum Information Framework for Site-based Research was developed through work at two scientifically significant sites: the hot springs at Yellowstone National Park, which are key to geobiology research; and the La Brea Tar Pits, an important paleontology locality in Southern California. We employed diverse methods of participatory engagement, in which key stakeholders at our sites (e.g. curators, collections managers, researchers, permit officers) were consulted through workshops, focus groups, interviews, action research methods, and collaborative information modeling and systems analysis. These participatory approaches were highly effective in fostering on-going partnership among a diverse team of domain scientists, information scientists, and software developers. The MIF developed in this work may be viewed as a "proto-standard" that can inform future repository development and data standards. Further, the approaches used to develop the MIF represent an important step toward systematic methods of developing geoscience data standards. Finally, we argue that organizing data around aspects of a site makes data collections more accessible to a range of scientific communities.
The Regulatory Framework for Privacy and Security
NASA Astrophysics Data System (ADS)
Hiller, Janine S.
The internet enables the easy collection of massive amounts of personally identifiable information. Unregulated data collection causes distrust and conflicts with widely accepted principles of privacy. The regulatory framework in the United States for ensuring privacy and security in the online environment consists of federal, state, and self-regulatory elements. New laws have been passed to address technological and internet practices that conflict with privacy protecting policies. The United States and the European Union approaches to privacy differ significantly, and the global internet environment will likely cause regulators to face the challenge of balancing privacy interests with data collection for many years to come.
A framework for evaluating proposals for scientific activities in wilderness
Peter Landres
2000-01-01
This paper presents a structured framework for evaluating proposals for scientific activities in wilderness. Wilderness managers receive proposals for scientific activities ranging from unobtrusive inventorying of plants and animals to the use of chainsaws and helicopters for collecting information. Currently, there is no consistent process for evaluating proposals,...
Linux Incident Response Volatile Data Analysis Framework
ERIC Educational Resources Information Center
McFadden, Matthew
2013-01-01
Cyber incident response is an emphasized subject area in cybersecurity in information technology with increased need for the protection of data. Due to ongoing threats, cybersecurity imposes many challenges and requires new investigative response techniques. In this study a Linux Incident Response Framework is designed for collecting volatile data…
Prediction of compression-induced image interpretability degradation
NASA Astrophysics Data System (ADS)
Blasch, Erik; Chen, Hua-Mei; Irvine, John M.; Wang, Zhonghai; Chen, Genshe; Nagy, James; Scott, Stephen
2018-04-01
Image compression is an important component in modern imaging systems as the volume of the raw data collected is increasing. To reduce the volume of data while collecting imagery useful for analysis, choosing the appropriate image compression method is desired. Lossless compression is able to preserve all the information, but it has limited reduction power. On the other hand, lossy compression, which may result in very high compression ratios, suffers from information loss. We model the compression-induced information loss in terms of the National Imagery Interpretability Rating Scale or NIIRS. NIIRS is a user-based quantification of image interpretability widely adopted by the Geographic Information System community. Specifically, we present the Compression Degradation Image Function Index (CoDIFI) framework that predicts the NIIRS degradation (i.e., a decrease of NIIRS level) for a given compression setting. The CoDIFI-NIIRS framework enables a user to broker the maximum compression setting while maintaining a specified NIIRS rating.
ERIC Educational Resources Information Center
Tzoc, Elias
2011-01-01
According to the "Framework of Guidance for Building Good Digital Collections," a good collection is broadly available and avoids unnecessary impediments to use. Two challenges, however, are the constant change in users' expectations and the increasing volume of information in local repositories. Therefore, as academic and research…
NASA Astrophysics Data System (ADS)
Sorensen, A. E.; Dauer, J. M.; Corral, L.; Fontaine, J. J.
2017-12-01
A core component of public scientific literacy, and thereby informed decision-making, is the ability of individuals to reason about complex systems. In response to students having difficulty learning about complex systems, educational research suggests that conceptual representations, or mental models, may help orient student thinking. Mental models provide a framework to support students in organizing and developing ideas. The PMC-2E model is a productive tool in teaching ideas of modeling complex systems in the classroom because the conceptual representation framework allows for self-directed learning where students can externalize systems thinking. Beyond mental models, recent work emphasizes the importance of facilitating integration of authentic science into the formal classroom. To align these ideas, a university class was developed around the theme of carnivore ecology, founded on PMC-2E framework and authentic scientific data collection. Students were asked to develop a protocol, collect, and analyze data around a scientific question in partnership with a scientist, and then use data to inform their own learning about the system through the mental model process. We identified two beneficial outcomes (1) scientific data is collected to address real scientific questions at a larger scale and (2) positive outcomes for student learning and views of science. After participating in the class, students report enjoying class structure, increased support for public understanding of science, and shifts in nature of science and interest in pursuing science metrics on post-assessments. Further work is ongoing investigating the linkages between engaging in authentic scientific practices that inform student mental models, and how it might promote students' systems-thinking skills, implications for student views of nature of science, and development of student epistemic practices.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-12
... Request; Information for Self-Certification Under FAQ 6 of the United States--European Union Safe Harbor.... SUPPLEMENTARY INFORMATION: I. Abstract In response to the European Union Directive on Data Protection that... framework bridges the differences between the European Union (EU) and U.S. approaches to privacy protection...
Virtual shelves in a digital library: a framework for access to networked information sources.
Patrick, T B; Springer, G K; Mitchell, J A; Sievert, M E
1995-01-01
Develop a framework for collections-based access to networked information sources that addresses the problem of location-dependent access to information sources. This framework uses a metaphor of a virtual shelf. A virtual shelf is a general-purpose server that is dedicated to a particular information subject class. The identifier of one of these servers identifies its subject class. Location-independent call numbers are assigned to information sources. Call numbers are based on standard vocabulary codes. The call numbers are first mapped to the location-independent identifiers of virtual shelves. When access to an information resource is required, a location directory provides a second mapping of these location-independent server identifiers to actual network locations. The framework has been implemented in two different systems. One system is based on the Open System Foundation/Distributed Computing Environment and the other is based on the World Wide Web. This framework applies in new ways traditional methods of library classification and cataloging. It is compatible with two traditional styles of selecting information searching and browsing. Traditional methods may be combined with new paradigms of information searching that will be able to take advantage of the special properties of digital information. Cooperation between the library-informational science community and the informatics community can provide a means for a continuing application of the knowledge and techniques of library science to the new problems of networked information sources.
BioEve Search: A Novel Framework to Facilitate Interactive Literature Search
Ahmed, Syed Toufeeq; Davulcu, Hasan; Tikves, Sukru; Nair, Radhika; Zhao, Zhongming
2012-01-01
Background. Recent advances in computational and biological methods in last two decades have remarkably changed the scale of biomedical research and with it began the unprecedented growth in both the production of biomedical data and amount of published literature discussing it. An automated extraction system coupled with a cognitive search and navigation service over these document collections would not only save time and effort, but also pave the way to discover hitherto unknown information implicitly conveyed in the texts. Results. We developed a novel framework (named “BioEve”) that seamlessly integrates Faceted Search (Information Retrieval) with Information Extraction module to provide an interactive search experience for the researchers in life sciences. It enables guided step-by-step search query refinement, by suggesting concepts and entities (like genes, drugs, and diseases) to quickly filter and modify search direction, and thereby facilitating an enriched paradigm where user can discover related concepts and keywords to search while information seeking. Conclusions. The BioEve Search framework makes it easier to enable scalable interactive search over large collection of textual articles and to discover knowledge hidden in thousands of biomedical literature articles with ease. PMID:22693501
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-29
... Traceability; Tribal Nations Using Systems for Location Identification AGENCY: Animal and Plant Health... using systems for location identification for the animal disease traceability framework and to request....aphis.usda.gov ). FOR FURTHER INFORMATION CONTACT: For information on Tribal Nations using location...
Understanding Collective Activities of People from Videos.
Wongun Choi; Savarese, Silvio
2014-06-01
This paper presents a principled framework for analyzing collective activities at different levels of semantic granularity from videos. Our framework is capable of jointly tracking multiple individuals, recognizing activities performed by individuals in isolation (i.e., atomic activities such as walking or standing), recognizing the interactions between pairs of individuals (i.e., interaction activities) as well as understanding the activities of group of individuals (i.e., collective activities). A key property of our work is that it can coherently combine bottom-up information stemming from detections or fragments of tracks (or tracklets) with top-down evidence. Top-down evidence is provided by a newly proposed descriptor that captures the coherent behavior of groups of individuals in a spatial-temporal neighborhood of the sequence. Top-down evidence provides contextual information for establishing accurate associations between detections or tracklets across frames and, thus, for obtaining more robust tracking results. Bottom-up evidence percolates upwards so as to automatically infer collective activity labels. Experimental results on two challenging data sets demonstrate our theoretical claims and indicate that our model achieves enhances tracking results and the best collective classification results to date.
NASA Astrophysics Data System (ADS)
Zha, Yuanyuan; Yeh, Tian-Chyi J.; Illman, Walter A.; Onoe, Hironori; Mok, Chin Man W.; Wen, Jet-Chau; Huang, Shao-Yang; Wang, Wenke
2017-04-01
Hydraulic tomography (HT) has become a mature aquifer test technology over the last two decades. It collects nonredundant information of aquifer heterogeneity by sequentially stressing the aquifer at different wells and collecting aquifer responses at other wells during each stress. The collected information is then interpreted by inverse models. Among these models, the geostatistical approaches, built upon the Bayesian framework, first conceptualize hydraulic properties to be estimated as random fields, which are characterized by means and covariance functions. They then use the spatial statistics as prior information with the aquifer response data to estimate the spatial distribution of the hydraulic properties at a site. Since the spatial statistics describe the generic spatial structures of the geologic media at the site rather than site-specific ones (e.g., known spatial distributions of facies, faults, or paleochannels), the estimates are often not optimal. To improve the estimates, we introduce a general statistical framework, which allows the inclusion of site-specific spatial patterns of geologic features. Subsequently, we test this approach with synthetic numerical experiments. Results show that this approach, using conditional mean and covariance that reflect site-specific large-scale geologic features, indeed improves the HT estimates. Afterward, this approach is applied to HT surveys at a kilometer-scale-fractured granite field site with a distinct fault zone. We find that by including fault information from outcrops and boreholes for HT analysis, the estimated hydraulic properties are improved. The improved estimates subsequently lead to better prediction of flow during a different pumping test at the site.
Supporting Young Learners 3: Ideas for Child Care Providers and Teachers.
ERIC Educational Resources Information Center
Brickman, Nancy Altman, Ed.
The High/Scope Curriculum is a developmentally based approach to early childhood education. This curriculum's "Extensions" newsletter, in which the articles in this collection first appeared, informs curriculum users about new development, relating to the High/Scope "open framework" curriculum. This collection divides the…
78 FR 52918 - Agency Information Collection Activities; Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-27
... the following industries: transportation services; communication; electric, gas, and sanitary services... substantially changed the federal legal framework for financial services providers. Among the changes, the Dodd... estimates of burden are based on its knowledge of the consumer credit industries and knowledge of the...
Supporting Young Learners 2: Ideas for Child Care Providers and Teachers.
ERIC Educational Resources Information Center
Brickman, Nancy Altman, Ed.
The High/Scope Curriculum is a developmentally based approach to early childhood education. The curriculum's "Extensions" newsletter, in which the articles in this collection first appeared, informs curriculum users about new development, relating to the High/Scope "open framework" curriculum. This collection divides the…
A Handbook for School District Financial Management.
ERIC Educational Resources Information Center
Dembowski, Frederick L.
Designed for school business officials, this handbook provides research information and guidelines on school district banking and cash management systems. Section 1 gives an overview of district financial management operations, discussing the administrative framework, cash budgeting, information and control systems, collection and disbursement…
Riccardo, Flavia; Dente, Maria Grazia; Kärki, Tommi; Fabiani, Massimo; Napoli, Christian; Chiarenza, Antonio; Giorgi Rossi, Paolo; Velasco Munoz, Cesar; Noori, Teymur; Declich, Silvia
2015-01-01
There are limitations in our capacity to interpret point estimates and trends of infectious diseases occurring among diverse migrant populations living in the European Union/European Economic Area (EU/EEA). The aim of this study was to design a data collection framework that could capture information on factors associated with increased risk to infectious diseases in migrant populations in the EU/EEA. The authors defined factors associated with increased risk according to a multi-dimensional framework and performed a systematic literature review in order to identify whether those factors well reflected the reported risk factors for infectious disease in these populations. Following this, the feasibility of applying this framework to relevant available EU/EEA data sources was assessed. The proposed multidimensional framework is well suited to capture the complexity and concurrence of these risk factors and in principle applicable in the EU/EEA. The authors conclude that adopting a multi-dimensional framework to monitor infectious diseases could favor the disaggregated collection and analysis of migrant health data. PMID:26393623
Riccardo, Flavia; Dente, Maria Grazia; Kärki, Tommi; Fabiani, Massimo; Napoli, Christian; Chiarenza, Antonio; Giorgi Rossi, Paolo; Munoz, Cesar Velasco; Noori, Teymur; Declich, Silvia
2015-09-17
There are limitations in our capacity to interpret point estimates and trends of infectious diseases occurring among diverse migrant populations living in the European Union/European Economic Area (EU/EEA). The aim of this study was to design a data collection framework that could capture information on factors associated with increased risk to infectious diseases in migrant populations in the EU/EEA. The authors defined factors associated with increased risk according to a multi-dimensional framework and performed a systematic literature review in order to identify whether those factors well reflected the reported risk factors for infectious disease in these populations. Following this, the feasibility of applying this framework to relevant available EU/EEA data sources was assessed. The proposed multidimensional framework is well suited to capture the complexity and concurrence of these risk factors and in principle applicable in the EU/EEA. The authors conclude that adopting a multi-dimensional framework to monitor infectious diseases could favor the disaggregated collection and analysis of migrant health data.
Advancing the use of performance evaluation in health care.
Traberg, Andreas; Jacobsen, Peter; Duthiers, Nadia Monique
2014-01-01
The purpose of this paper is to develop a framework for health care performance evaluation that enables decision makers to identify areas indicative of corrective actions. The framework should provide information on strategic pro-/regress in an operational context that justifies the need for organizational adjustments. The study adopts qualitative methods for constructing the framework, subsequently implementing the framework in a Danish magnetic resonance imaging (MRI) unit. Workshops and interviews form the basis of the qualitative construction phase, and two internal and five external databases are used for a quantitative data collection. By aggregating performance outcomes, collective measures of performance are achieved. This enables easy and intuitive identification of areas not strategically aligned. In general, the framework has proven helpful in an MRI unit, where operational decision makers have been struggling with extensive amounts of performance information. The implementation of the framework in a single case in a public and highly political environment restricts the generalizing potential. The authors acknowledge that there may be more suitable approaches in organizations with different settings. The strength of the framework lies in the identification of performance problems prior to decision making. The quality of decisions is directly related to the individual decision maker. The only function of the framework is to support these decisions. The study demonstrates a more refined and transparent use of performance reporting by combining strategic weight assignment and performance aggregation in hierarchies. In this way, the framework accentuates performance as a function of strategic progress or regress, thus assisting decision makers in exerting operational effort in pursuit of strategic alignment.
NASA Astrophysics Data System (ADS)
Nurmaini, Siti; Firsandaya Malik, Reza; Stiawan, Deris; Firdaus; Saparudin; Tutuko, Bambang
2017-04-01
The information framework aims to holistically address the problems and issues posed by unwanted peat and land fires within the context of the natural environment and socio-economic systems. Informed decisions on planning and allocation of resources can only be made by understanding the landscape. Therefore, information on fire history and air quality impacts must be collected for future analysis. This paper proposes strategic framework based on technology approach with data fusion strategy to produce the data analysis about peat land fires and air quality management in in South Sumatera. The research framework should use the knowledge, experience and data from the previous fire seasons to review, improve and refine the strategies and monitor their effectiveness for the next fire season. Communicating effectively with communities and the public and private sectors in remote and rural landscapes is important, by using smartphones and mobile applications. Tools such as one-stop information based on web applications, to obtain information such as early warning to send and receive fire alerts, could be developed and promoted so that all stakeholders can share important information with each other.
Virtual shelves in a digital library: a framework for access to networked information sources.
Patrick, T B; Springer, G K; Mitchell, J A; Sievert, M E
1995-01-01
OBJECTIVE: Develop a framework for collections-based access to networked information sources that addresses the problem of location-dependent access to information sources. DESIGN: This framework uses a metaphor of a virtual shelf. A virtual shelf is a general-purpose server that is dedicated to a particular information subject class. The identifier of one of these servers identifies its subject class. Location-independent call numbers are assigned to information sources. Call numbers are based on standard vocabulary codes. The call numbers are first mapped to the location-independent identifiers of virtual shelves. When access to an information resource is required, a location directory provides a second mapping of these location-independent server identifiers to actual network locations. RESULTS: The framework has been implemented in two different systems. One system is based on the Open System Foundation/Distributed Computing Environment and the other is based on the World Wide Web. CONCLUSIONS: This framework applies in new ways traditional methods of library classification and cataloging. It is compatible with two traditional styles of selecting information searching and browsing. Traditional methods may be combined with new paradigms of information searching that will be able to take advantage of the special properties of digital information. Cooperation between the library-informational science community and the informatics community can provide a means for a continuing application of the knowledge and techniques of library science to the new problems of networked information sources. PMID:8581554
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-14
... Request; Survey of Participating Companies in the U.S.-European Union and U.S.-Swiss Safe Harbor... Trade Administration (ITA) administers the U.S.-European Union (EU) and U.S.- Swiss Safe Harbor Frameworks. These Frameworks allow U.S. companies to meet the requirements of the European Union's Data...
Young, Meredith E; Thomas, Aliki; Varpio, Lara; Razack, Saleem I; Hanson, Mark D; Slade, Steve; Dayem, Katharine L; McKnight, David J
2017-04-01
Several national level calls have encouraged reconsideration of diversity issues in medical education. Particular interest has been placed on admissions, as decisions made here shape the nature of the future physician workforce. Critical analysis of current practices paired with evidence-informed policies may counter some of the barriers impeding access for underrepresented groups. We present a framework for diversity-related program development and evaluation grounded within a knowledge translation framework, and supported by the initiation of longitudinal collection of diversity-related data. We provide an illustrative case study for each component of the framework. Descriptive analyses are presented of pre/post intervention diversity metrics if applicable and available. The framework's focal points are: 1) data-driven identification of underrepresented groups, 2) pipeline development and targeted recruitment, 3) ensuring an inclusive process, 4) ensuring inclusive assessment, 5) ensuring inclusive selection, and 6) iterative use of diversity-related data. Case studies ranged from wording changes on admissions websites to the establishment of educational and administrative offices addressing needs of underrepresented populations. We propose that diversity-related data must be collected on a variety of markers, developed in partnership with stakeholders who are most likely to facilitate implementation of best practices and new policies. These data can facilitate the design, implementation, and evaluation of evidence-informed diversity initiatives and provide a structure for continued investigation into 'interventions' supporting diversity-related initiatives.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-14
... Collection; Comment Request; Coral Reef Conservation Program Administration AGENCY: National Oceanic and... The Coral Reef Conservation Act of 2000 (Act) was enacted to provide a framework for conserving coral reefs. The Coral Reef Conservation Grant Program, under the Act, provides funds to broad- based...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-01
... the survey, a choice experiment framework is used with statistically designed tradeoff questions... reason for the survey is public value research. All survey responses will be kept confidential. Form... Collection Request; Comment Request; Willingness to Pay Survey for Salmon Recovery in the Willamette...
Extending information retrieval methods to personalized genomic-based studies of disease.
Ye, Shuyun; Dawson, John A; Kendziorski, Christina
2014-01-01
Genomic-based studies of disease now involve diverse types of data collected on large groups of patients. A major challenge facing statistical scientists is how best to combine the data, extract important features, and comprehensively characterize the ways in which they affect an individual's disease course and likelihood of response to treatment. We have developed a survival-supervised latent Dirichlet allocation (survLDA) modeling framework to address these challenges. Latent Dirichlet allocation (LDA) models have proven extremely effective at identifying themes common across large collections of text, but applications to genomics have been limited. Our framework extends LDA to the genome by considering each patient as a "document" with "text" detailing his/her clinical events and genomic state. We then further extend the framework to allow for supervision by a time-to-event response. The model enables the efficient identification of collections of clinical and genomic features that co-occur within patient subgroups, and then characterizes each patient by those features. An application of survLDA to The Cancer Genome Atlas ovarian project identifies informative patient subgroups showing differential response to treatment, and validation in an independent cohort demonstrates the potential for patient-specific inference.
Airborne electromagnetic mapping of the base of aquifer in areas of western Nebraska
Abraham, Jared D.; Cannia, James C.; Bedrosian, Paul A.; Johnson, Michaela R.; Ball, Lyndsay B.; Sibray, Steven S.
2012-01-01
Airborne geophysical surveys of selected areas of the North and South Platte River valleys of Nebraska, including Lodgepole Creek valley, collected data to map aquifers and bedrock topography and thus improve the understanding of groundwater - surface-water relationships to be used in water-management decisions. Frequency-domain helicopter electromagnetic surveys, using a unique survey flight-line design, collected resistivity data that can be related to lithologic information for refinement of groundwater model inputs. To make the geophysical data useful to multidimensional groundwater models, numerical inversion converted measured data into a depth-dependent subsurface resistivity model. The inverted resistivity model, along with sensitivity analyses and test-hole information, is used to identify hydrogeologic features such as bedrock highs and paleochannels, to improve estimates of groundwater storage. The two- and three-dimensional interpretations provide the groundwater modeler with a high-resolution hydrogeologic framework and a quantitative estimate of framework uncertainty. The new hydrogeologic frameworks improve understanding of the flow-path orientation by refining the location of paleochannels and associated base of aquifer highs. These interpretations provide resource managers high-resolution hydrogeologic frameworks and quantitative estimates of framework uncertainty. The improved base of aquifer configuration represents the hydrogeology at a level of detail not achievable with previously available data.
GIS Application System Design Applied to Information Monitoring
NASA Astrophysics Data System (ADS)
Qun, Zhou; Yujin, Yuan; Yuena, Kang
Natural environment information management system involves on-line instrument monitoring, data communications, database establishment, information management software development and so on. Its core lies in collecting effective and reliable environmental information, increasing utilization rate and sharing degree of environment information by advanced information technology, and maximizingly providing timely and scientific foundation for environmental monitoring and management. This thesis adopts C# plug-in application development and uses a set of complete embedded GIS component libraries and tools libraries provided by GIS Engine to finish the core of plug-in GIS application framework, namely, the design and implementation of framework host program and each functional plug-in, as well as the design and implementation of plug-in GIS application framework platform. This thesis adopts the advantages of development technique of dynamic plug-in loading configuration, quickly establishes GIS application by visualized component collaborative modeling and realizes GIS application integration. The developed platform is applicable to any application integration related to GIS application (ESRI platform) and can be as basis development platform of GIS application development.
ERIC Educational Resources Information Center
Hyman, Harvey
2012-01-01
This dissertation examines the impact of exploration and learning upon eDiscovery information retrieval; it is written in three parts. Part I contains foundational concepts and background on the topics of information retrieval and eDiscovery. This part informs the reader about the research frameworks, methodologies, data collection, and…
Metrics for comparing neuronal tree shapes based on persistent homology.
Li, Yanjie; Wang, Dingkang; Ascoli, Giorgio A; Mitra, Partha; Wang, Yusu
2017-01-01
As more and more neuroanatomical data are made available through efforts such as NeuroMorpho.Org and FlyCircuit.org, the need to develop computational tools to facilitate automatic knowledge discovery from such large datasets becomes more urgent. One fundamental question is how best to compare neuron structures, for instance to organize and classify large collection of neurons. We aim to develop a flexible yet powerful framework to support comparison and classification of large collection of neuron structures efficiently. Specifically we propose to use a topological persistence-based feature vectorization framework. Existing methods to vectorize a neuron (i.e, convert a neuron to a feature vector so as to support efficient comparison and/or searching) typically rely on statistics or summaries of morphometric information, such as the average or maximum local torque angle or partition asymmetry. These simple summaries have limited power in encoding global tree structures. Based on the concept of topological persistence recently developed in the field of computational topology, we vectorize each neuron structure into a simple yet informative summary. In particular, each type of information of interest can be represented as a descriptor function defined on the neuron tree, which is then mapped to a simple persistence-signature. Our framework can encode both local and global tree structure, as well as other information of interest (electrophysiological or dynamical measures), by considering multiple descriptor functions on the neuron. The resulting persistence-based signature is potentially more informative than simple statistical summaries (such as average/mean/max) of morphometric quantities-Indeed, we show that using a certain descriptor function will give a persistence-based signature containing strictly more information than the classical Sholl analysis. At the same time, our framework retains the efficiency associated with treating neurons as points in a simple Euclidean feature space, which would be important for constructing efficient searching or indexing structures over them. We present preliminary experimental results to demonstrate the effectiveness of our persistence-based neuronal feature vectorization framework.
Metrics for comparing neuronal tree shapes based on persistent homology
Li, Yanjie; Wang, Dingkang; Ascoli, Giorgio A.; Mitra, Partha
2017-01-01
As more and more neuroanatomical data are made available through efforts such as NeuroMorpho.Org and FlyCircuit.org, the need to develop computational tools to facilitate automatic knowledge discovery from such large datasets becomes more urgent. One fundamental question is how best to compare neuron structures, for instance to organize and classify large collection of neurons. We aim to develop a flexible yet powerful framework to support comparison and classification of large collection of neuron structures efficiently. Specifically we propose to use a topological persistence-based feature vectorization framework. Existing methods to vectorize a neuron (i.e, convert a neuron to a feature vector so as to support efficient comparison and/or searching) typically rely on statistics or summaries of morphometric information, such as the average or maximum local torque angle or partition asymmetry. These simple summaries have limited power in encoding global tree structures. Based on the concept of topological persistence recently developed in the field of computational topology, we vectorize each neuron structure into a simple yet informative summary. In particular, each type of information of interest can be represented as a descriptor function defined on the neuron tree, which is then mapped to a simple persistence-signature. Our framework can encode both local and global tree structure, as well as other information of interest (electrophysiological or dynamical measures), by considering multiple descriptor functions on the neuron. The resulting persistence-based signature is potentially more informative than simple statistical summaries (such as average/mean/max) of morphometric quantities—Indeed, we show that using a certain descriptor function will give a persistence-based signature containing strictly more information than the classical Sholl analysis. At the same time, our framework retains the efficiency associated with treating neurons as points in a simple Euclidean feature space, which would be important for constructing efficient searching or indexing structures over them. We present preliminary experimental results to demonstrate the effectiveness of our persistence-based neuronal feature vectorization framework. PMID:28809960
A Data Scheduling and Management Infrastructure for the TEAM Network
NASA Astrophysics Data System (ADS)
Andelman, S.; Baru, C.; Chandra, S.; Fegraus, E.; Lin, K.; Unwin, R.
2009-04-01
The objective of the Tropical Ecology Assessment and Monitoring Network (www.teamnetwork.org) is "To generate real time data for monitoring long-term trends in tropical biodiversity through a global network of TEAM sites (i.e. field stations in tropical forests), providing an early warning system on the status of biodiversity to effectively guide conservation action". To achieve this, the TEAM Network operates by collecting data via standardized protocols at TEAM Sites. The standardized TEAM protocols include the Climate, Vegetation and Terrestrial Vertebrate Protocols. Some sites also implement additional protocols. There are currently 7 TEAM Sites with plans to grow the network to 15 by June 30, 2009 and 50 TEAM Sites by the end of 2010. Climate Protocol The Climate Protocol entails the collection of climate data via meteorological stations located at the TEAM Sites. This includes information such as precipitation, temperature, wind direction and strength and various solar radiation measurements. Vegetation Protocol The Vegetation Protocol collects standardized information on tropical forest trees and lianas. A TEAM Site will have between 6-9 1ha plots where trees and lianas larger than a pre-specified size are mapped, identified and measured. This results in each TEAM Site repeatedly measuring between 3000-5000 trees annually. Terrestrial Vertebrate Protocol The Terrestrial Vertebrate Protocol collects standardized information on mid-sized tropical forest fauna (i.e. birds and mammals). This information is collected via camera traps (i.e. digital cameras with motion sensors housed in weather proof casings). The images taken by the camera trap are reviewed to identify what species are captured in the image by the camera trap. The image and the interpretation of what is in the image are the data for the Terrestrial Vertebrate Protocol. The amount of data collected through the TEAM protocols provides a significant yet exciting IT challenge. The TEAM Network is currently partnering with the San Diego Super Computer Center to build the data management infrastructure. Data collected from the three core protocols as well as others are currently made available through the TEAM Network portal, which provides the content management framework, the data scheduling and management framework, an administrative framework to implement and manage TEAM sites, collaborative tools and a number of tools and applications utilizing Google Map and Google Earth products. A critical element of the TEAM Network data management infrastructure is to make the data publicly available in as close to real-time as possible (the TEAM Network Data Use Policy: http://www.teamnetwork.org/en/data/policy). This requires two essential tasks to be accomplished, 1) A data collection schedule has to be planned, proposed and approved for a given TEAM site. This is a challenging process since TEAM sites are geographically distributed across the tropics and hence have different seasons where they schedule field sampling for the different TEAM protocols. Capturing this information and ensuring that TEAM sites follow the outlined legal contract is key to the data collection process and 2) A stream-lined and efficient information management system to ensure data collected from the field meet the minimum data standards (i.e. are of the highest scientific quality) and are securely transferred, archived, processed and be rapidly made publicaly available, as a finished consumable product via the TEAM Network portal. The TEAM Network is achieving these goals by implementing an end-to-end framework consisting of the Sampling Scheduler application and the Data Management Framework. Sampling Scheduler The Sampling Scheduler is a project management, calendar based portal application that will allow scientists at a TEAM site to schedule field sampling for each of the TEAM protocols implemented at that site. The sampling scheduler addresses the specific requirements established in the TEAM protocols with the logistical scheduling needs of each TEAM Site. For example, each TEAM protocol defines when data must be collected (e.g. time of day, number of times per year, during which seasons, etc) as well as where data must be collected (from which sampling units, which trees, etc). Each TEAM Site has a limited number of resources and must create plans that will both satisfy the requirements of the protocols as well as be logistically feasible for their TEAM Site. With 15 TEAM Sites (and many more coming soon) the schedules of each TEAM Site must be communicated to the Network Office to ensure data are being collected as scheduled and to address the many problems when working in difficult environments like Tropical Forests. The Sampling Schedule provides built-in proposal and approval functionality to ensure that the TEAM Sites are and the Network office are in sync as well as provides the capability to modify schedules when needed. The Data Management Framework The Data Management framework is a three-tier data ingestion, edit and review application for protocols defined in the TEAM network. The data ingestion framework provides online web forms for field personnel to submit and edit data collected at TEAM Sites. These web forms will be accessible from the TEAM content management site. Once the data is securely uploaded, cured, processed and approved, it will be made publicly available for consumption by the scientific community. The Data Management framework, when combined with the Sampling Scheduler provides a closed loop Data Scheduling and Management infrastructure. All information starting from data collection plan, tools to input, modify and curate data, review and run QA/QC tests, as well as verify data are collected as planed are included. Finally, TEAM Network data are available for download via the Data Query and Download Application. This application utilizes a Google Maps custom interface to search, visualize, and download TEAM Network data. References • TEAM Network, http://www.teamnetwork.org • Center for Applied Biodiversity Science, Conservation International. http://science.conservation.org/portal/server.pt • TEAM Data Query and Download Application, http://www.teamnetwork.org/en/data/query
An Adaptive Sensor Mining Framework for Pervasive Computing Applications
NASA Astrophysics Data System (ADS)
Rashidi, Parisa; Cook, Diane J.
Analyzing sensor data in pervasive computing applications brings unique challenges to the KDD community. The challenge is heightened when the underlying data source is dynamic and the patterns change. We introduce a new adaptive mining framework that detects patterns in sensor data, and more importantly, adapts to the changes in the underlying model. In our framework, the frequent and periodic patterns of data are first discovered by the Frequent and Periodic Pattern Miner (FPPM) algorithm; and then any changes in the discovered patterns over the lifetime of the system are discovered by the Pattern Adaptation Miner (PAM) algorithm, in order to adapt to the changing environment. This framework also captures vital context information present in pervasive computing applications, such as the startup triggers and temporal information. In this paper, we present a description of our mining framework and validate the approach using data collected in the CASAS smart home testbed.
77 FR 22200 - Rescission of Rules
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-13
... ``Identity Theft Rules,'' 16 CFR part 681, and its rules governing ``Disposal of Consumer Report Information...; Duties of Creditors Regarding Risk-Based Pricing, 16 CFR part 640; Duties of Users of Consumer Reports... collection, assembly, and use of consumer report information and provides the framework for the credit...
An Information Needs Profile of Israeli Older Adults, regarding the Law and Services
ERIC Educational Resources Information Center
Getz, Irith; Weissman, Gabriella
2010-01-01
Based on Nicholas' framework for assessing information needs, this research aims to construct a profile of both Israeli older adults and their information needs regarding laws and social services. Data were collected by questionnaires answered by 200 older adults, born in Europe, Asia and Africa, who attended social clubs for older adults. The…
A Conceptual Framework for Analyzing Terrorist Groups,
1985-06-01
level of future terrorist operations, the impact of arrests, etc. The identification of these gaps should provide the impetus to improved collection in...act. The principal advantage, however, of the conceptual framework is its ability to absorb new information as it becomes available, providing the...and indicate how long they have been targets. (Types: Diplomatic, business, military, police, airlines, private citizens, utilities, energy facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iyer, Maithili; Kumar, Satish; Mathew, Sangeeta
Enhancing energy efficiency of the commercial building stock is an important aspect of any national energy policy. Understanding how buildings use energy is critical to formulating any new policy that may impact energy use, underscoring the importance of credible data. Data enables informed decision making and good quality data is essential for policy makers to prioritize energy saving strategies and track implementation. Given the uniqueness of the buildings sector and challenges to collecting relevant energy data, this study characterizes various elements involved in pertinent data collection and management, with the specific focus on well-defined data requirements, appropriate methodologies and processes,more » feasible data collection mechanisms, and approaches to institutionalizing the collection process. This report starts with a comprehensive review of available examples of energy data collection frameworks for buildings across different countries. The review covers the U.S. experience in the commercial buildings sector, the European experience in the buildings sector and other data collection initiatives in Singapore and China to capture the more systematic efforts in Asia in the commercial sector. To provide context, the review includes a summary and status of disparate efforts in India to collect and use commercial building energy data. Using this review as a key input, the study developed a data collection framework for India with specific consideration to relevant use cases. Continuing with the framework for data collection, this study outlines the key performance indicators applicable to the use cases and their collection feasibility, as well as immediate priorities of the participating stakeholders. It also discusses potential considerations for data collection and the possible approaches for survey design. With the specific purpose of laying out the possible ways to structure and organize data collection institutionally, the study collates existing mechanisms to analyze building energy performance in India and opportunities for standardizing data collection. This report describes the existing capacities and resources for establishing an institutional framework for data collection, the legislation and mandates that support such activity, and identifies roles and responsibilities of the relevant ministries and organizations. Finally, the study presents conclusions and identifies two major data collection strategies within the existing legal framework.« less
76 FR 59741 - Proposed Collection, Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-27
... conceptual framework for the State and area estimates of employment and unemployment, specifies the procedures to be used, provides input information, and discusses the theoretical and empirical basis for each...
Hotchkiss, David R; Aqil, Anwer; Lippeveld, Theo; Mukooyo, Edward
2010-07-03
Sound policy, resource allocation and day-to-day management decisions in the health sector require timely information from routine health information systems (RHIS). In most low- and middle-income countries, the RHIS is viewed as being inadequate in providing quality data and continuous information that can be used to help improve health system performance. In addition, there is limited evidence on the effectiveness of RHIS strengthening interventions in improving data quality and use. The purpose of this study is to evaluate the usefulness of the newly developed Performance of Routine Information System Management (PRISM) framework, which consists of a conceptual framework and associated data collection and analysis tools to assess, design, strengthen and evaluate RHIS. The specific objectives of the study are: a) to assess the reliability and validity of the PRISM instruments and b) to assess the validity of the PRISM conceptual framework. Facility- and worker-level data were collected from 110 health care facilities in twelve districts in Uganda in 2004 and 2007 using records reviews, structured interviews and self-administered questionnaires. The analysis procedures include Cronbach's alpha to assess internal consistency of selected instruments, test-retest analysis to assess the reliability and sensitivity of the instruments, and bivariate and multivariate statistical techniques to assess validity of the PRISM instruments and conceptual framework. Cronbach's alpha analysis suggests high reliability (0.7 or greater) for the indices measuring a promotion of a culture of information, RHIS tasks self-efficacy and motivation. The study results also suggest that a promotion of a culture of information influences RHIS tasks self-efficacy, RHIS tasks competence and motivation, and that self-efficacy and the presence of RHIS staff have a direct influence on the use of RHIS information, a key aspect of RHIS performance. The study results provide some empirical support for the reliability and validity of the PRISM instruments and the validity of the PRISM conceptual framework, suggesting that the PRISM approach can be effectively used by RHIS policy makers and practitioners to assess the RHIS and evaluate RHIS strengthening interventions. However, additional studies with larger sample sizes are needed to further investigate the value of the PRISM instruments in exploring the linkages between RHIS data quality and use, and health systems performance.
Taiwan Regulation of Biobanks.
Fan, Chien-Te; Hung, Tzu-Hsun; Yeh, Chan-Kun
2015-01-01
This paper introduces legal framework and governance structure in relation to the management and development of biobanks in Taiwan. At first, we briefly describe Taiwan's population, political system and health care system. Secondly, this research introduces biobanking framework of Taiwan including 25 biobanks established with the approval of the Ministry of Health and Welfare. In those biobanks, "Taiwan Biobank" is the first and the largest government-supported biobank which comprises population-based cohort study and disease- oriented study. Since the collection of information, data, and biological specimen of biobanks often involve highly sensitive personal information, in the legal framework of Taiwan, there is a specific regulation, "Human Biobank Management Act" (HBMA), which plays an important role in regulating biobanks in Taiwan. HBMA, the Personal Information Act and other regulations constitute a comprehensive legal and regulatory privacy framework of biobanks. Through the introduction and analysis of the current legal framework applicable to biobanks, we found that there are several challenges that need to be solved appropriately that involve duplicate review systems, the obstacles in the international collaboration, and data sharing between biobanks in Taiwan. © 2015 American Society of Law, Medicine & Ethics, Inc.
King, Raymond J; Garrett, Nedra; Kriseman, Jeffrey; Crum, Melvin; Rafalski, Edward M; Sweat, David; Frazier, Renee; Schearer, Sue; Cutts, Teresa
2016-09-08
We present a framework for developing a community health record to bring stakeholders, information, and technology together to collectively improve the health of a community. It is both social and technical in nature and presents an iterative and participatory process for achieving multisector collaboration and information sharing. It proposes a methodology and infrastructure for bringing multisector stakeholders and their information together to inform, target, monitor, and evaluate community health initiatives. The community health record is defined as both the proposed framework and a tool or system for integrating and transforming multisector data into actionable information. It is informed by the electronic health record, personal health record, and County Health Ranking systems but differs in its social complexity, communal ownership, and provision of information to multisector partners at scales ranging from address to zip code.
Garrett, Nedra; Kriseman, Jeffrey; Crum, Melvin; Rafalski, Edward M.; Sweat, David; Frazier, Renee; Schearer, Sue; Cutts, Teresa
2016-01-01
We present a framework for developing a community health record to bring stakeholders, information, and technology together to collectively improve the health of a community. It is both social and technical in nature and presents an iterative and participatory process for achieving multisector collaboration and information sharing. It proposes a methodology and infrastructure for bringing multisector stakeholders and their information together to inform, target, monitor, and evaluate community health initiatives. The community health record is defined as both the proposed framework and a tool or system for integrating and transforming multisector data into actionable information. It is informed by the electronic health record, personal health record, and County Health Ranking systems but differs in its social complexity, communal ownership, and provision of information to multisector partners at scales ranging from address to zip code. PMID:27609300
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-21
... development of standardized metadata in hundreds of organizations, and funded numerous implementations of OGC... of emphasis include: Metadata documentation, clearinghouse establishment, framework development...
Structure preserving clustering-object tracking via subgroup motion pattern segmentation
NASA Astrophysics Data System (ADS)
Fan, Zheyi; Zhu, Yixuan; Jiang, Jiao; Weng, Shuqin; Liu, Zhiwen
2018-01-01
Tracking clustering objects with similar appearances simultaneously in collective scenes is a challenging task in the field of collective motion analysis. Recent work on clustering-object tracking often suffers from poor tracking accuracy and terrible real-time performance due to the neglect or the misjudgment of the motion differences among objects. To address this problem, we propose a subgroup motion pattern segmentation framework based on a multilayer clustering structure and establish spatial constraints only among objects in the same subgroup, which entails having consistent motion direction and close spatial position. In addition, the subgroup segmentation results are updated dynamically because crowd motion patterns are changeable and affected by objects' destinations and scene structures. The spatial structure information combined with the appearance similarity information is used in the structure preserving object tracking framework to track objects. Extensive experiments conducted on several datasets containing multiple real-world crowd scenes validate the accuracy and the robustness of the presented algorithm for tracking objects in collective scenes.
Jordan, Rebecca; Gray, Steven; Sorensen, Amanda; Newman, Greg; Mellor, David; Newman, Greg; Hmelo-Silver, Cindy; LaDeau, Shannon; Biehler, Dawn; Crall, Alycia
2016-06-01
Citizen science has generated a growing interest among scientists and community groups, and citizen science programs have been created specifically for conservation. We examined collaborative science, a highly interactive form of citizen science, which we developed within a theoretically informed framework. In this essay, we focused on 2 aspects of our framework: social learning and adaptive management. Social learning, in contrast to individual-based learning, stresses collaborative and generative insight making and is well-suited for adaptive management. Adaptive-management integrates feedback loops that are informed by what is learned and is guided by iterative decision making. Participants engaged in citizen science are able to add to what they are learning through primary data collection, which can result in the real-time information that is often necessary for conservation. Our work is particularly timely because research publications consistently report a lack of established frameworks and evaluation plans to address the extent of conservation outcomes in citizen science. To illustrate how our framework supports conservation through citizen science, we examined how 2 programs enacted our collaborative science framework. Further, we inspected preliminary conservation outcomes of our case-study programs. These programs, despite their recent implementation, are demonstrating promise with regard to positive conservation outcomes. To date, they are independently earning funds to support research, earning buy-in from local partners to engage in experimentation, and, in the absence of leading scientists, are collecting data to test ideas. We argue that this success is due to citizen scientists being organized around local issues and engaging in iterative, collaborative, and adaptive learning. © 2016 Society for Conservation Biology.
Gray, Alistair; Veale, Jaimie F.; Binson, Diane; Sell, Randell L.
2013-01-01
Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand's Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens. PMID:23840231
Internet Civil Defense: Feasibility Study
2002-12-09
quell spread of false rumors that induce fear. Summary: Internet Civil Defense can help to manage fear in various ways. We can continuously poll...timely information to minimize rumors and accelerate recovery from disasters. ICD will create an “umbrella” framework to raise the overall...information to minimize rumors and accelerate recovery from disasters. Overall, ICD will collect and deliver actionable information to enhance public
ERIC Educational Resources Information Center
Choi, Woojae
2010-01-01
The purpose of this study was to investigate the influences of formal learning, personal characteristics, and work environment characteristics on informal learning among middle managers in the Korean banking sector. The conceptual framework identified three factors influencing informal learning. For this study, data collection was conducted in the…
2013-11-01
STOCHASTIC RADIATIVE TRANSFER MODEL FOR CONTAMINATED ROUGH SURFACES: A...of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid ...COVERED (From - To) Jan 2013 - Sep 2013 4. TITLE AND SUBTITLE Stochastic Radiative Transfer Model for Contaminated Rough Surfaces: A Framework for
Rycroft-Malone, Jo; Seers, Kate; Chandler, Jackie; Hawkes, Claire A; Crichton, Nicola; Allen, Claire; Bullock, Ian; Strunin, Leo
2013-03-09
The case has been made for more and better theory-informed process evaluations within trials in an effort to facilitate insightful understandings of how interventions work. In this paper, we provide an explanation of implementation processes from one of the first national implementation research randomized controlled trials with embedded process evaluation conducted within acute care, and a proposed extension to the Promoting Action on Research Implementation in Health Services (PARIHS) framework. The PARIHS framework was prospectively applied to guide decisions about intervention design, data collection, and analysis processes in a trial focussed on reducing peri-operative fasting times. In order to capture a holistic picture of implementation processes, the same data were collected across 19 participating hospitals irrespective of allocation to intervention. This paper reports on findings from data collected from a purposive sample of 151 staff and patients pre- and post-intervention. Data were analysed using content analysis within, and then across data sets. A robust and uncontested evidence base was a necessary, but not sufficient condition for practice change, in that individual staff and patient responses such as caution influenced decision making. The implementation context was challenging, in which individuals and teams were bounded by professional issues, communication challenges, power and a lack of clarity for the authority and responsibility for practice change. Progress was made in sites where processes were aligned with existing initiatives. Additionally, facilitators reported engaging in many intervention implementation activities, some of which result in practice changes, but not significant improvements to outcomes. This study provided an opportunity for reflection on the comprehensiveness of the PARIHS framework. Consistent with the underlying tenant of PARIHS, a multi-faceted and dynamic story of implementation was evident. However, the prominent role that individuals played as part of the interaction between evidence and context is not currently explicit within the framework. We propose that successful implementation of evidence into practice is a planned facilitated process involving an interplay between individuals, evidence, and context to promote evidence-informed practice. This proposal will enhance the potential of the PARIHS framework for explanation, and ensure theoretical development both informs and responds to the evidence base for implementation.
Advanced Performance Modeling with Combined Passive and Active Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dovrolis, Constantine; Sim, Alex
2015-04-15
To improve the efficiency of resource utilization and scheduling of scientific data transfers on high-speed networks, the "Advanced Performance Modeling with combined passive and active monitoring" (APM) project investigates and models a general-purpose, reusable and expandable network performance estimation framework. The predictive estimation model and the framework will be helpful in optimizing the performance and utilization of networks as well as sharing resources with predictable performance for scientific collaborations, especially in data intensive applications. Our prediction model utilizes historical network performance information from various network activity logs as well as live streaming measurements from network peering devices. Historical network performancemore » information is used without putting extra load on the resources by active measurement collection. Performance measurements collected by active probing is used judiciously for improving the accuracy of predictions.« less
An integrated healthcare enterprise information portal and healthcare information system framework.
Hsieh, S L; Lai, Feipei; Cheng, P H; Chen, J L; Lee, H H; Tsai, W N; Weng, Y C; Hsieh, S H; Hsu, K P; Ko, L F; Yang, T H; Chen, C H
2006-01-01
The paper presents an integrated, distributed Healthcare Enterprise Information Portal (HEIP) and Hospital Information Systems (HIS) framework over wireless/wired infrastructure at National Taiwan University Hospital (NTUH). A single sign-on solution for the hospital customer relationship management (CRM) in HEIP has been established. The outcomes of the newly developed Outpatient Information Systems (OIS) in HIS are discussed. The future HEIP blueprints with CRM oriented features: e-Learning, Remote Consultation and Diagnosis (RCD), as well as on-Line Vaccination Services are addressed. Finally, the integrated HEIP and HIS architectures based on the middleware technologies are proposed along with the feasible approaches. The preliminary performance of multi-media, time-based data exchanges over the wireless HEIP side is collected to evaluate the efficiency of the architecture.
ERIC Educational Resources Information Center
Harris, Kendra E.
2010-01-01
This single-case qualitative study examines leadership in an institution of higher education using the Responsible Leadership for Performance (RLP) model (Lynham & Chermack, 2006) as a framework. The study explores how using a paradigm of collective leadership as an alternative to models of individual leadership could inform understanding of…
Collective decision dynamics in the presence of external drivers
NASA Astrophysics Data System (ADS)
Bassett, Danielle S.; Alderson, David L.; Carlson, Jean M.
2012-09-01
We develop a sequence of models describing information transmission and decision dynamics for a network of individual agents subject to multiple sources of influence. Our general framework is set in the context of an impending natural disaster, where individuals, represented by nodes on the network, must decide whether or not to evacuate. Sources of influence include a one-to-many externally driven global broadcast as well as pairwise interactions, across links in the network, in which agents transmit either continuous opinions or binary actions. We consider both uniform and variable threshold rules on the individual opinion as baseline models for decision making. Our results indicate that (1) social networks lead to clustering and cohesive action among individuals, (2) binary information introduces high temporal variability and stagnation, and (3) information transmission over the network can either facilitate or hinder action adoption, depending on the influence of the global broadcast relative to the social network. Our framework highlights the essential role of local interactions between agents in predicting collective behavior of the population as a whole.
Concepts of Management Information Systems.
ERIC Educational Resources Information Center
Emery, J.C.
The paper attempts to provide a general framework for dealing with management information systems (MIS). An MIS is defined to have the following characteristics: (1) related to ongoing activities of an organization, (2) a man-machine system, (3) composed of a collection of subsystems, and (4) oriented around a large data base. An MIS places a…
Toxicogenomics and the Regulatory Framework
Toxicogenomics presents regulatory agencies with the opportunity to revolutionize their analyses by enabling the collection of information on a broader range of responses than currently considered in traditional regulatory decision making. Analyses of genomic responses are expec...
D'Ambruoso, Lucia; Byass, Peter; Nurul Qomariyah, Siti
2008-09-09
Maternal mortality remains unacceptably high in developing countries despite international advocacy, development targets, and simple, affordable and effective interventions. In recent years, regard for maternal mortality as a human rights issue as well as one that pertains to health, has emerged. We study a case of maternal death using a theoretical framework derived from the right to health to examine access to and quality of maternal healthcare. Our objective was to explore the potential of rights-based frameworks to inform public health planning from a human rights perspective. Information was elicited as part of a verbal autopsy survey investigating maternal deaths in rural settings in Indonesia. The deceased's relatives were interviewed to collect information on medical signs, symptoms and the social, cultural and health systems circumstances surrounding the death. In this case, a prolonged, severe fever and a complicated series of referrals culminated in the death of a 19-year-old primagravida at 7 months gestation. The cause of death was acute infection. The woman encountered a range of barriers to access; behavioural, socio-cultural, geographic and economic. Several serious health system failures were also apparent. The theoretical framework derived from the right to health identified that none of the essential elements of the right were upheld. The rights-based approach could identify how and where to improve services. However, there are fundamental and inherent conflicts between the public health tradition (collective and preventative) and the right to health (individualistic and curative). As a result, and in practice, the right to health is likely to be ineffective for public health planning from a human rights perspective. Collective rights such as the right to development may provide a more suitable means to achieve equity and social justice in health planning.
Data List - Specifying and Acquiring Earth Science Data Measurements All at Once
NASA Astrophysics Data System (ADS)
Shie, C. L.; Teng, W. L.; Liu, Z.; Hearty, T. J., III; Shen, S.; Li, A.; Hegde, M.; Bryant, K.; Seiler, E.; Kempler, S. J.
2016-12-01
Natural phenomena, such as tropical storms (e.g., hurricane/typhoons), winter storms (e.g., blizzards) volcanic eruptions, floods, and drought, have the potential to cause immense property damage, great socioeconomic impact, and tragic losses of human life. In order to investigate and assess these natural hazards in a timely manner, there needs to be efficient searching and accessing of massive amounts of heterogeneous scientific data from, particularly, satellite and model products. This is a daunting task for most application users, decision makers, and science researchers. The NASA Goddard Earth Sciences Data and Information Service Center (GES DISC) has, for many years, archived and served massive amounts of Earth science data, along with value-added information and services. In order to facilitate the GES DISC users in acquiring their data of interest "all at once," with minimum effort, the GES DISC has started developing a value-added and knowledge-based data service framework. This framework allows the preparation and presentation to users of collections of data and their related resources for natural disaster events or other scientific themes. These collections of data, initially termed "Data Bundle" and then "Virtual Collections" and finally "Data Lists," contain suites of annotated Web addresses (URLs) that point to their respective data and resource addresses, "all at once" and "virtually." Because these collections of data are virtual, there is no need to duplicate the data. Currently available "Data Lists" for several natural disaster phenomena and the architecture of the data service framework will be presented.
Hinchcliff, Reece; Senserrick, Teresa; Travaglia, Joanne; Greenfield, David; Ivers, Rebecca
2017-04-01
Knowledge translation and exchange (KTE) can enable evidence-informed road safety policy and practice by reducing the gap between what is known to be effective and what actually occurs. A quality improvement project, undertaken within a government policy frame, was implemented in 2015 to produce an enhanced KTE framework for road safety (the framework). Information was collected from 35 road safety stakeholders in the UK, the Netherlands, Norway and Sweden. Thirteen KTE facilitators were identified that covered research funding and production, the expertise of knowledge users and dissemination practices. The framework was subsequently developed, which separated facilitators seen as essential for a KTE system, from others perceived as aspirational due to their lesser influence and the considerable time and resources required for their implementation. The framework provides a heuristic device to enable policy agencies to holistically assess and improve current KTE systems for road safety, to encourage evidence-informed policy and practice. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Framework for asset management study results : research report.
DOT National Transportation Integrated Search
2012-04-27
Dye Management Group, Inc. (DMG) collected and analyzed local agency inventory, cost, and condition assessment information in order to provide the Michigan Transportation Asset Management Council (TAMC) with (a) the costs expended to maintain its roa...
The dynamics of information-driven coordination phenomena: A transfer entropy analysis
Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro
2016-01-01
Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data. PMID:27051875
The dynamics of information-driven coordination phenomena: A transfer entropy analysis.
Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro
2016-04-01
Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data.
Challenges in leveraging existing human performance data for quantifying the IDHEAS HRA method
Liao, Huafei N.; Groth, Katrina; Stevens-Adams, Susan
2015-07-29
Our article documents an exploratory study for collecting and using human performance data to inform human error probability (HEP) estimates for a new human reliability analysis (HRA) method, the IntegrateD Human Event Analysis System (IDHEAS). The method was based on cognitive models and mechanisms underlying human behaviour and employs a framework of 14 crew failure modes (CFMs) to represent human failures typical for human performance in nuclear power plant (NPP) internal, at-power events [1]. A decision tree (DT) was constructed for each CFM to assess the probability of the CFM occurring in different contexts. Data needs for IDHEAS quantification aremore » discussed. Then, the data collection framework and process is described and how the collected data were used to inform HEP estimation is illustrated with two examples. Next, five major technical challenges are identified for leveraging human performance data for IDHEAS quantification. Furthermore, these challenges reflect the data needs specific to IDHEAS. More importantly, they also represent the general issues with current human performance data and can provide insight for a path forward to support HRA data collection, use, and exchange for HRA method development, implementation, and validation.« less
A statistical metadata model for clinical trials' data management.
Vardaki, Maria; Papageorgiou, Haralambos; Pentaris, Fragkiskos
2009-08-01
We introduce a statistical, process-oriented metadata model to describe the process of medical research data collection, management, results analysis and dissemination. Our approach explicitly provides a structure for pieces of information used in Clinical Study Data Management Systems, enabling a more active role for any associated metadata. Using the object-oriented paradigm, we describe the classes of our model that participate during the design of a clinical trial and the subsequent collection and management of the relevant data. The advantage of our approach is that we focus on presenting the structural inter-relation of these classes when used during datasets manipulation by proposing certain transformations that model the simultaneous processing of both data and metadata. Our solution reduces the possibility of human errors and allows for the tracking of all changes made during datasets lifecycle. The explicit modeling of processing steps improves data quality and assists in the problem of handling data collected in different clinical trials. The case study illustrates the applicability of the proposed framework demonstrating conceptually the simultaneous handling of datasets collected during two randomized clinical studies. Finally, we provide the main considerations for implementing the proposed framework into a modern Metadata-enabled Information System.
ARCTOS: a relational database relating specimens, specimen-based science, and archival documentation
Jarrell, Gordon H.; Ramotnik, Cindy A.; McDonald, D.L.
2010-01-01
Data are preserved when they are perpetually discoverable, but even in the Information Age, discovery of legacy data appropriate to particular investigations is uncertain. Secure Internet storage is necessary but insufficient. Data can be discovered only when they are adequately described, and visibility increases markedly if the data are related to other data that are receiving usage. Such relationships can be built within (1) the framework of a relational database, or (1) they can be built among separate resources, within the framework of the Internet. Evolving primarily around biological collections, Arctos is a database that does both of these tasks. It includes data structures for a diversity of specimen attributes, essentially all collection-management tasks, plus literature citations, project descriptions, etc. As a centralized collaboration of several university museums, Arctos is an ideal environment for capitalizing on the many relationships that often exist between items in separate collections. Arctos is related to NIH’s DNA-sequence repository (GenBank) with record-to-record reciprocal linkages, and it serves data to several discipline-specific web portals, including the Global Biodiversity Information Network (GBIF). The University of Alaska Museum’s paleontological collection is Arctos’s recent extension beyond the constraints of neontology. With about 1.3 million cataloged items, additional collections are being added each year.
Outside the Framework of Thinkable Thought: The Modern Orchestration Project
ERIC Educational Resources Information Center
Gattegno, Eliot Aron
2010-01-01
In today's world of too much information, context--not content--is king. This proposal is for the development of an unparalleled sonic analysis tool that converts audio files into musical score notation and a Web site (API) to collect manage and preserve information about the musical sounds analyzed, as well as music scores, videos, and articles…
ERIC Educational Resources Information Center
Badway, Norena Norton; Somerville, Jerry
2011-01-01
The purpose of this study was to analyze what leaders of Advanced Technological Education (ATE) programs funded by the National Science Foundation believe are their most important needs for research information. Data was collected through a Delphi process, and results were analyzed through frameworks associated with program improvement initiatives…
Forecasting the wellness of elderly through SNMS
NASA Astrophysics Data System (ADS)
Wu, Yuan; Li, Lingling; Ma, Chao; Li, Lian; Huang, Bingqing; Liu, Li
2017-03-01
Accurate and timely information collection is important for physicians to provide prompt and appropriate treatment for patients. In this paper, a smart nursing home monitoring system which can predict the health conditions of the elderly people who live in the nursing home is presented. A framework integrating temporal and spatial contextual information for evaluating the wellness of an elderly has been modeled. A novel activity detection process based on the location information collects by the RFID technology in performing essential daily activities has been designed and developed. A BP neural network is trained using the activity information of the elderly live in the nursing home, wellness models are tested and the results are encouraging.
De Lusignan, Simon; Liyanage, Harshana; Di Iorio, Concetta Tania; Chan, Tom; Liaw, Siaw-Teng
2016-01-19
The use of health data for public health, surveillance, quality improvement and research is crucial to improve health systems and health care. However, bodies responsible for privacy and ethics often limit access to routinely collected health data. Ethical approvals, issues around protecting privacy and data access are often dealt with by different layers of regulations, making approval processes appear disjointed. To create a comprehensive framework for defining the ethical and privacy status of a project and for providing guidance on data access. The framework comprises principles and related questions. The core of the framework will be built using standard terminology definitions such as ethics-related controlled vocabularies and regional directives. It is built in this way to reduce ambiguity between different definitions. The framework is extensible: principles can be retired or added to, as can their related questions. Responses to these questions should allow data processors to define ethical issues, privacy risk and other unintended consequences. The framework contains three steps: (1) identifying possible ethical and privacy principles relevant to the project; (2) providing ethics and privacy guidance questions that inform the type of approval needed; and (3) assessing case-specific ethics and privacy issues. The outputs from this process should inform whether the balance between public interests and privacy breach and any ethical considerations are tipped in favour of societal benefits. If they are then this should be the basis on which data access is permitted. Tightly linking ethical principles to governance and data access may help maintain public trust.
Modeling Urban Scenarios & Experiments: Fort Indiantown Gap Data Collections Summary and Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Archer, Daniel E.; Bandstra, Mark S.; Davidson, Gregory G.
This report summarizes experimental radiation detector, contextual sensor, weather, and global positioning system (GPS) data collected to inform and validate a comprehensive, operational radiation transport modeling framework to evaluate radiation detector system and algorithm performance. This framework will be used to study the influence of systematic effects (such as geometry, background activity, background variability, environmental shielding, etc.) on detector responses and algorithm performance using synthetic time series data. This work consists of performing data collection campaigns at a canonical, controlled environment for complete radiological characterization to help construct and benchmark a high-fidelity model with quantified system geometries, detector response functions,more » and source terms for background and threat objects. This data also provides an archival, benchmark dataset that can be used by the radiation detection community. The data reported here spans four data collection campaigns conducted between May 2015 and September 2016.« less
Serrona, Kevin Roy B; Yu, Jeongsoo; Aguinaldo, Emelita; Florece, Leonardo M
2014-09-01
The Philippines has been making inroads in solid waste management with the enactment and implementation of the Republic Act 9003 or the Ecological Waste Management Act of 2000. Said legislation has had tremendous influence in terms of how the national and local government units confront the challenges of waste management in urban and rural areas using the reduce, reuse, recycle and recovery framework or 4Rs. One of the sectors needing assistance is the informal waste sector whose aspiration is legal recognition of their rank and integration of their waste recovery activities in mainstream waste management. To realize this, the Philippine National Solid Waste Management Commission initiated the formulation of the National Framework Plan for the Informal Waste Sector, which stipulates approaches, strategies and methodologies to concretely involve the said sector in different spheres of local waste management, such as collection, recycling and disposal. What needs to be fleshed out is the monitoring and evaluation component in order to gauge qualitative and quantitative achievements vis-a-vis the Framework Plan. In the process of providing an enabling environment for the informal waste sector, progress has to be monitored and verified qualitatively and quantitatively and measured against activities, outputs, objectives and goals. Using the Framework Plan as the reference, this article developed monitoring and evaluation indicators using the logical framework approach in project management. The primary objective is to institutionalize monitoring and evaluation, not just in informal waste sector plans, but in any waste management initiatives to ensure that envisaged goals are achieved. © The Author(s) 2014.
A distributed cloud-based cyberinfrastructure framework for integrated bridge monitoring
NASA Astrophysics Data System (ADS)
Jeong, Seongwoon; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.
2017-04-01
This paper describes a cloud-based cyberinfrastructure framework for the management of the diverse data involved in bridge monitoring. Bridge monitoring involves various hardware systems, software tools and laborious activities that include, for examples, a structural health monitoring (SHM), sensor network, engineering analysis programs and visual inspection. Very often, these monitoring systems, tools and activities are not coordinated, and the collected information are not shared. A well-designed integrated data management framework can support the effective use of the data and, thereby, enhance bridge management and maintenance operations. The cloud-based cyberinfrastructure framework presented herein is designed to manage not only sensor measurement data acquired from the SHM system, but also other relevant information, such as bridge engineering model and traffic videos, in an integrated manner. For the scalability and flexibility, cloud computing services and distributed database systems are employed. The information stored can be accessed through standard web interfaces. For demonstration, the cyberinfrastructure system is implemented for the monitoring of the bridges located along the I-275 Corridor in the state of Michigan.
A Decision Support Framework for Science-Based, Multi-Stakeholder Deliberation: A Coral Reef Example
NASA Astrophysics Data System (ADS)
Rehr, Amanda P.; Small, Mitchell J.; Bradley, Patricia; Fisher, William S.; Vega, Ann; Black, Kelly; Stockton, Tom
2012-12-01
We present a decision support framework for science-based assessment and multi-stakeholder deliberation. The framework consists of two parts: a DPSIR (Drivers-Pressures-States-Impacts-Responses) analysis to identify the important causal relationships among anthropogenic environmental stressors, processes, and outcomes; and a Decision Landscape analysis to depict the legal, social, and institutional dimensions of environmental decisions. The Decision Landscape incorporates interactions among government agencies, regulated businesses, non-government organizations, and other stakeholders. It also identifies where scientific information regarding environmental processes is collected and transmitted to improve knowledge about elements of the DPSIR and to improve the scientific basis for decisions. Our application of the decision support framework to coral reef protection and restoration in the Florida Keys focusing on anthropogenic stressors, such as wastewater, proved to be successful and offered several insights. Using information from a management plan, it was possible to capture the current state of the science with a DPSIR analysis as well as important decision options, decision makers and applicable laws with a the Decision Landscape analysis. A structured elicitation of values and beliefs conducted at a coral reef management workshop held in Key West, Florida provided a diversity of opinion and also indicated a prioritization of several environmental stressors affecting coral reef health. The integrated DPSIR/Decision landscape framework for the Florida Keys developed based on the elicited opinion and the DPSIR analysis can be used to inform management decisions, to reveal the role that further scientific information and research might play to populate the framework, and to facilitate better-informed agreement among participants.
Vertical and Horizontal Forces: A Framework for Understanding Airpower Command and Control
2014-05-22
failing to comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ...ABSTRACT The Air Force has long maintained the tenet of “centralized control , decentralized execution.” Changes in the contextual environment and...help commanders understand how command and control (C2) systems work best today. The proposed cognitive framework moves beyond centralization
Valdivieso Caraguay, Ángel Leonardo; García Villalba, Luis Javier
2017-01-01
This paper presents the Monitoring and Discovery Framework of the Self-Organized Network Management in Virtualized and Software Defined Networks SELFNET project. This design takes into account the scalability and flexibility requirements needed by 5G infrastructures. In this context, the present framework focuses on gathering and storing the information (low-level metrics) related to physical and virtual devices, cloud environments, flow metrics, SDN traffic and sensors. Similarly, it provides the monitoring data as a generic information source in order to allow the correlation and aggregation tasks. Our design enables the collection and storing of information provided by all the underlying SELFNET sublayers, including the dynamically onboarded and instantiated SDN/NFV Apps, also known as SELFNET sensors. PMID:28362346
Caraguay, Ángel Leonardo Valdivieso; Villalba, Luis Javier García
2017-03-31
This paper presents the Monitoring and Discovery Framework of the Self-Organized Network Management in Virtualized and Software Defined Networks SELFNET project. This design takes into account the scalability and flexibility requirements needed by 5G infrastructures. In this context, the present framework focuses on gathering and storing the information (low-level metrics) related to physical and virtual devices, cloud environments, flow metrics, SDN traffic and sensors. Similarly, it provides the monitoring data as a generic information source in order to allow the correlation and aggregation tasks. Our design enables the collection and storing of information provided by all the underlying SELFNET sublayers, including the dynamically onboarded and instantiated SDN/NFV Apps, also known as SELFNET sensors.
Wootton, Richard; Vladzymyrskyy, Anton; Zolfo, Maria; Bonnardot, Laurent
2011-01-01
Telemedicine has been used for many years to support doctors in the developing world. Several networks provide services in different settings and in different ways. However, to draw conclusions about which telemedicine networks are successful requires a method of evaluating them. No general consensus or validated framework exists for this purpose. To define a basic method of performance measurement that can be used to improve and compare teleconsultation networks; to employ the proposed framework in an evaluation of three existing networks; to make recommendations about the future implementation and follow-up of such networks. Analysis based on the experience of three telemedicine networks (in operation for 7-10 years) that provide services to doctors in low-resource settings and which employ the same basic design. Although there are many possible indicators and metrics that might be relevant, five measures for each of the three user groups appear to be sufficient for the proposed framework. In addition, from the societal perspective, information about clinical- and cost-effectiveness is also required. The proposed performance measurement framework was applied to three mature telemedicine networks. Despite their differences in terms of activity, size and objectives, their performance in certain respects is very similar. For example, the time to first reply from an expert is about 24 hours for each network. Although all three networks had systems in place to collect data from the user perspective, none of them collected information about the coordinator's time required or about ease of system usage. They had only limited information about quality and cost. Measuring the performance of a telemedicine network is essential in understanding whether the network is working as intended and what effect it is having. Based on long-term field experience, the suggested framework is a practical tool that will permit organisations to assess the performance of their own networks and to improve them by comparison with others. All telemedicine systems should provide information about setup and running costs because cost-effectiveness is crucial for sustainability.
Wootton, Richard; Vladzymyrskyy, Anton; Zolfo, Maria; Bonnardot, Laurent
2011-01-01
Background Telemedicine has been used for many years to support doctors in the developing world. Several networks provide services in different settings and in different ways. However, to draw conclusions about which telemedicine networks are successful requires a method of evaluating them. No general consensus or validated framework exists for this purpose. Objective To define a basic method of performance measurement that can be used to improve and compare teleconsultation networks; to employ the proposed framework in an evaluation of three existing networks; to make recommendations about the future implementation and follow-up of such networks. Methods Analysis based on the experience of three telemedicine networks (in operation for 7–10 years) that provide services to doctors in low-resource settings and which employ the same basic design. Findings Although there are many possible indicators and metrics that might be relevant, five measures for each of the three user groups appear to be sufficient for the proposed framework. In addition, from the societal perspective, information about clinical- and cost-effectiveness is also required. The proposed performance measurement framework was applied to three mature telemedicine networks. Despite their differences in terms of activity, size and objectives, their performance in certain respects is very similar. For example, the time to first reply from an expert is about 24 hours for each network. Although all three networks had systems in place to collect data from the user perspective, none of them collected information about the coordinator's time required or about ease of system usage. They had only limited information about quality and cost. Conclusion Measuring the performance of a telemedicine network is essential in understanding whether the network is working as intended and what effect it is having. Based on long-term field experience, the suggested framework is a practical tool that will permit organisations to assess the performance of their own networks and to improve them by comparison with others. All telemedicine systems should provide information about setup and running costs because cost-effectiveness is crucial for sustainability. PMID:22162965
A Test Generation Framework for Distributed Fault-Tolerant Algorithms
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn; Bushnell, David; Miner, Paul; Pasareanu, Corina S.
2009-01-01
Heavyweight formal methods such as theorem proving have been successfully applied to the analysis of safety critical fault-tolerant systems. Typically, the models and proofs performed during such analysis do not inform the testing process of actual implementations. We propose a framework for generating test vectors from specifications written in the Prototype Verification System (PVS). The methodology uses a translator to produce a Java prototype from a PVS specification. Symbolic (Java) PathFinder is then employed to generate a collection of test cases. A small example is employed to illustrate how the framework can be used in practice.
ERIC Educational Resources Information Center
Lindeman, Karen Wise
2012-01-01
This study investigated how a child with early cochlear implantation interacted with peers in his inclusive preschool setting. A qualitative case-study framed in a socio-linguistic framework guided the data collection and analysis. Data collection included detailed field notes, classroom free play observations, informal student interviews, teacher…
A Knowledge Discovery framework for Planetary Defense
NASA Astrophysics Data System (ADS)
Jiang, Y.; Yang, C. P.; Li, Y.; Yu, M.; Bambacus, M.; Seery, B.; Barbee, B.
2016-12-01
Planetary Defense, a project funded by NASA Goddard and the NSF, is a multi-faceted effort focused on the mitigation of Near Earth Object (NEO) threats to our planet. Currently, there exists a dispersion of information concerning NEO's amongst different organizations and scientists, leading to a lack of a coherent system of information to be used for efficient NEO mitigation. In this paper, a planetary defense knowledge discovery engine is proposed to better assist the development and integration of a NEO responding system. Specifically, we have implemented an organized information framework by two means: 1) the development of a semantic knowledge base, which provides a structure for relevant information. It has been developed by the implementation of web crawling and natural language processing techniques, which allows us to collect and store the most relevant structured information on a regular basis. 2) the development of a knowledge discovery engine, which allows for the efficient retrieval of information from our knowledge base. The knowledge discovery engine has been built on the top of Elasticsearch, an open source full-text search engine, as well as cutting-edge machine learning ranking and recommendation algorithms. This proposed framework is expected to advance the knowledge discovery and innovation in planetary science domain.
2013-01-01
Background The case has been made for more and better theory-informed process evaluations within trials in an effort to facilitate insightful understandings of how interventions work. In this paper, we provide an explanation of implementation processes from one of the first national implementation research randomized controlled trials with embedded process evaluation conducted within acute care, and a proposed extension to the Promoting Action on Research Implementation in Health Services (PARIHS) framework. Methods The PARIHS framework was prospectively applied to guide decisions about intervention design, data collection, and analysis processes in a trial focussed on reducing peri-operative fasting times. In order to capture a holistic picture of implementation processes, the same data were collected across 19 participating hospitals irrespective of allocation to intervention. This paper reports on findings from data collected from a purposive sample of 151 staff and patients pre- and post-intervention. Data were analysed using content analysis within, and then across data sets. Results A robust and uncontested evidence base was a necessary, but not sufficient condition for practice change, in that individual staff and patient responses such as caution influenced decision making. The implementation context was challenging, in which individuals and teams were bounded by professional issues, communication challenges, power and a lack of clarity for the authority and responsibility for practice change. Progress was made in sites where processes were aligned with existing initiatives. Additionally, facilitators reported engaging in many intervention implementation activities, some of which result in practice changes, but not significant improvements to outcomes. Conclusions This study provided an opportunity for reflection on the comprehensiveness of the PARIHS framework. Consistent with the underlying tenant of PARIHS, a multi-faceted and dynamic story of implementation was evident. However, the prominent role that individuals played as part of the interaction between evidence and context is not currently explicit within the framework. We propose that successful implementation of evidence into practice is a planned facilitated process involving an interplay between individuals, evidence, and context to promote evidence-informed practice. This proposal will enhance the potential of the PARIHS framework for explanation, and ensure theoretical development both informs and responds to the evidence base for implementation. Trial registration ISRCTN18046709 - Peri-operative Implementation Study Evaluation (PoISE). PMID:23497438
Complete integrability of information processing by biochemical reactions
Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio
2016-01-01
Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling – based on spin systems – has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis–Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy – based on completely integrable hydrodynamic-type systems of PDEs – which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions. PMID:27812018
Complete integrability of information processing by biochemical reactions
NASA Astrophysics Data System (ADS)
Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio
2016-11-01
Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling - based on spin systems - has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis-Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy - based on completely integrable hydrodynamic-type systems of PDEs - which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions.
Complete integrability of information processing by biochemical reactions.
Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio
2016-11-04
Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling - based on spin systems - has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis-Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy - based on completely integrable hydrodynamic-type systems of PDEs - which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions.
NASA Astrophysics Data System (ADS)
Bouty, A. A.; Koniyo, M. H.; Novian, D.
2018-02-01
This study aims to determine the level of maturity of information technology governance in Gorontalo city government by applying the COBIT framework 4.1. The research method is the case study method, by conducting surveys and data collection at 25 institution in Gorontalo City. The results of this study is the analysis of information technology needs based on the measurement of maturity level. The results of the measurement of the maturity level of information technology governance shows that there are still many business processes running at lower level, from 9 existing business processes there are 4 processes at level 2 (repetitive but intuitive) and 3 processes at level 1 (Initial/Ad hoc). With these results, is expected that the government of Gorontalo city immediately make improvements to the governance of information technology so that it can run more effectively and efficiently.
A Query Expansion Framework in Image Retrieval Domain Based on Local and Global Analysis
Rahman, M. M.; Antani, S. K.; Thoma, G. R.
2011-01-01
We present an image retrieval framework based on automatic query expansion in a concept feature space by generalizing the vector space model of information retrieval. In this framework, images are represented by vectors of weighted concepts similar to the keyword-based representation used in text retrieval. To generate the concept vocabularies, a statistical model is built by utilizing Support Vector Machine (SVM)-based classification techniques. The images are represented as “bag of concepts” that comprise perceptually and/or semantically distinguishable color and texture patches from local image regions in a multi-dimensional feature space. To explore the correlation between the concepts and overcome the assumption of feature independence in this model, we propose query expansion techniques in the image domain from a new perspective based on both local and global analysis. For the local analysis, the correlations between the concepts based on the co-occurrence pattern, and the metrical constraints based on the neighborhood proximity between the concepts in encoded images, are analyzed by considering local feedback information. We also analyze the concept similarities in the collection as a whole in the form of a similarity thesaurus and propose an efficient query expansion based on the global analysis. The experimental results on a photographic collection of natural scenes and a biomedical database of different imaging modalities demonstrate the effectiveness of the proposed framework in terms of precision and recall. PMID:21822350
ERIC Educational Resources Information Center
Buntrock, H.
The purpose of the survey was: (1) to evaluate existing agricultural information services and (2) to propose possible frameworks for an improved world-wide agricultural information service. The principal statistical results of the survey are summarized in the following figures which are based on data collected in nearly all instances for the year…
Information needs and behaviors of geoscience educators: A grounded theory study
NASA Astrophysics Data System (ADS)
Aber, Susan Ward
2005-12-01
Geoscience educators use a variety of resources and resource formats in their classroom teaching to facilitate student understanding of concepts and processes that define subject areas considered in the realm of geoscience. In this study of information needs and behaviors of geoscience educators, the researcher found that participants preferred visual media such as personal photographic and digital images, as well as published figures, animations, and cartoons, and that participants bypassed their academic libraries to meet these information needs. In order to investigate the role of information in developing introductory geoscience course and instruction, a grounded theory study was conducted through a qualitative paradigm with an interpretive approach and naturalistic inquiry. The theoretical and methodological framework was constructivism and sense-making. Research questions were posited on the nature of geoscience subject areas and the resources and resource formats used in conveying geoscience topics to science and non-science majors, as well as educators' preferences and concerns with curriculum and instruction. The underlying framework was to investigate the place of the academic library and librarian in the sense-making, constructivist approach of geoscience educators. A purposive sample of seven geoscience educators from four universities located in mid-western United States was identified as exemplary teachers by department chairpersons. A triangulation of data collection methods included semi-structured interviews, document reviews, and classroom observations. Data were analyzed using the constant comparative method, which included coding, categorizing, and interpreting for patterns and relationships. Contextual factors were identified and a simple model resulted showing the role of information in teaching for these participants. While participants developed lectures and demonstrations using intrapersonal knowledge and personal collections, one barrier was a lack of time and funding for converting photographic prints and slides to digital images. Findings have implications for academic librarians to provide more visual media or assistance with organizing and formatting existing outdated media formats and to create collaborative collection development through repackaging personal collections of geoscience participants to enhance teaching. Implications for library school educators include providing curriculum on information needs and behaviors from a user's perspective, subject specialty librarianship, and internal collaborative collection development to complement external collection development.
O'Neil, Adrienne; Jacka, Felice N; Quirk, Shae E; Cocker, Fiona; Taylor, C Barr; Oldenburg, Brian; Berk, Michael
2015-02-05
Historically, the focus of Non Communicable Disease (NCD) prevention and control has been cardiovascular disease (CVD), type 2 diabetes mellitus (T2DM), cancer and chronic respiratory diseases. Collectively, these account for more deaths than any other NCDs. Despite recent calls to include the common mental disorders (CMDs) of depression and anxiety under the NCD umbrella, prevention and control of these CMDs remain largely separate and independent. In order to address this gap, we apply a framework recently proposed by the Centers for Disease Control with three overarching objectives: (1) to obtain better scientific information through surveillance, epidemiology, and prevention research; (2) to disseminate this information to appropriate audiences through communication and education; and (3) to translate this information into action through programs, policies, and systems. We conclude that a shared framework of this type is warranted, but also identify opportunities within each objective to advance this agenda and consider the potential benefits of this approach that may exist beyond the health care system.
Thinking graphically: Connecting vision and cognition during graph comprehension.
Ratwani, Raj M; Trafton, J Gregory; Boehm-Davis, Deborah A
2008-03-01
Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive integration. During visual integration, pattern recognition processes are used to form visual clusters of information; these visual clusters are then used to reason about the graph during cognitive integration. In 3 experiments, the processes required to extract specific information and to integrate information were examined by collecting verbal protocol and eye movement data. Results supported the task analytic theories for specific information extraction and the processes of visual and cognitive integration for integrative questions. Further, the integrative processes scaled up as graph complexity increased, highlighting the importance of these processes for integration in more complex graphs. Finally, based on this framework, design principles to improve both visual and cognitive integration are described. PsycINFO Database Record (c) 2008 APA, all rights reserved
Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique.
Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever
2015-01-01
In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method.
Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique
Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever
2015-01-01
In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method. PMID:25948132
Fishing out collective memory of migratory schools
De Luca, Giancarlo; Mariani, Patrizio; MacKenzie, Brian R.; Marsili, Matteo
2014-01-01
Animals form groups for many reasons, but there are costs and benefits associated with group formation. One of the benefits is collective memory. In groups on the move, social interactions play a crucial role in the cohesion and the ability to make consensus decisions. When migrating from spawning to feeding areas, fish schools need to retain a collective memory of the destination site over thousands of kilometres, and changes in group formation or individual preference can produce sudden changes in migration pathways. We propose a modelling framework, based on stochastic adaptive networks, that can reproduce this collective behaviour. We assume that three factors control group formation and school migration behaviour: the intensity of social interaction, the relative number of informed individuals and the strength of preference that informed individuals have for a particular migration area. We treat these factors independently and relate the individuals’ preferences to the experience and memory for certain migration sites. We demonstrate that removal of knowledgeable individuals or alteration of individual preference can produce rapid changes in group formation and collective behaviour. For example, intensive fishing targeting the migratory species and also their preferred prey can reduce both terms to a point at which migration to the destination sites is suddenly stopped. The conceptual approaches represented by our modelling framework may therefore be able to explain large-scale changes in fish migration and spatial distribution. PMID:24647905
Fishing out collective memory of migratory schools.
De Luca, Giancarlo; Mariani, Patrizio; MacKenzie, Brian R; Marsili, Matteo
2014-06-06
Animals form groups for many reasons, but there are costs and benefits associated with group formation. One of the benefits is collective memory. In groups on the move, social interactions play a crucial role in the cohesion and the ability to make consensus decisions. When migrating from spawning to feeding areas, fish schools need to retain a collective memory of the destination site over thousands of kilometres, and changes in group formation or individual preference can produce sudden changes in migration pathways. We propose a modelling framework, based on stochastic adaptive networks, that can reproduce this collective behaviour. We assume that three factors control group formation and school migration behaviour: the intensity of social interaction, the relative number of informed individuals and the strength of preference that informed individuals have for a particular migration area. We treat these factors independently and relate the individuals' preferences to the experience and memory for certain migration sites. We demonstrate that removal of knowledgeable individuals or alteration of individual preference can produce rapid changes in group formation and collective behaviour. For example, intensive fishing targeting the migratory species and also their preferred prey can reduce both terms to a point at which migration to the destination sites is suddenly stopped. The conceptual approaches represented by our modelling framework may therefore be able to explain large-scale changes in fish migration and spatial distribution.
Big Data Discovery and Access Services through NOAA OneStop
NASA Astrophysics Data System (ADS)
Casey, K. S.; Neufeld, D.; Ritchey, N. A.; Relph, J.; Fischman, D.; Baldwin, R.
2017-12-01
The NOAA OneStop Project was created as a pathfinder effort to to improve the discovery of, access to, and usability of NOAA's vast and diverse collection of big data. OneStop is led by the NOAA/NESDIS National Centers for Environmental Information (NCEI), and is seen as a key NESDIS contribution to NOAA's open data and data stewardship efforts. OneStop consists of an entire framework of services, from storage and interoperable access services at the base, through metadata and catalog services in the middle, to a modern user interface experience at the top. Importantly, it is an open framework where external tools and services can connect at whichever level is most appropriate. Since the beta release of the OneStop user interface at the 2016 Fall AGU meeting, significant progress has been made improving and modernizing many NOAA data collections to optimize their use within the framework. In addition, OneStop has made progress implementing robust metadata management and catalog systems at the collection and granule level and improving the user experience with the web interface. This progress will be summarized and the results of extensive user testing including professional usability studies will be reviewed. Key big data technologies supporting the framework will be presented and a community input sought on the future directions of the OneStop Project.
An entropic framework for modeling economies
NASA Astrophysics Data System (ADS)
Caticha, Ariel; Golan, Amos
2014-08-01
We develop an information-theoretic framework for economic modeling. This framework is based on principles of entropic inference that are designed for reasoning on the basis of incomplete information. We take the point of view of an external observer who has access to limited information about broad macroscopic economic features. We view this framework as complementary to more traditional methods. The economy is modeled as a collection of agents about whom we make no assumptions of rationality (in the sense of maximizing utility or profit). States of statistical equilibrium are introduced as those macrostates that maximize entropy subject to the relevant information codified into constraints. The basic assumption is that this information refers to supply and demand and is expressed in the form of the expected values of certain quantities (such as inputs, resources, goods, production functions, utility functions and budgets). The notion of economic entropy is introduced. It provides a measure of the uniformity of the distribution of goods and resources. It captures both the welfare state of the economy as well as the characteristics of the market (say, monopolistic, concentrated or competitive). Prices, which turn out to be the Lagrange multipliers, are endogenously generated by the economy. Further studies include the equilibrium between two economies and the conditions for stability. As an example, the case of the nonlinear economy that arises from linear production and utility functions is treated in some detail.
Baltussen, Rob; Jansen, Maarten Paul Maria; Bijlmakers, Leon; Grutters, Janneke; Kluytmans, Anouck; Reuzel, Rob P; Tummers, Marcia; der Wilt, Gert Jan van
2017-02-01
Priority setting in health care has been long recognized as an intrinsically complex and value-laden process. Yet, health technology assessment agencies (HTAs) presently employ value assessment frameworks that are ill fitted to capture the range and diversity of stakeholder values and thereby risk compromising the legitimacy of their recommendations. We propose "evidence-informed deliberative processes" as an alternative framework with the aim to enhance this legitimacy. This framework integrates two increasingly popular and complementary frameworks for priority setting: multicriteria decision analysis and accountability for reasonableness. Evidence-informed deliberative processes are, on one hand, based on early, continued stakeholder deliberation to learn about the importance of relevant social values. On the other hand, they are based on rational decision-making through evidence-informed evaluation of the identified values. The framework has important implications for how HTA agencies should ideally organize their processes. First, HTA agencies should take the responsibility of organizing stakeholder involvement. Second, agencies are advised to integrate their assessment and appraisal phases, allowing for the timely collection of evidence on values that are considered relevant. Third, HTA agencies should subject their decision-making criteria to public scrutiny. Fourth, agencies are advised to use a checklist of potentially relevant criteria and to provide argumentation for how each criterion affected the recommendation. Fifth, HTA agencies must publish their argumentation and install options for appeal. The framework should not be considered a blueprint for HTA agencies but rather an aspirational goal-agencies can take incremental steps toward achieving this goal. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
A conceptual framework for patient-centered fertility treatment.
Duthie, Elizabeth A; Cooper, Alexandra; Davis, Joseph B; Schoyer, Katherine D; Sandlow, Jay; Strawn, Estil Y; Flynn, Kathryn E
2017-09-07
Patient-centered care is a pillar of quality health care and is important to patients experiencing infertility. In this study we used empirical, in-depth data on couples' experiences of infertility treatment decision making to inform and revise a conceptual framework for patient-centered fertility treatment that was developed based on health care professionals' conceptualizations of fertility treatment, covering effectiveness, burden, safety, and costs. In this prospective, longitudinal mixed methods study, we collected data from both members (separately) of 37 couples who scheduled an initial consult with a reproductive specialist. Data collection occurred 1 week before the initial consultation, 1 week after the initial consultation, and then roughly 2, 4, 8, and 12 months later. Data collection included semi-structured qualitative interviews, self-reported questionnaires, and medical record review. Interviews were recorded, transcribed, and content analyzed in NVivo. A single coder analyzed all transcripts, with > 25% of transcripts coded by a second coder to ensure quality control and consistency. Content analysis of the interview transcripts revealed 6 treatment dimensions: effectiveness, physical and emotional burden, time, cost, potential risks, and genetic parentage. Thus, the revised framework for patient-centered fertility treatment retains much from the original framework, with modification to one dimension (from safety to potential risks) and the addition of two dimensions (time and genetic parentage). For patients and their partners making fertility treatment decisions, tradeoffs are explicitly considered across dimensions as opposed to each dimension being considered on its own. Patient-centered fertility treatment should account for the dimensions of treatment that patients and their partners weigh when making decisions about how to add a child to their family. Based on the lived experiences of couples seeking specialist medical care for infertility, this revised conceptual framework can be used to inform patient-centered treatment and research on infertility and to develop decision support tools for patients and providers.
A data management infrastructure for bridge monitoring
NASA Astrophysics Data System (ADS)
Jeong, Seongwoon; Byun, Jaewook; Kim, Daeyoung; Sohn, Hoon; Bae, In Hwan; Law, Kincho H.
2015-04-01
This paper discusses a data management infrastructure framework for bridge monitoring applications. As sensor technologies mature and become economically affordable, their deployment for bridge monitoring will continue to grow. Data management becomes a critical issue not only for storing the sensor data but also for integrating with the bridge model to support other functions, such as management, maintenance and inspection. The focus of this study is on the effective data management of bridge information and sensor data, which is crucial to structural health monitoring and life cycle management of bridge structures. We review the state-of-the-art of bridge information modeling and sensor data management, and propose a data management framework for bridge monitoring based on NoSQL database technologies that have been shown useful in handling high volume, time-series data and to flexibly deal with unstructured data schema. Specifically, Apache Cassandra and Mongo DB are deployed for the prototype implementation of the framework. This paper describes the database design for an XML-based Bridge Information Modeling (BrIM) schema, and the representation of sensor data using Sensor Model Language (SensorML). The proposed prototype data management framework is validated using data collected from the Yeongjong Bridge in Incheon, Korea.
Practice-Based Knowledge Discovery for Comparative Effectiveness Research: An Organizing Framework
Lucero, Robert J.; Bakken, Suzanne
2014-01-01
Electronic health information systems can increase the ability of health-care organizations to investigate the effects of clinical interventions. The authors present an organizing framework that integrates outcomes and informatics research paradigms to guide knowledge discovery in electronic clinical databases. They illustrate its application using the example of hospital acquired pressure ulcers (HAPU). The Knowledge Discovery through Informatics for Comparative Effectiveness Research (KDI-CER) framework was conceived as a heuristic to conceptualize study designs and address potential methodological limitations imposed by using a single research perspective. Advances in informatics research can play a complementary role in advancing the field of outcomes research including CER. The KDI-CER framework can be used to facilitate knowledge discovery from routinely collected electronic clinical data. PMID:25278645
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veeramany, Arun; Coles, Garill A.; Unwin, Stephen D.
The Pacific Northwest National Laboratory developed a risk framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions. In this paper, we briefly recap the framework and demonstrate its implementation for seismic and geomagnetic hazards using a benchmark reliability test system. We describe integration of a collection of models implemented to perform hazard analysis, fragility evaluation, consequence estimation, and postevent restoration. We demonstrate the value of the framework as a multihazard power grid risk assessment and management tool. As a result, the research will benefit transmission planners and emergency planners by improving their ability to maintain a resilientmore » grid infrastructure against impacts from major events.« less
Veeramany, Arun; Coles, Garill A.; Unwin, Stephen D.; ...
2017-08-25
The Pacific Northwest National Laboratory developed a risk framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions. In this paper, we briefly recap the framework and demonstrate its implementation for seismic and geomagnetic hazards using a benchmark reliability test system. We describe integration of a collection of models implemented to perform hazard analysis, fragility evaluation, consequence estimation, and postevent restoration. We demonstrate the value of the framework as a multihazard power grid risk assessment and management tool. As a result, the research will benefit transmission planners and emergency planners by improving their ability to maintain a resilientmore » grid infrastructure against impacts from major events.« less
2015-03-13
A. Lee. “A Programming Model for Time - Synchronized Distributed Real- Time Systems”. In: Proceedings of Real Time and Em- bedded Technology and Applications Symposium. 2007, pp. 259–268. ...From MetroII to Metronomy, Designing Contract-based Function-Architecture Co-simulation Framework for Timing Verification of Cyber-Physical Systems...the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data
The Pacific Northwest Hydrologic Landscapes (PNW HL) at the assessment unit scale has provided a solid conceptual classification framework to relate and transfer hydrologically meaningful information between watersheds without access to streamflow time series. A collection of tec...
DOT National Transportation Integrated Search
1999-09-01
We have scanned the country and brought together the collective wisdom and expertise of transportation professionals implementing Intelligent Transportation Systems (ITS) projects across the United States. This information will prove helpful as you s...
Child injury surveillance capabilities in NSW: informing policy and practice.
Mitchell, Rebecca; Testa, Luke
2017-10-11
Injury is one of the most common reasons why a child is hospitalised. Information gained from injury surveillance activities provides an estimate of the injury burden, describes injury event circumstances, can be used to monitor injury trends over time, and is used to design and evaluate injury prevention activities. This perspective article provides an overview of child injury surveillance capabilities within New South Wales (NSW), Australia, following a stocktake of population-based injury-related data collections using the Evaluation Framework for Injury Surveillance Systems. Information about childhood injury in NSW is obtained from multiple administrative data collections that were not specifically designed to conduct injury surveillance. Obtaining good information for child injury surveillance in NSW will involve better coordination of information from agencies that record information about childhood injury. Regular reporting about childhood injury to provide a comprehensive profile of injuries of children and young people in the state should be considered, along with the provision and/or linkage of child injury information from multiple data collections. This could support the development of a suite of injury performance indicators to monitor childhood injury reduction strategies across NSW.
Andrew, Simon A.
2017-01-01
Following the 2015 Middle East Respiratory Syndrome (MERS) outbreak in South Korea, this research aims to examine the structural effect of public health network explaining collaboration effectiveness, which is defined as joint efforts to improve quality of service provision, cost savings, and coordination. We tested the bonding and bridging effects on collaboration effectiveness during the MERS outbreak response by utilizing an institutional collective action framework. The analysis results of 114 organizations responding during the crisis show a significant association between the bonding effect and the effectiveness of collaboration, as well as a positive association between risk communication in disseminating public health information and the effectiveness of collaboration. PMID:28914780
Ajunwa, Ifeoma; Crawford, Kate; Ford, Joel S
2016-09-01
This essay details the resurgence of wellness program as employed by large corporations with the aim of reducing healthcare costs. The essay narrows in on a discussion of how Big Data collection practices are being utilized in wellness programs and the potential negative impact on the worker in regards to privacy and employment discrimination. The essay offers an ethical framework to be adopted by wellness program vendors in order to conduct wellness programs that would achieve cost-saving goals without undue burdens on the worker. The essay also offers some innovative approaches to wellness that may well better serve the goals of healthcare cost reduction. © 2016 American Society of Law, Medicine & Ethics.
D'Ambruoso, Lucia; Byass, Peter; Nurul Qomariyah, Siti
2008-01-01
Background Maternal mortality remains unacceptably high in developing countries despite international advocacy, development targets, and simple, affordable and effective interventions. In recent years, regard for maternal mortality as a human rights issue as well as one that pertains to health, has emerged. Objective We study a case of maternal death using a theoretical framework derived from the right to health to examine access to and quality of maternal healthcare. Our objective was to explore the potential of rights-based frameworks to inform public health planning from a human rights perspective. Design Information was elicited as part of a verbal autopsy survey investigating maternal deaths in rural settings in Indonesia. The deceased's relatives were interviewed to collect information on medical signs, symptoms and the social, cultural and health systems circumstances surrounding the death. Results In this case, a prolonged, severe fever and a complicated series of referrals culminated in the death of a 19-year-old primagravida at 7 months gestation. The cause of death was acute infection. The woman encountered a range of barriers to access; behavioural, socio-cultural, geographic and economic. Several serious health system failures were also apparent. The theoretical framework derived from the right to health identified that none of the essential elements of the right were upheld. Conclusion The rights-based approach could identify how and where to improve services. However, there are fundamental and inherent conflicts between the public health tradition (collective and preventative) and the right to health (individualistic and curative). As a result, and in practice, the right to health is likely to be ineffective for public health planning from a human rights perspective. Collective rights such as the right to development may provide a more suitable means to achieve equity and social justice in health planning. PMID:20027244
Koene, Paul; de Mol, Rudi M.; Ipema, Bert
2016-01-01
Which mammal species are suitable to be kept as pet? For answering this question many factors have to be considered. Animals have many adaptations to their natural environment in which they have evolved that may cause adaptation problems and/or risks in captivity. Problems may be visible in behavior, welfare, health, and/or human–animal interaction, resulting, for example, in stereotypies, disease, and fear. A framework is developed in which bibliographic information of mammal species from the wild and captive environment is collected and assessed by three teams of animal scientists. Oneliners from literature about behavioral ecology, health, and welfare and human–animal relationship of 90 mammal species are collected by team 1 in a database and strength of behavioral needs and risks is assessed by team 2. Based on summaries of those strengths the suitability of the mammal species is assessed by team 3. Involvement of stakeholders for supplying bibliographic information and assessments was propagated. Combining the individual and subjective assessments of the scientists using statistical methods makes the final assessment of a rank order of suitability as pet of those species less biased and more objective. The framework is dynamic and produces an initial rank ordered list of the pet suitability of 90 mammal species, methods to add new mammal species to the list or remove animals from the list and a method to incorporate stakeholder assessments. A model is developed that allows for provisional classification of pet suitability. Periodical update of the pet suitability framework is expected to produce an updated list with increased reliability and accuracy. Furthermore, the framework could be further developed to assess the pet suitability of additional species of other animal groups, e.g., birds, reptiles, and amphibians. PMID:27243023
Hwabamungu, Boroto; Brown, Irwin; Williams, Quentin
2018-01-01
Recent literature on organisational strategy has called for greater emphasis on individuals (stakeholders) and what they do in the process of strategizing. Public sector organisations have to engage with an array of heterogeneous stakeholders in fulfilling their mandate. The public health sector in particular needs to engage with a diversity of stakeholders at local, regional and national levels when strategising. The purpose of this study is to investigate the influence of stakeholder relations on the implementation of Information Systems (IS) strategy in public hospitals in South Africa. An interpretive approach using two provinces was employed. The Activity Analysis and Development (ActAD) framework, an enhanced form of activity theory, was used as the theoretical framework. Data was collected using semi-structured interviews, meetings, documents analysis, physical artefacts and observation. The collected data was analysed using thematic analysis. Findings reveal that IS strategy implementation in public hospitals involves a large and complex network of stakeholder groups at different levels, and over different time periods. These stakeholder groups act in accordance with formal and informal roles, rules and modalities. Various contextual conditions together with the actions of, and interactions between stakeholder groups give rise to the situationality of stakeholder relations dynamics and strategy implementation. The multiple actions and interactions over time lead to the realisation of some aspects of the IS strategy in public hospitals. Given the complexity and dynamism of the context there are also certain unplanned implementations as well. These relationships are captured in a Stakeholder Relations Influence (SRI) framework. The SRI framework can be assistive in the assessment and mapping of stakeholders and stakeholder relations, and the assessment of the implications of these relations for effective IS strategy implementation in public hospitals. The framework can also provide the basis for the development of appropriate corrective measures in the implementation of strategies and policies in public institutions such as public hospitals. Copyright © 2017 Elsevier B.V. All rights reserved.
Cooling the Collective Motion of Trapped Ions to Initialize a Quantum Register
2016-09-13
computation [1] provides a gen- eral framework for fundamental investigations into sub- jects such as entanglement, quantum measurement, and quantum ...information theory. Since quantum computation relies on entanglement between qubits, any implementa- tion of a quantum computer must offer isolation from the...for realiz- ing a quantum computer , which is scalable to an arbitrary number of qubits. Their scheme is based on a collection of trapped atomic ions
Supporting Young Learners: Ideas for Preschool and Day Care Providers.
ERIC Educational Resources Information Center
Brickman, Nancy Altman, Ed.; Taylor, Lynn Spencer, Ed.
The High/Scope Curriculum is a developmentally based approach to early childhood education. The curriculum's "Extensions" newsletter, in which the articles in this collection first appeared, informs curriculum users about new developments relating to the High/Scope "open framework" curriculum. The articles are presented in…
Evaluating data worth for ground-water management under uncertainty
Wagner, B.J.
1999-01-01
A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models-a chance-constrained ground-water management model and an integer-programing sampling network design model-to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information-i.e., the projected reduction in management costs-with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models - a chance-constrained ground-water management model and an integer-programming sampling network design model - to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information - i.e., the projected reduction in management costs - with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.
Studying ocean acidification in the Arctic Ocean
Robbins, Lisa
2012-01-01
The U.S. Geological Survey (USGS) partnership with the U.S. Coast Guard Ice Breaker Healey and its United Nations Convention Law of the Sea (UNCLOS) cruises has produced new synoptic data from samples collected in the Arctic Ocean and insights into the patterns and extent of ocean acidification. This framework of foundational geochemical information will help inform our understanding of potential risks to Arctic resources due to ocean acidification.
Mnemonic convergence in social networks: The emergent properties of cognition at a collective level.
Coman, Alin; Momennejad, Ida; Drach, Rae D; Geana, Andra
2016-07-19
The development of shared memories, beliefs, and norms is a fundamental characteristic of human communities. These emergent outcomes are thought to occur owing to a dynamic system of information sharing and memory updating, which fundamentally depends on communication. Here we report results on the formation of collective memories in laboratory-created communities. We manipulated conversational network structure in a series of real-time, computer-mediated interactions in fourteen 10-member communities. The results show that mnemonic convergence, measured as the degree of overlap among community members' memories, is influenced by both individual-level information-processing phenomena and by the conversational social network structure created during conversational recall. By studying laboratory-created social networks, we show how large-scale social phenomena (i.e., collective memory) can emerge out of microlevel local dynamics (i.e., mnemonic reinforcement and suppression effects). The social-interactionist approach proposed herein points to optimal strategies for spreading information in social networks and provides a framework for measuring and forging collective memories in communities of individuals.
On uncertainty in information and ignorance in knowledge
NASA Astrophysics Data System (ADS)
Ayyub, Bilal M.
2010-05-01
This paper provides an overview of working definitions of knowledge, ignorance, information and uncertainty and summarises formalised philosophical and mathematical framework for their analyses. It provides a comparative examination of the generalised information theory and the generalised theory of uncertainty. It summarises foundational bases for assessing the reliability of knowledge constructed as a collective set of justified true beliefs. It discusses system complexity for ancestor simulation potentials. It offers value-driven communication means of knowledge and contrarian knowledge using memes and memetics.
Brueton, Valerie C; Vale, Claire L; Choodari-Oskooei, Babak; Jinks, Rachel; Tierney, Jayne F
2014-11-27
Providing evidence of impact highlights the benefits of medical research to society. Such evidence is increasingly requested by research funders and commonly relies on citation analysis. However, other indicators may be more informative. Although frameworks to demonstrate the impact of clinical research have been reported, no complementary framework exists for methodological research. Therefore, we assessed the impact of methodological research projects conducted or completed between 2009 and 2012 at the UK Medical Research Council Clinical Trials Unit Hub for Trials Methodology Research Hub, with a view to developing an appropriate framework. Various approaches to the collection of data on research impact were employed. Citation rates were obtained using Web of Science (http://www.webofknowledge.com/) and analyzed descriptively. Semistructured interviews were conducted to obtain information on the rates of different types of research output that indicated impact for each project. Results were then pooled across all projects. Finally, email queries pertaining to methodology projects were collected retrospectively and their content analyzed. Simple citation analysis established the citation rates per year since publication for 74 methodological publications; however, further detailed analysis revealed more about the potential influence of these citations. Interviews that spanned 20 individual research projects demonstrated a variety of types of impact not otherwise collated, for example, applications and further developments of the research; release of software and provision of guidance materials to facilitate uptake; formation of new collaborations and broad dissemination. Finally, 194 email queries relating to 6 methodological projects were received from 170 individuals across 23 countries. They provided further evidence that the methodologies were impacting on research and research practice, both nationally and internationally. We have used the information gathered in this study to adapt an existing framework for impact of clinical research for use in methodological research. Gathering evidence on research impact of methodological research from a variety of sources has enabled us to obtain multiple indicators and thus to demonstrate broad impacts of methodological research. The adapted framework developed can be applied to future methodological research and thus provides a tool for methodologists to better assess and report research impacts.
Maxwell, Annette E; Stewart, Susan L; Glenn, Beth A; Wong, Weng Kee; Yasui, Yutaka; Chang, L Cindy; Taylor, Victoria M; Nguyen, Tung T; Chen, Moon S; Bastani, Roshan
2012-01-01
Few studies have examined theoretically informed constructs related to hepatitis B (HBV) testing, and comparisons across studies are challenging due to lack of uniformity in constructs assessed. The present analysis examined relationships among Health Behavior Framework factors across four Asian American groups to advance the development of theory-based interventions for HBV testing in at-risk populations. Data were collected from 2007-2010 as part of baseline surveys during four intervention trials promoting HBV testing among Vietnamese-, Hmong-, Korean- and Cambodian-Americans (n = 1,735). Health Behavior Framework constructs assessed included: awareness of HBV, knowledge of transmission routes, perceived susceptibility, perceived severity, doctor recommendation, stigma of HBV infection, and perceived efficacy of testing. Within each group we assessed associations between our intermediate outcome of knowledge of HBV transmission and other constructs, to assess the concurrent validity of our model and instruments. While the absolute levels for Health Behavior Framework factors varied across groups, relationships between knowledge and other factors were generally consistent. This suggests similarities rather than differences with respect to posited drivers of HBV-related behavior. Our findings indicate that Health Behavior Framework constructs are applicable to diverse ethnic groups and provide preliminary evidence for the construct validity of the Health Behavior Framework.
Maxwell, AE; Stewart, SL; Glenn, BA; Wong, WK; Yasui, Y; Chang, LC; Taylor, VM; Nguyen, TT; Chen, MS; Bastani, R
2012-01-01
Background Few studies have examined theoretically informed constructs related to hepatitis B (HBV) testing, and comparisons across studies is challenging due to lack of uniformity in constructs assessed. This analysis examines relationships among Health Behavior Framework factors across four Asian American groups to advance the development of theory-based interventions for HBV testing in at-risk populations. Methods Data were collected from 2007–2010 as part of baseline surveys during four intervention trials promoting HBV testing among Vietnamese-, Hmong-, Korean- and Cambodian-Americans (n = 1,735). Health Behavior Framework constructs assessed included: awareness of HBV, knowledge of transmission routes, perceived susceptibility, perceived severity, doctor recommendation, stigma of HBV infection, and perceived efficacy of testing. Within each group we assessed associations between our intermediate outcome of knowledge of HBV transmission and other constructs, to assess the concurrent validity of our model and instruments. Results While the absolute levels for Health Behavior Framework factors varied across groups, relationships between knowledge and other factors were generally consistent. This suggests similarities rather than differences with respect to posited drivers of HBV-related behavior. Discussion Our findings indicate that Health Behavior Framework constructs are applicable to diverse ethnic groups and provide preliminary evidence for the construct validity of the Health Behavior Framework. PMID:22799389
NASA Astrophysics Data System (ADS)
Graham, E.; Schindel, D. E.
2014-12-01
The Global Registry of Scientific Collections (GRSciColl) is an online information resource developed to gather and disseminate basic information on scientific collections. Building on initiatives started for biological collections, GRSciColl expands this framework to encompass all scientific disciplines including earth and space sciences, anthropology, archaeology, biomedicine, and applied fields such as agriculture and technology. The goals of this registry are to (1) provide a single source of synoptic information about the repositories, their component collections, access and use policies, and staff contact information; and (2) facilitate the assignment of identifiers for repositories and their collections that are globally unique across all disciplines. As digitization efforts continue, the importance of globally unique identifiers is paramount to ensuring interoperability across datasets. Search capabilities and web services will significantly increase the web visibility and accessibility of these collections. Institutional records include categorization by governance (e.g., national, state or local governmental, private non-profit) and by scientific discipline (e.g., earth science, biomedical, agricultural). Collection-level metadata categorize the types of contained specimens/samples and modes of preservation. In selecting the level of granularity for these categories, designers sought a compromise that would capture enough information to be useful in searches and inquiries and would complement the detailed archives in specimen-level databases such (which are increasingly digital) hosted by discipline-specific groups (e.g. SESAR) or the repositories themselves (e.g. KE EMu).
Yoo, Min-Jung; Grozel, Clément; Kiritsis, Dimitris
2016-07-08
This paper describes our conceptual framework of closed-loop lifecycle information sharing for product-service in the Internet of Things (IoT). The framework is based on the ontology model of product-service and a type of IoT message standard, Open Messaging Interface (O-MI) and Open Data Format (O-DF), which ensures data communication. (1) BACKGROUND: Based on an existing product lifecycle management (PLM) methodology, we enhanced the ontology model for the purpose of integrating efficiently the product-service ontology model that was newly developed; (2) METHODS: The IoT message transfer layer is vertically integrated into a semantic knowledge framework inside which a Semantic Info-Node Agent (SINA) uses the message format as a common protocol of product-service lifecycle data transfer; (3) RESULTS: The product-service ontology model facilitates information retrieval and knowledge extraction during the product lifecycle, while making more information available for the sake of service business creation. The vertical integration of IoT message transfer, encompassing all semantic layers, helps achieve a more flexible and modular approach to knowledge sharing in an IoT environment; (4) Contribution: A semantic data annotation applied to IoT can contribute to enhancing collected data types, which entails a richer knowledge extraction. The ontology-based PLM model enables as well the horizontal integration of heterogeneous PLM data while breaking traditional vertical information silos; (5) CONCLUSION: The framework was applied to a fictive case study with an electric car service for the purpose of demonstration. For the purpose of demonstrating the feasibility of the approach, the semantic model is implemented in Sesame APIs, which play the role of an Internet-connected Resource Description Framework (RDF) database.
Yoo, Min-Jung; Grozel, Clément; Kiritsis, Dimitris
2016-01-01
This paper describes our conceptual framework of closed-loop lifecycle information sharing for product-service in the Internet of Things (IoT). The framework is based on the ontology model of product-service and a type of IoT message standard, Open Messaging Interface (O-MI) and Open Data Format (O-DF), which ensures data communication. (1) Background: Based on an existing product lifecycle management (PLM) methodology, we enhanced the ontology model for the purpose of integrating efficiently the product-service ontology model that was newly developed; (2) Methods: The IoT message transfer layer is vertically integrated into a semantic knowledge framework inside which a Semantic Info-Node Agent (SINA) uses the message format as a common protocol of product-service lifecycle data transfer; (3) Results: The product-service ontology model facilitates information retrieval and knowledge extraction during the product lifecycle, while making more information available for the sake of service business creation. The vertical integration of IoT message transfer, encompassing all semantic layers, helps achieve a more flexible and modular approach to knowledge sharing in an IoT environment; (4) Contribution: A semantic data annotation applied to IoT can contribute to enhancing collected data types, which entails a richer knowledge extraction. The ontology-based PLM model enables as well the horizontal integration of heterogeneous PLM data while breaking traditional vertical information silos; (5) Conclusion: The framework was applied to a fictive case study with an electric car service for the purpose of demonstration. For the purpose of demonstrating the feasibility of the approach, the semantic model is implemented in Sesame APIs, which play the role of an Internet-connected Resource Description Framework (RDF) database. PMID:27399717
Chuckwalla Valley multiple-well monitoring site, Chuckwalla Valley, Riverside County
Everett, Rhett
2013-01-01
The U.S. Geological Survey (USGS), in cooperation with the Bureau of Land Management, is evaluating the geohydrology and water availability of the Chuckwalla Valley, California. As part of this evaluation, the USGS installed the Chuckwalla Valley multiple-well monitoring site (CWV1) in the southeastern portion of the Chuckwalla Basin. Data collected at this site provide information about the geology, hydrology, geophysics, and geochemistry of the local aquifer system, thus enhancing the understanding of the geohydrologic framework of the Chuckwalla Valley. This report presents construction information for the CWV1 multiple-well monitoring site and initial geohydrologic data collected from the site.
Validating for Use and Interpretation: A Mixed Methods Contribution Illustrated
ERIC Educational Resources Information Center
Morell, Linda; Tan, Rachael Jin Bee
2009-01-01
Researchers in the areas of psychology and education strive to understand the intersections among validity, educational measurement, and cognitive theory. Guided by a mixed model conceptual framework, this study investigates how respondents' opinions inform the validation argument. Validity evidence for a science assessment was collected through…
Pieces of Civic Intelligence: Towards a Capacities Framework
ERIC Educational Resources Information Center
Schuler, Douglas
2014-01-01
Civic intelligence is the capacity of collectivities--from small informal groups to humanity as a whole--to equitably and effectively address important shared problems such as poverty, bioterrorism, and natural disasters. It's an abstract concept that can be expressed in policy, art, demonstrations, or conversation. In this article, civic…
Complementary and Alternative Therapies: An Evidence-Based Framework
ERIC Educational Resources Information Center
Shaw, Steven R.
2008-01-01
Complementary and alternative medicine (CAM) has experienced a dramatic growth in use and acceptability over the last 20 years. CAM is a diverse collection of medical and healthcare systems, practices, and products that are not presently considered a component of conventional medicine. CAM traditionally has been practiced by informally educated…
Civics Framework for the 2010 National Assessment of Educational Progress
ERIC Educational Resources Information Center
National Assessment Governing Board, 2009
2009-01-01
The National Assessment of Educational Progress (NAEP) is a survey mandated by the U.S. Congress to collect and report information about student achievement in various academic subjects, such as mathematics, science, reading, writing, history, geography, and civics. The National Assessment Governing Board sets policy and the overall dimensions…
A Generic Privacy Quantification Framework for Privacy-Preserving Data Publishing
ERIC Educational Resources Information Center
Zhu, Zutao
2010-01-01
In recent years, the concerns about the privacy for the electronic data collected by government agencies, organizations, and industries are increasing. They include individual privacy and knowledge privacy. Privacy-preserving data publishing is a research branch that preserves the privacy while, at the same time, withholding useful information in…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-26
... collection, assembly, and use of consumer report information and provides the framework for the credit... increased the obligations of users of consumer reports, particularly employers. Most significantly, the 1996... transferred authority to issue interpretive guidance under the FCRA to the Consumer Financial Protection...
Mueller, Michael; Morgan, David
2017-07-01
International comparisons of health spending and financing are most frequently carried out using datasets of international organisations based on the System of Health Accounts (SHA). This accounting framework has recently been updated and 2016 saw the first international data collection under the new SHA 2011 guidelines. In addition to reaching better comparability of health spending figures and greater country coverage, the updated framework has seen changes in the dimension of health financing leading to important consequences when analysing health financing data. This article presents the first results of health spending and financing data collected under this new framework and highlights the areas where SHA 2011 has become a more useful tool for policy analysis, by complementing data on expenditure of health financing schemes with information about their revenue streams. It describes the major conceptual changes in the scope of health financing and highlights why comprehensive analyses based on SHA 2011 can provide for a more complete description and comparison of health financing across countries, facilitate a more meaningful discussion of fiscal sustainability of health spending by also analysing the revenues of compulsory public schemes and help to clarify the role of governments in financing health care - which is generally much bigger than previously documented. Copyright © 2017 Elsevier B.V. All rights reserved.
Developing a framework for assessment of the environmental determinants of walking and cycling.
Pikora, Terri; Giles-Corti, Billie; Bull, Fiona; Jamrozik, Konrad; Donovan, Rob
2003-04-01
The focus for interventions and research on physical activity has moved away from vigorous activity to moderate-intensity activities, such as walking. In addition, a social ecological approach to physical activity research and practice is recommended. This approach considers the influence of the environment and policies on physical activity. Although there is limited empirical published evidence related to the features of the physical environment that influence physical activity, urban planning and transport agencies have developed policies and strategies that have the potential to influence whether people walk or cycle in their neighbourhood. This paper presents the development of a framework of the potential environmental influences on walking and cycling based on published evidence and policy literature, interviews with experts and a Delphi study. The framework includes four features: functional, safety, aesthetic and destination; as well as the hypothesised factors that contribute to each of these features of the environment. In addition, the Delphi experts determined the perceived relative importance of these factors. Based on these factors, a data collection tool will be developed and the frameworks will be tested through the collection of environmental information on neighbourhoods, where data on the walking and cycling patterns have been collected previously. Identifying the environmental factors that influence walking and cycling will allow the inclusion of a public health perspective as well as those of urban planning and transport in the design of built environments.
Lee, Minhyun; Koo, Choongwan; Hong, Taehoon; Park, Hyo Seon
2014-04-15
For the effective photovoltaic (PV) system, it is necessary to accurately determine the monthly average daily solar radiation (MADSR) and to develop an accurate MADSR map, which can simplify the decision-making process for selecting the suitable location of the PV system installation. Therefore, this study aimed to develop a framework for the mapping of the MADSR using an advanced case-based reasoning (CBR) and a geostatistical technique. The proposed framework consists of the following procedures: (i) the geographic scope for the mapping of the MADSR is set, and the measured MADSR and meteorological data in the geographic scope are collected; (ii) using the collected data, the advanced CBR model is developed; (iii) using the advanced CBR model, the MADSR at unmeasured locations is estimated; and (iv) by applying the measured and estimated MADSR data to the geographic information system, the MADSR map is developed. A practical validation was conducted by applying the proposed framework to South Korea. It was determined that the MADSR map developed through the proposed framework has been improved in terms of accuracy. The developed MADSR map can be used for estimating the MADSR at unmeasured locations and for determining the optimal location for the PV system installation.
Thomas, Jonathan V.; Stanton, Gregory P.; Bumgarner, Johnathan R.; Pearson, Daniel K.; Teeple, Andrew; Houston, Natalie A.; Payne, Jason; Musgrove, MaryLynn
2013-01-01
Several previous studies have been done to compile or collect physical and chemical data, describe the hydrogeologic processes, and develop conceptual and numerical groundwater-flow models of the Edwards-Trinity aquifer in the Trans-Pecos region. Documented methods were used to compile and collect groundwater, surface-water, geochemical, geophysical, and geologic information that subsequently were used to develop this conceptual model.
Framework for integration of informal waste management sector with the formal sector in Pakistan.
Masood, Maryam; Barlow, Claire Y
2013-10-01
Historically, waste pickers around the globe have utilised urban solid waste as a principal source of livelihood. Formal waste management sectors usually perceive the informal waste collection/recycling networks as backward, unhygienic and generally incompatible with modern waste management systems. It is proposed here that through careful planning and administration, these seemingly troublesome informal networks can be integrated into formal waste management systems in developing countries, providing mutual benefits. A theoretical framework for integration based on a case study in Lahore, Pakistan, is presented. The proposed solution suggests that the municipal authority should draw up and agree on a formal work contract with the group of waste pickers already operating in the area. The proposed system is assessed using the integration radar framework to classify and analyse possible intervention points between the sectors. The integration of the informal waste workers with the formal waste management sector is not a one dimensional or single step process. An ideal solution might aim for a balanced focus on all four categories of intervention, although this may be influenced by local conditions. Not all the positive benefits will be immediately apparent, but it is expected that as the acceptance of such projects increases over time, the informal recycling economy will financially supplement the formal system in many ways.
Carle, Adam C; Riley, William; Hays, Ron D; Cella, David
2015-10-01
To guide measure development, National Institutes of Health-supported Patient reported Outcomes Measurement Information System (PROMIS) investigators developed a hierarchical domain framework. The framework specifies health domains at multiple levels. The initial PROMIS domain framework specified that physical function and symptoms such as Pain and Fatigue indicate Physical Health (PH); Depression, Anxiety, and Anger indicate Mental Health (MH); and Social Role Performance and Social Satisfaction indicate Social Health (SH). We used confirmatory factor analyses to evaluate the fit of the hypothesized framework to data collected from a large sample. We used data (n=14,098) from PROMIS's wave 1 field test and estimated domain scores using the PROMIS item response theory parameters. We then used confirmatory factor analyses to test whether the domains corresponded to the PROMIS domain framework as expected. A model corresponding to the domain framework did not provide ideal fit [root mean square error of approximation (RMSEA)=0.13; comparative fit index (CFI)=0.92; Tucker Lewis Index (TLI)=0.88; standardized root mean square residual (SRMR)=0.09]. On the basis of modification indices and exploratory factor analyses, we allowed Fatigue to load on both PH and MH. This model fit the data acceptably (RMSEA=0.08; CFI=0.97; TLI=0.96; SRMR=0.03). Our findings generally support the PROMIS domain framework. Allowing Fatigue to load on both PH and MH improved fit considerably.
Development of user-friendly and interactive data collection system for cerebral palsy.
Raharjo, I; Burns, T G; Venugopalan, J; Wang, M D
2016-02-01
Cerebral palsy (CP) is a permanent motor disorder that appears in early age and it requires multiple tests to assess the physical and mental capabilities of the patients. Current medical record data collection systems, e.g., EPIC, employed for CP are very general, difficult to navigate, and prone to errors. The data cannot easily be extracted which limits data analysis on this rich source of information. To overcome these limitations, we designed and prototyped a database with a graphical user interface geared towards clinical research specifically in CP. The platform with MySQL and Java framework is reliable, secure, and can be easily integrated with other programming languages for data analysis such as MATLAB. This database with GUI design is a promising tool for data collection and can be applied in many different fields aside from CP to infer useful information out of the vast amount of data being collected.
Development of user-friendly and interactive data collection system for cerebral palsy
Raharjo, I.; Burns, T. G.; Venugopalan, J.; Wang., M. D.
2016-01-01
Cerebral palsy (CP) is a permanent motor disorder that appears in early age and it requires multiple tests to assess the physical and mental capabilities of the patients. Current medical record data collection systems, e.g., EPIC, employed for CP are very general, difficult to navigate, and prone to errors. The data cannot easily be extracted which limits data analysis on this rich source of information. To overcome these limitations, we designed and prototyped a database with a graphical user interface geared towards clinical research specifically in CP. The platform with MySQL and Java framework is reliable, secure, and can be easily integrated with other programming languages for data analysis such as MATLAB. This database with GUI design is a promising tool for data collection and can be applied in many different fields aside from CP to infer useful information out of the vast amount of data being collected. PMID:28133638
Hu, Peter F; Yang, Shiming; Li, Hsiao-Chi; Stansbury, Lynn G; Yang, Fan; Hagegeorge, George; Miller, Catriona; Rock, Peter; Stein, Deborah M; Mackenzie, Colin F
2017-01-01
Research and practice based on automated electronic patient monitoring and data collection systems is significantly limited by system down time. We asked whether a triple-redundant Monitor of Monitors System (MoMs) to collect and summarize key information from system-wide data sources could achieve high fault tolerance, early diagnosis of system failure, and improve data collection rates. In our Level I trauma center, patient vital signs(VS) monitors were networked to collect real time patient physiologic data streams from 94 bed units in our various resuscitation, operating, and critical care units. To minimize the impact of server collection failure, three BedMaster® VS servers were used in parallel to collect data from all bed units. To locate and diagnose system failures, we summarized critical information from high throughput datastreams in real-time in a dashboard viewer and compared the before and post MoMs phases to evaluate data collection performance as availability time, active collection rates, and gap duration, occurrence, and categories. Single-server collection rates in the 3-month period before MoMs deployment ranged from 27.8 % to 40.5 % with combined 79.1 % collection rate. Reasons for gaps included collection server failure, software instability, individual bed setting inconsistency, and monitor servicing. In the 6-month post MoMs deployment period, average collection rates were 99.9 %. A triple redundant patient data collection system with real-time diagnostic information summarization and representation improved the reliability of massive clinical data collection to nearly 100 % in a Level I trauma center. Such data collection framework may also increase the automation level of hospital-wise information aggregation for optimal allocation of health care resources.
A framework to measure the value of public health services.
Jacobson, Peter D; Neumann, Peter J
2009-10-01
To develop a framework that public health practitioners could use to measure the value of public health services. Primary data were collected from August 2006 through March 2007. We interviewed (n=46) public health practitioners in four states, leaders of national public health organizations, and academic researchers. Using a semi-structured interview protocol, we conducted a series of qualitative interviews to define the component parts of value for public health services and identify methodologies used to measure value and data collected. The primary form of analysis is descriptive, synthesizing information across respondents as to how they measure the value of their services. Our interviews did not reveal a consensus on how to measure value or a specific framework for doing so. Nonetheless, the interviews identified some potential strategies, such as cost accounting and performance-based contracting mechanisms. The interviews noted implementation barriers, including limits to staff capacity and data availability. We developed a framework that considers four component elements to measure value: external factors that must be taken into account (i.e., mandates); key internal actions that a local health department must take (i.e., staff assessment); using appropriate quantitative measures; and communicating value to elected officials and the public.
Jull, J; Whitehead, M; Petticrew, M; Kristjansson, E; Gough, D; Petkovic, J; Volmink, J; Weijer, C; Taljaard, M; Edwards, S; Mbuagbaw, L; Cookson, R; McGowan, J; Lyddiatt, A; Boyer, Y; Cuervo, L G; Armstrong, R; White, H; Yoganathan, M; Pantoja, T; Shea, B; Pottie, K; Norheim, O; Baird, S; Robberstad, B; Sommerfelt, H; Asada, Y; Wells, G; Tugwell, P; Welch, V
2017-01-01
Background Randomised controlled trials can provide evidence relevant to assessing the equity impact of an intervention, but such information is often poorly reported. We describe a conceptual framework to identify health equity-relevant randomised trials with the aim of improving the design and reporting of such trials. Methods An interdisciplinary and international research team engaged in an iterative consensus building process to develop and refine the conceptual framework via face-to-face meetings, teleconferences and email correspondence, including findings from a validation exercise whereby two independent reviewers used the emerging framework to classify a sample of randomised trials. Results A randomised trial can usefully be classified as ‘health equity relevant’ if it assesses the effects of an intervention on the health or its determinants of either individuals or a population who experience ill health due to disadvantage defined across one or more social determinants of health. Health equity-relevant randomised trials can either exclusively focus on a single population or collect data potentially useful for assessing differential effects of the intervention across multiple populations experiencing different levels or types of social disadvantage. Trials that are not classified as ‘health equity relevant’ may nevertheless provide information that is indirectly relevant to assessing equity impact, including information about individual level variation unrelated to social disadvantage and potentially useful in secondary modelling studies. Conclusion The conceptual framework may be used to design and report randomised trials. The framework could also be used for other study designs to contribute to the evidence base for improved health equity. PMID:28951402
District health information system assessment: a case study in iran.
Raeisi, Ahmad Reza; Saghaeiannejad, Sakineh; Karimi, Saeed; Ehteshami, Asghar; Kasaei, Mahtab
2013-03-01
Health care managers and personnel should be aware and literate of health information system in order to increase the efficiency and effectiveness in their organization. Since accurate, appropriate, precise, timely, valid information and interpretation of information is required and is the basis for policy planning and decision making in various levels of the organization. This study was conducted to assess the district health information system evolution in Iran according to WHO framework. This research is an applied, descriptive cross sectional study, in which a total of twelve urban and eight rural facilities, and the district health center at Falavarjan region were surveyed by using a questionnaire with 334 items. Content and constructive validity and reliability of the questionnaire were confirmed with correlation coefficient of 0.99. Obtained data were analyzed with SPSS 16 software and descriptive statistics were used to examine measures of WHO compliance. The analysis of data revealed that the mean score of compliance of district health information system framework was 35.75 percent. The maximum score of compliance with district health information system belonged to the data collection process (70 percent). The minimum score of compliance with district health information system belonged to information based decision making process with a score of 10 percent. District Health Information System Criteria in Isfahan province do not completely comply with WHO framework. Consequently, it seems that health system managers engaged with underlying policy and decision making processes at district health level should try to restructure and decentralize district health information system and develop training management programs for their managers.
NASA Astrophysics Data System (ADS)
Elias, E.; Reyes, J. J.; Steele, C. M.; Rango, A.
2017-12-01
Assessing vulnerability of agricultural systems to climate variability and change is vital in securing food systems and sustaining rural livelihoods. Farmers, ranchers, and forest landowners rely on science-based, decision-relevant, and localized information to maintain production, ecological viability, and economic returns. This contribution synthesizes a collection of research on the future of agricultural production in the American Southwest (SW). Research was based on a variety of geospatial methodologies and datasets to assess the vulnerability of rangelands and livestock, field crops, specialty crops, and forests in the SW to climate-risk and change. This collection emerged from the development of regional vulnerability assessments for agricultural climate-risk by the U.S. Department of Agriculture (USDA) Climate Hub Network, established to deliver science-based information and technologies to enable climate-informed decision-making. Authors defined vulnerability differently based on their agricultural system of interest, although each primarily focuses on biophysical systems. We found that an inconsistent framework for vulnerability and climate risk was necessary to adequately capture the diversity, variability, and heterogeneity of SW landscapes, peoples, and agriculture. Through the diversity of research questions and methodologies, this collection of articles provides valuable information on various aspects of SW vulnerability. All articles relied on geographic information systems technology, with highly variable levels of complexity. Agricultural articles used National Agricultural Statistics Service data, either as tabular county level summaries or through the CropScape cropland raster datasets. Most relied on modeled historic and future climate information, but with differing assumptions regarding spatial resolution and temporal framework. We assert that it is essential to evaluate climate risk using a variety of complementary methodologies and perspectives. In addition, we found that spatial analysis supports informed adaptation, within and outside the SW United States. The persistence and adaptive capacity of agriculture in the water-limited Southwest serves as an instructive example and may offer solutions to reduce future climate risk.
Strategic Management of the Information Technology Resource: A Framework for Retention
ERIC Educational Resources Information Center
Sanchez, Cesar O.
2010-01-01
The qualitative phenomenological study focused on the exploration and identification of factors that might trigger the turnover of IT professional employees. The interview of 20 IT professional employees from the State of New York assisted in the collection of perceptions and lived turnover experiences of IT study participants. Five themes…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-14
... disclosure requirement. Obligation to Respond: Required to obtain or retain benefits. Statutory authority for...-222 MHz and (220 MHz service). In establishing this licensing plan, FCC's goal is to establish a flexible regulatory framework that allows for efficient licensing of the 220 MHz service, eliminates...
ERIC Educational Resources Information Center
Grabowski, Barbara L.
2011-01-01
After a discussion of the state of both misaligned and informative online and distance education research, the authors in this special issue (hereafter called the collective) extract evidence-based principles about strategies that work. Both are addressed in this article. First, their criticisms centered on the value of comparative research. Those…
Evaluation Blueprint for School-Wide Positive Behavior Support
ERIC Educational Resources Information Center
Algozzine, Bob; Horner, Robert H.; Sugai, George; Barrett, Susan; Dickey, Celeste Rossetto; Eber, Lucille; Kincaid, Donald; Lewis, Timothy; Tobin, Tary
2010-01-01
Evaluation is the process of collecting and using information for decision-making. A hallmark of School-wide Positive Behavior Support (SWPBS) is a commitment to formal evaluation. The purpose of this SWPBS Evaluation Blueprint is to provide those involved in developing Evaluation Plans and Evaluation Reports with a framework for (a) addressing…
Student Vulnerability, Agency, and Learning Analytics: An Exploration
ERIC Educational Resources Information Center
Prinsloo, Paul; Slade, Sharon
2016-01-01
In light of increasing concerns about surveillance, higher education institutions (HEIs) cannot afford a simple paternalistic approach to student data. Very few HEIs have regulatory frameworks in place and/or share information with students regarding the scope of data that may be collected, analyzed, used, and shared. It is clear from literature…
Radiation therapy for people with cancer: what do written information materials tell them?
Smith, S K; Yan, B; Milross, C; Dhillon, H M
2016-07-01
This study aimed to compare and contrast the contents of different types of written patient information about radiotherapy, namely (1) hospital radiotherapy departments vs. cancer control organisations and (2) generic vs. tumour-specific materials. A coding framework, informed by existing patients' information needs literature, was developed and applied to 54 radiotherapy information resources. The framework comprised 12 broad themes; cancer diagnosis, general information about radiotherapy, treatment planning, daily treatment, side effects, self-care management, external radiotherapy, internal radiotherapy, impact on daily activities, post-treatment, psychosocial health and other content, such as a glossary. Materials produced by cancer organisations contained significantly more information than hospital resources on diagnosis, general radiotherapy information, internal radiotherapy and psychosocial health. However, hospital materials provided more information about treatment planning, daily treatment and the impact on daily activities. Compared to generic materials, tumour-specific resources were superior in providing information about diagnosis, daily treatment, side effects, post-treatment and psychosocial health. Information about internal radiotherapy, prognosis and chronic side effects were poorly covered by most resources. Collectively, hospital and cancer organisation resources complement each other in meeting patients' information needs. Identifying ways to consolidate different information sources could help comprehensively address patients' medical and psychosocial information needs about radiotherapy. © 2015 John Wiley & Sons Ltd.
A cognitive information processing framework for distributed sensor networks
NASA Astrophysics Data System (ADS)
Wang, Feiyi; Qi, Hairong
2004-09-01
In this paper, we present a cognitive agent framework (CAF) based on swarm intelligence and self-organization principles, and demonstrate it through collaborative processing for target classification in sensor networks. The framework involves integrated designs to provide both cognitive behavior at the organization level to conquer complexity and reactive behavior at the individual agent level to retain simplicity. The design tackles various problems in the current information processing systems, including overly complex systems, maintenance difficulties, increasing vulnerability to attack, lack of capability to tolerate faults, and inability to identify and cope with low-frequency patterns. An important and distinguishing point of the presented work from classical AI research is that the acquired intelligence does not pertain to distinct individuals but to groups. It also deviates from multi-agent systems (MAS) due to sheer quantity of extremely simple agents we are able to accommodate, to the degree that some loss of coordination messages and behavior of faulty/compromised agents will not affect the collective decision made by the group.
The role of data fusion in predictive maintenance using digital twin
NASA Astrophysics Data System (ADS)
Liu, Zheng; Meyendorf, Norbert; Mrad, Nezih
2018-04-01
Modern aerospace industry is migrating from reactive to proactive and predictive maintenance to increase platform operational availability and efficiency, extend its useful life cycle and reduce its life cycle cost. Multiphysics modeling together with data-driven analytics generate a new paradigm called "Digital Twin." The digital twin is actually a living model of the physical asset or system, which continually adapts to operational changes based on the collected online data and information, and can forecast the future of the corresponding physical counterpart. This paper reviews the overall framework to develop a digital twin coupled with the industrial Internet of Things technology to advance aerospace platforms autonomy. Data fusion techniques particularly play a significant role in the digital twin framework. The flow of information from raw data to high-level decision making is propelled by sensor-to-sensor, sensor-to-model, and model-to-model fusion. This paper further discusses and identifies the role of data fusion in the digital twin framework for aircraft predictive maintenance.
Jawad, Mohammed; Darzi, Andrea; Lotfi, Tamara; Nakkash, Rima; Hawkins, Ben; Akl, Elie A
2017-08-01
We assessed compliance of waterpipe product packaging and labelling with the Framework Convention on Tobacco Control's Article 11. We evaluated samples collected at a trade fair against ten domains: health warning location, size, use of pictorials, use of colour, and packaging information on constituents and emissions. We also evaluated waterpipe accessories (e.g., charcoal) for misleading claims. Ten of 15 tobacco products had health warnings on their principal display areas, covering a median of 22.4 per cent (interquartile range 19.4-27.4 per cent) of those areas. Three had pictorial, in-colour health warnings. We judged all packaging information on constituents and emissions to be misleading. Eight of 13 charcoal products displayed environmentally friendly descriptors and/or claims of reduced harm that we judged to be misleading. Increased compliance with waterpipe tobacco regulation is warranted. An improved policy framework for waterpipe tobacco should also consider regulation of accessories such as charcoal products.
Six methodological steps to build medical data warehouses for research.
Szirbik, N B; Pelletier, C; Chaussalet, T
2006-09-01
We propose a simple methodology for heterogeneous data collection and central repository-style database design in healthcare. Our method can be used with or without other software development frameworks, and we argue that its application can save a relevant amount of implementation effort. Also, we believe that the method can be used in other fields of research, especially those that have a strong interdisciplinary nature. The idea emerged during a healthcare research project, which consisted among others in grouping information from heterogeneous and distributed information sources. We developed this methodology by the lessons learned when we had to build a data repository, containing information about elderly patients flows in the UK's long-term care system (LTC). We explain thoroughly those aspects that influenced the methodology building. The methodology is defined by six steps, which can be aligned with various iterative development frameworks. We describe here the alignment of our methodology with the RUP (rational unified process) framework. The methodology emphasizes current trends, as early identification of critical requirements, data modelling, close and timely interaction with users and stakeholders, ontology building, quality management, and exception handling. Of a special interest is the ontological engineering aspect, which had the effects with the highest impact after the project. That is, it helped stakeholders to perform better collaborative negotiations that brought better solutions for the overall system investigated. An insight into the problems faced by others helps to lead the negotiators to win-win situations. We consider that this should be the social result of any project that collects data for better decision making that leads finally to enhanced global outcomes.
Sensor-based architecture for medical imaging workflow analysis.
Silva, Luís A Bastião; Campos, Samuel; Costa, Carlos; Oliveira, José Luis
2014-08-01
The growing use of computer systems in medical institutions has been generating a tremendous quantity of data. While these data have a critical role in assisting physicians in the clinical practice, the information that can be extracted goes far beyond this utilization. This article proposes a platform capable of assembling multiple data sources within a medical imaging laboratory, through a network of intelligent sensors. The proposed integration framework follows a SOA hybrid architecture based on an information sensor network, capable of collecting information from several sources in medical imaging laboratories. Currently, the system supports three types of sensors: DICOM repository meta-data, network workflows and examination reports. Each sensor is responsible for converting unstructured information from data sources into a common format that will then be semantically indexed in the framework engine. The platform was deployed in the Cardiology department of a central hospital, allowing identification of processes' characteristics and users' behaviours that were unknown before the utilization of this solution.
2007-04-30
Camm, RAND Corporation 4th Annual Acquisition Research Symposium of the Naval Postgraduate School: Acquisition Research: Creating Synergy for ...Informed Change May 16-17, 2007 Approved for public release, distribution unlimited. Prepared for : Naval Postgraduate School, Monterey, California...93943 Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1
Expert decision-making strategies
NASA Technical Reports Server (NTRS)
Mosier, Kathleen L.
1991-01-01
A recognition-primed decisions (RPD) model is employed as a framework to investigate crew decision-making processes. The quality of information transfer, a critical component of the team RPD model and an indicator of the team's 'collective consciouness', is measured and analyzed with repect to crew performance. As indicated by the RPD model, timing and patterns of information search transfer were expected to reflect extensive and continual situation assessment, and serial evaluation of alternative states of the world or decision response options.
Finding the Words: Medical Students' Reflections on Communication Challenges in Clinic.
Braverman, Genna; Bereknyei Merrell, Sylvia; Bruce, Janine S; Makoul, Gregory; Schillinger, Erika
2016-11-01
Interpersonal communication is essential to providing excellent patient care and requires ongoing development. Although aspects of medical student interpersonal communication may degrade throughout career progression, it is unknown what specific elements pose challenges. We aimed to characterize clerkship students' perspectives on communication challenges in the outpatient setting to help inform curricular development. Third-year medical students in a required family medicine clerkship were asked to describe a communication challenge they encountered. Open-ended written responses were collected through a mandatory post-clerkship survey. Responses were qualitatively coded using an a priori framework for teaching and assessing communication skills (The SEGUE Framework for Teaching and Assessing Communication Skills) with data-derived additions to the framework, followed by a team-based thematic analysis. We collected 799 reflections written by 518 students from 2007-2014. Three dominant themes emerged from the analysis: challenges with (1) effectively exchanging information with patients, (2) managing emotional aspects of the patient encounter, and (3) negotiating terms of the encounter. Communication curricula focus on content and process of the medical interview, but insufficient time and energy are devoted to psychosocial factors, including aspects of the encounter that are emotionally charged or conflicting. While gaps in students' communication skillsets may be anticipated or observed by educators, this study offers an analysis of students' own perceptions of the challenges they face.
Dafalla, Tarig Dafalla Mohamed; Kushniruk, Andre W; Borycki, Elizabeth M
2015-01-01
A pragmatic evaluation framework for evaluating the usability and usefulness of an e-learning intervention for a patient clinical information scheduling system is presented in this paper. The framework was conceptualized based on two different but related concepts (usability and usefulness) and selection of appropriate and valid methods of data collection and analysis that included: (1) Low-Cost Rapid Usability Engineering (LCRUE), (2) Cognitive Task Analysis (CTA), (3) Heuristic Evaluation (HE) criteria for web-based learning, and (4) Software Usability Measurement Inventory (SUMI). The results of the analysis showed some areas where usability that were related to General Interface Usability (GIU), instructional design and content was problematic; some of which might account for the poorly rated aspects of usability when subjectively measured. This paper shows that using a pragmatic framework can be a useful way, not only for measuring the usability and usefulness, but also for providing a practical objective evidences for learning and continuous quality improvement of e-learning systems. The findings should be of interest to educators, developers, designers, researchers, and usability practitioners involved in the development of e-learning systems in healthcare. This framework could be an appropriate method for assessing the usability, usefulness and safety of health information systems both in the laboratory and in the clinical context.
NASA Astrophysics Data System (ADS)
El-Gafy, Mohamed Anwar
Transportation projects will have impact on the environment. The general environmental pollution and damage caused by roads is closely associated with the level of economic activity. Although Environmental Impact Assessments (EIAs) are dependent on geo-spatial information in order to make an assessment, there are no rules per se how to conduct an environmental assessment. Also, the particular objective of each assessment is dictated case-by-case, based on what information and analyses are required. The conventional way of Environmental Impact Assessment (EIA) study is a time consuming process because it has large number of dependent and independent variables which have to be taken into account, which also have different consequences. With the emergence of satellite remote sensing technology and Geographic Information Systems (GIS), this research presents a new framework for the analysis phase of the Environmental Impact Assessment (EIA) for transportation projects based on the integration between remote sensing technology, geographic information systems, and spatial modeling. By integrating the merits of the map overlay method and the matrix method, the framework analyzes comprehensively the environmental vulnerability around the road and its impact on the environment. This framework is expected to: (1) improve the quality of the decision making process, (2) be applied both to urban and inter-urban projects, regardless of transport mode, and (3) present the data and make the appropriate analysis to support the decision of the decision-makers and allow them to present these data to the public hearings in a simple manner. Case studies, transportation projects in the State of Florida, were analyzed to illustrate the use of the decision support framework and demonstrate its capabilities. This cohesive and integrated system will facilitate rational decisions through cost effective coordination of environmental information and data management that can be tailored to specific projects. The framework would facilitate collecting, organizing, analyzing, archiving, and coordinating the information and data necessary to support technical and policy transportation decisions.
The PO.DAAC Portal and its use of the Drupal Framework
NASA Astrophysics Data System (ADS)
Alarcon, C.; Huang, T.; Bingham, A.; Cosic, S.
2011-12-01
The Physical Oceanography Distributed Active Archive Center portal (http://podaac.jpl.nasa.gov) is the primary interface for discovering and accessing oceanographic datasets collected from the vantage point of space. In addition, it provides information about NASA's satellite missions and operational activities at the data center. Recently the portal underwent a major redesign and deployment utilizing the Drupal framework. The Drupal framework was chosen as the platform for the portal due to its flexibility, open source community, and modular infrastructure. The portal features efficient content addition and management, mailing lists, forums, role based access control, and a faceted dataset browse capability. The dataset browsing was built as a custom Drupal module and integrates with a SOLR search engine.
A vision framework for the localization of soccer players and ball on the pitch using Handycams
NASA Astrophysics Data System (ADS)
Vilas, Tiago; Rodrigues, J. M. F.; Cardoso, P. J. S.; Silva, Bruno
2015-03-01
The current performance requirements in soccer make imperative the use of new technologies for game observation and analysis, such that detailed information about the teams' actions is provided. This paper summarizes a framework to collect the soccer players and ball positions using one or more Full HD Handycams, placed no more than 20cm apart in the stands, as well as how this framework connects to the FootData project. The system was based on four main modules: the detection and delimitation of the soccer pitch, the ball and the players detection and assignment to their teams, the tracking of players and ball and finally the computation of their localization (in meters) in the pitch.
TECHNOLOGY ASSESSMENT IN HOSPITALS: LESSONS LEARNED FROM AN EMPIRICAL EXPERIMENT.
Foglia, Emanuela; Lettieri, Emanuele; Ferrario, Lucrezia; Porazzi, Emanuele; Garagiola, Elisabetta; Pagani, Roberta; Bonfanti, Marzia; Lazzarotti, Valentina; Manzini, Raffaella; Masella, Cristina; Croce, Davide
2017-01-01
Hospital Based Health Technology Assessment (HBHTA) practices, to inform decision making at the hospital level, emerged as urgent priority for policy makers, hospital managers, and professionals. The present study crystallized the results achieved by the testing of an original framework for HBHTA, developed within Lombardy Region: the IMPlementation of A Quick hospital-based HTA (IMPAQHTA). The study tested: (i) the HBHTA framework efficiency, (ii) feasibility, (iii) the tool utility and completeness, considering dimensions and sub-dimensions. The IMPAQHTA framework deployed the Regional HTA program, activated in 2008 in Lombardy, at the hospital level. The relevance and feasibility of the framework were tested over a 3-year period through a large-scale empirical experiment, involving seventy-four healthcare professionals organized in different HBHTA teams for assessing thirty-two different technologies within twenty-two different hospitals. Semi-structured interviews and self-reported questionnaires were used to collect data regarding the relevance and feasibility of the IMPAQHTA framework. The proposed HBHTA framework proved to be suitable for application at the hospital level, in the Italian context, permitting a quick assessment (11 working days) and providing hospital decision makers with relevant and quantitative information. Performances in terms of feasibility, utility, completeness, and easiness proved to be satisfactory. The IMPAQHTA was considered to be a complete and feasible HBHTA framework, as well as being replicable to different technologies within any hospital settings, thus demonstrating the capability of a hospital to develop a complete HTA, if supported by adequate and well defined tools and quantitative metrics.
A sustainable dietetics bridging program: development and implementation in Atlantic Canada.
Lordly, Daphne; Guy, Jennifer; Barry, Paula; Garus, Jennifer
2014-01-01
A provincial focus on immigration and improved foreign credential recognition has led to an investigation of best practices and subsequent recommendations for the development and implementation of a sustainable university-based bridging program for internationally educated dietitians in Atlantic Canada. Data were collected from various sources and used to inform program decisions and direction. An advisory framework was established through a core group representing dietetics education and regulation and internationalization. Subsequently, a key stakeholder group was formed. As a result of this collaboration and research, a dietetics bridging framework was developed and a program pilot tested. Lessons learned may inform similar endeavours and highlight the importance of collaborative leadership and collaboration among multiple stakeholders, and of creatively addressing program sustainability issues while keeping learners (internationally educated dietitians) at the centre.
NASA Astrophysics Data System (ADS)
Nicosia, Vincenzo; Skardal, Per Sebastian; Arenas, Alex; Latora, Vito
2017-03-01
We introduce a framework to intertwine dynamical processes of different nature, each with its own distinct network topology, using a multilayer network approach. As an example of collective phenomena emerging from the interactions of multiple dynamical processes, we study a model where neural dynamics and nutrient transport are bidirectionally coupled in such a way that the allocation of the transport process at one layer depends on the degree of synchronization at the other layer, and vice versa. We show numerically, and we prove analytically, that the multilayer coupling induces a spontaneous explosive synchronization and a heterogeneous distribution of allocations, otherwise not present in the two systems considered separately. Our framework can find application to other cases where two or more dynamical processes such as synchronization, opinion formation, information diffusion, or disease spreading, are interacting with each other.
Mnemonic convergence in social networks: The emergent properties of cognition at a collective level
Coman, Alin; Momennejad, Ida; Drach, Rae D.; Geana, Andra
2016-01-01
The development of shared memories, beliefs, and norms is a fundamental characteristic of human communities. These emergent outcomes are thought to occur owing to a dynamic system of information sharing and memory updating, which fundamentally depends on communication. Here we report results on the formation of collective memories in laboratory-created communities. We manipulated conversational network structure in a series of real-time, computer-mediated interactions in fourteen 10-member communities. The results show that mnemonic convergence, measured as the degree of overlap among community members’ memories, is influenced by both individual-level information-processing phenomena and by the conversational social network structure created during conversational recall. By studying laboratory-created social networks, we show how large-scale social phenomena (i.e., collective memory) can emerge out of microlevel local dynamics (i.e., mnemonic reinforcement and suppression effects). The social-interactionist approach proposed herein points to optimal strategies for spreading information in social networks and provides a framework for measuring and forging collective memories in communities of individuals. PMID:27357678
Regional Land Use Mapping: the Phoenix Pilot Project
NASA Technical Reports Server (NTRS)
Anderson, J. R.; Place, J. L.
1971-01-01
The Phoenix Pilot Program has been designed to make effective use of past experience in making land use maps and collecting land use information. Conclusions reached from the project are: (1) Land use maps and accompanying statistical information of reasonable accuracy and quality can be compiled at a scale of 1:250,000 from orbital imagery. (2) Orbital imagery used in conjunction with other sources of information when available can significantly enhance the collection and analysis of land use information. (3) Orbital imagery combined with modern computer technology will help resolve the problem of obtaining land use data quickly and on a regular basis, which will greatly enhance the usefulness of such data in regional planning, land management, and other applied programs. (4) Agreement on a framework or scheme of land use classification for use with orbital imagery will be necessary for effective use of land use data.
Exploiting salient semantic analysis for information retrieval
NASA Astrophysics Data System (ADS)
Luo, Jing; Meng, Bo; Quan, Changqin; Tu, Xinhui
2016-11-01
Recently, many Wikipedia-based methods have been proposed to improve the performance of different natural language processing (NLP) tasks, such as semantic relatedness computation, text classification and information retrieval. Among these methods, salient semantic analysis (SSA) has been proven to be an effective way to generate conceptual representation for words or documents. However, its feasibility and effectiveness in information retrieval is mostly unknown. In this paper, we study how to efficiently use SSA to improve the information retrieval performance, and propose a SSA-based retrieval method under the language model framework. First, SSA model is adopted to build conceptual representations for documents and queries. Then, these conceptual representations and the bag-of-words (BOW) representations can be used in combination to estimate the language models of queries and documents. The proposed method is evaluated on several standard text retrieval conference (TREC) collections. Experiment results on standard TREC collections show the proposed models consistently outperform the existing Wikipedia-based retrieval methods.
Mirzoev, Tolib N; Green, Andrew; Van Kalliecharan, Ricky
2015-01-01
An adequate capacity of ministries of health (MOH) to develop and implement policies is essential. However, no frameworks were found assessing MOH capacity to conduct health policy processes within developing countries. This paper presents a conceptual framework for assessing MOH capacity to conduct policy processes based on a study from Tajikistan, a former Soviet republic where independence highlighted capacity challenges. The data collection for this qualitative study included in-depth interviews, document reviews and observations of policy events. Framework approach for analysis was used. The conceptual framework was informed by existing literature, guided the data collection and analysis, and was subsequently refined following insights from the study. The Tajik MOH capacity, while gradually improving, remains weak. There is poor recognition of wider contextual influences, ineffective leadership and governance as reflected in centralised decision-making, limited use of evidence, inadequate actors' participation and ineffective use of resources to conduct policy processes. However, the question is whether this is a reflection of lack of MOH ability or evidence of constraining environment or both. The conceptual framework identifies five determinants of robust policy processes, each with specific capacity needs: policy context, MOH leadership and governance, involvement of policy actors, the role of evidence and effective resource use for policy processes. Three underlying considerations are important for applying the capacity to policy processes: the need for clear focus, recognition of capacity levels and elements, and both ability and enabling environment. The proposed framework can be used in assessing and strengthening of the capacity of different policy actors. Copyright © 2013 John Wiley & Sons, Ltd.
Mehmood, Irfan; Sajjad, Muhammad; Baik, Sung Wook
2014-01-01
Wireless capsule endoscopy (WCE) has great advantages over traditional endoscopy because it is portable and easy to use, especially in remote monitoring health-services. However, during the WCE process, the large amount of captured video data demands a significant deal of computation to analyze and retrieve informative video frames. In order to facilitate efficient WCE data collection and browsing task, we present a resource- and bandwidth-aware WCE video summarization framework that extracts the representative keyframes of the WCE video contents by removing redundant and non-informative frames. For redundancy elimination, we use Jeffrey-divergence between color histograms and inter-frame Boolean series-based correlation of color channels. To remove non-informative frames, multi-fractal texture features are extracted to assist the classification using an ensemble-based classifier. Owing to the limited WCE resources, it is impossible for the WCE system to perform computationally intensive video summarization tasks. To resolve computational challenges, mobile-cloud architecture is incorporated, which provides resizable computing capacities by adaptively offloading video summarization tasks between the client and the cloud server. The qualitative and quantitative results are encouraging and show that the proposed framework saves information transmission cost and bandwidth, as well as the valuable time of data analysts in browsing remote sensing data. PMID:25225874
Mehmood, Irfan; Sajjad, Muhammad; Baik, Sung Wook
2014-09-15
Wireless capsule endoscopy (WCE) has great advantages over traditional endoscopy because it is portable and easy to use, especially in remote monitoring health-services. However, during the WCE process, the large amount of captured video data demands a significant deal of computation to analyze and retrieve informative video frames. In order to facilitate efficient WCE data collection and browsing task, we present a resource- and bandwidth-aware WCE video summarization framework that extracts the representative keyframes of the WCE video contents by removing redundant and non-informative frames. For redundancy elimination, we use Jeffrey-divergence between color histograms and inter-frame Boolean series-based correlation of color channels. To remove non-informative frames, multi-fractal texture features are extracted to assist the classification using an ensemble-based classifier. Owing to the limited WCE resources, it is impossible for the WCE system to perform computationally intensive video summarization tasks. To resolve computational challenges, mobile-cloud architecture is incorporated, which provides resizable computing capacities by adaptively offloading video summarization tasks between the client and the cloud server. The qualitative and quantitative results are encouraging and show that the proposed framework saves information transmission cost and bandwidth, as well as the valuable time of data analysts in browsing remote sensing data.
Low-Beer, Daniel; Bergeri, Isabel; Hess, Sarah; Garcia-Calleja, Jesus Maria; Hayashi, Chika; Mozalevskis, Antons; Rinder Stengaard, Annemarie; Sabin, Keith; Harmanci, Hande; Bulterys, Marc
2017-01-01
Evidence documenting the global burden of disease from viral hepatitis was essential for the World Health Assembly to endorse the first Global Health Sector Strategy (GHSS) on viral hepatitis in May 2016. The GHSS on viral hepatitis proposes to eliminate viral hepatitis as a public health threat by 2030. The GHSS on viral hepatitis is in line with targets for HIV infection and tuberculosis as part of the Sustainable Development Goals. As coordination between hepatitis and HIV programs aims to optimize the use of resources, guidance is also needed to align the strategic information components of the 2 programs. The World Health Organization monitoring and evaluation framework for viral hepatitis B and C follows an approach similar to the one of HIV, including components on the following: (1) context (prevalence of infection), (2) input, (3) output and outcome, including the cascade of prevention and treatment, and (4) impact (incidence and mortality). Data systems that are needed to inform this framework include (1) surveillance for acute hepatitis, chronic infections, and sequelae and (2) program data documenting prevention and treatment, which for the latter includes a database of patients. Overall, the commonalities between HIV and hepatitis at the strategic, policy, technical, and implementation levels justify coordination, strategic linkage, or integration, depending on the type of HIV and viral hepatitis epidemics. Strategic information is a critical area of this alignment under the principle of what gets measured gets done. It is facilitated because the monitoring and evaluation frameworks for HIV and viral hepatitis were constructed using a similar approach. However, for areas where elimination of viral hepatitis requires data that cannot be collected through the HIV program, collaborations are needed with immunization, communicable disease control, tuberculosis, and hepatology centers to ensure collection of information for the remaining indicators. PMID:29246882
A comprehensive health service evaluation and monitoring framework.
Reeve, Carole; Humphreys, John; Wakerman, John
2015-12-01
To develop a framework for evaluating and monitoring a primary health care service, integrating hospital and community services. A targeted literature review of primary health service evaluation frameworks was performed to inform the development of the framework specifically for remote communities. Key principles underlying primary health care evaluation were determined and sentinel indicators developed to operationalise the evaluation framework. This framework was then validated with key stakeholders. The framework includes Donabedian's three seminal domains of structure, process and outcomes to determine health service performance. These in turn are dependent on sustainability, quality of patient care and the determinants of health to provide a comprehensive health service evaluation framework. The principles underpinning primary health service evaluation were pertinent to health services in remote contexts. Sentinel indicators were developed to fit the demographic characteristics and health needs of the population. Consultation with key stakeholders confirmed that the evaluation framework was applicable. Data collected routinely by health services can be used to operationalise the proposed health service evaluation framework. Use of an evaluation framework which links policy and health service performance to health outcomes will assist health services to improve performance as part of a continuous quality improvement cycle. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Creating the sustainable conditions for knowledge information sharing in virtual community.
Wang, Jiangtao; Yang, Jianmei; Chen, Quan; Tsai, Sang-Bing
2016-01-01
Encyclopedias are not a new platform for the distribution of knowledge, but they have recently drawn a great deal of attention in their online iteration. Peer production in particular has emerged as a new mode of providing information with value and offering competitive advantage in information production. Large numbers of volunteers actively share their knowledge by continuously editing articles in Baidu encyclopedias. Most articles in the online communities are the cumulative and integrated products of the contributions of many coauthors. Email-based surveys and objective data mining were here used to collect analytical data. Critical mass theory is here used to analyze the characteristics of these collective actions and to explain the emergence and sustainability of these actions in the Baidu Encyclopedia communities. These results show that, based on the collective action framework, the contributors group satisfied the two key characteristics that ensure the collective action of knowledge contribution will both take place and become self-sustaining. This analysis not only facilitates the identification of collective actions related to individuals sharing knowledge in virtual communities, but also can provide an insight for other similar virtual communities' management and development.
Integrated and implicit: how residents learn CanMEDS roles by participating in practice.
Renting, Nienke; Raat, A N Janet; Dornan, Tim; Wenger-Trayner, Etienne; van der Wal, Martha A; Borleffs, Jan C C; Gans, Rijk O B; Jaarsma, A Debbie C
2017-09-01
Learning outcomes for residency training are defined in competency frameworks such as the CanMEDS framework, which ultimately aim to better prepare residents for their future tasks. Although residents' training relies heavily on learning through participation in the workplace under the supervision of a specialist, it remains unclear how the CanMEDS framework informs practice-based learning and daily interactions between residents and supervisors. This study aimed to explore how the CanMEDS framework informs residents' practice-based training and interactions with supervisors. Constructivist grounded theory guided iterative data collection and analyses. Data were collected by direct observations of residents and supervisors, combined with formal and field interviews. We progressively arrived at an explanatory theory by coding and interpreting the data, building provisional theories and through continuous conversations. Data analysis drew on sensitising insights from communities of practice theory, which provided this study with a social learning perspective. CanMEDS roles occurred in an integrated fashion and usually remained implicit during interactions. The language of CanMEDS was not adopted in clinical practice, which seemed to impede explicit learning interactions. The CanMEDS framework seemed only one of many factors of influence in practice-based training: patient records and other documents were highly influential in daily activities and did not always correspond with CanMEDS roles. Additionally, the position of residents seemed too peripheral to allow them to learn certain aspects of the Health Advocate and Leader roles. The CanMEDS framework did not really guide supervisors' and residents' practice or interactions. It was not explicitly used as a common language in which to talk about resident performance and roles. Therefore, the extent to which CanMEDS actually helps improve residents' learning trajectories and conversations between residents and supervisors about residents' progress remains questionable. This study highlights the fact that the reification of competency frameworks into the complexity of practice-based learning is not a straightforward exercise. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
Harper, Stacey L; Hutchison, James E; Baker, Nathan; Ostraat, Michele; Tinkle, Sally; Steevens, Jeffrey; Hoover, Mark D; Adamick, Jessica; Rajan, Krishna; Gaheen, Sharon; Cohen, Yoram; Nel, Andre; Cachau, Raul E; Tuominen, Mark
2013-01-01
The quantity of information on nanomaterial properties and behavior continues to grow rapidly. Without a concerted effort to collect, organize and mine disparate information coming out of current research efforts, the value and effective use of this information will be limited at best. Data will not be translated to knowledge. At worst, erroneous conclusions will be drawn and future research may be misdirected. Nanoinformatics can be a powerful approach to enhance the value of global information in nanoscience and nanotechnology. Much progress has been made through grassroots efforts in nanoinformatics resulting in a multitude of resources and tools for nanoscience researchers. In 2012, the nanoinformatics community believed it was important to critically evaluate and refine currently available nanoinformatics approaches in order to best inform the science and support the future of predictive nanotechnology. The Greener Nano 2012: Nanoinformatics Tools and Resources Workshop brought together informatics groups with materials scientists active in nanoscience research to evaluate and reflect on the tools and resources that have recently emerged in support of predictive nanotechnology. The workshop goals were to establish a better understanding of current nanoinformatics approaches and to clearly define immediate and projected informatics infrastructure needs of the nanotechnology community. The theme of nanotechnology environmental health and safety (nanoEHS) was used to provide real-world, concrete examples on how informatics can be utilized to advance our knowledge and guide nanoscience. The benefit here is that the same properties that impact the performance of products could also be the properties that inform EHS. From a decision management standpoint, the dual use of such data should be considered a priority. Key outcomes include a proposed collaborative framework for data collection, data sharing and information integration.
Harper, Stacey L; Hutchison, James E; Baker, Nathan; Ostraat, Michele; Tinkle, Sally; Steevens, Jeffrey; Hoover, Mark D; Adamick, Jessica; Rajan, Krishna; Gaheen, Sharon; Cohen, Yoram; Nel, Andre; Cachau, Raul E; Tuominen, Mark
2014-01-01
The quantity of information on nanomaterial properties and behavior continues to grow rapidly. Without a concerted effort to collect, organize and mine disparate information coming out of current research efforts, the value and effective use of this information will be limited at best. Data will not be translated to knowledge. At worst, erroneous conclusions will be drawn and future research may be misdirected. Nanoinformatics can be a powerful approach to enhance the value of global information in nanoscience and nanotechnology. Much progress has been made through grassroots efforts in nanoinformatics resulting in a multitude of resources and tools for nanoscience researchers. In 2012, the nanoinformatics community believed it was important to critically evaluate and refine currently available nanoinformatics approaches in order to best inform the science and support the future of predictive nanotechnology. The Greener Nano 2012: Nanoinformatics Tools and Resources Workshop brought together informatics groups with materials scientists active in nanoscience research to evaluate and reflect on the tools and resources that have recently emerged in support of predictive nanotechnology. The workshop goals were to establish a better understanding of current nanoinformatics approaches and to clearly define immediate and projected informatics infrastructure needs of the nanotechnology community. The theme of nanotechnology environmental health and safety (nanoEHS) was used to provide real-world, concrete examples on how informatics can be utilized to advance our knowledge and guide nanoscience. The benefit here is that the same properties that impact the performance of products could also be the properties that inform EHS. From a decision management standpoint, the dual use of such data should be considered a priority. Key outcomes include a proposed collaborative framework for data collection, data sharing and information integration. PMID:24454543
NASA Astrophysics Data System (ADS)
Feyen, Luc; Gorelick, Steven M.
2005-03-01
We propose a framework that combines simulation optimization with Bayesian decision analysis to evaluate the worth of hydraulic conductivity data for optimal groundwater resources management in ecologically sensitive areas. A stochastic simulation optimization management model is employed to plan regionally distributed groundwater pumping while preserving the hydroecological balance in wetland areas. Because predictions made by an aquifer model are uncertain, groundwater supply systems operate below maximum yield. Collecting data from the groundwater system can potentially reduce predictive uncertainty and increase safe water production. The price paid for improvement in water management is the cost of collecting the additional data. Efficient data collection using Bayesian decision analysis proceeds in three stages: (1) The prior analysis determines the optimal pumping scheme and profit from water sales on the basis of known information. (2) The preposterior analysis estimates the optimal measurement locations and evaluates whether each sequential measurement will be cost-effective before it is taken. (3) The posterior analysis then revises the prior optimal pumping scheme and consequent profit, given the new information. Stochastic simulation optimization employing a multiple-realization approach is used to determine the optimal pumping scheme in each of the three stages. The cost of new data must not exceed the expected increase in benefit obtained in optimal groundwater exploitation. An example based on groundwater management practices in Florida aimed at wetland protection showed that the cost of data collection more than paid for itself by enabling a safe and reliable increase in production.
Validation of educational assessments: a primer for simulation and beyond.
Cook, David A; Hatala, Rose
2016-01-01
Simulation plays a vital role in health professions assessment. This review provides a primer on assessment validation for educators and education researchers. We focus on simulation-based assessment of health professionals, but the principles apply broadly to other assessment approaches and topics. Validation refers to the process of collecting validity evidence to evaluate the appropriateness of the interpretations, uses, and decisions based on assessment results. Contemporary frameworks view validity as a hypothesis, and validity evidence is collected to support or refute the validity hypothesis (i.e., that the proposed interpretations and decisions are defensible). In validation, the educator or researcher defines the proposed interpretations and decisions, identifies and prioritizes the most questionable assumptions in making these interpretations and decisions (the "interpretation-use argument"), empirically tests those assumptions using existing or newly-collected evidence, and then summarizes the evidence as a coherent "validity argument." A framework proposed by Messick identifies potential evidence sources: content, response process, internal structure, relationships with other variables, and consequences. Another framework proposed by Kane identifies key inferences in generating useful interpretations: scoring, generalization, extrapolation, and implications/decision. We propose an eight-step approach to validation that applies to either framework: Define the construct and proposed interpretation, make explicit the intended decision(s), define the interpretation-use argument and prioritize needed validity evidence, identify candidate instruments and/or create/adapt a new instrument, appraise existing evidence and collect new evidence as needed, keep track of practical issues, formulate the validity argument, and make a judgment: does the evidence support the intended use? Rigorous validation first prioritizes and then empirically evaluates key assumptions in the interpretation and use of assessment scores. Validation science would be improved by more explicit articulation and prioritization of the interpretation-use argument, greater use of formal validation frameworks, and more evidence informing the consequences and implications of assessment.
Conflicts of interest improve collective computation of adaptive social structures
Brush, Eleanor R.; Krakauer, David C.; Flack, Jessica C.
2018-01-01
In many biological systems, the functional behavior of a group is collectively computed by the system’s individual components. An example is the brain’s ability to make decisions via the activity of billions of neurons. A long-standing puzzle is how the components’ decisions combine to produce beneficial group-level outputs, despite conflicts of interest and imperfect information. We derive a theoretical model of collective computation from mechanistic first principles, using results from previous work on the computation of power structure in a primate model system. Collective computation has two phases: an information accumulation phase, in which (in this study) pairs of individuals gather information about their fighting abilities and make decisions about their dominance relationships, and an information aggregation phase, in which these decisions are combined to produce a collective computation. To model information accumulation, we extend a stochastic decision-making model—the leaky integrator model used to study neural decision-making—to a multiagent game-theoretic framework. We then test alternative algorithms for aggregating information—in this study, decisions about dominance resulting from the stochastic model—and measure the mutual information between the resultant power structure and the “true” fighting abilities. We find that conflicts of interest can improve accuracy to the benefit of all agents. We also find that the computation can be tuned to produce different power structures by changing the cost of waiting for a decision. The successful application of a similar stochastic decision-making model in neural and social contexts suggests general principles of collective computation across substrates and scales. PMID:29376116
The development of an evaluation framework for injury surveillance systems
Mitchell, Rebecca J; Williamson, Ann M; O'Connor, Rod
2009-01-01
Background Access to good quality information from injury surveillance is essential to develop and monitor injury prevention activities. To determine if information obtained from surveillance is of high quality, the limitations and strengths of a surveillance system are often examined. Guidelines have been developed to assist in evaluating certain types of surveillance systems. However, to date, no standard guidelines have been developed to specifically evaluate an injury surveillance system. The aim of this research is to develop a framework to guide the evaluation of injury surveillance systems. Methods The development of an Evaluation Framework for Injury Surveillance Systems (EFISS) involved a four stage process. First, a literature review was conducted to identify an initial set of characteristics that were recognised as important and/or had been recommended to be assessed in an evaluation of a surveillance system. Second, this set of characteristics was assessed using SMART criteria. Third, those surviving were presented to an expert panel using a two round modified-Delphi study to gain an alternative perspective on characteristic definitions, practicality of assessment, and characteristic importance. Finally, a rating system was created for the EFISS characteristics. Results The resulting EFISS consisted of 18 characteristics that assess three areas of an injury surveillance system – five characteristics assess data quality, nine characteristics assess the system's operation, and four characteristics assess the practical capability of an injury surveillance system. A rating system assesses the performance of each characteristic. Conclusion The development of the EFISS builds upon existing evaluation guidelines for surveillance systems and provides a framework tailored to evaluate an injury surveillance system. Ultimately, information obtained through an evaluation of an injury data collection using the EFISS would be useful for agencies to recommend how a collection could be improved to increase its usefulness for injury surveillance and in the long-term injury prevention. PMID:19627617
Event-Driven Technology to Generate Relevant Collections of Near-Realtime Data
NASA Astrophysics Data System (ADS)
Graves, S. J.; Keiser, K.; Nair, U. S.; Beck, J. M.; Ebersole, S.
2017-12-01
Getting the right data when it is needed continues to be a challenge for researchers and decision makers. Event-Driven Data Delivery (ED3), funded by the NASA Applied Science program, is a technology that allows researchers and decision makers to pre-plan what data, information and processes they need to have collected or executed in response to future events. The Information Technology and Systems Center at the University of Alabama in Huntsville (UAH) has developed the ED3 framework in collaboration with atmospheric scientists at UAH, scientists at the Geological Survey of Alabama, and other federal, state and local stakeholders to meet the data preparedness needs for research, decisions and situational awareness. The ED3 framework supports an API that supports the addition of loosely-coupled, distributed event handlers and data processes. This approach allows the easy addition of new events and data processes so the system can scale to support virtually any type of event or data process. Using ED3's underlying services, applications have been developed that monitor for alerts of registered event types and automatically triggers subscriptions that match new events, providing users with a living "album" of results that can continued to be curated as more information for an event becomes available. This capability can allow users to improve capacity for the collection, creation and use of data and real-time processes (data access, model execution, product generation, sensor tasking, social media filtering, etc), in response to disaster (and other) events by preparing in advance for data and information needs for future events. This presentation will provide an update on the ED3 developments and deployments, and further explain the applicability for utilizing near-realtime data in hazards research, response and situational awareness.
Causal assessment of surrogacy in a meta-analysis of colorectal cancer trials
Li, Yun; Taylor, Jeremy M.G.; Elliott, Michael R.; Sargent, Daniel J.
2011-01-01
When the true end points (T) are difficult or costly to measure, surrogate markers (S) are often collected in clinical trials to help predict the effect of the treatment (Z). There is great interest in understanding the relationship among S, T, and Z. A principal stratification (PS) framework has been proposed by Frangakis and Rubin (2002) to study their causal associations. In this paper, we extend the framework to a multiple trial setting and propose a Bayesian hierarchical PS model to assess surrogacy. We apply the method to data from a large collection of colon cancer trials in which S and T are binary. We obtain the trial-specific causal measures among S, T, and Z, as well as their overall population-level counterparts that are invariant across trials. The method allows for information sharing across trials and reduces the nonidentifiability problem. We examine the frequentist properties of our model estimates and the impact of the monotonicity assumption using simulations. We also illustrate the challenges in evaluating surrogacy in the counterfactual framework that result from nonidentifiability. PMID:21252079
Fillo, Jennifer; Staplefoote-Boynton, B Lynette; Martinez, Angel; Sontag-Padilla, Lisa; Shadel, William G; Martino, Steven C; Setodji, Claude M; Meeker, Daniella; Scharf, Deborah
2016-12-01
Advances in mobile technology and mobile applications (apps) have opened up an exciting new frontier for behavioral health researchers, with a "second generation" of apps allowing for the simultaneous collection of multiple streams of data in real time. With this comes a host of technical decisions and ethical considerations unique to this evolving approach to research. Drawing on our experience developing a second-generation app for the simultaneous collection of text message, voice, and self-report data, we provide a framework for researchers interested in developing and using second-generation mobile apps to study health behaviors. Our Simplified Novel Application (SNApp) framework breaks the app development process into four phases: (1) information and resource gathering, (2) software and hardware decisions, (3) software development and testing, and (4) study start-up and implementation. At each phase, we address common challenges and ethical issues and make suggestions for effective and efficient app development. Our goal is to help researchers effectively balance priorities related to the function of the app with the realities of app development, human subjects issues, and project resource constraints.
Cloud Privacy Audit Framework: A Value-Based Design
ERIC Educational Resources Information Center
Coss, David Lewis
2013-01-01
The rapid expansion of cloud technology provides enormous capacity, which allows for the collection, dissemination and re-identification of personal information. It is the cloud's resource capabilities such as these that fuel the concern for privacy. The impetus of these concerns are not to far removed from those expressed by Mason in 1986…
ERIC Educational Resources Information Center
Yamagata-Lynch, Lisa C.
2007-01-01
Understanding human activity in real-world situations often involves complicated data collection, analysis, and presentation methods. This article discusses how Cultural-Historical Activity Theory (CHAT) can inform design-based research practices that focus on understanding activity in real-world situations. I provide a sample data set with…
The Couzens Machine. A Computerized Learning Exchange. Final Report, 1973-74.
ERIC Educational Resources Information Center
Davis, Ken, Comp.; Libengood, Richard, Comp.
The Couzens Machine is a computerized learning exchange and information service developed for the residents of Couzens Hall, a dormitory at the University of Michigan. Organized as a collective within the framework of a course and supported by an instructional development grant from the Center for Research on Learning and Teaching, the Couzens…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-08
... lengths, are derived from natural science research. For the survey, a choice experiment framework is used... survey is public value research. The Santa Cruz River is a case study of a waterway highly impacted by... Request; Comment Request; Willingness To Pay Survey for Santa Cruz River Management Options in Southern...
Models of the Behavior of People Searching the Internet: A Petri Net Approach.
ERIC Educational Resources Information Center
Kantor, Paul B.; Nordlie, Ragnar
1999-01-01
Illustrates how various key abstractions of information finding, such as document relevance, a desired number of relevant documents, discouragement, exhaustion, and satisfaction can be modeled using the Petri Net framework. Shows that this model leads naturally to a new approach to collection of user data, and to analysis of transaction logs.…
Kim, Yoonsang; Huang, Jidong; Emery, Sherry
2016-02-26
Social media have transformed the communications landscape. People increasingly obtain news and health information online and via social media. Social media platforms also serve as novel sources of rich observational data for health research (including infodemiology, infoveillance, and digital disease detection detection). While the number of studies using social data is growing rapidly, very few of these studies transparently outline their methods for collecting, filtering, and reporting those data. Keywords and search filters applied to social data form the lens through which researchers may observe what and how people communicate about a given topic. Without a properly focused lens, research conclusions may be biased or misleading. Standards of reporting data sources and quality are needed so that data scientists and consumers of social media research can evaluate and compare methods and findings across studies. We aimed to develop and apply a framework of social media data collection and quality assessment and to propose a reporting standard, which researchers and reviewers may use to evaluate and compare the quality of social data across studies. We propose a conceptual framework consisting of three major steps in collecting social media data: develop, apply, and validate search filters. This framework is based on two criteria: retrieval precision (how much of retrieved data is relevant) and retrieval recall (how much of the relevant data is retrieved). We then discuss two conditions that estimation of retrieval precision and recall rely on--accurate human coding and full data collection--and how to calculate these statistics in cases that deviate from the two ideal conditions. We then apply the framework on a real-world example using approximately 4 million tobacco-related tweets collected from the Twitter firehose. We developed and applied a search filter to retrieve e-cigarette-related tweets from the archive based on three keyword categories: devices, brands, and behavior. The search filter retrieved 82,205 e-cigarette-related tweets from the archive and was validated. Retrieval precision was calculated above 95% in all cases. Retrieval recall was 86% assuming ideal conditions (no human coding errors and full data collection), 75% when unretrieved messages could not be archived, 86% assuming no false negative errors by coders, and 93% allowing both false negative and false positive errors by human coders. This paper sets forth a conceptual framework for the filtering and quality evaluation of social data that addresses several common challenges and moves toward establishing a standard of reporting social data. Researchers should clearly delineate data sources, how data were accessed and collected, and the search filter building process and how retrieval precision and recall were calculated. The proposed framework can be adapted to other public social media platforms.
A Conceptual Framework and Principles for Trusted Pervasive Health
Blobel, Bernd Gerhard; Seppälä, Antto Veikko; Sorvari, Hannu Olavi; Nykänen, Pirkko Anneli
2012-01-01
Background Ubiquitous computing technology, sensor networks, wireless communication and the latest developments of the Internet have enabled the rise of a new concept—pervasive health—which takes place in an open, unsecure, and highly dynamic environment (ie, in the information space). To be successful, pervasive health requires implementable principles for privacy and trustworthiness. Objective This research has two interconnected objectives. The first is to define pervasive health as a system and to understand its trust and privacy challenges. The second goal is to build a conceptual model for pervasive health and use it to develop principles and polices which can make pervasive health trustworthy. Methods In this study, a five-step system analysis method is used. Pervasive health is defined using a metaphor of digital bubbles. A conceptual framework model focused on trustworthiness and privacy is then developed for pervasive health. On that model, principles and rules for trusted information management in pervasive health are defined. Results In the first phase of this study, a new definition of pervasive health was created. Using this model, differences between pervasive health and health care are stated. Reviewed publications demonstrate that the widely used principles of predefined and static trust cannot guarantee trustworthiness and privacy in pervasive health. Instead, such an environment requires personal dynamic and context-aware policies, awareness, and transparency. A conceptual framework model focused on information processing in pervasive health is developed. Using features of pervasive health and relations from the framework model, new principles for trusted pervasive health have been developed. The principles propose that personal health data should be under control of the data subject. The person shall have the right to verify the level of trust of any system which collects or processes his or her health information. Principles require that any stakeholder or system collecting or processing health data must support transparency and shall publish its trust and privacy attributes and even its domain specific policies. Conclusions The developed principles enable trustworthiness and guarantee privacy in pervasive health. The implementation of principles requires new infrastructural services such as trust verification and policy conflict resolution. After implementation, the accuracy and usability of principles should be analyzed. PMID:22481297
A conceptual framework and principles for trusted pervasive health.
Ruotsalainen, Pekka Sakari; Blobel, Bernd Gerhard; Seppälä, Antto Veikko; Sorvari, Hannu Olavi; Nykänen, Pirkko Anneli
2012-04-06
Ubiquitous computing technology, sensor networks, wireless communication and the latest developments of the Internet have enabled the rise of a new concept-pervasive health-which takes place in an open, unsecure, and highly dynamic environment (ie, in the information space). To be successful, pervasive health requires implementable principles for privacy and trustworthiness. This research has two interconnected objectives. The first is to define pervasive health as a system and to understand its trust and privacy challenges. The second goal is to build a conceptual model for pervasive health and use it to develop principles and policies which can make pervasive health trustworthy. In this study, a five-step system analysis method is used. Pervasive health is defined using a metaphor of digital bubbles. A conceptual framework model focused on trustworthiness and privacy is then developed for pervasive health. On that model, principles and rules for trusted information management in pervasive health are defined. In the first phase of this study, a new definition of pervasive health was created. Using this model, differences between pervasive health and health care are stated. Reviewed publications demonstrate that the widely used principles of predefined and static trust cannot guarantee trustworthiness and privacy in pervasive health. Instead, such an environment requires personal dynamic and context-aware policies, awareness, and transparency. A conceptual framework model focused on information processing in pervasive health is developed. Using features of pervasive health and relations from the framework model, new principles for trusted pervasive health have been developed. The principles propose that personal health data should be under control of the data subject. The person shall have the right to verify the level of trust of any system which collects or processes his or her health information. Principles require that any stakeholder or system collecting or processing health data must support transparency and shall publish its trust and privacy attributes and even its domain specific policies. The developed principles enable trustworthiness and guarantee privacy in pervasive health. The implementation of principles requires new infrastructural services such as trust verification and policy conflict resolution. After implementation, the accuracy and usability of principles should be analyzed.
Assessing the impact of healthcare research: A systematic review of methodological frameworks.
Cruz Rivera, Samantha; Kyte, Derek G; Aiyegbusi, Olalekan Lee; Keeley, Thomas J; Calvert, Melanie J
2017-08-01
Increasingly, researchers need to demonstrate the impact of their research to their sponsors, funders, and fellow academics. However, the most appropriate way of measuring the impact of healthcare research is subject to debate. We aimed to identify the existing methodological frameworks used to measure healthcare research impact and to summarise the common themes and metrics in an impact matrix. Two independent investigators systematically searched the Medical Literature Analysis and Retrieval System Online (MEDLINE), the Excerpta Medica Database (EMBASE), the Cumulative Index to Nursing and Allied Health Literature (CINAHL+), the Health Management Information Consortium, and the Journal of Research Evaluation from inception until May 2017 for publications that presented a methodological framework for research impact. We then summarised the common concepts and themes across methodological frameworks and identified the metrics used to evaluate differing forms of impact. Twenty-four unique methodological frameworks were identified, addressing 5 broad categories of impact: (1) 'primary research-related impact', (2) 'influence on policy making', (3) 'health and health systems impact', (4) 'health-related and societal impact', and (5) 'broader economic impact'. These categories were subdivided into 16 common impact subgroups. Authors of the included publications proposed 80 different metrics aimed at measuring impact in these areas. The main limitation of the study was the potential exclusion of relevant articles, as a consequence of the poor indexing of the databases searched. The measurement of research impact is an essential exercise to help direct the allocation of limited research resources, to maximise research benefit, and to help minimise research waste. This review provides a collective summary of existing methodological frameworks for research impact, which funders may use to inform the measurement of research impact and researchers may use to inform study design decisions aimed at maximising the short-, medium-, and long-term impact of their research.
Jull, J; Whitehead, M; Petticrew, M; Kristjansson, E; Gough, D; Petkovic, J; Volmink, J; Weijer, C; Taljaard, M; Edwards, S; Mbuagbaw, L; Cookson, R; McGowan, J; Lyddiatt, A; Boyer, Y; Cuervo, L G; Armstrong, R; White, H; Yoganathan, M; Pantoja, T; Shea, B; Pottie, K; Norheim, O; Baird, S; Robberstad, B; Sommerfelt, H; Asada, Y; Wells, G; Tugwell, P; Welch, V
2017-09-25
Randomised controlled trials can provide evidence relevant to assessing the equity impact of an intervention, but such information is often poorly reported. We describe a conceptual framework to identify health equity-relevant randomised trials with the aim of improving the design and reporting of such trials. An interdisciplinary and international research team engaged in an iterative consensus building process to develop and refine the conceptual framework via face-to-face meetings, teleconferences and email correspondence, including findings from a validation exercise whereby two independent reviewers used the emerging framework to classify a sample of randomised trials. A randomised trial can usefully be classified as 'health equity relevant' if it assesses the effects of an intervention on the health or its determinants of either individuals or a population who experience ill health due to disadvantage defined across one or more social determinants of health. Health equity-relevant randomised trials can either exclusively focus on a single population or collect data potentially useful for assessing differential effects of the intervention across multiple populations experiencing different levels or types of social disadvantage. Trials that are not classified as 'health equity relevant' may nevertheless provide information that is indirectly relevant to assessing equity impact, including information about individual level variation unrelated to social disadvantage and potentially useful in secondary modelling studies. The conceptual framework may be used to design and report randomised trials. The framework could also be used for other study designs to contribute to the evidence base for improved health equity. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Clinical Knowledge Governance Framework for Nationwide Data Infrastructure Projects.
Wulff, Antje; Haarbrandt, Birger; Marschollek, Michael
2018-01-01
The availability of semantically-enriched and interoperable clinical information models is crucial for reusing once collected data across institutions like aspired in the German HiGHmed project. Funded by the Federal Ministry of Education and Research, this nationwide data infrastructure project adopts the openEHR approach for semantic modelling. Here, strong governance is required to define high-quality and reusable models. Design of a clinical knowledge governance framework for openEHR modelling in cross-institutional settings like HiGHmed. Analysis of successful practices from international projects, published ideas on archetype governance and own modelling experiences as well as modelling of BPMN processes. We designed a framework by presenting archetype variations, roles and responsibilities, IT support and modelling workflows. Our framework has great potential to make the openEHR modelling efforts manageable. Because practical experiences are rare, prospectively our work will be predestinated to evaluate the benefits of such structured governance approaches.
Plamondon, Katrina M; Bottorff, Joan L; Cole, Donald C
2015-11-01
Deliberative dialogue (DD) is a knowledge translation strategy that can serve to generate rich data and bridge health research with action. An intriguing alternative to other modes of generating data, the purposeful and evidence-informed conversations characteristic of DD generate data inclusive of collective interpretations. These data are thus dialogic, presenting complex challenges for qualitative analysis. In this article, we discuss the nature of data generated through DD, orienting ourselves toward a theoretically grounded approach to analysis. We offer an integrated framework for analysis, balancing analytical strategies of categorizing and connecting with the use of empathetic and suspicious interpretive lenses. In this framework, data generation and analysis occur in concert, alongside engaging participants and synthesizing evidence. An example of application is provided, demonstrating nuances of the framework. We conclude with reflections on the strengths and limitations of the framework, suggesting how it may be relevant in other qualitative health approaches. © The Author(s) 2015.
Using architectures for semantic interoperability to create journal clubs for emergency response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, James E; Collins, Linn M; Martinez, Mark L B
2009-01-01
In certain types of 'slow burn' emergencies, careful accumulation and evaluation of information can offer a crucial advantage. The SARS outbreak in the first decade of the 21st century was such an event, and ad hoc journal clubs played a critical role in assisting scientific and technical responders in identifying and developing various strategies for halting what could have become a dangerous pandemic. This research-in-progress paper describes a process for leveraging emerging semantic web and digital library architectures and standards to (1) create a focused collection of bibliographic metadata, (2) extract semantic information, (3) convert it to the Resource Descriptionmore » Framework /Extensible Markup Language (RDF/XML), and (4) integrate it so that scientific and technical responders can share and explore critical information in the collections.« less
Health information management in the home: a human factors assessment.
Zayas-Cabán, Teresa
2012-01-01
Achieving optimal health outcomes requires that consumers maintain myriad health data and understand how to utilize appropriate health information management applications. This case study investigated four families' health information management tasks in their homes. Four different families participated in the study: a single parent household; two nuclear family households; and an extended family household. A work system model known as the balance model was used as a guiding framework for data collection. Data collection consisted of three stages: (1) primary health information manager interviews; (2) family interviews; and (3) task observations. Overall, families reported 69 unique health information management tasks that took place in nine different locations, using 22 different information storage artifacts. Frequently occurring tasks related to health management or health coordination were conducted in public spaces. Less frequent or more time-consuming tasks, such as researching a health concern or storing medical history, were performed in private spaces such as bedrooms or studies. Similarities across households suggest potential foundational design elements that consumer health information technology application designers need to balance with tailored interventions to successfully support variations in individuals' health information management needs.
Information Service System For Small Forestowners
NASA Astrophysics Data System (ADS)
Zhang, Shaochen; Li, Yun
Individual owned forests have boomed in the last decade in China. Hundreds of millions of private forest owners have emerged since years of afforestation practice and collective forest ownership reform. Most of those private forest owners are former peasants living in afforestation areas. They thirst for forestry information, such as technique knowledge, forestry policies, finance, marketing, etc. Unfortunately the ways they could get certain information are very limit. Before internet time, Local governments are the main channel they search helps for useful information and technique supports. State and local governments have paid much attention to provide necessary forestry technique supports to those small forest owners and provided varies training projects, issued official forestry information through their websites. While, as state government expands household contract system in the management of collective forestry land, the number of individual forest owners is bumping up in future 5 years. There is still a gap between supplying ability and requirement of forestry information. To construct an effective forestry information service system in next 3-5 year can bridge the gap. This paper discusses the framework of such an information service system.
Wilson, Caroline; Rooshenas, Leila; Paramasivan, Sangeetha; Elliott, Daisy; Jepson, Marcus; Strong, Sean; Birtle, Alison; Beard, David J; Halliday, Alison; Hamdy, Freddie C; Lewis, Rebecca; Metcalfe, Chris; Rogers, Chris A; Stein, Robert C; Blazeby, Jane M; Donovan, Jenny L
2018-01-19
Research has shown that recruitment to trials is a process that stretches from identifying potentially eligible patients, through eligibility assessment, to obtaining informed consent. The length and complexity of this pathway means that many patients do not have the opportunity to consider participation. This article presents the development of a simple framework to document, understand and improve the process of trial recruitment. Eight RCTs integrated a QuinteT Recruitment Intervention (QRI) into the main trial, feasibility or pilot study. Part of the QRI required mapping the patient recruitment pathway using trial-specific screening and recruitment logs. A content analysis compared the logs to identify aspects of the recruitment pathway and process that were useful in monitoring and improving recruitment. Findings were synthesised to develop an optimised simple framework that can be used in a wide range of RCTs. The eight trials recorded basic information about patients screened for trial participation and randomisation outcome. Three trials systematically recorded reasons why an individual was not enrolled in the trial, and further details why they were not eligible or approached, or declined randomisation. A framework to facilitate clearer recording of the recruitment process and reasons for non-participation was developed: SEAR - Screening, to identify potentially eligible trial participants; Eligibility, assessed against the trial protocol inclusion/exclusion criteria; Approach, the provision of oral and written information and invitation to participate in the trial, and Randomised or not, with the outcome of randomisation or treatment received. The SEAR framework encourages the collection of information to identify recruitment obstacles and facilitate improvements to the recruitment process. SEAR can be adapted to monitor recruitment to most RCTs, but is likely to add most value in trials where recruitment problems are anticipated or evident. Further work to test it more widely is recommended.
Keeping All the PIECES: Phylogenetically Informed Ex Situ Conservation of Endangered Species.
Larkin, Daniel J; Jacobi, Sarah K; Hipp, Andrew L; Kramer, Andrea T
2016-01-01
Ex situ conservation in germplasm and living collections is a major focus of global plant conservation strategies. Prioritizing species for ex situ collection is a necessary component of this effort for which sound strategies are needed. Phylogenetic considerations can play an important role in prioritization. Collections that are more phylogenetically diverse are likely to encompass more ecological and trait variation, and thus provide stronger conservation insurance and richer resources for future restoration efforts. However, phylogenetic criteria need to be weighed against other, potentially competing objectives. We used ex situ collection and threat rank data for North American angiosperms to investigate gaps in ex situ coverage and phylogenetic diversity of collections and to develop a flexible framework for prioritizing species across multiple objectives. We found that ex situ coverage of 18,766 North American angiosperm taxa was low with respect to the most vulnerable taxa: just 43% of vulnerable to critically imperiled taxa were in ex situ collections, far short of a year-2020 goal of 75%. In addition, species held in ex situ collections were phylogenetically clustered (P < 0.001), i.e., collections comprised less phylogenetic diversity than would be expected had species been drawn at random. These patterns support incorporating phylogenetic considerations into ex situ prioritization in a manner balanced with other criteria, such as vulnerability. To meet this need, we present the 'PIECES' index (Phylogenetically Informed Ex situ Conservation of Endangered Species). PIECES integrates phylogenetic considerations into a flexible framework for prioritizing species across competing objectives using multi-criteria decision analysis. Applying PIECES to prioritizing ex situ conservation of North American angiosperms, we show strong return on investment across multiple objectives, some of which are negatively correlated with each other. A spreadsheet-based decision support tool for North American angiosperms is provided; this tool can be customized to align with different conservation objectives.
Keeping All the PIECES: Phylogenetically Informed Ex Situ Conservation of Endangered Species
Larkin, Daniel J.; Jacobi, Sarah K.; Hipp, Andrew L.; Kramer, Andrea T.
2016-01-01
Ex situ conservation in germplasm and living collections is a major focus of global plant conservation strategies. Prioritizing species for ex situ collection is a necessary component of this effort for which sound strategies are needed. Phylogenetic considerations can play an important role in prioritization. Collections that are more phylogenetically diverse are likely to encompass more ecological and trait variation, and thus provide stronger conservation insurance and richer resources for future restoration efforts. However, phylogenetic criteria need to be weighed against other, potentially competing objectives. We used ex situ collection and threat rank data for North American angiosperms to investigate gaps in ex situ coverage and phylogenetic diversity of collections and to develop a flexible framework for prioritizing species across multiple objectives. We found that ex situ coverage of 18,766 North American angiosperm taxa was low with respect to the most vulnerable taxa: just 43% of vulnerable to critically imperiled taxa were in ex situ collections, far short of a year-2020 goal of 75%. In addition, species held in ex situ collections were phylogenetically clustered (P < 0.001), i.e., collections comprised less phylogenetic diversity than would be expected had species been drawn at random. These patterns support incorporating phylogenetic considerations into ex situ prioritization in a manner balanced with other criteria, such as vulnerability. To meet this need, we present the ‘PIECES’ index (Phylogenetically Informed Ex situ Conservation of Endangered Species). PIECES integrates phylogenetic considerations into a flexible framework for prioritizing species across competing objectives using multi-criteria decision analysis. Applying PIECES to prioritizing ex situ conservation of North American angiosperms, we show strong return on investment across multiple objectives, some of which are negatively correlated with each other. A spreadsheet-based decision support tool for North American angiosperms is provided; this tool can be customized to align with different conservation objectives. PMID:27257671
NASA Astrophysics Data System (ADS)
Shapiro, C. D.
2014-12-01
Data democracy is a concept that has great relevance to the use and value of geospatial data and scientific information. Data democracy describes a world in which data and information are widely and broadly accessible, understandable, and useable. The concept operationalizes the public good nature of scientific information and provides a framework for increasing benefits from its use. Data democracy encompasses efforts to increase accessibility to geospatial data and to expand participation in its collection, analysis, and application. These two pillars are analogous to demand and supply relationships. Improved accessibility, or demand, includes increased knowledge about geospatial data and low barriers to retrieval and use. Expanded participation, or supply, encompasses a broader community involved in developing geospatial data and scientific information. This pillar of data democracy is characterized by methods such as citizen science or crowd sourcing.A framework is developed for advancing the use of data democracy. This includes efforts to assess the societal benefits (economic and social) of scientific information. This knowledge is critical to continued monitoring of the effectiveness of data democracy implementation and of potential impact on the use and value of scientific information. The framework also includes an assessment of opportunities for advancing data democracy both on the supply and demand sides. These opportunities include relatively inexpensive efforts to reduce barriers to use as well as the identification of situations in which participation can be expanded in scientific efforts to enhance the breadth of involvement as well as expanding participation to non-traditional communities. This framework provides an initial perspective on ways to expand the "scientific community" of data users and providers. It also describes a way forward for enhancing the societal benefits from geospatial data and scientific information. As a result, data democracy not only provides benefits to a greater population, it enhances the value of science.
Environmental exposure assessment framework for nanoparticles in solid waste.
Boldrin, Alessio; Hansen, Steffen Foss; Baun, Anders; Hartmann, Nanna Isabella Bloch; Astrup, Thomas Fruergaard
2014-01-01
Information related to the potential environmental exposure of engineered nanomaterials (ENMs) in the solid waste management phase is extremely scarce. In this paper, we define nanowaste as separately collected or collectable waste materials which are or contain ENMs, and we present a five-step framework for the systematic assessment of ENM exposure during nanowaste management. The framework includes deriving EOL nanoproducts and evaluating the physicochemical properties of the nanostructure, matrix properties and nanowaste treatment processes as well as transformation processes and environment releases, eventually leading to a final assessment of potential ENM exposure. The proposed framework was applied to three selected nanoproducts: nanosilver polyester textile, nanoTiO 2 sunscreen lotion and carbon nanotube tennis racquets. We found that the potential global environmental exposure of ENMs associated with these three products was an estimated 0.5-143 Mg/year, which can also be characterised qualitatively as medium, medium, low, respectively. Specific challenges remain and should be subject to further research: (1) analytical techniques for the characterisation of nanowaste and its transformation during waste treatment processes, (2) mechanisms for the release of ENMs, (3) the quantification of nanowaste amounts at the regional scale, (4) a definition of acceptable limit values for exposure to ENMs from nanowaste and (5) the reporting of nanowaste generation data.
Environmental exposure assessment framework for nanoparticles in solid waste
NASA Astrophysics Data System (ADS)
Boldrin, Alessio; Hansen, Steffen Foss; Baun, Anders; Hartmann, Nanna Isabella Bloch; Astrup, Thomas Fruergaard
2014-06-01
Information related to the potential environmental exposure of engineered nanomaterials (ENMs) in the solid waste management phase is extremely scarce. In this paper, we define nanowaste as separately collected or collectable waste materials which are or contain ENMs, and we present a five-step framework for the systematic assessment of ENM exposure during nanowaste management. The framework includes deriving EOL nanoproducts and evaluating the physicochemical properties of the nanostructure, matrix properties and nanowaste treatment processes as well as transformation processes and environment releases, eventually leading to a final assessment of potential ENM exposure. The proposed framework was applied to three selected nanoproducts: nanosilver polyester textile, nanoTiO2 sunscreen lotion and carbon nanotube tennis racquets. We found that the potential global environmental exposure of ENMs associated with these three products was an estimated 0.5-143 Mg/year, which can also be characterised qualitatively as medium, medium, low, respectively. Specific challenges remain and should be subject to further research: (1) analytical techniques for the characterisation of nanowaste and its transformation during waste treatment processes, (2) mechanisms for the release of ENMs, (3) the quantification of nanowaste amounts at the regional scale, (4) a definition of acceptable limit values for exposure to ENMs from nanowaste and (5) the reporting of nanowaste generation data.
Real-time tracking of visually attended objects in virtual environments and its application to LOD.
Lee, Sungkil; Kim, Gerard Jounghyun; Choi, Seungmoon
2009-01-01
This paper presents a real-time framework for computationally tracking objects visually attended by the user while navigating in interactive virtual environments. In addition to the conventional bottom-up (stimulus-driven) saliency map, the proposed framework uses top-down (goal-directed) contexts inferred from the user's spatial and temporal behaviors, and identifies the most plausibly attended objects among candidates in the object saliency map. The computational framework was implemented using GPU, exhibiting high computational performance adequate for interactive virtual environments. A user experiment was also conducted to evaluate the prediction accuracy of the tracking framework by comparing objects regarded as visually attended by the framework to actual human gaze collected with an eye tracker. The results indicated that the accuracy was in the level well supported by the theory of human cognition for visually identifying single and multiple attentive targets, especially owing to the addition of top-down contextual information. Finally, we demonstrate how the visual attention tracking framework can be applied to managing the level of details in virtual environments, without any hardware for head or eye tracking.
Towse, Adrian; Garrison, Louis P
2010-01-01
This article examines performance-based risk-sharing agreements for pharmaceuticals from a theoretical economic perspective. We position these agreements as a form of coverage with evidence development. New performance-based risk sharing could produce a more efficient market equilibrium, achieved by adjustment of the price post-launch to reflect outcomes combined with a new approach to the post-launch costs of evidence collection. For this to happen, the party best able to manage or to bear specific risks must do so. Willingness to bear risk will depend not only on ability to manage it, but on the degree of risk aversion. We identify three related frameworks that provide relevant insights: value of information, real option theory and money-back guarantees. We identify four categories of risk sharing: budget impact, price discounting, outcomes uncertainty and subgroup uncertainty. We conclude that a value of information/real option framework is likely to be the most helpful approach for understanding the costs and benefits of risk sharing. There are a number of factors that are likely to be crucial in determining if performance-based or risk-sharing agreements are efficient and likely to become more important in the future: (i) the cost and practicality of post-launch evidence collection relative to pre-launch; (ii) the feasibility of coverage with evidence development without a pre-agreed contract as to how the evidence will be used to adjust price, revenues or use, in which uncertainty around the pay-off to additional research will reduce the incentive for the manufacturer to collect the information; (iii) the difficulty of writing and policing risk-sharing agreements; (iv) the degree of risk aversion (and therefore opportunity to trade) on the part of payers and manufacturers; and (v) the extent of transferability of data from one country setting to another to support coverage with evidence development in a risk-sharing framework. There is no doubt that--in principle--risk sharing can provide manufacturers and payers additional real options that increase overall efficiency. Given the lack of empirical evidence on the success of schemes already agreed and on the issues we set out above, it is too early to tell if the recent surge of interest in these arrangements is likely to be a trend or only a fad.
Evaluating the use of key performance indicators to evidence the patient experience.
McCance, Tanya; Hastings, Jack; Dowler, Hilda
2015-11-01
To test eight person-centred key performance indicators and the feasibility of an appropriate measurement framework as an approach to evidencing the patient experience. The value of measuring the quality of patient care is undisputed in the international literature, however, the type of measures that can be used to generate data that is meaningful for practice continues to be debated. This paper offers a different perspective to the 'measurement' of the nursing and midwifery contribution to the patient experience. Fourth generation evaluation was the methodological approach used to evaluate the implementation of the key performance indicators and measurement framework across three participating organisations involving nine practice settings. Data were collected by repeated use of claims, concerns and issues with staff working across nine participating sites (n = 18) and the senior executives from the three partner organisations (n = 12). Data were collected during the facilitated sessions with stakeholders and analysed in conjunction with the data generated from the measurement framework. The data reveal the inherent value placed on the evidence generated from the implementation of the key performance indicators as reflected in the following themes: measuring what matters; evidencing the patient experience; engaging staff; a focus for improving practice; and articulating and demonstrating the positive contribution of nursing and midwifery. The implementation of the key performance indicators and the measurement framework has been effective in generating evidence that demonstrates the patient experience. The nature of the data generated not only privileges the patient voice but also offers feedback to nurses and midwives that can inform the development of person-centred cultures. The use of these indicators will produce evidence of patient experience that can be used by nurse and midwives to celebrate and further inform person-centred practice. © 2015 John Wiley & Sons Ltd.
Flexible patient information search and retrieval framework: pilot implementation
NASA Astrophysics Data System (ADS)
Erdal, Selnur; Catalyurek, Umit V.; Saltz, Joel; Kamal, Jyoti; Gurcan, Metin N.
2007-03-01
Medical centers collect and store significant amount of valuable data pertaining to patients' visit in the form of medical free-text. In addition, standardized diagnosis codes (International Classification of Diseases, Ninth Revision, Clinical Modification: ICD9-CM) related to those dictated reports are usually available. In this work, we have created a framework where image searches could be initiated through a combination of free-text reports as well as ICD9 codes. This framework enables more comprehensive search on existing large sets of patient data in a systematic way. The free text search is enriched by computer-aided inclusion of additional search terms enhanced by a thesaurus. This combination of enriched search allows users to access to a larger set of relevant results from a patient-centric PACS in a simpler way. Therefore, such framework is of particular use in tasks such as gathering images for desired patient populations, building disease models, and so on. As the motivating application of our framework, we implemented a search engine. This search engine processed two years of patient data from the OSU Medical Center's Information Warehouse and identified lung nodule location information using a combination of UMLS Meta-Thesaurus enhanced text report searches along with ICD9 code searches on patients that have been discharged. Five different queries with various ICD9 codes involving lung cancer were carried out on 172552 cases. Each search was completed under a minute on average per ICD9 code and the inclusion of UMLS thesaurus increased the number of relevant cases by 45% on average.
Gibson, Amelia N.
2016-01-01
This grounded theory study used in-depth, semi-structured interview to examine the information-seeking behaviors of 35 parents of children with Down syndrome. Emergent themes include a progressive pattern of behavior including information overload and avoidance, passive attention, and active information seeking; varying preferences between tacit and explicit information at different stages; and selection of information channels and sources that varied based on personal and situational constraints. Based on the findings, the author proposes a progressive model of health information seeking and a framework for using this model to collect data in practice. The author also discusses the practical and theoretical implications of a responsive, progressive approach to understanding parents’ health information–seeking behavior. PMID:28462351
The Role of Conceptual Frameworks in Collecting Multisite Qualitative Data.
ERIC Educational Resources Information Center
Lotto, Linda S.
1983-01-01
Examines the use of conceptual frameworks in collecting qualitative data from multiple sites. Presents strategies for devising frameworks that are flexible and general without sacrificing specificity. (JOW)
Whittaker, Maxine; Hodge, Nicola; Mares, Renata E; Rodney, Anna
2015-04-01
Health information is required for a variety of purposes at all levels of a health system, and a workforce skilled in collecting, analysing, presenting, and disseminating such information is essential to fulfil these demands. While it is established that low- and middle-income countries (LMICs) are facing shortages in human resources for health (HRH), there has been little systematic attention focussed on non-clinical competencies. In response, we developed a framework that defines the minimum health information competencies required by health workers at various levels of a health system. Using the Delphi method, we consulted with leading global health information system (HIS) experts. An initial list of competencies and draft framework were developed based on results of a systematic literature review. During the second half of 2012, we sampled 38 experts with broad-based HIS knowledge and extensive development experience. Two rounds of consultation were carried out with the same group to establish validity of the framework and gain feedback on the draft competencies. Responses from consultations were analysed using Qualtrics® software and content analysis. In round one, 17 experts agreed to participate in the consultation and 11 (65%) completed the survey. In the second round, 11 experts agreed to participate and eight (73%) completed the survey. Overall, respondents agreed that there is a need for all health workers to have basic HIS competencies and that the concept of a minimum HIS competency framework is valid. Consensus was reached around the inclusion of 68 competencies across four levels of a health system. This consultation is one of the first to identify the HIS competencies required among general health workers, as opposed to specialist HIS roles. It is also one of the first attempts to develop a framework on minimum HIS competencies needed in LMICs, highlighting the skills needed at each level of the system, and identifying potential gaps in current training to allow a more systematic approach to HIS capacity-building.
Meeting report: advancing practical applications of biodiversity ontologies
2014-01-01
We describe the outcomes of three recent workshops aimed at advancing development of the Biological Collections Ontology (BCO), the Population and Community Ontology (PCO), and tools to annotate data using those and other ontologies. The first workshop gathered use cases to help grow the PCO, agreed upon a format for modeling challenging concepts such as ecological niche, and developed ontology design patterns for defining collections of organisms and population-level phenotypes. The second focused on mapping datasets to ontology terms and converting them to Resource Description Framework (RDF), using the BCO. To follow-up, a BCO hackathon was held concurrently with the 16th Genomics Standards Consortium Meeting, during which we converted additional datasets to RDF, developed a Material Sample Core for the Global Biodiversity Information Framework, created a Web Ontology Language (OWL) file for importing Darwin Core classes and properties into BCO, and developed a workflow for converting biodiversity data among formats.
INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorensek, M.; Hamm, L.; Garcia, H.
2011-07-18
Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come frommore » many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.« less
Kirschenmann Road multi-well monitoring site, Cuyama Valley, Santa Barbara County, California
Everett, R.R.; Hanson, R.T.; Sweetkind, D.S.
2011-01-01
The U.S. Geological Survey (USGS), in cooperation with the Water Agency Division of the Santa Barbara County Department of Public Works, is evaluating the geohydrology and water availability of the Cuyama Valley, California (fig. 1). As part of this evaluation, the USGS installed the Cuyama Valley Kirschenmann Road multiple-well monitoring site (CVKR) in the South-Main subregion of the Cuyama Valley (fig. 1). The CVKR well site is designed to allow for the collection of depth-specific water-level and water-quality data. Data collected at this site provides information about the geology, hydrology, geophysics, and geochemistry of the local aquifer system, thus, enhancing the understanding of the geohydrologic framework of the Cuyama Valley. This report presents the construction information and initial geohydrologic data collected from the CVKR monitoring site, along with a brief comparison to selected supply and irrigation wells from the major subregions of the Cuyama Valley (fig. 1).
Resources monitoring and automatic management system for multi-VO distributed computing system
NASA Astrophysics Data System (ADS)
Chen, J.; Pelevanyuk, I.; Sun, Y.; Zhemchugov, A.; Yan, T.; Zhao, X. H.; Zhang, X. M.
2017-10-01
Multi-VO supports based on DIRAC have been set up to provide workload and data management for several high energy experiments in IHEP. To monitor and manage the heterogeneous resources which belong to different Virtual Organizations in a uniform way, a resources monitoring and automatic management system based on Resource Status System(RSS) of DIRAC has been presented in this paper. The system is composed of three parts: information collection, status decision and automatic control, and information display. The information collection includes active and passive way of gathering status from different sources and stores them in databases. The status decision and automatic control is used to evaluate the resources status and take control actions on resources automatically through some pre-defined policies and actions. The monitoring information is displayed on a web portal. Both the real-time information and historical information can be obtained from the web portal. All the implementations are based on DIRAC framework. The information and control including sites, policies, web portal for different VOs can be well defined and distinguished within DIRAC user and group management infrastructure.
Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M
2018-03-05
The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative analysis of heterogeneous data types in the development of complex botanicals such as polyphenols for eventual clinical and translational applications.
A general framework for a collaborative water quality knowledge and information network.
Dalcanale, Fernanda; Fontane, Darrell; Csapo, Jorge
2011-03-01
Increasing knowledge about the environment has brought about a better understanding of the complexity of the issues, and more information publicly available has resulted into a steady shift from centralized decision making to increasing levels of participatory processes. The management of that information, in turn, is becoming more complex. One of the ways to deal with the complexity is the development of tools that would allow all players, including managers, researchers, educators, stakeholders and the civil society, to be able to contribute to the information system, in any level they are inclined to do so. In this project, a search for the available technology for collaboration, methods of community filtering, and community-based review was performed and the possible implementation of these tools to create a general framework for a collaborative "Water Quality Knowledge and Information Network" was evaluated. The main goals of the network are to advance water quality education and knowledge; encourage distribution and access to data; provide networking opportunities; allow public perceptions and concerns to be collected; promote exchange of ideas; and, give general, open, and free access to information. A reference implementation was made available online and received positive feedback from the community, which also suggested some possible improvements.
A General Framework for a Collaborative Water Quality Knowledge and Information Network
NASA Astrophysics Data System (ADS)
Dalcanale, Fernanda; Fontane, Darrell; Csapo, Jorge
2011-03-01
Increasing knowledge about the environment has brought about a better understanding of the complexity of the issues, and more information publicly available has resulted into a steady shift from centralized decision making to increasing levels of participatory processes. The management of that information, in turn, is becoming more complex. One of the ways to deal with the complexity is the development of tools that would allow all players, including managers, researchers, educators, stakeholders and the civil society, to be able to contribute to the information system, in any level they are inclined to do so. In this project, a search for the available technology for collaboration, methods of community filtering, and community-based review was performed and the possible implementation of these tools to create a general framework for a collaborative "Water Quality Knowledge and Information Network" was evaluated. The main goals of the network are to advance water quality education and knowledge; encourage distribution and access to data; provide networking opportunities; allow public perceptions and concerns to be collected; promote exchange of ideas; and, give general, open, and free access to information. A reference implementation was made available online and received positive feedback from the community, which also suggested some possible improvements.
Marine and Hydrokinetic Technology Development Risk Management Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snowberg, David; Weber, Jochem
2015-09-01
Over the past decade, the global marine and hydrokinetic (MHK) industry has suffered a number of serious technological and commercial setbacks. To help reduce the risks of industry failures and advance the development of new technologies, the U.S. Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL) developed an MHK Risk Management Framework. By addressing uncertainties, the MHK Risk Management Framework increases the likelihood of successful development of an MHK technology. It covers projects of any technical readiness level (TRL) or technical performance level (TPL) and all risk types (e.g. technological risk, regulatory risk, commercial risk) over themore » development cycle. This framework is intended for the development and deployment of a single MHK technology—not for multiple device deployments within a plant. This risk framework is intended to meet DOE’s risk management expectations for the MHK technology research and development efforts of the Water Power Program (see Appendix A). It also provides an overview of other relevant risk management tools and documentation.1 This framework emphasizes design and risk reviews as formal gates to ensure risks are managed throughout the technology development cycle. Section 1 presents the recommended technology development cycle, Sections 2 and 3 present tools to assess the TRL and TPL of the project, respectively. Section 4 presents a risk management process with design and risk reviews for actively managing risk within the project, and Section 5 presents a detailed description of a risk registry to collect the risk management information into one living document. Section 6 presents recommendations for collecting and using lessons learned throughout the development process.« less
On the coding and reporting of race and ethnicity in New Hampshire for purposes of cancer reporting.
Riddle, Bruce L
2005-01-01
The objective was to investigate how data on race and ethnicity are collected by hospitals reporting to the New Hampshire State Cancer Registry (NHSCR). NHSCR surveyed hospitals asking how information on race and ethnicity were collected. A review of relevant legal mandates and national guidelines was undertaken. Many hospitals lack policies on collection, computer systems fail to support national guidelines, and staff rely on visual inspection. Hospital staffs are not now culturally equipped to collect race and ethnicity in a meaningful way. The numerator in cancer incidence rates is most likely not accurate and for some smaller populations very biased. A new framework is needed that takes into account the needs of the democracy.
Unleashing Our Untapped Domestic Collection is the Key to Prevention
2007-09-01
Information Center (NCIC), Uniform Crime Reporting (UCR), and Integrated Automated Fingerprint Identification System (IAFIS) fingerprint ...The Blue Ocean Strategy Canvas , as described by Kim and Mauborgne, is an analytical framework that is both diagnostic and action oriented...The authors argue the value of a strategy canvas is its ability to capture the current state, provide an understanding of various factors impacting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-29
... ID No. EPA-HQ-OEI- 2011-0096, to (1) EPA online using http://www.regulations.gov (our preferred... a public docket for this ICR under Docket ID No. EPA-HQ-OEI-2011-0096, which is available for online..., technology-neutral framework for electronic reporting across all EPA programs; allow EPA programs to offer...
ERIC Educational Resources Information Center
Russell, Heather Gordy
2010-01-01
The mixed method study focused on increasing blood donations from staff who work in a blood collecting organization and relies on Gilbert's Behavior Engineering Model as a framework. The qualitative phase of the study involved focus groups. Information from the focus groups and the literature review were used to create hypotheses. A survey was…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-04
... their days-at-sea (DAS) declaration through their VMS from a Northeast (NE) multispecies Category A DAS to a monkfish DAS while at sea, i.e., before crossing the VMS demarcation line upon the vessel's... DAS in the NFMA enables NMFS to monitor the overall fishing effort, in the form of monkfish DAS usage...
ERIC Educational Resources Information Center
Bien, Barbara; Wojszel, Beata; Sikorska-Simmons, Elzbieta
2007-01-01
This study examines rural-urban differences in informal caregivers' perceptions of caregiving. The study's theoretical framework is based on the two-factor model of caregiving, which views caregiving as having both positive and negative impact. Data were collected in personal interviews with 126 rural and 127 urban caregivers in the Bialystok…
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon; Benjamin, Allan; Everett, Christopher; Feather, Martin; Rutledge, Peter; Sen, Dev; Youngblood, Robert
2015-01-01
This is the second of two volumes that collectively comprise the NASA System Safety Handbook. Volume 1 (NASASP-210-580) was prepared for the purpose of presenting the overall framework for System Safety and for providing the general concepts needed to implement the framework. Volume 2 provides guidance for implementing these concepts as an integral part of systems engineering and risk management. This guidance addresses the following functional areas: 1.The development of objectives that collectively define adequate safety for a system, and the safety requirements derived from these objectives that are levied on the system. 2.The conduct of system safety activities, performed to meet the safety requirements, with specific emphasis on the conduct of integrated safety analysis (ISA) as a fundamental means by which systems engineering and risk management decisions are risk-informed. 3.The development of a risk-informed safety case (RISC) at major milestone reviews to argue that the systems safety objectives are satisfied (and therefore that the system is adequately safe). 4.The evaluation of the RISC (including supporting evidence) using a defined set of evaluation criteria, to assess the veracity of the claims made therein in order to support risk acceptance decisions.
Adversity magnifies the importance of social information in decision-making.
Pérez-Escudero, Alfonso; de Polavieja, Gonzalo G
2017-11-01
Decision-making theories explain animal behaviour, including human behaviour, as a response to estimations about the environment. In the case of collective behaviour, they have given quantitative predictions of how animals follow the majority option. However, they have so far failed to explain that in some species and contexts social cohesion increases when conditions become more adverse (i.e. individuals choose the majority option with higher probability when the estimated quality of all available options decreases). We have found that this failure is due to modelling simplifications that aided analysis, like low levels of stochasticity or the assumption that only one choice is the correct one. We provide a more general but simple geometric framework to describe optimal or suboptimal decisions in collectives that gives insight into three different mechanisms behind this effect. The three mechanisms have in common that the private information acts as a gain factor to social information: a decrease in the privately estimated quality of all available options increases the impact of social information, even when social information itself remains unchanged. This increase in the importance of social information makes it more likely that agents will follow the majority option. We show that these results quantitatively explain collective behaviour in fish and experiments of social influence in humans. © 2017 The Authors.
Noonan, Vanessa K; Thorogood, Nancy P; Joshi, Phalgun B; Fehlings, Michael G; Craven, B Catharine; Linassi, Gary; Fourney, Daryl R; Kwon, Brian K; Bailey, Christopher S; Tsai, Eve C; Drew, Brian M; Ahn, Henry; Tsui, Deborah; Dvorak, Marcel F
2013-05-01
Privacy legislation addresses concerns regarding the privacy of personal information; however, its interpretation by research ethics boards has resulted in significant challenges to the collection, management, use and disclosure of personal health information for multi-centre research studies. This paper describes the strategy used to develop the national Rick Hansen Spinal Cord Injury Registry (RHSCIR) in accordance with privacy statutes and benchmarked against best practices. An analysis of the regional and national privacy legislation was conducted to determine the requirements for each of the 31 local RHSCIR sites and the national RHSCIR office. A national privacy and security framework was created for RHSCIR that includes a governance structure, standard operating procedures, training processes, physical and technical security and privacy impact assessments. The framework meets a high-water mark in ensuring privacy and security of personal health information nationally and may assist in the development of other national or international research initiatives. Copyright © 2013 Longwoods Publishing.
Imam, Fahim T.; Larson, Stephen D.; Bandrowski, Anita; Grethe, Jeffery S.; Gupta, Amarnath; Martone, Maryann E.
2012-01-01
An initiative of the NIH Blueprint for neuroscience research, the Neuroscience Information Framework (NIF) project advances neuroscience by enabling discovery and access to public research data and tools worldwide through an open source, semantically enhanced search portal. One of the critical components for the overall NIF system, the NIF Standardized Ontologies (NIFSTD), provides an extensive collection of standard neuroscience concepts along with their synonyms and relationships. The knowledge models defined in the NIFSTD ontologies enable an effective concept-based search over heterogeneous types of web-accessible information entities in NIF’s production system. NIFSTD covers major domains in neuroscience, including diseases, brain anatomy, cell types, sub-cellular anatomy, small molecules, techniques, and resource descriptors. Since the first production release in 2008, NIF has grown significantly in content and functionality, particularly with respect to the ontologies and ontology-based services that drive the NIF system. We present here on the structure, design principles, community engagement, and the current state of NIFSTD ontologies. PMID:22737162
Noonan, Vanessa K.; Thorogood, Nancy P.; Joshi, Phalgun B.; Fehlings, Michael G.; Craven, B. Catharine; Linassi, Gary; Fourney, Daryl R.; Kwon, Brian K.; Bailey, Christopher S.; Tsai, Eve C.; Drew, Brian M.; Ahn, Henry; Tsui, Deborah; Dvorak, Marcel F.
2013-01-01
Privacy legislation addresses concerns regarding the privacy of personal information; however, its interpretation by research ethics boards has resulted in significant challenges to the collection, management, use and disclosure of personal health information for multi-centre research studies. This paper describes the strategy used to develop the national Rick Hansen Spinal Cord Injury Registry (RHSCIR) in accordance with privacy statutes and benchmarked against best practices. An analysis of the regional and national privacy legislation was conducted to determine the requirements for each of the 31 local RHSCIR sites and the national RHSCIR office. A national privacy and security framework was created for RHSCIR that includes a governance structure, standard operating procedures, training processes, physical and technical security and privacy impact assessments. The framework meets a high-water mark in ensuring privacy and security of personal health information nationally and may assist in the development of other national or international research initiatives. PMID:23968640
Judging nursing information on the WWW: a theoretical understanding.
Cader, Raffik; Campbell, Steve; Watson, Don
2009-09-01
This paper is a report of a study of the judgement processes nurses use when evaluating World Wide Web information related to nursing practice. The World Wide Web has increased the global accessibility of online health information. However, the variable nature of the quality of World Wide Web information and its perceived level of reliability may lead to misinformation. This makes demands on healthcare professionals, and on nurses in particular, to ensure that health information of reliable quality is selected for use in practice. A grounded theory approach was adopted. Semi-structured interviews and focus groups were used to collect data, between 2004 and 2005, from 20 nurses undertaking a postqualification graduate course at a university and 13 nurses from a local hospital in the United Kingdom. A theoretical framework emerged that gave insight into the judgement process nurses use when evaluating World Wide Web information. Participants broke the judgement process down into specific tasks. In addition, they used tacit, process and propositional knowledge and intuition, quasi-rational cognition and analysis to undertake these tasks. World Wide Web information cues, time available and nurses' critical skills were influencing factors in their judgement process. Addressing the issue of quality and reliability associated with World Wide Web information is a global challenge. This theoretical framework could contribute towards meeting this challenge.
Composing the theme of city to be diverse and sustainable
NASA Astrophysics Data System (ADS)
Wiranegara, H. W.
2018-01-01
To give a path for developing a city needs a theme. City’s goal stated in a document of a spatial plan were too broad and insufficient detail in giving a direction. To make more detail and precise, every city has to compose a city theme. It is developed based on the potential, the uniqueness, the excellence, and the sustainability of its human resources, natural resources, and man-made resources. An integration among the three of resources which have the highest score become a theme of the city. The aim of this research was to formulate the conceptual framework to compose a city theme. The research design was the interview survey in Banda Aceh, Banjarmasin, and Kupang. Informants were the government officials, academics, figures, the private sector and public who considered related to the intended information being collected. Having set the conceptual framework, the interview directed to check the implementation in realities. The result was that the conceptual framework could accommodate the phenomenon of composing the theme of the city. Yet, it was a preliminary in nature and needed more research to get a complete result.
Javadian, Sanas; Stigler-Granados, Paula; Curtis, Clifton; Thompson, Francis; Huber, Laurent; Novotny, Thomas E
2015-08-18
Cigarette butts (tobacco product waste (TPW)) are the single most collected item in environmental trash cleanups worldwide. This brief descriptive study used an online survey tool (Survey Monkey) to assess knowledge, attitudes, and beliefs among individuals representing the Framework Convention Alliance (FCA) about this issue. The FCA has about 350 members, including mainly non-governmental tobacco control advocacy groups that support implementation of the World Health Organization's (WHO) Framework Convention on Tobacco Control (FCTC). Although the response rate (28%) was low, respondents represented countries from all six WHO regions. The majority (62%) have heard the term TPW, and nearly all (99%) considered TPW as an environmental harm. Most (77%) indicated that the tobacco industry should be responsible for TPW mitigation, and 72% felt that smokers should also be held responsible. This baseline information may inform future international discussions by the FCTC Conference of the Parties (COP) regarding environmental policies that may be addressed within FCTC obligations. Additional research is planned regarding the entire lifecycle of tobacco's impact on the environment.
NASA Astrophysics Data System (ADS)
Scheele, C. J.; Huang, Q.
2016-12-01
In the past decade, the rise in social media has led to the development of a vast number of social media services and applications. Disaster management represents one of such applications leveraging massive data generated for event detection, response, and recovery. In order to find disaster relevant social media data, current approaches utilize natural language processing (NLP) methods based on keywords, or machine learning algorithms relying on text only. However, these approaches cannot be perfectly accurate due to the variability and uncertainty in language used on social media. To improve current methods, the enhanced text-mining framework is proposed to incorporate location information from social media and authoritative remote sensing datasets for detecting disaster relevant social media posts, which are determined by assessing the textual content using common text mining methods and how the post relates spatiotemporally to the disaster event. To assess the framework, geo-tagged Tweets were collected for three different spatial and temporal disaster events: hurricane, flood, and tornado. Remote sensing data and products for each event were then collected using RealEarthTM. Both Naive Bayes and Logistic Regression classifiers were used to compare the accuracy within the enhanced text-mining framework. Finally, the accuracies from the enhanced text-mining framework were compared to the current text-only methods for each of the case study disaster events. The results from this study address the need for more authoritative data when using social media in disaster management applications.
A Framework for Enhancing Real-time Social Media Data to Improve Disaster Management Process
NASA Astrophysics Data System (ADS)
Attique Shah, Syed; Zafer Şeker, Dursun; Demirel, Hande
2018-05-01
Social Media datasets are playing a vital role to provide information that can support decision making in nearly all domains of technology. It is due to the fact that social media is a quick and economical approach for data collection from public through methods like crowdsourcing. It is already proved by existing research that in case of any disaster (natural or man-made) the information extracted from Social Media sites is very critical to Disaster Management Systems for response and reconstruction. This study comprises of two components, the first part proposes a framework that provides updated and filtered real time input data for the disaster management system through social media and the second part consists of a designed web user API for a structured and defined real time data input process. This study contributes to the discipline of design science for the information systems domain. The aim of this study is to propose a framework that can filter and organize data from the unstructured social media sources through recognized methods and to bring this retrieved data to the same level as that of taken through a structured and predefined mechanism of a web API. Both components are designed to a level such that they can potentially collaborate and produce updated information for a disaster management system to carry out accurate and effective.
An information theory account of cognitive control.
Fan, Jin
2014-01-01
Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory.
Personal photograph enhancement using internet photo collections.
Zhang, Chenxi; Gao, Jizhou; Wang, Oliver; Georgel, Pierre; Yang, Ruigang; Davis, James; Frahm, Jan-Michael; Pollefeys, Marc
2014-02-01
Given the growth of Internet photo collections, we now have a visual index of all major cities and tourist sites in the world. However, it is still a difficult task to capture that perfect shot with your own camera when visiting these places, especially when your camera itself has limitations, such as a limited field of view. In this paper, we propose a framework to overcome the imperfections of personal photographs of tourist sites using the rich information provided by large-scale Internet photo collections. Our method deploys state-of-the-art techniques for constructing initial 3D models from photo collections. The same techniques are then used to register personal photographs to these models, allowing us to augment personal 2D images with 3D information. This strong available scene prior allows us to address a number of traditionally challenging image enhancement techniques and achieve high-quality results using simple and robust algorithms. Specifically, we demonstrate automatic foreground segmentation, mono-to-stereo conversion, field-of-view expansion, photometric enhancement, and additionally automatic annotation with geolocation and tags. Our method clearly demonstrates some possible benefits of employing the rich information contained in online photo databases to efficiently enhance and augment one's own personal photographs.
NASA Astrophysics Data System (ADS)
Bezawada, Rajesh; Uijt de Haag, Maarten
2010-04-01
This paper discusses the results of an initial evaluation study of hazard and integrity monitor functions for use with integrated alerting and notification. The Hazard and Integrity Monitor (HIM) (i) allocates information sources within the Integrated Intelligent Flight Deck (IIFD) to required functionality (like conflict detection and avoidance) and determines required performance of these information sources as part of that function; (ii) monitors or evaluates the required performance of the individual information sources and performs consistency checks among various information sources; (iii) integrates the information to establish tracks of potential hazards that can be used for the conflict probes or conflict prediction for various time horizons including the 10, 5, 3, and <3 minutes used in our scenario; (iv) detects and assesses the class of the hazard and provide possible resolutions. The HIM monitors the operation-dependent performance parameters related to the potential hazards in a manner similar to the Required Navigation Performance (RNP). Various HIM concepts have been implemented and evaluated using a previously developed sensor simulator/synthesizer. Within the simulation framework, various inputs to the IIFD and its subsystems are simulated, synthesized from actual collected data, or played back from actual flight test sensor data. The framework and HIM functions are implemented in SimulinkR, a modeling language developed by The MathworksTM. This modeling language allows for test and evaluation of various sensor and communication link configurations as well as the inclusion of feedback from the pilot on the performance of the aircraft.
Defining a risk-informed framework for whole-of-government lessons learned: A Canadian perspective.
Friesen, Shaye K; Kelsey, Shelley; Legere, J A Jim
Lessons learned play an important role in emergency management (EM) and organizational agility. Virtually all aspects of EM can derive benefit from a lessons learned program. From major security events to exercises, exploiting and applying lessons learned and "best practices" is critical to organizational resilience and adaptiveness. A robust lessons learned process and methodology provides an evidence base with which to inform decisions, guide plans, strengthen mitigation strategies, and assist in developing tools for operations. The Canadian Safety and Security Program recently supported a project to define a comprehensive framework that would allow public safety and security partners to regularly share event response best practices, and prioritize recommendations originating from after action reviews. This framework consists of several inter-locking elements: a comprehensive literature review/environmental scan of international programs; a survey to collect data from end users and management; the development of a taxonomy for organizing and structuring information; a risk-informed methodology for selecting, prioritizing, and following through on recommendations; and standardized templates and tools for tracking recommendations and ensuring implementation. This article discusses the efforts of the project team, which provided "best practice" advice and analytical support to ensure that a systematic approach to lessons learned was taken by the federal community to improve prevention, preparedness, and response activities. It posits an approach by which one might design a systematic process for information sharing and event response coordination-an approach that will assist federal departments to institutionalize a cross-government lessons learned program.
Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame
NASA Astrophysics Data System (ADS)
Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi
2018-01-01
At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.
Framework model and principles for trusted information sharing in pervasive health.
Ruotsalainen, Pekka; Blobel, Bernd; Nykänen, Pirkko; Seppälä, Antto; Sorvari, Hannu
2011-01-01
Trustfulness (i.e. health and wellness information is processed ethically, and privacy is guaranteed) is one of the cornerstones for future Personal Health Systems, ubiquitous healthcare and pervasive health. Trust in today's healthcare is organizational, static and predefined. Pervasive health takes place in an open and untrusted information space where person's lifelong health and wellness information together with contextual data are dynamically collected and used by many stakeholders. This generates new threats that do not exist in today's eHealth systems. Our analysis shows that the way security and trust are implemented in today's healthcare cannot guarantee information autonomy and trustfulness in pervasive health. Based on a framework model of pervasive health and risks analysis of ubiquitous information space, we have formulated principles which enable trusted information sharing in pervasive health. Principles imply that the data subject should have the right to dynamically verify trust and to control the use of her health information, as well as the right to set situation based context-aware personal policies. Data collectors and processors have responsibilities including transparency of information processing, and openness of interests, policies and environmental features. Our principles create a base for successful management of privacy and information autonomy in pervasive health. They also imply that it is necessary to create new data models for personal health information and new architectures which support situation depending trust and privacy management.
2009-01-01
Infodemiology can be defined as the science of distribution and determinants of information in an electronic medium, specifically the Internet, or in a population, with the ultimate aim to inform public health and public policy. Infodemiology data can be collected and analyzed in near real time. Examples for infodemiology applications include: the analysis of queries from Internet search engines to predict disease outbreaks (eg. influenza); monitoring peoples' status updates on microblogs such as Twitter for syndromic surveillance; detecting and quantifying disparities in health information availability; identifying and monitoring of public health relevant publications on the Internet (eg. anti-vaccination sites, but also news articles or expert-curated outbreak reports); automated tools to measure information diffusion and knowledge translation, and tracking the effectiveness of health marketing campaigns. Moreover, analyzing how people search and navigate the Internet for health-related information, as well as how they communicate and share this information, can provide valuable insights into health-related behavior of populations. Seven years after the infodemiology concept was first introduced, this paper revisits the emerging fields of infodemiology and infoveillance and proposes an expanded framework, introducing some basic metrics such as information prevalence, concept occurrence ratios, and information incidence. The framework distinguishes supply-based applications (analyzing what is being published on the Internet, eg. on Web sites, newsgroups, blogs, microblogs and social media) from demand-based methods (search and navigation behavior), and further distinguishes passive from active infoveillance methods. Infodemiology metrics follow population health relevant events or predict them. Thus, these metrics and methods are potentially useful for public health practice and research, and should be further developed and standardized. PMID:19329408
Eysenbach, Gunther
2009-03-27
Infodemiology can be defined as the science of distribution and determinants of information in an electronic medium, specifically the Internet, or in a population, with the ultimate aim to inform public health and public policy. Infodemiology data can be collected and analyzed in near real time. Examples for infodemiology applications include the analysis of queries from Internet search engines to predict disease outbreaks (eg. influenza), monitoring peoples' status updates on microblogs such as Twitter for syndromic surveillance, detecting and quantifying disparities in health information availability, identifying and monitoring of public health relevant publications on the Internet (eg. anti-vaccination sites, but also news articles or expert-curated outbreak reports), automated tools to measure information diffusion and knowledge translation, and tracking the effectiveness of health marketing campaigns. Moreover, analyzing how people search and navigate the Internet for health-related information, as well as how they communicate and share this information, can provide valuable insights into health-related behavior of populations. Seven years after the infodemiology concept was first introduced, this paper revisits the emerging fields of infodemiology and infoveillance and proposes an expanded framework, introducing some basic metrics such as information prevalence, concept occurrence ratios, and information incidence. The framework distinguishes supply-based applications (analyzing what is being published on the Internet, eg. on Web sites, newsgroups, blogs, microblogs and social media) from demand-based methods (search and navigation behavior), and further distinguishes passive from active infoveillance methods. Infodemiology metrics follow population health relevant events or predict them. Thus, these metrics and methods are potentially useful for public health practice and research, and should be further developed and standardized.
Gareen, Ilana F; Sicks, JoRean D; Jain, Amanda Adams; Moline, Denise; Coffman-Kadish, Nancy
2013-01-01
In clinical trials and epidemiologic studies, information on medical care utilization and health outcomes is often obtained from medical records. For multi-center studies, this information may be gathered by personnel at individual sites or by staff at a central coordinating center. We describe the process used to develop a HIPAA-compliant centralized process to collect medical record information for a large multi-center cancer screening trial. The framework used to select, request, and track medical records incorporated a participant questionnaire with unique identifiers for each medical provider. De-identified information from the questionnaires was sent to the coordinating center indexed by these identifiers. The central coordinating center selected specific medical providers for abstraction and notified sites using these identifiers. The site personnel then linked the identifiers with medical provider information. Staff at the sites collected medical records and provided them for central abstraction. Medical records were successfully obtained and abstracted to ascertain information on outcomes and health care utilization in a study with over 18,000 study participants. Collection of records required for outcomes related to positive screening examinations and lung cancer diagnosis exceeded 90%. Collection of records for all aims was 87.32%. We designed a successful centralized medical record abstraction process that may be generalized to other research settings, including observational studies. The coordinating center received no identifying data. The process satisfied requirements imposed by the Health Insurance Portability and Accountability Act and concerns of site institutional review boards with respect to protected health information. Copyright © 2012 Elsevier Inc. All rights reserved.
Gareen, Ilana F.; Sicks, JoRean; Adams, Amanda; Moline, Denise; Coffman-Kadish, Nancy
2012-01-01
Background In clinical trials and epidemiologic studies, information on medical care utilization and health outcomes is often obtained from medical records. For multi-center studies, this information may be gathered by personnel at individual sites or by staff at a central coordinating center. We describe the process used to develop a HIPAA-compliant centralized process to collect medical record information for a large multi-center cancer screening trial. Methods The framework used to select, request, and track medical records incorporated a participant questionnaire with unique identifiers for each medical provider. De-identified information from the questionnaires was sent to the coordinating center indexed by these identifiers. The central coordinating center selected specific medical providers for abstraction and notified sites using these identifiers. The site personnel then linked the identifiers with medical provider information. Staff at the sites collected medical records and provided them for central abstraction. Results Medical records were successfully obtained and abstracted to ascertain information on outcomes and health care utilization in a study with over 18,000 study participants. Collection of records required for outcomes related to positive screening examinations and lung cancer diagnosis exceeded 90%. Collection of records for all aims was 87.32%. Conclusions We designed a successful centralized medical record abstraction process that may be generalized to other research settings, including observational studies. The coordinating center received no identifying data. The process satisfied requirements imposed by the Health Insurance Portability and Accountability Act and concerns of site institutional review boards with respect to protected health information. PMID:22982342
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKinnon, Archibald D.; Thompson, Seth R.; Doroshchuk, Ruslan A.
mart grid technologies are transforming the electric power grid into a grid with bi-directional flows of both power and information. Operating millions of new smart meters and smart appliances will significantly impact electric distribution systems resulting in greater efficiency. However, the scale of the grid and the new types of information transmitted will potentially introduce several security risks that cannot be addressed by traditional, centralized security techniques. We propose a new bio-inspired cyber security approach. Social insects, such as ants and bees, have developed complex-adaptive systems that emerge from the collective application of simple, light-weight behaviors. The Digital Ants frameworkmore » is a bio-inspired framework that uses mobile light-weight agents. Sensors within the framework use digital pheromones to communicate with each other and to alert each other of possible cyber security issues. All communication and coordination is both localized and decentralized thereby allowing the framework to scale across the large numbers of devices that will exist in the smart grid. Furthermore, the sensors are light-weight and therefore suitable for implementation on devices with limited computational resources. This paper will provide a brief overview of the Digital Ants framework and then present results from test bed-based demonstrations that show that Digital Ants can identify a cyber attack scenario against smart meter deployments.« less
Thermal Profiles for Selected River Reaches in the Yakima River Basin, Washington
Vaccaro, J.J.; Keys, M.E.; Julich, R.J.; Welch, W.B.
2008-01-01
Thermal profiles (data sets of longitudinal near-streambed temperature) that provide information on areas of potential ground-water discharge and salmonid habitat for 11 river reaches in the Yakima River basin, Washington, are available as Microsoft Excel? files that can be downloaded from the Internet. Two reaches were profiled twice resulting in 13 profiles. Data were collected for all but one thermal profile during 2001. Data consist of date and time (Pacific Daylight), near-streambed water temperature, and latitude and longitude collected concurrently using a temperature probe and a Global Positioning System. The data were collected from a watercraft towing the probe with an internal datalogger while moving downstream in a Lagrangian framework.
openBIS: a flexible framework for managing and analyzing complex data in biology research
2011-01-01
Background Modern data generation techniques used in distributed systems biology research projects often create datasets of enormous size and diversity. We argue that in order to overcome the challenge of managing those large quantitative datasets and maximise the biological information extracted from them, a sound information system is required. Ease of integration with data analysis pipelines and other computational tools is a key requirement for it. Results We have developed openBIS, an open source software framework for constructing user-friendly, scalable and powerful information systems for data and metadata acquired in biological experiments. openBIS enables users to collect, integrate, share, publish data and to connect to data processing pipelines. This framework can be extended and has been customized for different data types acquired by a range of technologies. Conclusions openBIS is currently being used by several SystemsX.ch and EU projects applying mass spectrometric measurements of metabolites and proteins, High Content Screening, or Next Generation Sequencing technologies. The attributes that make it interesting to a large research community involved in systems biology projects include versatility, simplicity in deployment, scalability to very large data, flexibility to handle any biological data type and extensibility to the needs of any research domain. PMID:22151573
A Conceptual Framework for SAHRA Integrated Multi-resolution Modeling in the Rio Grande Basin
NASA Astrophysics Data System (ADS)
Liu, Y.; Gupta, H.; Springer, E.; Wagener, T.; Brookshire, D.; Duffy, C.
2004-12-01
The sustainable management of water resources in a river basin requires an integrated analysis of the social, economic, environmental and institutional dimensions of the problem. Numerical models are commonly used for integration of these dimensions and for communication of the analysis results to stakeholders and policy makers. The National Science Foundation Science and Technology Center for Sustainability of semi-Arid Hydrology and Riparian Areas (SAHRA) has been developing integrated multi-resolution models to assess impacts of climate variability and land use change on water resources in the Rio Grande Basin. These models not only couple natural systems such as surface and ground waters, but will also include engineering, economic and social components that may be involved in water resources decision-making processes. This presentation will describe the conceptual framework being developed by SAHRA to guide and focus the multiple modeling efforts and to assist the modeling team in planning, data collection and interpretation, communication, evaluation, etc. One of the major components of this conceptual framework is a Conceptual Site Model (CSM), which describes the basin and its environment based on existing knowledge and identifies what additional information must be collected to develop technically sound models at various resolutions. The initial CSM is based on analyses of basin profile information that has been collected, including a physical profile (e.g., topographic and vegetative features), a man-made facility profile (e.g., dams, diversions, and pumping stations), and a land use and ecological profile (e.g., demographics, natural habitats, and endangered species). Based on the initial CSM, a Conceptual Physical Model (CPM) is developed to guide and evaluate the selection of a model code (or numerical model) for each resolution to conduct simulations and predictions. A CPM identifies, conceptually, all the physical processes and engineering and socio-economic activities occurring (or to occur) in the real system that the corresponding numerical models are required to address, such as riparian evapotranspiration responses to vegetation change and groundwater pumping impacts on soil moisture contents. Simulation results from different resolution models and observations of the real system will then be compared to evaluate the consistency among the CSM, the CPMs, and the numerical models, and feedbacks will be used to update the models. In a broad sense, the evaluation of the models (conceptual or numerical), as well as the linkages between them, can be viewed as a part of the overall conceptual framework. As new data are generated and understanding improves, the models will evolve, and the overall conceptual framework is refined. The development of the conceptual framework becomes an on-going process. We will describe the current state of this framework and the open questions that have to be addressed in the future.
A Method for the Study of Human Factors in Aircraft Operations
NASA Technical Reports Server (NTRS)
Barnhart, W.; Billings, C.; Cooper, G.; Gilstrap, R.; Lauber, J.; Orlady, H.; Puskas, B.; Stephens, W.
1975-01-01
A method for the study of human factors in the aviation environment is described. A conceptual framework is provided within which pilot and other human errors in aircraft operations may be studied with the intent of finding out how, and why, they occurred. An information processing model of human behavior serves as the basis for the acquisition and interpretation of information relating to occurrences which involve human error. A systematic method of collecting such data is presented and discussed. The classification of the data is outlined.
Some legal concerns with the use of crowd-sourced Geospatial Information
NASA Astrophysics Data System (ADS)
Cho, George
2014-06-01
Volunteered geographic Information (VGI), citizens as sensors, crowd-sourcing and 'Wikipedia' of maps have been used to describe activity facilitated by the Internet and the dynamic Web 2.0 environment to collect geographic information (GI). Legal concerns raised in the creation, assembly and dissemination of GI by produsers include: quality, ownership and liability. In detail, accuracy and authoritativeness of the crowd-sourced GI; the ownership and moral rights to the information, and contractual and tort liability are key concerns. A legal framework and governance structure may be necessary whereby technology, networked governance and provision of legal protections may be combined to mitigate geo-liability as a 'chilling' factor in VGI development.
Mental health system and services in Albania.
Keste, Dévora; Lazeri, Ledia; Demi, Neli; Severoni, Santino; Lora, Antonio; Saxena, Shekhar
2006-01-01
To describe the mental health system in Albania. Data were gathered in 2003 and in 2004 using a new WHO instrument, World Health Organization Assessment Instrument for Mental health Systems (WHO-AIMS), designed for collecting essential information on the mental health system of low and middle income countries. It consists of 6 domains, 28 facets and 156 items. The information collected through WHO AIMS covered the key aspects of mental health system in Albania: the mental health policy and the legislative framework, the network of mental health services and the characteristics of the users, the role of the primary health care, the human resources, the public education and the links with other governmental sectors, monitoring and research. The data collection through WHO AIMS represented a needed step for a better in-depth knowledge of the system and for implementing actions to strengthen the system. Examples of planned actions were the improvement of the mental health component in primary care, a clear shift of resources from mental hospitals to community facilities, an increase of the outpatient care and an expansion of the mental health information system.
Mialon, Melissa; Mialon, Jonathan
2017-09-01
In the present study, we used a structured approach based on publicly available information to identify the corporate political activity (CPA) strategies of three major actors in the dairy industry in France. We collected publicly available information from the industry, government and other sources over a 6-month period, from March to August 2015. Data collection and analysis were informed by an existing framework for classifying the CPA of the food industry. Setting/Subjects Our study included three major actors in the dairy industry in France: Danone, Lactalis and the Centre National Interprofessionnel de l'Economie Laitière (CNIEL), a trade association. During the period of data collection, the dairy industry employed CPA practices on numerous occasions by using three strategies: the 'information and messaging', the 'constituency building' and the 'policy substitution' strategies. The most common practice was the shaping of evidence in ways that suited the industry. The industry also sought involvement in the community, establishing relationships with public health professionals, academics and the government. Our study shows that the dairy industry used several CPA practices, even during periods when there was no specific policy debate on the role of dairy products in dietary guidelines. The information provided here could inform public health advocates and policy makers and help them ensure that commercial interests of industry do not impede public health policies and programmes.
Information on black-footed ferret biology collected within the framework of ferret conservation
Biggins, Dean E.
2012-01-01
Once feared to be extinct, black-footed ferrets (Mustela nigripes) were rediscovered near Meeteetse, Wyoming, in 1981, resulting in renewed conservation and research efforts for this highly endangered species. A need for information directly useful to recovery has motivated much monitoring of ferrets since that time, but field activities have enabled collection of data relevant to broader biological themes. This special feature is placed in a context of similar books and proceedings devoted to ferret biology and conservation. Articles include general observations on ferrets, modeling of potential impacts of ferrets on prairie dogs (Cynomys spp.), discussions on relationships of ferrets to prairie dog habitats at several spatial scales (from individual burrows to patches of burrow systems) and a general treatise on the status of black-footed ferret recovery.
Hierarchical species distribution models
Hefley, Trevor J.; Hooten, Mevin B.
2016-01-01
Determining the distribution pattern of a species is important to increase scientific knowledge, inform management decisions, and conserve biodiversity. To infer spatial and temporal patterns, species distribution models have been developed for use with many sampling designs and types of data. Recently, it has been shown that count, presence-absence, and presence-only data can be conceptualized as arising from a point process distribution. Therefore, it is important to understand properties of the point process distribution. We examine how the hierarchical species distribution modeling framework has been used to incorporate a wide array of regression and theory-based components while accounting for the data collection process and making use of auxiliary information. The hierarchical modeling framework allows us to demonstrate how several commonly used species distribution models can be derived from the point process distribution, highlight areas of potential overlap between different models, and suggest areas where further research is needed.
Gray, Kathleen M.
2018-01-01
Environmental health literacy (EHL) is a relatively new framework for conceptualizing how people understand and use information about potentially harmful environmental exposures and their influence on health. As such, information on the characterization and measurement of EHL is limited. This review provides an overview of EHL as presented in peer-reviewed literature and aggregates studies based on whether they represent individual level EHL or community level EHL or both. A range of assessment tools has been used to measure EHL, with many studies relying on pre-/post-assessment; however, a broader suite of assessment tools may be needed to capture community-wide outcomes. This review also suggests that the definition of EHL should explicitly include community change or collective action as an important longer-term outcome and proposes a refinement of previous representations of EHL as a theoretical framework, to include self-efficacy. PMID:29518955
Martiniuk, Alexandra L C; Millar, Heather C; Malefoasi, George; Vergeer, Petra; Garland, Trevor; Knight, Simon
2008-01-01
The Solomon Islands is experiencing instability and insecurity and also a concomitant increase in aid. This article aims to address the need for theoretical coordination frameworks to be further informed by the actual experiences, requirements, and views of the recipients of aid. Qualitative research techniques were used to better understand governmental and nongovernmental leaders' views of health sector aid in the Solomon Islands. Data were collected using previously published literature, government and nongovernmental documents, and in-person interviews. Two key themes emerged from the interviews: the need for coordination and integration of aid and the need for this integration to occur over the long-term. These themes are presented using quotations from key informants. Themes and quotations arising from the analyses may assist in understanding theoretical frameworks for coordination, particularly in postconflict states. Future needs regarding mechanisms of collaboration in the Solomons are also discussed.
Conceptual framework for nutrition surveillance systems.
Mock, N B; Bertrand, W E
1993-01-01
This article describes the evolution of nutrition surveillance as an intervention strategy and presents a framework for improving the usefulness of nutrition surveillance programs. It seems clear that such programs' impact on nutritional well-being will depend increasingly on their ability to reach and influence decision-makers. Therefore, it is important to consider political and social forces, and also to realize that if a program is too decentralized or too far removed from key decision-makers, its ability to influence resource flows may be limited. It is of course important that the surveillance information provided be appropriate and of good quality. Therefore, the data collected should be analyzed to ensure they are accurate and representative. Once that has been done, relevant findings should be presented in a readily understandable form designed to meet the intended recipients' information needs. Such findings should also be disseminated to all important decision-maker constituencies, including external donors of nutrition assistance and the general public.
2015-12-01
DOD, joint, or armed service component’s manuals , and other publications . Obviously, JCETs fall under the broader spectrum of security cooperation...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited JOINT COMBINED...No. 0704–0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing
Interagency Modeling Atmospheric Assessment Center Local Jurisdiction: IMAAC Operations Framework
2010-03-01
Richard Bergin Robert Josefek i REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information...Edward J. Dadosky 5. FUNDING NUMBERS 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate School Monterey, CA 93943-5000 8...PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING /MONITORING AGENCY NAME(S) AND ADDRESS(ES) N/A 10. SPONSORING/MONITORING AGENCY REPORT
Creativity: Creativity in Complex Military Systems
2017-05-25
generation later in the problem-solving process. The design process is an alternative problem-solving framework individuals or groups use to orient...no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control ...the potential of their formations. 15. SUBJECT TERMS Creativity, Divergent Thinking, Design , Systems Thinking, Operational Art 16. SECURITY
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-25
...; Division B of this law is the HEARTH Act. As amended by the HEARTH Act, Subpart C of the McKinney-Vento... McKinney-Vento Homeless Assistance Act (42 U.S.C. 11371 et seq.). The HEARTH Act was designed to improve... implementation of the HEARTH Act. This rule establishes the regulatory framework for the Continuum of Care...
Developing Simulated Cyber Attack Scenarios Against Virtualized Adversary Networks
2017-03-01
MAST is a custom software framework originally designed to facilitate the training of network administrators on live networks using SimWare. The MAST...or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington headquarters Services ...scenario development and testing in a virtual test environment. Commercial and custom software tools that provide the ability to conduct network
2014-04-01
must be done to determine current infrastructure and capabilities so that necessary updates and changes can be addressed up front. Mobile biometric...with existing satellite communications infrastructure . 20 PSTP 03-427BIOM 4 State of Mobile Biometric Device Market 4.1 Fingerprint...is a wireless information system highlighted by Real-time wireless data collection mobile device independence, wireless infrastructure independence
Understanding the distributed cognitive processes of intensive care patient discharge.
Lin, Frances; Chaboyer, Wendy; Wallis, Marianne
2014-03-01
To better understand and identify vulnerabilities and risks in the ICU patient discharge process, which provides evidence for service improvement. Previous studies have identified that 'after hours' discharge and 'premature' discharge from ICU are associated with increased mortality. However, some of these studies have largely been retrospective reviews of various administrative databases, while others have focused on specific aspects of the process, which may miss crucial components of the discharge process. This is an ethnographic exploratory study. Distributed cognition and activity theory were used as theoretical frameworks. Ethnographic data collection techniques including informal interviews, direct observations and collecting existing documents were used. A total of 56 one-to-one interviews were conducted with 46 participants; 28 discharges were observed; and numerous documents were collected during a five-month period. A triangulated technique was used in both data collection and data analysis to ensure the research rigour. Under the guidance of activity theory and distributed cognition theoretical frameworks, five themes emerged: hierarchical power and authority, competing priorities, ineffective communication, failing to enact the organisational processes and working collaboratively to optimise the discharge process. Issues with teamwork, cognitive processes and team members' interaction with cognitive artefacts influenced the discharge process. Strategies to improve shared situational awareness are needed to improve teamwork, patient flow and resource efficiency. Tools need to be evaluated regularly to ensure their continuous usefulness. Health care professionals need to be aware of the impact of their competing priorities and ensure discharges occur in a timely manner. Activity theory and distributed cognition are useful theoretical frameworks to support healthcare organisational research. © 2013 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Militello, F.; Farley, T.; Mukhi, K.; Walkden, N.; Omotani, J. T.
2018-05-01
A statistical framework was introduced in Militello and Omotani [Nucl. Fusion 56, 104004 (2016)] to correlate the dynamics and statistics of L-mode and inter-ELM plasma filaments with the radial profiles of thermodynamic quantities they generate in the Scrape Off Layer. This paper extends the framework to cases in which the filaments are emitted from the separatrix at different toroidal positions and with a finite toroidal velocity. It is found that the toroidal velocity does not affect the profiles, while the toroidal distribution of filament emission renormalises the waiting time between two events. Experimental data collected by visual camera imaging are used to evaluate the statistics of the fluctuations, to inform the choice of the probability distribution functions used in the application of the framework. It is found that the toroidal separation of the filaments is exponentially distributed, thus suggesting the lack of a toroidal modal structure. Finally, using these measurements, the framework is applied to an experimental case and good agreement is found.
Muleme, James; Kankya, Clovice; Ssempebwa, John C.; Mazeri, Stella; Muwonge, Adrian
2017-01-01
Knowledge, attitude, and practice (KAP) studies guide the implementation of public health interventions (PHIs), and they are important tools for political persuasion. The design and implementation of PHIs assumes a linear KAP relationship, i.e., an awareness campaign results in the desirable societal behavioral change. However, there is no robust framework for testing this relationship before and after PHIs. Here, we use qualitative and quantitative data on pesticide usage to test this linear relationship, identify associated context specific factors as well as assemble a framework that could be used to guide and evaluate PHIs. We used data from a cross-sectional mixed methods study on pesticide usage. Quantitative data were collected using a structured questionnaire from 167 households representing 1,002 individuals. Qualitative data were collected from key informants and focus group discussions. Quantitative and qualitative data analysis was done in R 3.2.0 as well as qualitative thematic analysis, respectively. Our framework shows that a KAP linear relationship only existed for households with a low knowledge score, suggesting that an awareness campaign would only be effective for ~37% of the households. Context specific socioeconomic factors explain why this relationship does not hold for households with high knowledge scores. These findings are essential for developing targeted cost-effective and sustainable interventions on pesticide usage and other PHIs with context specific modifications. PMID:29276703
Muleme, James; Kankya, Clovice; Ssempebwa, John C; Mazeri, Stella; Muwonge, Adrian
2017-01-01
Knowledge, attitude, and practice (KAP) studies guide the implementation of public health interventions (PHIs), and they are important tools for political persuasion. The design and implementation of PHIs assumes a linear KAP relationship, i.e., an awareness campaign results in the desirable societal behavioral change. However, there is no robust framework for testing this relationship before and after PHIs. Here, we use qualitative and quantitative data on pesticide usage to test this linear relationship, identify associated context specific factors as well as assemble a framework that could be used to guide and evaluate PHIs. We used data from a cross-sectional mixed methods study on pesticide usage. Quantitative data were collected using a structured questionnaire from 167 households representing 1,002 individuals. Qualitative data were collected from key informants and focus group discussions. Quantitative and qualitative data analysis was done in R 3.2.0 as well as qualitative thematic analysis, respectively. Our framework shows that a KAP linear relationship only existed for households with a low knowledge score, suggesting that an awareness campaign would only be effective for ~37% of the households. Context specific socioeconomic factors explain why this relationship does not hold for households with high knowledge scores. These findings are essential for developing targeted cost-effective and sustainable interventions on pesticide usage and other PHIs with context specific modifications.
PLOCAN glider portal: a gateway for useful data management and visualization system
NASA Astrophysics Data System (ADS)
Morales, Tania; Lorenzo, Alvaro; Viera, Josue; Barrera, Carlos; José Rueda, María
2014-05-01
Nowadays monitoring ocean behavior and its characteristics involves a wide range of sources able to gather and provide a vast amount of data in spatio-temporal scales. Multiplatform infrastructures, like PLOCAN, hold a variety of autonomous Lagrangian and Eulerian devices addressed to collect information then transferred to land in near-real time. Managing all this data collection in an efficient way is a major issue. Advances in ocean observation technologies, where underwater autonomous gliders play a key role, has brought as a consequence an improvement of spatio-temporal resolution which offers a deeper understanding of the ocean but requires a bigger effort in the data management process. There are general requirements in terms of data management in that kind of environments, such as processing raw data at different levels to obtain valuable information, storing data coherently and providing accurate products to final users according to their specific needs. Managing large amount of data can be certainly tedious and complex without having right tools and operational procedures; hence automating these tasks through software applications saves time and reduces errors. Moreover, data distribution is highly relevant since scientist tent to assimilate different sources for comparison and validation. The use of web applications has boosted the necessary scientific dissemination. Within this argument, PLOCAN has implemented a set of independent but compatible applications to process, store and disseminate information gathered through different oceanographic platforms. These applications have been implemented using open standards, such as HTML and CSS, and open source software, like python as programming language and Django as framework web. More specifically, a glider application has been developed within the framework of FP7-GROOM project. Regarding data management, this project focuses on collecting and making available consistent and quality controlled datasets as well as fostering open access to glider data.
Journal selection decisions: a biomedical library operations research model. I. The framework.
Kraft, D H; Polacsek, R A; Soergel, L; Burns, K; Klair, A
1976-01-01
The problem of deciding which journal titles to select for acquisition in a biomedical library is modeled. The approach taken is based on cost/benefit ratios. Measures of journal worth, methods of data collection, and journal cost data are considered. The emphasis is on the development of a practical process for selecting journal titles, based on the objectivity and rationality of the model; and on the collection of the approprate data and library statistics in a reasonable manner. The implications of this process towards an overall management information system (MIS) for biomedical serials handling are discussed. PMID:820391
Orpana, H.; Vachon, J.; Dykxhoorn, J.; McRae, L.; Jayaraman, G.
2016-01-01
Abstract Introduction: The Mental Health Strategy for Canada identified a need to enhance the collection of data on mental health in Canada. While surveillance systems on mental illness have been established, a data gap for monitoring positive mental health and its determinants was identified. The goal of this project was to develop a Positive Mental Health Surveillance Indicator Framework, to provide a picture of the state of positive mental health and its determinants in Canada. Data from this surveillance framework will be used to inform programs and policies to improve the mental health of Canadians. Methods: A literature review and environmental scan were conducted to provide the theoretical base for the framework, and to identify potential positive mental health outcomes and risk and protective factors. The Public Health Agency of Canada’s definition of positive mental health was adopted as the conceptual basis for the outcomes of this framework. After identifying a comprehensive list of risk and protective factors, mental health experts, other governmental partners and non-governmental stakeholders were consulted to prioritize these indicators. Subsequently, these groups were consulted to identify the most promising measurement approaches for each indicator. Results: A conceptual framework for surveillance of positive mental health and its determinants has been developed to contain 5 outcome indicators and 25 determinant indicators organized within 4 domains at the individual, family, community and societal level. This indicator framework addresses a data gap identified in Canada’s strategy for mental health and will be used to inform programs and policies to improve the mental health status of Canadians throughout the life course. PMID:26789022
NASA Technical Reports Server (NTRS)
Roychoudhury, Indranil; Daigle, Matthew; Goebel, Kai; Spirkovska, Lilly; Sankararaman, Shankar; Ossenfort, John; Kulkarni, Chetan; McDermott, William; Poll, Scott
2016-01-01
As new operational paradigms and additional aircraft are being introduced into the National Airspace System (NAS), maintaining safety in such a rapidly growing environment becomes more challenging. It is therefore desirable to have an automated framework to provide an overview of the current safety of the airspace at different levels of granularity, as well an understanding of how the state of the safety will evolve into the future given the anticipated flight plans, weather forecast, predicted health of assets in the airspace, and so on. Towards this end, as part of our earlier work, we formulated the Real-Time Safety Monitoring (RTSM) framework for monitoring and predicting the state of safety and to predict unsafe events. In our previous work, the RTSM framework was demonstrated in simulation on three different constructed scenarios. In this paper, we further develop the framework and demonstrate it on real flight data from multiple data sources. Specifically, the flight data is obtained through the Shadow Mode Assessment using Realistic Technologies for the National Airspace System (SMART-NAS) Testbed that serves as a central point of collection, integration, and access of information from these different data sources. By testing and evaluating using real-world scenarios, we may accelerate the acceptance of the RTSM framework towards deployment. In this paper we demonstrate the framework's capability to not only estimate the state of safety in the NAS, but predict the time and location of unsafe events such as a loss of separation between two aircraft, or an aircraft encountering convective weather. The experimental results highlight the capability of the approach, and the kind of information that can be provided to operators to improve their situational awareness in the context of safety.
Systems and methods for an extensible business application framework
NASA Technical Reports Server (NTRS)
Bell, David G. (Inventor); Crawford, Michael (Inventor)
2012-01-01
Method and systems for editing data from a query result include requesting a query result using a unique collection identifier for a collection of individual files and a unique identifier for a configuration file that specifies a data structure for the query result. A query result is generated that contains a plurality of fields as specified by the configuration file, by combining each of the individual files associated with a unique identifier for a collection of individual files. The query result data is displayed with a plurality of labels as specified in the configuration file. Edits can be performed by querying a collection of individual files using the configuration file, editing a portion of the query result, and transmitting only the edited information for storage back into a data repository.
A memory learning framework for effective image retrieval.
Han, Junwei; Ngan, King N; Li, Mingjing; Zhang, Hong-Jiang
2005-04-01
Most current content-based image retrieval systems are still incapable of providing users with their desired results. The major difficulty lies in the gap between low-level image features and high-level image semantics. To address the problem, this study reports a framework for effective image retrieval by employing a novel idea of memory learning. It forms a knowledge memory model to store the semantic information by simply accumulating user-provided interactions. A learning strategy is then applied to predict the semantic relationships among images according to the memorized knowledge. Image queries are finally performed based on a seamless combination of low-level features and learned semantics. One important advantage of our framework is its ability to efficiently annotate images and also propagate the keyword annotation from the labeled images to unlabeled images. The presented algorithm has been integrated into a practical image retrieval system. Experiments on a collection of 10,000 general-purpose images demonstrate the effectiveness of the proposed framework.
Erraguntla, Madhav; Zapletal, Josef; Lawley, Mark
2017-12-01
The impact of infectious disease on human populations is a function of many factors including environmental conditions, vector dynamics, transmission mechanics, social and cultural behaviors, and public policy. A comprehensive framework for disease management must fully connect the complete disease lifecycle, including emergence from reservoir populations, zoonotic vector transmission, and impact on human societies. The Framework for Infectious Disease Analysis is a software environment and conceptual architecture for data integration, situational awareness, visualization, prediction, and intervention assessment. Framework for Infectious Disease Analysis automatically collects biosurveillance data using natural language processing, integrates structured and unstructured data from multiple sources, applies advanced machine learning, and uses multi-modeling for analyzing disease dynamics and testing interventions in complex, heterogeneous populations. In the illustrative case studies, natural language processing from social media, news feeds, and websites was used for information extraction, biosurveillance, and situation awareness. Classification machine learning algorithms (support vector machines, random forests, and boosting) were used for disease predictions.
Hutin, Yvan; Low-Beer, Daniel; Bergeri, Isabel; Hess, Sarah; Garcia-Calleja, Jesus Maria; Hayashi, Chika; Mozalevskis, Antons; Rinder Stengaard, Annemarie; Sabin, Keith; Harmanci, Hande; Bulterys, Marc
2017-12-15
Evidence documenting the global burden of disease from viral hepatitis was essential for the World Health Assembly to endorse the first Global Health Sector Strategy (GHSS) on viral hepatitis in May 2016. The GHSS on viral hepatitis proposes to eliminate viral hepatitis as a public health threat by 2030. The GHSS on viral hepatitis is in line with targets for HIV infection and tuberculosis as part of the Sustainable Development Goals. As coordination between hepatitis and HIV programs aims to optimize the use of resources, guidance is also needed to align the strategic information components of the 2 programs. The World Health Organization monitoring and evaluation framework for viral hepatitis B and C follows an approach similar to the one of HIV, including components on the following: (1) context (prevalence of infection), (2) input, (3) output and outcome, including the cascade of prevention and treatment, and (4) impact (incidence and mortality). Data systems that are needed to inform this framework include (1) surveillance for acute hepatitis, chronic infections, and sequelae and (2) program data documenting prevention and treatment, which for the latter includes a database of patients. Overall, the commonalities between HIV and hepatitis at the strategic, policy, technical, and implementation levels justify coordination, strategic linkage, or integration, depending on the type of HIV and viral hepatitis epidemics. Strategic information is a critical area of this alignment under the principle of what gets measured gets done. It is facilitated because the monitoring and evaluation frameworks for HIV and viral hepatitis were constructed using a similar approach. However, for areas where elimination of viral hepatitis requires data that cannot be collected through the HIV program, collaborations are needed with immunization, communicable disease control, tuberculosis, and hepatology centers to ensure collection of information for the remaining indicators. ©Yvan Hutin, Daniel Low-Beer, Isabel Bergeri, Sarah Hess, Jesus Maria Garcia-Calleja, Chika Hayashi, Antons Mozalevskis, Annemarie Rinder Stengaard, Keith Sabin, Hande Harmanci, Marc Bulterys. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 15.12.2017.
A Framework for Learning Analytics Using Commodity Wearable Devices
Lu, Yu; Zhang, Sen; Zhang, Zhiqiang; Xiao, Wendong; Yu, Shengquan
2017-01-01
We advocate for and introduce LEARNSense, a framework for learning analytics using commodity wearable devices to capture learner’s physical actions and accordingly infer learner context (e.g., student activities and engagement status in class). Our work is motivated by the observations that: (a) the fine-grained individual-specific learner actions are crucial to understand learners and their context information; (b) sensor data available on the latest wearable devices (e.g., wrist-worn and eye wear devices) can effectively recognize learner actions and help to infer learner context information; (c) the commodity wearable devices that are widely available on the market can provide a hassle-free and non-intrusive solution. Following the above observations and under the proposed framework, we design and implement a sensor-based learner context collector running on the wearable devices. The latest data mining and sensor data processing techniques are employed to detect different types of learner actions and context information. Furthermore, we detail all of the above efforts by offering a novel and exemplary use case: it successfully provides the accurate detection of student actions and infers the student engagement states in class. The specifically designed learner context collector has been implemented on the commodity wrist-worn device. Based on the collected and inferred learner information, the novel intervention and incentivizing feedback are introduced into the system service. Finally, a comprehensive evaluation with the real-world experiments, surveys and interviews demonstrates the effectiveness and impact of the proposed framework and this use case. The F1 score for the student action classification tasks achieve 0.9, and the system can effectively differentiate the defined three learner states. Finally, the survey results show that the learners are satisfied with the use of our system (mean score of 3.7 with a standard deviation of 0.55). PMID:28613236
Investigating nurse practitioners in the private sector: a theoretically informed research protocol.
Adams, Margaret; Gardner, Glenn; Yates, Patsy
2017-06-01
To report a study protocol and the theoretical framework normalisation process theory that informs this protocol for a case study investigation of private sector nurse practitioners. Most research evaluating nurse practitioner service is focused on public, mainly acute care environments where nurse practitioner service is well established with strong structures for governance and sustainability. Conversely, there is lack of clarity in governance for emerging models in the private sector. In a climate of healthcare reform, nurse practitioner service is extending beyond the familiar public health sector. Further research is required to inform knowledge of the practice, operational framework and governance of new nurse practitioner models. The proposed research will use a multiple exploratory case study design to examine private sector nurse practitioner service. Data collection includes interviews, surveys and audits. A sequential mixed method approach to analysis of each case will be conducted. Findings from within-case analysis will lead to a meta-synthesis across all four cases to gain a holistic understanding of the cases under study, private sector nurse practitioner service. Normalisation process theory will be used to guide the research process, specifically coding and analysis of data using theory constructs and the relevant components associated with those constructs. This article provides a blueprint for the research and describes a theoretical framework, normalisation process theory in terms of its flexibility as an analytical framework. Consistent with the goals of best research practice, this study protocol will inform the research community in the field of primary health care about emerging research in this field. Publishing a study protocol ensures researcher fidelity to the analysis plan and supports research collaboration across teams. © 2016 John Wiley & Sons Ltd.
A Framework for Learning Analytics Using Commodity Wearable Devices.
Lu, Yu; Zhang, Sen; Zhang, Zhiqiang; Xiao, Wendong; Yu, Shengquan
2017-06-14
We advocate for and introduce LEARNSense, a framework for learning analytics using commodity wearable devices to capture learner's physical actions and accordingly infer learner context (e.g., student activities and engagement status in class). Our work is motivated by the observations that: (a) the fine-grained individual-specific learner actions are crucial to understand learners and their context information; (b) sensor data available on the latest wearable devices (e.g., wrist-worn and eye wear devices) can effectively recognize learner actions and help to infer learner context information; (c) the commodity wearable devices that are widely available on the market can provide a hassle-free and non-intrusive solution. Following the above observations and under the proposed framework, we design and implement a sensor-based learner context collector running on the wearable devices. The latest data mining and sensor data processing techniques are employed to detect different types of learner actions and context information. Furthermore, we detail all of the above efforts by offering a novel and exemplary use case: it successfully provides the accurate detection of student actions and infers the student engagement states in class. The specifically designed learner context collector has been implemented on the commodity wrist-worn device. Based on the collected and inferred learner information, the novel intervention and incentivizing feedback are introduced into the system service. Finally, a comprehensive evaluation with the real-world experiments, surveys and interviews demonstrates the effectiveness and impact of the proposed framework and this use case. The F1 score for the student action classification tasks achieve 0.9, and the system can effectively differentiate the defined three learner states. Finally, the survey results show that the learners are satisfied with the use of our system (mean score of 3.7 with a standard deviation of 0.55).
Health Risk Information Engagement and Amplification on Social Media.
Strekalova, Yulia A
2017-04-01
Emerging pandemics call for unique health communication and education strategies in which public health agencies need to satisfy the public's information needs about possible risks while preventing risk exaggeration and dramatization. As a route to providing a framework for understanding public information behaviors in response to an emerging pandemic, this study examined the characteristics of communicative behaviors of social media audiences in response to Ebola outbreak news. Grounded in the social amplification of risks framework, this study adds to an understanding of information behaviors of online audiences by showing empirical differences in audience engagement with online health information. The data were collected from the Centers for Disease Control and Prevention (CDC) Facebook channel. The final data set included 809 CDC posts and 35,916 audience comments. The analysis identified the differences in audience information behaviors in response to an emerging pandemic, Ebola, and health promotion posts. While the CDC had fewer posts on Ebola than health promotion topics, the former received more attention from active page users. Furthermore, audience members who actively engaged with Ebola news had a small overlap with those who engaged with non-Ebola information during the same period. Overall, this study demonstrated that information behavior and audience engagement is topic dependent. Furthermore, audiences who commented on news about an emerging pandemic were homogenous and varied in their degree of information amplification.
Creating a Framework of Guidance for Building Good Digital Collections.
ERIC Educational Resources Information Center
Cole, Timothy W.
2002-01-01
Presents the Framework of Guidance for Building Good Digital Collections that was developed by the Institute of Museum and Library Services with other organizations to guide museums and libraries in digitization collection practices. Highlights digital collections, digital objects, and metadata, and discusses reusability, persistence,…
Canaway, Rachel; Bismark, Marie; Dunt, David; Prang, Khic-Houy; Kelaher, Margaret
2018-04-01
Public reporting of hospital performance data is a developing area that is gaining increased attention. This is the first study to explore a range of stakeholder opinions on how such public reporting could be strengthened in Australia. Thirty-four semi-structured interviews were conducted with a purposive sample of expert healthcare consumer, provider and purchaser informants who worked in a variety of senior roles and had knowledge of or involvement in public reporting of hospital data within the public or private healthcare sectors. Informants from all Australian states, territory and national jurisdictions participated. Thematic analysis was used to gain an overview of experts' opinions to inform policy and systems-development for strengthening foundational frameworks for public reporting of health services performance. Themes arising were synthesised to generate explanatory figures to highlight key areas for strengthening public reporting. Our findings suggest that in Australia there is a lack of agreement on what the objectives and who the audience are for public reporting of hospital performance data. Without this shared understanding it is difficult to strengthen frameworks and impacts of public reporting. When developing frameworks for public reporting of hospital data in Australia, more explicit definition of what or who are the 'public' is needed along with identification of barriers, desired impacts, data needs, and data collection/reporting/feedback mechanisms. All relevant stakeholders should be involved in design of public reporting frameworks. Offering multiple systems of public reporting, each tailored to particular audiences, might enable greater impact of reporting towards improved hospital quality and safety, and consumer knowledge to inform treatment decisions. This study provides an overview of perspectives, but further research is warranted to develop PR frameworks that can generate greatest impacts for the needs of various audiences. Copyright © 2018 Elsevier Ltd. All rights reserved.
Salmon, P; Williamson, A; Lenné, M; Mitsopoulos-Rubens, E; Rudin-Brown, C M
2010-08-01
Safety-compromising accidents occur regularly in the led outdoor activity domain. Formal accident analysis is an accepted means of understanding such events and improving safety. Despite this, there remains no universally accepted framework for collecting and analysing accident data in the led outdoor activity domain. This article presents an application of Rasmussen's risk management framework to the analysis of the Lyme Bay sea canoeing incident. This involved the development of an Accimap, the outputs of which were used to evaluate seven predictions made by the framework. The Accimap output was also compared to an analysis using an existing model from the led outdoor activity domain. In conclusion, the Accimap output was found to be more comprehensive and supported all seven of the risk management framework's predictions, suggesting that it shows promise as a theoretically underpinned approach for analysing, and learning from, accidents in the led outdoor activity domain. STATEMENT OF RELEVANCE: Accidents represent a significant problem within the led outdoor activity domain. This article presents an evaluation of a risk management framework that can be used to understand such accidents and to inform the development of accident countermeasures and mitigation strategies for the led outdoor activity domain.
The Common Framework for Earth Observation Data
NASA Astrophysics Data System (ADS)
Gallo, J.; Stryker, T. S.; Sherman, R.
2016-12-01
Each year, the Federal government records petabytes of data about our home planet. That massive amount of data in turn provides enormous benefits to society through weather reports, agricultural forecasts, air and water quality warnings, and countless other applications. To maximize the ease of transforming the data into useful information for research and for public services, the U.S. Group on Earth Observations released the first Common Framework for Earth Observation Data in March 2016. The Common Framework recommends practices for Federal agencies to adopt in order to improve the ability of all users to discover, access, and use Federal Earth observations data. The U.S. Government is committed to making data from civil Earth observation assets freely available to all users. Building on the Administration's commitment to promoting open data, open science, and open government, the Common Framework goes beyond removing financial barriers to data access, and attempts to minimize the technical impediments that limit data utility. While Earth observation systems typically collect data for a specific purpose, these data are often also useful in applications unforeseen during development of the systems. Managing and preserving these data with a common approach makes it easier for a wide range of users to find, evaluate, understand, and utilize the data, which in turn leads to the development of a wide range of innovative applications. The Common Framework provides Federal agencies with a recommended set of standards and practices to follow in order to achieve this goal. Federal agencies can follow these best practices as they develop new observing systems or modernize their existing collections of data. This presentation will give a brief on the context and content of the Common Framework, along with future directions for implementation and keeping its recommendations up-to-date with developing technology.
An information theory account of cognitive control
Fan, Jin
2014-01-01
Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory. PMID:25228875
The electronic transfer of information and aerospace knowledge diffusion
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.; Bishop, Ann P.; Barclay, Rebecca O.; Kennedy, John M.
1992-01-01
Increasing reliance on and investment in information technology and electronic networking systems presupposes that computing and information technology will play a motor role in the diffusion of aerospace knowledge. Little is known, however, about actual information technology needs, uses, and problems within the aerospace knowledge diffusion process. The authors state that the potential contributions of information technology to increased productivity and competitiveness will be diminished unless empirically derived knowledge regarding the information-seeking behavior of the members of the social system - those who are producing, transferring, and using scientific and technical information - is incorporated into a new technology policy framework. Research into the use of information technology and electronic networks by U.S. aerospace engineers and scientists, collected as part of a research project designed to study aerospace knowledge diffusion, is presented in support of this assertion.
Evolutionary concepts in biobanking - the BC BioLibrary
2009-01-01
Background Medical research to improve health care faces a major problem in the relatively limited availability of adequately annotated and collected biospecimens. This limitation is creating a growing gap between the pace of scientific advances and successful exploitation of this knowledge. Biobanks are an important conduit for transfer of biospecimens (tissues, blood, body fluids) and related health data to research. They have evolved outside of the historical source of tissue biospecimens, clinical pathology archives. Research biobanks have developed advanced standards, protocols, databases, and mechanisms to interface with researchers seeking biospecimens. However, biobanks are often limited in their capacity and ability to ensure quality in the face of increasing demand. Our strategy to enhance both capacity and quality in research biobanking is to create a new framework that repatriates the activity of biospecimen accrual for biobanks to clinical pathology. Methods The British Columbia (BC) BioLibrary is a framework to maximize the accrual of high-quality, annotated biospecimens into biobanks. The BC BioLibrary design primarily encompasses: 1) specialized biospecimen collection units embedded within clinical pathology and linked to a biospecimen distribution system that serves biobanks; 2) a systematic process to connect potential donors with biobanks, and to connect biobanks with consented biospecimens; and 3) interdisciplinary governance and oversight informed by public opinion. Results The BC BioLibrary has been embraced by biobanking leaders and translational researchers throughout BC, across multiple health authorities, institutions, and disciplines. An initial pilot network of three Biospecimen Collection Units has been successfully established. In addition, two public deliberation events have been held to obtain input from the public on the BioLibrary and on issues including consent, collection of biospecimens and governance. Conclusion The BC BioLibrary framework addresses common issues for clinical pathology, biobanking, and translational research across multiple institutions and clinical and research domains. We anticipate that our framework will lead to enhanced biospecimen accrual capacity and quality, reduced competition between biobanks, and a transparent process for donors that enhances public trust in biobanking. PMID:19909513
A service brokering and recommendation mechanism for better selecting cloud services.
Gui, Zhipeng; Yang, Chaowei; Xia, Jizhe; Huang, Qunying; Liu, Kai; Li, Zhenlong; Yu, Manzhu; Sun, Min; Zhou, Nanyin; Jin, Baoxuan
2014-01-01
Cloud computing is becoming the new generation computing infrastructure, and many cloud vendors provide different types of cloud services. How to choose the best cloud services for specific applications is very challenging. Addressing this challenge requires balancing multiple factors, such as business demands, technologies, policies and preferences in addition to the computing requirements. This paper recommends a mechanism for selecting the best public cloud service at the levels of Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). A systematic framework and associated workflow include cloud service filtration, solution generation, evaluation, and selection of public cloud services. Specifically, we propose the following: a hierarchical information model for integrating heterogeneous cloud information from different providers and a corresponding cloud information collecting mechanism; a cloud service classification model for categorizing and filtering cloud services and an application requirement schema for providing rules for creating application-specific configuration solutions; and a preference-aware solution evaluation mode for evaluating and recommending solutions according to the preferences of application providers. To test the proposed framework and methodologies, a cloud service advisory tool prototype was developed after which relevant experiments were conducted. The results show that the proposed system collects/updates/records the cloud information from multiple mainstream public cloud services in real-time, generates feasible cloud configuration solutions according to user specifications and acceptable cost predication, assesses solutions from multiple aspects (e.g., computing capability, potential cost and Service Level Agreement, SLA) and offers rational recommendations based on user preferences and practical cloud provisioning; and visually presents and compares solutions through an interactive web Graphical User Interface (GUI).
Weiler, Gabriele; Schröder, Christina; Schera, Fatima; Dobkowicz, Matthias; Kiefer, Stephan; Heidtke, Karsten R; Hänold, Stefanie; Nwankwo, Iheanyi; Forgó, Nikolaus; Stanulla, Martin; Eckert, Cornelia; Graf, Norbert
2014-01-01
Biobanks represent key resources for clinico-genomic research and are needed to pave the way to personalised medicine. To achieve this goal, it is crucial that scientists can securely access and share high-quality biomaterial and related data. Therefore, there is a growing interest in integrating biobanks into larger biomedical information and communication technology (ICT) infrastructures. The European project p-medicine is currently building an innovative ICT infrastructure to meet this need. This platform provides tools and services for conducting research and clinical trials in personalised medicine. In this paper, we describe one of its main components, the biobank access framework p-BioSPRE (p-medicine Biospecimen Search and Project Request Engine). This generic framework enables and simplifies access to existing biobanks, but also to offer own biomaterial collections to research communities, and to manage biobank specimens and related clinical data over the ObTiMA Trial Biomaterial Manager. p-BioSPRE takes into consideration all relevant ethical and legal standards, e.g., safeguarding donors’ personal rights and enabling biobanks to keep control over the donated material and related data. The framework thus enables secure sharing of biomaterial within open and closed research communities, while flexibly integrating related clinical and omics data. Although the development of the framework is mainly driven by user scenarios from the cancer domain, in this case, acute lymphoblastic leukaemia and Wilms tumour, it can be extended to further disease entities. PMID:24567758
Assessing the impact of healthcare research: A systematic review of methodological frameworks
Keeley, Thomas J.; Calvert, Melanie J.
2017-01-01
Background Increasingly, researchers need to demonstrate the impact of their research to their sponsors, funders, and fellow academics. However, the most appropriate way of measuring the impact of healthcare research is subject to debate. We aimed to identify the existing methodological frameworks used to measure healthcare research impact and to summarise the common themes and metrics in an impact matrix. Methods and findings Two independent investigators systematically searched the Medical Literature Analysis and Retrieval System Online (MEDLINE), the Excerpta Medica Database (EMBASE), the Cumulative Index to Nursing and Allied Health Literature (CINAHL+), the Health Management Information Consortium, and the Journal of Research Evaluation from inception until May 2017 for publications that presented a methodological framework for research impact. We then summarised the common concepts and themes across methodological frameworks and identified the metrics used to evaluate differing forms of impact. Twenty-four unique methodological frameworks were identified, addressing 5 broad categories of impact: (1) ‘primary research-related impact’, (2) ‘influence on policy making’, (3) ‘health and health systems impact’, (4) ‘health-related and societal impact’, and (5) ‘broader economic impact’. These categories were subdivided into 16 common impact subgroups. Authors of the included publications proposed 80 different metrics aimed at measuring impact in these areas. The main limitation of the study was the potential exclusion of relevant articles, as a consequence of the poor indexing of the databases searched. Conclusions The measurement of research impact is an essential exercise to help direct the allocation of limited research resources, to maximise research benefit, and to help minimise research waste. This review provides a collective summary of existing methodological frameworks for research impact, which funders may use to inform the measurement of research impact and researchers may use to inform study design decisions aimed at maximising the short-, medium-, and long-term impact of their research. PMID:28792957
Parallel Molecular Distributed Detection With Brownian Motion.
Rogers, Uri; Koh, Min-Sung
2016-12-01
This paper explores the in vivo distributed detection of an undesired biological agent's (BAs) biomarkers by a group of biological sized nanomachines in an aqueous medium under drift. The term distributed, indicates that the system information relative to the BAs presence is dispersed across the collection of nanomachines, where each nanomachine possesses limited communication, computation, and movement capabilities. Using Brownian motion with drift, a probabilistic detection and optimal data fusion framework, coined molecular distributed detection, will be introduced that combines theory from both molecular communication and distributed detection. Using the optimal data fusion framework as a guide, simulation indicates that a sub-optimal fusion method exists, allowing for a significant reduction in implementation complexity while retaining BA detection accuracy.
Huang, Jidong; Emery, Sherry
2016-01-01
Background Social media have transformed the communications landscape. People increasingly obtain news and health information online and via social media. Social media platforms also serve as novel sources of rich observational data for health research (including infodemiology, infoveillance, and digital disease detection detection). While the number of studies using social data is growing rapidly, very few of these studies transparently outline their methods for collecting, filtering, and reporting those data. Keywords and search filters applied to social data form the lens through which researchers may observe what and how people communicate about a given topic. Without a properly focused lens, research conclusions may be biased or misleading. Standards of reporting data sources and quality are needed so that data scientists and consumers of social media research can evaluate and compare methods and findings across studies. Objective We aimed to develop and apply a framework of social media data collection and quality assessment and to propose a reporting standard, which researchers and reviewers may use to evaluate and compare the quality of social data across studies. Methods We propose a conceptual framework consisting of three major steps in collecting social media data: develop, apply, and validate search filters. This framework is based on two criteria: retrieval precision (how much of retrieved data is relevant) and retrieval recall (how much of the relevant data is retrieved). We then discuss two conditions that estimation of retrieval precision and recall rely on—accurate human coding and full data collection—and how to calculate these statistics in cases that deviate from the two ideal conditions. We then apply the framework on a real-world example using approximately 4 million tobacco-related tweets collected from the Twitter firehose. Results We developed and applied a search filter to retrieve e-cigarette–related tweets from the archive based on three keyword categories: devices, brands, and behavior. The search filter retrieved 82,205 e-cigarette–related tweets from the archive and was validated. Retrieval precision was calculated above 95% in all cases. Retrieval recall was 86% assuming ideal conditions (no human coding errors and full data collection), 75% when unretrieved messages could not be archived, 86% assuming no false negative errors by coders, and 93% allowing both false negative and false positive errors by human coders. Conclusions This paper sets forth a conceptual framework for the filtering and quality evaluation of social data that addresses several common challenges and moves toward establishing a standard of reporting social data. Researchers should clearly delineate data sources, how data were accessed and collected, and the search filter building process and how retrieval precision and recall were calculated. The proposed framework can be adapted to other public social media platforms. PMID:26920122
Veinot, Tiffany C; Campbell, Terrance R; Kruger, Daniel J; Grodzinski, Alison
2013-01-01
We investigated the user requirements of African-American youth (aged 14-24 years) to inform the design of a culturally appropriate, network-based informatics intervention for the prevention of HIV and other sexually transmitted infections (STI). We conducted 10 focus groups with 75 African-American youth from a city with high HIV/STI prevalence. Data analyses involved coding using qualitative content analysis procedures and memo writing. Unexpectedly, the majority of participants' design recommendations concerned trust. Youth expressed distrust towards people and groups, which was amplified within the context of information technology-mediated interactions about HIV/STI. Participants expressed distrust in the reliability of condoms and the accuracy of HIV tests. They questioned the benevolence of many institutions, and some rejected authoritative HIV/STI information. Therefore, reputational information, including rumor, influenced HIV/STI-related decision making. Participants' design requirements also focused on trust-related concerns. Accordingly, we developed a novel trust-centered design framework to guide intervention design. Current approaches to online trust for health informatics do not consider group-level trusting patterns. Yet, trust was the central intervention-relevant issue among African-American youth, suggesting an important focus for culturally informed design. Our design framework incorporates: intervention objectives (eg, network embeddedness, participation); functional specifications (eg, decision support, collective action, credible question and answer services); and interaction design (eg, member control, offline network linkages, optional anonymity). Trust is a critical focus for HIV/STI informatics interventions for young African Americans. Our design framework offers practical, culturally relevant, and systematic guidance to designers to reach this underserved group better.
2015-01-01
for IC fault detection . This section provides background information on inversion methods. Conventional inversion techniques and their shortcomings are...physical techniques, electron beam imaging/analysis, ion beam techniques, scanning probe techniques. Electrical tests are used to detect faults in 13 an...hand, there is also the second harmonic technique through which duty cycle degradation faults are detected by collecting the magnitude and the phase of
2015-06-01
Stockholm International Peace Research Institute (SIPRI) yearbooks, published documents of EU governments, annual reports and studies , and others. The main...the framework of impact assessment. In this context, five studies were commissioned, in particular to collect more 48 quantitative information...military spending, in particular in the defense procurement and research and development areas, has been negatively affecting defense companies in
Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Worley, Charles R.
2011-01-01
In July of 2008, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the geologic controls on island framework from Ship Island to Horn Island, Mississippi, for the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazard Susceptibility project. Funding was provided through the Geologic Framework and Holocene Coastal Evolution of the Mississippi-Alabama Region Subtask (http://ngom.er.usgs.gov/task2_2/index.php); this project is also part of a broader USGS study on Coastal Change and Transport (CCT). This report serves as an archive of unprocessed digital Chirp seismic reflection data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, observer's logbook, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.
NASA Astrophysics Data System (ADS)
Engelman, Jonathan
Changing student conceptions in physics is a difficult process and has been a topic of research for many years. The purpose of this study was to understand what prompted students to change or not change their incorrect conceptions of Newtons Second or Third Laws in response to an intervention, Interactive Video Vignettes (IVVs), designed to overcome them. This study is based on prior research reported in the literature which has found that a curricular framework of elicit, confront, resolve, and reflect (ECRR) is important for changing student conceptions (McDermott, 2001). This framework includes four essential parts such that during an instructional event student conceptions should be elicited, incorrect conceptions confronted, these conflicts resolved, and then students should be prompted to reflect on their learning. Twenty-two undergraduate student participants who completed either or both IVVs were studied to determine whether or not they experienced components of the ECRR framework at multiple points within the IVVs. A fully integrated, mixed methods design was used to address the study purpose. Both quantitative and qualitative data were collected iteratively for each participant. Successive data collections were informed by previous data collections. All data were analyzed concurrently. The quantitative strand included a pre/post test that participants took before and after completing a given IVV and was used to measure the effect of each IVV on learning. The qualitative strand included video of each participant completing the IVV as well as an audio-recorded video elicitation interview after the post-test. The qualitative data collection was designed to describe student experiences with each IVV as well as to observe how the ECRR framework was experienced. Collecting and analyzing data using this mixed methods approach helped develop a more complete understanding of how student conceptions of Newtons Second and Third Laws changed through completion of IVVs and how the ECRR framework was experienced. In answering the research questions, two major conclusions were reached: (1) while the ECRR framework was experienced in both the Newtons 2nd Law and Newtons 3rd Law IVVs, these experiences were qualitatively different from each other and these differences help support the differences in gain scores on the post-tests for the participants; and (2) both IVVs were able to change certain misconceptions associated with either Newtons 2nd or 3rd laws more than others. Therefore, in researching student experiences while completing the Newtons 2nd Law and Newtons 3rd Law IVVs, I determined that a complete, sequential experience of the elicit, confront, resolve, reflect framework led to the greatest change in student conceptions. This dissertation adds to the field of physics education through finding the positive impact of the ECRR framework, as IVVs are still being created and disseminated. Physics educators and researchers interested in conceptual change can use these findings to provide evidence on what students think when interacting with videos designed to change their conceptions. Finally, this dissertation supports the conceptual change literature in that the full, sequential experience involving each component of the ECRR framework led to a change in student conceptions.
Bolivia 1998: results from the Demographic and Health Survey.
2000-09-01
This document presents the results of the Bolivia Demographic and Health Survey (DHS), or Encuesta Nacional de Demografia y Salud 1998, conducted by the Instituto Nacional de Estadistica, La Paz, Bolivia, within the framework of the DHS Program of Macro International. Data were collected from 12,109 households and complete interviews were conducted with 11,187 women aged 15-49. A male survey was also conducted, which collected data from 3780 men aged 15-64. The information collected include the following: 1) general characteristics of the population, 2) fertility, 3) fertility preferences, 4) current contraceptive use, 5) contraception, 6) marital and contraceptive status, 7) postpartum variables, 8) infant mortality, 9) health: disease prevention and treatment, and 10) nutritional status: anthropometric measures.
Deep Borehole Disposal Safety Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freeze, Geoffrey A.; Stein, Emily; Price, Laura L.
This report presents a preliminary safety analysis for the deep borehole disposal (DBD) concept, using a safety case framework. A safety case is an integrated collection of qualitative and quantitative arguments, evidence, and analyses that substantiate the safety, and the level of confidence in the safety, of a geologic repository. This safety case framework for DBD follows the outline of the elements of a safety case, and identifies the types of information that will be required to satisfy these elements. At this very preliminary phase of development, the DBD safety case focuses on the generic feasibility of the DBD concept.more » It is based on potential system designs, waste forms, engineering, and geologic conditions; however, no specific site or regulatory framework exists. It will progress to a site-specific safety case as the DBD concept advances into a site-specific phase, progressing through consent-based site selection and site investigation and characterization.« less
Methods for Evaluating Respondent Attrition in Web-Based Surveys
Sabo, Roy T; Krist, Alex H; Day, Teresa; Cyrus, John; Woolf, Steven H
2016-01-01
Background Electronic surveys are convenient, cost effective, and increasingly popular tools for collecting information. While the online platform allows researchers to recruit and enroll more participants, there is an increased risk of participant dropout in Web-based research. Often, these dropout trends are simply reported, adjusted for, or ignored altogether. Objective To propose a conceptual framework that analyzes respondent attrition and demonstrates the utility of these methods with existing survey data. Methods First, we suggest visualization of attrition trends using bar charts and survival curves. Next, we propose a generalized linear mixed model (GLMM) to detect or confirm significant attrition points. Finally, we suggest applications of existing statistical methods to investigate the effect of internal survey characteristics and patient characteristics on dropout. In order to apply this framework, we conducted a case study; a seventeen-item Informed Decision-Making (IDM) module addressing how and why patients make decisions about cancer screening. Results Using the framework, we were able to find significant attrition points at Questions 4, 6, 7, and 9, and were also able to identify participant responses and characteristics associated with dropout at these points and overall. Conclusions When these methods were applied to survey data, significant attrition trends were revealed, both visually and empirically, that can inspire researchers to investigate the factors associated with survey dropout, address whether survey completion is associated with health outcomes, and compare attrition patterns between groups. The framework can be used to extract information beyond simple responses, can be useful during survey development, and can help determine the external validity of survey results. PMID:27876687
Personal Photo Enhancement Using Internet Photo Collections.
Zhang, Chenxi; Gao, Jizhou; Wang, Oliver; Georgel, Pierre; Yang, Ruigang; Davis, James; Frahm, Jan-Michael; Pollefeys, Marc
2013-04-26
Given the growth of Internet photo collections we now have a visual index of all major cities and tourist sites in the world. However, it is still a difficult task to capture that perfect shot with your own camera when visiting these places, especially when your camera itself has limitations, such as a limited field of view. In this paper, we propose a framework to overcome the imperfections of personal photos of tourist sites using the rich information provided by large scale Internet photo collections. Our method deploys state-of-the-art techniques for constructing initial 3D models from photo collections. The same techniques are then used to register personal photos to these models, allowing us to augment personal 2D images with 3D information. This strong available scene prior allows us to address a number of traditionally challenging image enhancement techniques, and achieve high quality results using simple and robust algorithms. Specifically, we demonstrate automatic foreground segmentation, mono-to-stereo conversion, the field of view expansion, photometric enhancement, and additionally automatic annotation with geo-location and tags. Our method clearly demonstrates some possible benefits of employing the rich information contained in on-line photo databases to efficiently enhance and augment one’s own personal photos.
Ingle, Kapilkumar N; Harada, Koichi; Wei, Chang Nian; Minamoto, Keiko; Ueda, Atsushi
2011-03-01
The leather industry is one of the main examples of industries which play an important role in the Indian economy in terms of exports and employment opportunities, while being blamed for environmental pollution. The objective of this study was to find the advances or improvements in the Japanese leather industry which are not found in typical leather industries in developing countries. We examined the Japanese leather industry in this context because Japan is a developed country in which tanning processes have been a traditional business from ancient times, and also the leather industry has played an important role in the process of economic development of Japan. The study was based both on information collected from various areas related to the leather industry or leather industry stakeholders, and also on a review of published information. Information was collected through site visits, interviews, questionnaires, and detailed discussions with these stakeholders, as well as from their websites. The framework of a typical leather industry is discussed in three sections: pollution prevention, pollution control, and pollution mitigation related to sources, processes, and impact possibilities, respectively. Eleven basic differences were noted between the Japanese and Indian leather industries. The availability of melting centers is the main important feature of the Japanese leather sector. Guidelines are suggested which focus on some changes that are expected to lead to both environmental and economic benefits, with better pollution management, which should lead to continuous improvement of the environmental performance of the industry, and, finally, sustainable development.
Information surfing with the JHU/APL coherent imager
NASA Astrophysics Data System (ADS)
Ratto, Christopher R.; Shipley, Kara R.; Beagley, Nathaniel; Wolfe, Kevin C.
2015-05-01
The ability to perform remote forensics in situ is an important application of autonomous undersea vehicles (AUVs). Forensics objectives may include remediation of mines and/or unexploded ordnance, as well as monitoring of seafloor infrastructure. At JHU/APL, digital holography is being explored for the potential application to underwater imaging and integration with an AUV. In previous work, a feature-based approach was developed for processing the holographic imagery and performing object recognition. In this work, the results of the image processing method were incorporated into a Bayesian framework for autonomous path planning referred to as information surfing. The framework was derived assuming that the location of the object of interest is known a priori, but the type of object and its pose are unknown. The path-planning algorithm adaptively modifies the trajectory of the sensing platform based on historical performance of object and pose classification. The algorithm is called information surfing because the direction of motion is governed by the local information gradient. Simulation experiments were carried out using holographic imagery collected from submerged objects. The autonomous sensing algorithm was compared to a deterministic sensing CONOPS, and demonstrated improved accuracy and faster convergence in several cases.
Sensor assignment to mission in AI-TECD
NASA Astrophysics Data System (ADS)
Ganger, Robert; de Mel, Geeth; Pham, Tien; Rudnicki, Ronald; Schreiber, Yonatan
2016-05-01
Sensor-mission assignment involves the allocation of sensors and other information-providing resources to missions in order to cover the information needs of the individual tasks within each mission. The importance of efficient and effective means to find appropriate resources for tasks is exacerbated in the coalition context where the operational environment is dynamic and a multitude of critically important tasks need to achieve their collective goals to meet the objectives of the coalition. The Sensor Assignment to Mission (SAM) framework—a research product of the International Technology Alliance in Network and Information Sciences (NIS-ITA) program—provided the first knowledge intensive resource selection approach for the sensor network domain so that contextual information could be used to effectively select resources for tasks in coalition environments. Recently, CUBRC, Inc. was tasked with operationalizing the SAM framework through the use of the I2WD Common Core Ontologies for the Communications-Electronics Research, Development and Engineering Center (CERDEC) sponsored Actionable Intelligence Technology Enabled Capabilities Demonstration (AI-TECD). The demonstration event took place at Fort Dix, New Jersey during July 2015, and this paper discusses the integration and the successful demonstration of the SAM framework within the AI-TECD, lessons learned, and its potential impact in future operations.
Berntsen, Gro; Høyem, Audhild; Lettrem, Idar; Ruland, Cornelia; Rumpsfeld, Markus; Gammon, Deede
2018-06-20
Person-Centered Integrated Care (PC-IC) is believed to improve outcomes and experience for persons with multiple long-term and complex conditions. No broad consensus exists regarding how to capture the patient-experienced quality of PC-IC. Most PC-IC evaluation tools focus on care events or care in general. Building on others' and our previous work, we outlined a 4-stage goal-oriented PC-IC process ideal: 1) Personalized goal setting 2) Care planning aligned with goals 3) Care delivery according to plan, and 4) Evaluation of goal attainment. We aimed to explore, apply, refine and operationalize this quality of care framework. This paper is a qualitative evaluative review of the individual Patient Pathways (iPP) experiences of 19 strategically chosen persons with multimorbidity in light of ideals for chronic care. The iPP includes all care events, addressing the persons collected health issues, organized by time. We constructed iPPs based on the electronic health record (from general practice, nursing services, and hospital) with patient follow-up interviews. The application of the framework and its refinement were parallel processes. Both were based on analysis of salient themes in the empirical material in light of the PC-IC process ideal and progressively more informed applications of themes and questions. The informants consistently reviewed care quality by how care supported/ threatened their long-term goals. Personal goals were either implicit or identified by "What matters to you?" Informants expected care to address their long-term goals and placed responsibility for care quality and delivery at the system level. The PC-IC process framework exposed system failure in identifying long-term goals, provision of shared long-term multimorbidity care plans, monitoring of care delivery and goal evaluation. The PC-IC framework includes descriptions of ideal care, key questions and literature references for each stage of the PC-IC process. This first version of a PC-IC process framework needs further validation in other settings. Gaps in care that are invisible with event-based quality of care frameworks become apparent when evaluated by a long-term goal-driven PC-IC process framework. The framework appears meaningful to persons with multimorbidity.
Tsoka-Gwegweni, Joyce M; Wassenaar, Douglas R
2014-12-01
The Emanuel, Wendler, and Grady framework was designed as a universal tool for use in many settings including developing countries. However, it is not known whether the work of African health research ethics committees (RECs) is compatible with this framework. The absence of any normative or empirical weighting of the eight principles within this framework suggests that different health RECs may raise some ethical issues more frequently than others when reviewing protocols. We used the Emanuel et al. framework to assess, code, and rank the most frequent ethical issues considered by a biomedical REC during review of research protocols for the years 2008 to 2012. We extracted data from the recorded minutes of a South African biomedical REC for the years 2008 to 2012, designed the data collection sheet according to the Emanuel et al. framework, and removed all identifiers during data processing and analysis. From the 98 protocols that we assessed, the most frequent issues that emerged were the informed consent, scientific validity, fair participant selection, and ongoing respect for participants. This study represents the first known attempt to analyze REC responses/minutes using the Emanuel et al. framework, and suggests that this framework may be useful in describing and categorizing the core activities of an REC. © The Author(s) 2014.
VISTILES: Coordinating and Combining Co-located Mobile Devices for Visual Data Exploration.
Langner, Ricardo; Horak, Tom; Dachselt, Raimund
2017-08-29
We present VISTILES, a conceptual framework that uses a set of mobile devices to distribute and coordinate visualization views for the exploration of multivariate data. In contrast to desktop-based interfaces for information visualization, mobile devices offer the potential to provide a dynamic and user-defined interface supporting co-located collaborative data exploration with different individual workflows. As part of our framework, we contribute concepts that enable users to interact with coordinated & multiple views (CMV) that are distributed across several mobile devices. The major components of the framework are: (i) dynamic and flexible layouts for CMV focusing on the distribution of views and (ii) an interaction concept for smart adaptations and combinations of visualizations utilizing explicit side-by-side arrangements of devices. As a result, users can benefit from the possibility to combine devices and organize them in meaningful spatial layouts. Furthermore, we present a web-based prototype implementation as a specific instance of our concepts. This implementation provides a practical application case enabling users to explore a multivariate data collection. We also illustrate the design process including feedback from a preliminary user study, which informed the design of both the concepts and the final prototype.
Murphy, Dennis D; Weiland, Paul S
2011-02-01
The Endangered Species Act is intended to conserve at-risk species and the ecosystems upon which they depend, and it is premised on the notion that if the wildlife agencies that are charged with implementing the statute use the best available scientific information, they can successfully carry out this intention. We assess effects analysis as a tool for using best science to guide agency decisions under the Act. After introducing effects analysis, we propose a framework that facilitates identification and use of the best available information in the development of agency determinations. The framework includes three essential steps--the collection of reliable scientific information, the critical assessment and synthesis of available data and analyses derived from those data, and the analysis of the effects of actions on listed species and their habitats. We warn of likely obstacles to rigorous, structured effect analyses and describe the extent to which independent scientific review may assist in overcoming these obstacles. We conclude by describing eight essential elements that are required for a successful effects analysis.
Smartphone and GPS technology for free-roaming dog population surveillance - a methodological study.
Barnard, Shanis; Ippoliti, Carla; Di Flaviano, Daniele; De Ruvo, Andrea; Messori, Stefano; Giovannini, Armando; Dalla Villa, Paolo
2015-01-01
Free-roaming dogs (FRD) represent a potential threat to the quality of life in cities from an ecological, social and public health point of view. One of the most urgent concerns is the role of uncontrolled dogs as reservoirs of infectious diseases transmittable to humans and, above all, rabies. An estimate of the FRD population size and characteristics in a given area is the first step for any relevant intervention programme. Direct count methods are still prominent because of their non-invasive approach, information technologies can support such methods facilitating data collection and allowing for a more efficient data handling. This paper presents a new framework for data collection using a topological algorithm implemented as ArcScript in ESRI® ArcGIS software, which allows for a random selection of the sampling areas. It also supplies a mobile phone application for Android® operating system devices which integrates Global Positioning System (GPS) and Google MapsTM. The potential of such a framework was tested in 2 Italian regions. Coupling technological and innovative solutions associated with common counting methods facilitate data collection and transcription. It also paves the way to future applications, which could support dog population management systems.
Information at the edge of chaos in fluid neural networks
NASA Astrophysics Data System (ADS)
Solé, Ricard V.; Miramontes, Octavio
1995-01-01
Fluid neural networks, defined as neural nets of mobile elements with random activation, are studied by means of several approaches. They are proposed as a theoretical framework for a wide class of systems as insect societies, collectives of robots or the immune system. The critical properties of this model are also analysed, showing the existence of a critical boundary in parameter space where maximum information transfer occurs. In this sense, this boundary is in fact an example of the “edge of chaos” in systems like those described in our approach. Recent experiments with ant colonies seem to confirm our result.
Physiology-based face recognition in the thermal infrared spectrum.
Buddharaju, Pradeep; Pavlidis, Ioannis T; Tsiamyrtzis, Panagiotis; Bazakos, Mike
2007-04-01
The current dominant approaches to face recognition rely on facial characteristics that are on or over the skin. Some of these characteristics have low permanency can be altered, and their phenomenology varies significantly with environmental factors (e.g., lighting). Many methodologies have been developed to address these problems to various degrees. However, the current framework of face recognition research has a potential weakness due to its very nature. We present a novel framework for face recognition based on physiological information. The motivation behind this effort is to capitalize on the permanency of innate characteristics that are under the skin. To establish feasibility, we propose a specific methodology to capture facial physiological patterns using the bioheat information contained in thermal imagery. First, the algorithm delineates the human face from the background using the Bayesian framework. Then, it localizes the superficial blood vessel network using image morphology. The extracted vascular network produces contour shapes that are characteristic to each individual. The branching points of the skeletonized vascular network are referred to as Thermal Minutia Points (TMPs) and constitute the feature database. To render the method robust to facial pose variations, we collect for each subject to be stored in the database five different pose images (center, midleft profile, left profile, midright profile, and right profile). During the classification stage, the algorithm first estimates the pose of the test image. Then, it matches the local and global TMP structures extracted from the test image with those of the corresponding pose images in the database. We have conducted experiments on a multipose database of thermal facial images collected in our laboratory, as well as on the time-gap database of the University of Notre Dame. The good experimental results show that the proposed methodology has merit, especially with respect to the problem of low permanence over time. More importantly, the results demonstrate the feasibility of the physiological framework in face recognition and open the way for further methodological and experimental research in the area.
Garriga, Ricard Giné; de Palencia, Alejandro Jiménez Fdez; Foguet, Agustí Pérez
2015-09-01
Today, a vast proportion of people still lack a simple pit latrine and a source of safe drinking water. To help end this appalling state of affairs, there is a pressing need to provide policymakers with evidences which may be the basis of effective planning, targeting and prioritization. Two major challenges often hinder this process: i) lack of reliable data to identify which areas are most in need; and ii) inadequate instruments for decision-making support. In tackling previous shortcomings, this paper proposes a monitoring framework to compile, analyze, interpret and disseminate water, sanitation and hygiene information. In an era of decentralization, where decision-making moves to local governments, we apply such framework at the local level. The ultimate goal is to develop appropriate tools for decentralized planning support. To this end, the study first implements a methodology for primary data collection, which combines the household and the waterpoint as information sources. In doing so, we provide a complete picture of the context in which domestic WASH services are delivered. Second, the collected data are analyzed to underline the emerging development challenges. The use of simple planning indicators serves as the basis to i) reveal which areas require policy attention, and to ii) identify the neediest. Third, a classification process is proposed to prioritize among various populations. Three different case studies from East and Southern African countries are presented. Results indicate that accurate and comprehensive data, if adequately exploited through simple instruments, may be the basis of effective targeting and prioritization, which are central to sector planning. The application of the proposed framework in the real world, however, is to a certain extent elusive; and we point out to conclude two specific challenges that remain unaddressed, namely the upgrade of existing decision-making processes to enhance transparency and inclusiveness, and the development of data updating mechanisms. Copyright © 2015 Elsevier B.V. All rights reserved.
Ecological Feasibility Studies in Restoration Decision Making
NASA Astrophysics Data System (ADS)
Hopfensperger, Kristine N.; Engelhardt, Katharina A. M.; Seagle, Steven W.
2007-06-01
The restoration of degraded systems is essential for maintaining the provision of valuable ecosystem services, including the maintenance of aesthetic values. However, restoration projects often fail to reach desired goals for a variety of ecologic, financial, and social reasons. Feasibility studies that evaluate whether a restoration effort should even be attempted can enhance restoration success by highlighting potential pitfalls and gaps in knowledge before the design phase of a restoration. Feasibility studies also can bring stakeholders together before a restoration project is designed to discuss potential disagreements. For these reasons, a feasibility study was conducted to evaluate the efficacy of restoring a tidal freshwater marsh in the Potomac River near Alexandria, Virginia. The study focused on science rather than engineering questions, and thus differed in approach from other feasibility studies that are mostly engineering driven. The authors report the framework they used to conduct a feasibility study to inform other potential restoration projects with similar goals. The seven steps of the framework encompass (1) initiation of a feasibility study, (2) compilation of existing data, (3) collection of current site information, (4) examination of case studies, (5) synthesis of information in a handbook, (6) meeting with selected stakeholders, and (7) evaluation of meeting outcomes. By conducting a feasibility study using the seven-step framework, the authors set the stage for conducting future compliance studies and enhancing the chance of a successful restoration.
NASA Astrophysics Data System (ADS)
Evans, B. J. K.; Wyborn, L. A.; Druken, K. A.; Richards, C. J.; Trenham, C. E.; Wang, J.
2016-12-01
The Australian National Computational Infrastructure (NCI) manages a large geospatial repository (10+ PBytes) of Earth systems, environmental, water management and geophysics research data, co-located with a petascale supercomputer and an integrated research cloud. NCI has applied the principles of the "Common Framework for Earth-Observation Data" (the Framework) to the organisation of these collections enabling a diverse range of researchers to explore different aspects of the data and, in particular, for seamless programmatic data analysis, both in-situ access and via data services. NCI provides access to the collections through the National Environmental Research Data Interoperability Platform (NERDIP) - a comprehensive and integrated data platform with both common and emerging services designed to enable data accessibility and citability. Applying the Framework across the range of datasets ensures that programmatic access, both in-situ and network methods, work as uniformly as possible for any dataset, using both APIs and data services. NCI has also created a comprehensive quality assurance framework to regularise compliance checks across the data, library APIs and data services, and to establish a comprehensive set of benchmarks to quantify both functionality and performance perspectives for the Framework. The quality assurance includes organisation of datasets through a data management plan, which anchors the data directory structure, version controls and data information services so that they are kept aligned with operational changes over time. Specific attention has been placed on the way data are packed inside the files. Our experience has shown that complying with standards such as CF and ACDD is still not enough to ensure that all data services or software packages correctly read the data. Further, data may not be optimally organised for the different access patterns, which causes poor performance of the CPUs and bandwidth utilisation. We will also discuss some gaps in the Framework that have emerged and our approach to resolving these.
The utility of information collected by occupational disease surveillance systems.
Money, A; Carder, M; Hussey, L; Agius, R M
2015-11-01
The Health and Occupation Research (THOR) network in the UK and the Republic of Ireland (ROI) is an integrated system of surveillance schemes collecting work-related ill-health (WRIH) data since 1989. In addition to providing information about disease incidence, trends in incidence and the identification of new hazards, THOR also operates an ad hoc data enquiry service enabling interested parties to request information about cases of WRIH reported to THOR. To examine requests for information made to a network of surveillance schemes for WRIH in the UK. Analysis via SPSS of data requests received by THOR between 2002 and 2014. A total of 631 requests were received by THOR between 2002 and 2014. Requests were predominantly submitted by participating THOR physicians (34%) and the main THOR funder-the UK Health & Safety Executive (HSE) (31%). The majority (67%) of requests were for information about work-related respiratory or skin disease with relatively few requests for other diagnoses, such as musculoskeletal or mental ill-health. Requests frequently related to a specific industry and/or occupation (42%) and/or a specific causal agent (58%). Data collected by occupational disease surveillance systems such as THOR are an extremely useful source of information, the use of which extends beyond informing government on disease incidence and trends in incidence. The data collected provide a framework that can assist a wide range of enquirers with clinical diagnoses, identification of suspected causative agents/exposures and to highlight growing risks in particular industrial and occupational sectors. © The Author 2015. Published by Oxford University Press on behalf of the Society of Occupational Medicine. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.; Bishop, Ann P.; Barclay, Rebecca O.; Kennedy, John M.
1992-01-01
Increasing reliance on and investment in information technology and electronic networking systems presupposes that computing and information technology will play a major role in the diffusion of aerospace knowledge. Little is known, however, about actual information technology needs, uses, and problems within the aerospace knowledge diffusion process. The authors state that the potential contributions of information technology to increased productivity and competitiveness will be diminished unless empirically derived knowledge regarding the information-seeking behavior of the members of the social system - those who are producing, transferring, and using scientific and technical information - is incorporated into a new technology policy framework. Research into the use of information technology and electronic networks by U.S. aerospace engineers and scientists, collected as part of a research project designed to study aerospace knowledge diffusion, is presented in support of this assertion.
Improving information recognition and performance of recycling chimneys.
Durugbo, Christopher
2013-01-01
The aim of this study was to assess and improve how recyclers (individuals carrying out the task of recycling) make use of visual cues to carryout recycling tasks in relation to 'recycling chimneys' (repositories for recycled waste). An initial task analysis was conducted through an activity sampling study and an eye tracking experiment using a mobile eye tracker to capture fixations of recyclers during recycling tasks. Following data collection using the eye tracker, a set of recommendations for improving information representation were then identified using the widely researched skills, rules, knowledge framework, and for a comparative study to assess the performance of improved interfaces for recycling chimneys based on Ecological Interface Design principles. Information representation on recycling chimneys determines how we recycle waste. This study describes an eco-ergonomics-based approach to improve the design of interfaces for recycling chimneys. The results are valuable for improving the performance of waste collection processes in terms of minimising contamination and increasing the quantity of recyclables.
A Conceptual Model of the Information Requirements of Nursing Organizations
Miller, Emmy
1989-01-01
Three related issues play a role in the identification of the information requirements of nursing organizations. These issues are the current state of computer systems in health care organizations, the lack of a well-defined data set for nursing, and the absence of models representing data and information relevant to clinical and administrative nursing practice. This paper will examine current methods of data collection, processing, and storage in clinical and administrative nursing practice for the purpose of identifying the information requirements of nursing organizations. To satisfy these information requirements, database technology can be used; however, a model for database design is needed that reflects the conceptual framework of nursing and the professional concerns of nurses. A conceptual model of the types of data necessary to produce the desired information will be presented and the relationships among data will be delineated.
Ahanhanzo, Yolaine Glèlè; Kpozehouen, Alphonse; Sopoh, Ghislain; Sossa-Jérôme, Charles; Ouedraogo, Laurent; Wilmet-Dramaix, Michèle
2016-01-01
The management of health information is a key pillar in both emergencies reception and handling facilities, given the strategic position and the potential of these facilities within hospitals, and in the monitoring of public health and epidemiology. With the technological revolution, computerization made the information systems evolve in emergency departments, especially in developed countries, with improved performance in terms of care quality, productivity and patient satisfaction. This study analyses the situation of Benin in this field, through the case of the Academic Clinic of Emergency Department of the National University Teaching Hospital of Cotonou, the national reference hospital. The study is cross-sectional and evaluative. Collection techniques are literature review and structured interviews. The components rated are resources, indicators, data sources, data management and the use-dissemination of the information through a model adapted from Health Metrics Network framework. We used quantitative and qualitative analysis. The absence of a regulatory framework restricts the operation of the system in all components and accounts for the lack and inadequacy of the dedicated resources. Dedication of more resources for this system for crucial needs such as computerization requires sensitization and greater awareness of the administrative authorities about the fact that an effective health information management system is of prime importance in this type of facility.
Portraits of self-organization in fish schools interacting with robots
NASA Astrophysics Data System (ADS)
Aureli, M.; Fiorilli, F.; Porfiri, M.
2012-05-01
In this paper, we propose an enabling computational and theoretical framework for the analysis of experimental instances of collective behavior in response to external stimuli. In particular, this work addresses the characterization of aggregation and interaction phenomena in robot-animal groups through the exemplary analysis of fish schooling in the vicinity of a biomimetic robot. We adapt global observables from statistical mechanics to capture the main features of the shoal collective motion and its response to the robot from experimental observations. We investigate the shoal behavior by using a diffusion mapping analysis performed on these global observables that also informs the definition of relevant portraits of self-organization.
Using airborne geophysical surveys to improve groundwater resource management models
Abraham, Jared D.; Cannia, James C.; Peterson, Steven M.; Smith, Bruce D.; Minsley, Burke J.; Bedrosian, Paul A.
2010-01-01
Increasingly, groundwater management requires more accurate hydrogeologic frameworks for groundwater models. These complex issues have created the demand for innovative approaches to data collection. In complicated terrains, groundwater modelers benefit from continuous high‐resolution geologic maps and their related hydrogeologic‐parameter estimates. The USGS and its partners have collaborated to use airborne geophysical surveys for near‐continuous coverage of areas of the North Platte River valley in western Nebraska. The survey objectives were to map the aquifers and bedrock topography of the area to help improve the understanding of groundwater‐surface‐water relationships, leading to improved water management decisions. Frequency‐domain heliborne electromagnetic surveys were completed, using a unique survey design to collect resistivity data that can be related to lithologic information to refine groundwater model inputs. To render the geophysical data useful to multidimensional groundwater models, numerical inversion is necessary to convert the measured data into a depth‐dependent subsurface resistivity model. This inverted model, in conjunction with sensitivity analysis, geological ground truth (boreholes and surface geology maps), and geological interpretation, is used to characterize hydrogeologic features. Interpreted two‐ and three‐dimensional data coverage provides the groundwater modeler with a high‐resolution hydrogeologic framework and a quantitative estimate of framework uncertainty. This method of creating hydrogeologic frameworks improved the understanding of flow path orientation by redefining the location of the paleochannels and associated bedrock highs. The improved models reflect actual hydrogeology at a level of accuracy not achievable using previous data sets.
Park, Soojin; Park, Sungyong; Park, Young B
2018-02-12
With the emergence of various forms of smart devices and new paradigms such as the Internet of Things (IoT) concept, the IT (Information Technology) service areas are expanding explosively compared to the provision of services by single systems. A new system operation concept that has emerged in accordance with such technical trends is the IT ecosystem. The IT ecosystem can be considered a special type of system of systems in which multiple systems with various degrees of autonomy achieve common goals while adapting to the given environment. The single systems that participate in the IT ecosystem adapt autonomously to the current situation based on collected data from sensors. Furthermore, to maintain the services supported by the whole IT ecosystem sustainably, the configuration of single systems that participate in the IT ecosystem also changes appropriately in accordance with the changed situation. In order to support the IT ecosystem, this paper proposes an architecture framework that supports dynamic configuration changes to achieve the goal of the whole IT ecosystem, while ensuring the autonomy of single systems through the collection of data from sensors so as to recognize the situational context of individual participating systems. For the feasibility evaluation of the proposed framework, a simulated example of an IT ecosystem for unmanned forest management was constructed, and the quantitative evaluation results are discussed in terms of the extent to which the proposed architecture framework can continuously provide sustainable services in response to diverse environmental context changes.
Park, Young B.
2018-01-01
With the emergence of various forms of smart devices and new paradigms such as the Internet of Things (IoT) concept, the IT (Information Technology) service areas are expanding explosively compared to the provision of services by single systems. A new system operation concept that has emerged in accordance with such technical trends is the IT ecosystem. The IT ecosystem can be considered a special type of system of systems in which multiple systems with various degrees of autonomy achieve common goals while adapting to the given environment. The single systems that participate in the IT ecosystem adapt autonomously to the current situation based on collected data from sensors. Furthermore, to maintain the services supported by the whole IT ecosystem sustainably, the configuration of single systems that participate in the IT ecosystem also changes appropriately in accordance with the changed situation. In order to support the IT ecosystem, this paper proposes an architecture framework that supports dynamic configuration changes to achieve the goal of the whole IT ecosystem, while ensuring the autonomy of single systems through the collection of data from sensors so as to recognize the situational context of individual participating systems. For the feasibility evaluation of the proposed framework, a simulated example of an IT ecosystem for unmanned forest management was constructed, and the quantitative evaluation results are discussed in terms of the extent to which the proposed architecture framework can continuously provide sustainable services in response to diverse environmental context changes. PMID:29439540
Ilott, Irene; Gerrish, Kate; Booth, Andrew; Field, Becky
2013-10-01
There is an international imperative to implement research into clinical practice to improve health care. Understanding the dynamics of change requires knowledge from theoretical and empirical studies. This paper presents a novel approach to testing a new meta theoretical framework: the Consolidated Framework for Implementation Research. The utility of the Framework was evaluated using a post hoc, deductive analysis of 11 narrative accounts of innovation in health care services and practice from England, collected in 2010. A matrix, comprising the five domains and 39 constructs of the Framework was developed to examine the coherence of the terminology, to compare results across contexts and to identify new theoretical developments. The Framework captured the complexity of implementation across 11 diverse examples, offering theoretically informed, comprehensive coverage. The Framework drew attention to relevant points in individual cases together with patterns across cases; for example, all were internally developed innovations that brought direct or indirect patient advantage. In 10 cases, the change was led by clinicians. Most initiatives had been maintained for several years and there was evidence of spread in six examples. Areas for further development within the Framework include sustainability and patient/public engagement in implementation. Our analysis suggests that this conceptual framework has the potential to offer useful insights, whether as part of a situational analysis or by developing context-specific propositions for hypothesis testing. Such studies are vital now that innovation is being promoted as core business for health care. © 2012 John Wiley & Sons Ltd.
Dual Coding Theory Explains Biphasic Collective Computation in Neural Decision-Making.
Daniels, Bryan C; Flack, Jessica C; Krakauer, David C
2017-01-01
A central question in cognitive neuroscience is how unitary, coherent decisions at the whole organism level can arise from the distributed behavior of a large population of neurons with only partially overlapping information. We address this issue by studying neural spiking behavior recorded from a multielectrode array with 169 channels during a visual motion direction discrimination task. It is well known that in this task there are two distinct phases in neural spiking behavior. Here we show Phase I is a distributed or incompressible phase in which uncertainty about the decision is substantially reduced by pooling information from many cells. Phase II is a redundant or compressible phase in which numerous single cells contain all the information present at the population level in Phase I, such that the firing behavior of a single cell is enough to predict the subject's decision. Using an empirically grounded dynamical modeling framework, we show that in Phase I large cell populations with low redundancy produce a slow timescale of information aggregation through critical slowing down near a symmetry-breaking transition. Our model indicates that increasing collective amplification in Phase II leads naturally to a faster timescale of information pooling and consensus formation. Based on our results and others in the literature, we propose that a general feature of collective computation is a "coding duality" in which there are accumulation and consensus formation processes distinguished by different timescales.
Dual Coding Theory Explains Biphasic Collective Computation in Neural Decision-Making
Daniels, Bryan C.; Flack, Jessica C.; Krakauer, David C.
2017-01-01
A central question in cognitive neuroscience is how unitary, coherent decisions at the whole organism level can arise from the distributed behavior of a large population of neurons with only partially overlapping information. We address this issue by studying neural spiking behavior recorded from a multielectrode array with 169 channels during a visual motion direction discrimination task. It is well known that in this task there are two distinct phases in neural spiking behavior. Here we show Phase I is a distributed or incompressible phase in which uncertainty about the decision is substantially reduced by pooling information from many cells. Phase II is a redundant or compressible phase in which numerous single cells contain all the information present at the population level in Phase I, such that the firing behavior of a single cell is enough to predict the subject's decision. Using an empirically grounded dynamical modeling framework, we show that in Phase I large cell populations with low redundancy produce a slow timescale of information aggregation through critical slowing down near a symmetry-breaking transition. Our model indicates that increasing collective amplification in Phase II leads naturally to a faster timescale of information pooling and consensus formation. Based on our results and others in the literature, we propose that a general feature of collective computation is a “coding duality” in which there are accumulation and consensus formation processes distinguished by different timescales. PMID:28634436
Fatima, Iram; Fahim, Muhammad; Lee, Young-Koo; Lee, Sungyoung
2013-01-01
In recent years, activity recognition in smart homes is an active research area due to its applicability in many applications, such as assistive living and healthcare. Besides activity recognition, the information collected from smart homes has great potential for other application domains like lifestyle analysis, security and surveillance, and interaction monitoring. Therefore, discovery of users common behaviors and prediction of future actions from past behaviors become an important step towards allowing an environment to provide personalized service. In this paper, we develop a unified framework for activity recognition-based behavior analysis and action prediction. For this purpose, first we propose kernel fusion method for accurate activity recognition and then identify the significant sequential behaviors of inhabitants from recognized activities of their daily routines. Moreover, behaviors patterns are further utilized to predict the future actions from past activities. To evaluate the proposed framework, we performed experiments on two real datasets. The results show a remarkable improvement of 13.82% in the accuracy on average of recognized activities along with the extraction of significant behavioral patterns and precise activity predictions with 6.76% increase in F-measure. All this collectively help in understanding the users” actions to gain knowledge about their habits and preferences. PMID:23435057
Opportunities and Challenges in Supply-Side Simulation: Physician-Based Models
Gresenz, Carole Roan; Auerbach, David I; Duarte, Fabian
2013-01-01
Objective To provide a conceptual framework and to assess the availability of empirical data for supply-side microsimulation modeling in the context of health care. Data Sources Multiple secondary data sources, including the American Community Survey, Health Tracking Physician Survey, and SK&A physician database. Study Design We apply our conceptual framework to one entity in the health care market—physicians—and identify, assess, and compare data available for physician-based simulation models. Principal Findings Our conceptual framework describes three broad types of data required for supply-side microsimulation modeling. Our assessment of available data for modeling physician behavior suggests broad comparability across various sources on several dimensions and highlights the need for significant integration of data across multiple sources to provide a platform adequate for modeling. A growing literature provides potential estimates for use as behavioral parameters that could serve as the models' engines. Sources of data for simulation modeling that account for the complex organizational and financial relationships among physicians and other supply-side entities are limited. Conclusions A key challenge for supply-side microsimulation modeling is optimally combining available data to harness their collective power. Several possibilities also exist for novel data collection. These have the potential to serve as catalysts for the next generation of supply-side-focused simulation models to inform health policy. PMID:23347041
Della Seta, Maurella; Sellitri, Cinzia
2004-01-01
The research project "Collection and dissemination of bioethical information through an integrated electronic system", started in 2001 by the Istituto Superiore di Sanità (ISS), had among its objectives, the realization of an integrated system for data collection and exchange of documents related to bioethics. The system should act as a reference tool for those research activities impacting on citizens' health and welfare. This paper aims at presenting some initiatives, developed in the project framework, in order to establish an Italian documentation network, among which: a) exchange of ISS publications with Italian institutions active in this field; b) survey through a questionnaire aimed at assessing Italian informative resources, state-of-the-art and holdings of documentation centres and ethical committees; c) Italian Internet resources analysis. The results of the survey, together with the analysis of web sites, show that at present in Italy there are many interesting initiatives for collecting and spreading of documentation in the bioethical fields, but there is an urgent need for an integration of such resources. Ethical committees generally speaking need a larger availability of documents, while there are good potentialities for the establishment of an electronic network for document retrieval and delivery.
Cookies for the Real World: Assessing the Potential of RFID for Contractor Monitoring
2006-05-30
Business Systems. Litoral Combat Ship (LCS) Civilian Aviation Alternative Support Study: Report of Findings and Recommendation. July 2006. NPS-AM-06...Review of the Framework from 1987-2003." BPP Research Colloquium. 25 November 2003. Copies of the Acquisition Sponsored Research Reports may be printed...mlpqdo^ar^qb=p`elli= Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated
Utilizing a Value of Information Framework to Improve Ore Collection and Classification Procedures
2006-05-01
account for uncertainty in revenues or costs. Studies that utilize this type of deterministic modeling are: Boshkov & Wright (1973); Laubscher (1981... Disney & Peters, 2003). Disney & Peters (2003) reference a number of applications in both the veterinary and agricultural sectors. Agricultural studies...covered by revenue made from selling the end product. Because the cost data are aggregated for the BI and D3 mills at Kiruna, we have to allocate the
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bes, D. R.; Civitarese, O.
The experimental information on isospin-spin excitations around {sup 58}Ni is analyzed by using isoscalar and isovector pairing vibrations, Gamow-Teller (GT) modes, and their couplings. It is found that the proposed coupling scheme accounts for a sizable amount of the strength associated with isospin-spin excitations, which include transitions to both one- and two-phonon states. The calculations are performed within the framework of perturbation theory, accounting for the renormalization of the charge by the collective GT excitations.
2009-01-01
Background There are few studies that examine the processes that interdisciplinary teams engage in and how we can design health information systems (HIS) to support those team processes. This was an exploratory study with two purposes: (1) To develop a framework for interdisciplinary team communication based on structures, processes and outcomes that were identified as having occurred during weekly team meetings. (2) To use the framework to guide 'e-teams' HIS design to support interdisciplinary team meeting communication. Methods An ethnographic approach was used to collect data on two interdisciplinary teams. Qualitative content analysis was used to analyze the data according to structures, processes and outcomes. Results We present details for team meta-concepts of structures, processes and outcomes and the concepts and sub concepts within each meta-concept. We also provide an exploratory framework for interdisciplinary team communication and describe how the framework can guide HIS design to support 'e-teams'. Conclusion The structures, processes and outcomes that describe interdisciplinary teams are complex and often occur in a non-linear fashion. Electronic data support, process facilitation and team video conferencing are three HIS tools that can enhance team function. PMID:19754966
Kuziemsky, Craig E; Borycki, Elizabeth M; Purkis, Mary Ellen; Black, Fraser; Boyle, Michael; Cloutier-Fisher, Denise; Fox, Lee Ann; MacKenzie, Patricia; Syme, Ann; Tschanz, Coby; Wainwright, Wendy; Wong, Helen
2009-09-15
There are few studies that examine the processes that interdisciplinary teams engage in and how we can design health information systems (HIS) to support those team processes. This was an exploratory study with two purposes: (1) To develop a framework for interdisciplinary team communication based on structures, processes and outcomes that were identified as having occurred during weekly team meetings. (2) To use the framework to guide 'e-teams' HIS design to support interdisciplinary team meeting communication. An ethnographic approach was used to collect data on two interdisciplinary teams. Qualitative content analysis was used to analyze the data according to structures, processes and outcomes. We present details for team meta-concepts of structures, processes and outcomes and the concepts and sub concepts within each meta-concept. We also provide an exploratory framework for interdisciplinary team communication and describe how the framework can guide HIS design to support 'e-teams'. The structures, processes and outcomes that describe interdisciplinary teams are complex and often occur in a non-linear fashion. Electronic data support, process facilitation and team video conferencing are three HIS tools that can enhance team function.
Gardiner, Bruce S.; Wong, Kelvin K. L.; Joldes, Grand R.; Rich, Addison J.; Tan, Chin Wee; Burgess, Antony W.; Smith, David W.
2015-01-01
This paper presents a framework for modelling biological tissues based on discrete particles. Cell components (e.g. cell membranes, cell cytoskeleton, cell nucleus) and extracellular matrix (e.g. collagen) are represented using collections of particles. Simple particle to particle interaction laws are used to simulate and control complex physical interaction types (e.g. cell-cell adhesion via cadherins, integrin basement membrane attachment, cytoskeletal mechanical properties). Particles may be given the capacity to change their properties and behaviours in response to changes in the cellular microenvironment (e.g., in response to cell-cell signalling or mechanical loadings). Each particle is in effect an ‘agent’, meaning that the agent can sense local environmental information and respond according to pre-determined or stochastic events. The behaviour of the proposed framework is exemplified through several biological problems of ongoing interest. These examples illustrate how the modelling framework allows enormous flexibility for representing the mechanical behaviour of different tissues, and we argue this is a more intuitive approach than perhaps offered by traditional continuum methods. Because of this flexibility, we believe the discrete modelling framework provides an avenue for biologists and bioengineers to explore the behaviour of tissue systems in a computational laboratory. PMID:26452000
Gaze-contingent perceptually enabled interactions in the operating theatre.
Kogkas, Alexandros A; Darzi, Ara; Mylonas, George P
2017-07-01
Improved surgical outcome and patient safety in the operating theatre are constant challenges. We hypothesise that a framework that collects and utilises information -especially perceptually enabled ones-from multiple sources, could help to meet the above goals. This paper presents some core functionalities of a wider low-cost framework under development that allows perceptually enabled interaction within the surgical environment. The synergy of wearable eye-tracking and advanced computer vision methodologies, such as SLAM, is exploited. As a demonstration of one of the framework's possible functionalities, an articulated collaborative robotic arm and laser pointer is integrated and the set-up is used to project the surgeon's fixation point in 3D space. The implementation is evaluated over 60 fixations on predefined targets, with distances between the subject and the targets of 92-212 cm and between the robot and the targets of 42-193 cm. The median overall system error is currently 3.98 cm. Its real-time potential is also highlighted. The work presented here represents an introduction and preliminary experimental validation of core functionalities of a larger framework under development. The proposed framework is geared towards a safer and more efficient surgical theatre.
Sridharan, Sanjeev; Jones, Bobby; Caudill, Barry; Nakaima, April
2016-10-01
This paper describes a framework that can help refine program theory through data explorations and stakeholder dialogue. The framework incorporates the following steps: a recognition that program implementation might need to be multi-phased for a number of interventions, the need to take stock of program theory, the application of pattern recognition methods to help identify heterogeneous program mechanisms, and stakeholder dialogue to refine the program. As part of the data exploration, a method known as developmental trajectories is implemented to learn about heterogeneous trajectories of outcomes in longitudinal evaluations. This method identifies trajectory clusters and also can estimate different treatment impacts for the various groups. The framework is highlighted with data collected in an evaluation of an alcohol risk-reduction program delivered in a college fraternity setting. The framework discussed in the paper is informed by a realist focus on "what works for whom under what contexts." The utility of the framework in contributing to a dialogue on heterogeneous mechanism and subsequent implementation is described. The connection of the ideas in paper to a 'learning through principled discovery' approach is also described. Copyright © 2016. Published by Elsevier Ltd.
Gardiner, Bruce S; Wong, Kelvin K L; Joldes, Grand R; Rich, Addison J; Tan, Chin Wee; Burgess, Antony W; Smith, David W
2015-10-01
This paper presents a framework for modelling biological tissues based on discrete particles. Cell components (e.g. cell membranes, cell cytoskeleton, cell nucleus) and extracellular matrix (e.g. collagen) are represented using collections of particles. Simple particle to particle interaction laws are used to simulate and control complex physical interaction types (e.g. cell-cell adhesion via cadherins, integrin basement membrane attachment, cytoskeletal mechanical properties). Particles may be given the capacity to change their properties and behaviours in response to changes in the cellular microenvironment (e.g., in response to cell-cell signalling or mechanical loadings). Each particle is in effect an 'agent', meaning that the agent can sense local environmental information and respond according to pre-determined or stochastic events. The behaviour of the proposed framework is exemplified through several biological problems of ongoing interest. These examples illustrate how the modelling framework allows enormous flexibility for representing the mechanical behaviour of different tissues, and we argue this is a more intuitive approach than perhaps offered by traditional continuum methods. Because of this flexibility, we believe the discrete modelling framework provides an avenue for biologists and bioengineers to explore the behaviour of tissue systems in a computational laboratory.
A Service Brokering and Recommendation Mechanism for Better Selecting Cloud Services
Gui, Zhipeng; Yang, Chaowei; Xia, Jizhe; Huang, Qunying; Liu, Kai; Li, Zhenlong; Yu, Manzhu; Sun, Min; Zhou, Nanyin; Jin, Baoxuan
2014-01-01
Cloud computing is becoming the new generation computing infrastructure, and many cloud vendors provide different types of cloud services. How to choose the best cloud services for specific applications is very challenging. Addressing this challenge requires balancing multiple factors, such as business demands, technologies, policies and preferences in addition to the computing requirements. This paper recommends a mechanism for selecting the best public cloud service at the levels of Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). A systematic framework and associated workflow include cloud service filtration, solution generation, evaluation, and selection of public cloud services. Specifically, we propose the following: a hierarchical information model for integrating heterogeneous cloud information from different providers and a corresponding cloud information collecting mechanism; a cloud service classification model for categorizing and filtering cloud services and an application requirement schema for providing rules for creating application-specific configuration solutions; and a preference-aware solution evaluation mode for evaluating and recommending solutions according to the preferences of application providers. To test the proposed framework and methodologies, a cloud service advisory tool prototype was developed after which relevant experiments were conducted. The results show that the proposed system collects/updates/records the cloud information from multiple mainstream public cloud services in real-time, generates feasible cloud configuration solutions according to user specifications and acceptable cost predication, assesses solutions from multiple aspects (e.g., computing capability, potential cost and Service Level Agreement, SLA) and offers rational recommendations based on user preferences and practical cloud provisioning; and visually presents and compares solutions through an interactive web Graphical User Interface (GUI). PMID:25170937
European initiatives to develop information systems in oceanography
NASA Astrophysics Data System (ADS)
Le Grand, P.
2009-04-01
Various initiatives are currently in preparation or ongoing at the European level to improve information systems in Earth Sciences and oceanographic systems are at the forefront of these efforts. Europe is playing a leading role in the Group on Earth Observation (GEO) that aims to implement the Global Earth Observation System of Systems (GEOSS). The GEO Architecture and Data Committee, oversees the development of the GEOSS Common Infrastructure (GCI) which consists of a web-based portal, a clearinghouse for searching data, information and services, registries containing information about GEOSS components and associated standards and best practices. This development is detailed in the various tasks of the GEO Work Plan . Several European projects in the marine domain funded under the research framework program participate in the development of the GEOSS. EMODNET is another initiative to develop a system that will allow a better identification and access to marine data that are being collected, that will permit the identification of data gaps and that will shape a data collection and monitoring infrastructure directly suited to multiple applications. A number of measures have already been taken at EU level - the INSPIRE Directive obliges Member States to facilitate discovery of data holdings, the Environmental Information Directive requires them to release the data when asked, the Public Sector Information Directive facilitates the re-use of public data and the revised Data Collection Regulation has improved the availability of fisheries data. Moreover, prototype marine data catalogues and quality procedures for measurement laboratories have been developed through successive EU research programmes. EMODNET is complementary to other EU initiatives in the marine domain. Parameters made available through EMODNET will facilitate the GMES marine core service which aims to deliver both short term and seasonal forecasts, hindcasts, nowcasts, and time series and climate change scenario simulations. EMODNET will provide the access to raw and processed data necessary to calculate the indicators that Member States are obliged to provide through WISE-Marine to meet the requirements of the Marine Strategy Framework Directive. Moving to the definitive EMODNET will require significant funding. Given that EMODNET is very much focused on a sea-basin scale and given the impetus accorded to territorial cohesion by the EU maritime policy, discussions will begin to determine whether cohesion funding could support the initiative. At the same time moves will begin to integrate EMODNET with initiatives under the EU's research infrastructure programmes and the Common Fisheries Policy Data Collection Regulation. The objective is to achieve by 2014 an operational and sustainable EMODNET with earmarked funding and an agreed governance structure.
Gondek, John C; Gensemer, Robert W; Claytor, Carrie A; Canton, Steven P; Gorsuch, Joseph W
2018-06-01
Acceptance of the Biotic Ligand Model (BLM) to derive aquatic life criteria, for metals in general and copper in particular, is growing amongst regulatory agencies worldwide. Thus, it is important to ensure that water quality data are used appropriately and consistently in deriving such criteria. Here we present a suggested BLM implementation framework (hereafter referred to as "the Framework") to help guide the decision-making process when designing sampling and analysis programs for use of the BLM to derive water quality criteria applied on a site-specific basis. Such a framework will help inform stakeholders on the requirements needed to derive BLM-based criteria, and thus, ensure the appropriate types and amount of data are being collected and interpreted. The Framework was developed for calculating BLM-based criteria when data are available from multiple sampling locations on a stream. The Framework aspires to promote consistency when applying the BLM across datasets of disparate water quality, data quantity, and spatial and temporal representativeness, and is meant to be flexible to maximize applicability over a wide range of scenarios. Therefore, the Framework allows for a certain level of interpretation and adjustment to address the issues unique to each dataset. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Andrei, Victor; Arandjelović, Ognjen
2016-12-01
The rapidly expanding corpus of medical research literature presents major challenges in the understanding of previous work, the extraction of maximum information from collected data, and the identification of promising research directions. We present a case for the use of advanced machine learning techniques as an aide in this task and introduce a novel methodology that is shown to be capable of extracting meaningful information from large longitudinal corpora and of tracking complex temporal changes within it. Our framework is based on (i) the discretization of time into epochs, (ii) epoch-wise topic discovery using a hierarchical Dirichlet process-based model, and (iii) a temporal similarity graph which allows for the modelling of complex topic changes. More specifically, this is the first work that discusses and distinguishes between two groups of particularly challenging topic evolution phenomena: topic splitting and speciation and topic convergence and merging, in addition to the more widely recognized emergence and disappearance and gradual evolution. The proposed framework is evaluated on a public medical literature corpus.
Reed, Maureen G; Godmaire, Hélène; Abernethy, Paivi; Guertin, Marc-André
2014-12-01
Deliberation, dialogue and systematic learning are now considered attributes of good practice for organizations seeking to advance sustainability. Yet we do not know whether organizations that span spatial scales and governance responsibilities can establish effective communities of practice to facilitate learning and action. The purpose of this paper is to generate a framework that specifies actions and processes of a community of practice designed to instill collective learning and action strategies across a multi-level, multi-partner network. The framework is then used to describe and analyze a partnership among practitioners of Canada's 16 UNESCO biosphere reserves, and additional researchers and government representatives from across Canada. The framework is a cycle of seven action steps, beginning and ending with reflecting on and evaluating present practice. It is supported by seven characteristics of collaborative environmental management that are used to gauge the success of the partnership. Our results show that the partnership successfully built trust, established shared norms and common interest, created incentives to participate, generated value in information sharing and willingness to engage, demonstrated effective flow of information, and provided leadership and facilitation. Key to success was the presence of a multi-lingual facilitator who could bridge cultural differences across regions and academia-practitioner expectations. The project succeeded in establishing common goals, setting mutual expectations and building relations of trust and respect, and co-creating knowledge. It is too soon to determine whether changes in practices that support sustainability will be maintained over the long term and without the help of an outside facilitator. Copyright © 2014 Elsevier Ltd. All rights reserved.
Vigi4Med Scraper: A Framework for Web Forum Structured Data Extraction and Semantic Representation
Audeh, Bissan; Beigbeder, Michel; Zimmermann, Antoine; Jaillon, Philippe; Bousquet, Cédric
2017-01-01
The extraction of information from social media is an essential yet complicated step for data analysis in multiple domains. In this paper, we present Vigi4Med Scraper, a generic open source framework for extracting structured data from web forums. Our framework is highly configurable; using a configuration file, the user can freely choose the data to extract from any web forum. The extracted data are anonymized and represented in a semantic structure using Resource Description Framework (RDF) graphs. This representation enables efficient manipulation by data analysis algorithms and allows the collected data to be directly linked to any existing semantic resource. To avoid server overload, an integrated proxy with caching functionality imposes a minimal delay between sequential requests. Vigi4Med Scraper represents the first step of Vigi4Med, a project to detect adverse drug reactions (ADRs) from social networks founded by the French drug safety agency Agence Nationale de Sécurité du Médicament (ANSM). Vigi4Med Scraper has successfully extracted greater than 200 gigabytes of data from the web forums of over 20 different websites. PMID:28122056
INFORM Lab: a testbed for high-level information fusion and resource management
NASA Astrophysics Data System (ADS)
Valin, Pierre; Guitouni, Adel; Bossé, Eloi; Wehn, Hans; Happe, Jens
2011-05-01
DRDC Valcartier and MDA have created an advanced simulation testbed for the purpose of evaluating the effectiveness of Network Enabled Operations in a Coastal Wide Area Surveillance situation, with algorithms provided by several universities. This INFORM Lab testbed allows experimenting with high-level distributed information fusion, dynamic resource management and configuration management, given multiple constraints on the resources and their communications networks. This paper describes the architecture of INFORM Lab, the essential concepts of goals and situation evidence, a selected set of algorithms for distributed information fusion and dynamic resource management, as well as auto-configurable information fusion architectures. The testbed provides general services which include a multilayer plug-and-play architecture, and a general multi-agent framework based on John Boyd's OODA loop. The testbed's performance is demonstrated on 2 types of scenarios/vignettes for 1) cooperative search-and-rescue efforts, and 2) a noncooperative smuggling scenario involving many target ships and various methods of deceit. For each mission, an appropriate subset of Canadian airborne and naval platforms are dispatched to collect situation evidence, which is fused, and then used to modify the platform trajectories for the most efficient collection of further situation evidence. These platforms are fusion nodes which obey a Command and Control node hierarchy.
Emergence of consensus as a modular-to-nested transition in communication dynamics
NASA Astrophysics Data System (ADS)
Borge-Holthoefer, Javier; Baños, Raquel A.; Gracia-Lázaro, Carlos; Moreno, Yamir
2017-01-01
Online social networks have transformed the way in which humans communicate and interact, leading to a new information ecosystem where people send and receive information through multiple channels, including traditional communication media. Despite many attempts to characterize the structure and dynamics of these techno-social systems, little is known about fundamental aspects such as how collective attention arises and what determines the information life-cycle. Current approaches to these problems either focus on human temporal dynamics or on semiotic dynamics. In addition, as recently shown, information ecosystems are highly competitive, with humans and memes striving for scarce resources -visibility and attention, respectively. Inspired by similar problems in ecology, here we develop a methodology that allows to cast all the previous aspects into a compact framework and to characterize, using microblogging data, information-driven systems as mutualistic networks. Our results show that collective attention around a topic is reached when the user-meme network self-adapts from a modular to a nested structure, which ultimately allows minimizing competition and attaining consensus. Beyond a sociological interpretation, we explore such resemblance to natural mutualistic communities via well-known dynamics of ecological systems.
Emergence of consensus as a modular-to-nested transition in communication dynamics.
Borge-Holthoefer, Javier; Baños, Raquel A; Gracia-Lázaro, Carlos; Moreno, Yamir
2017-01-30
Online social networks have transformed the way in which humans communicate and interact, leading to a new information ecosystem where people send and receive information through multiple channels, including traditional communication media. Despite many attempts to characterize the structure and dynamics of these techno-social systems, little is known about fundamental aspects such as how collective attention arises and what determines the information life-cycle. Current approaches to these problems either focus on human temporal dynamics or on semiotic dynamics. In addition, as recently shown, information ecosystems are highly competitive, with humans and memes striving for scarce resources -visibility and attention, respectively. Inspired by similar problems in ecology, here we develop a methodology that allows to cast all the previous aspects into a compact framework and to characterize, using microblogging data, information-driven systems as mutualistic networks. Our results show that collective attention around a topic is reached when the user-meme network self-adapts from a modular to a nested structure, which ultimately allows minimizing competition and attaining consensus. Beyond a sociological interpretation, we explore such resemblance to natural mutualistic communities via well-known dynamics of ecological systems.
Emergence of consensus as a modular-to-nested transition in communication dynamics
Borge-Holthoefer, Javier; Baños, Raquel A.; Gracia-Lázaro, Carlos; Moreno, Yamir
2017-01-01
Online social networks have transformed the way in which humans communicate and interact, leading to a new information ecosystem where people send and receive information through multiple channels, including traditional communication media. Despite many attempts to characterize the structure and dynamics of these techno-social systems, little is known about fundamental aspects such as how collective attention arises and what determines the information life-cycle. Current approaches to these problems either focus on human temporal dynamics or on semiotic dynamics. In addition, as recently shown, information ecosystems are highly competitive, with humans and memes striving for scarce resources –visibility and attention, respectively. Inspired by similar problems in ecology, here we develop a methodology that allows to cast all the previous aspects into a compact framework and to characterize, using microblogging data, information-driven systems as mutualistic networks. Our results show that collective attention around a topic is reached when the user-meme network self-adapts from a modular to a nested structure, which ultimately allows minimizing competition and attaining consensus. Beyond a sociological interpretation, we explore such resemblance to natural mutualistic communities via well-known dynamics of ecological systems. PMID:28134358
Monitoring surface water quality using social media in the context of citizen science
NASA Astrophysics Data System (ADS)
Zheng, Hang; Hong, Yang; Long, Di; Jing, Hua
2017-02-01
Surface water quality monitoring (SWQM) provides essential information for water environmental protection. However, SWQM is costly and limited in terms of equipment and sites. The global popularity of social media and intelligent mobile devices with GPS and photography functions allows citizens to monitor surface water quality. This study aims to propose a method for SWQM using social media platforms. Specifically, a WeChat-based application platform is built to collect water quality reports from volunteers, which have been proven valuable for water quality monitoring. The methods for data screening and volunteer recruitment are discussed based on the collected reports. The proposed methods provide a framework for collecting water quality data from citizens and offer a primary foundation for big data analysis in future research.
Veinot, Tiffany C; Campbell, Terrance R; Kruger, Daniel J; Grodzinski, Alison
2013-01-01
Objective We investigated the user requirements of African-American youth (aged 14–24 years) to inform the design of a culturally appropriate, network-based informatics intervention for the prevention of HIV and other sexually transmitted infections (STI). Materials and Methods We conducted 10 focus groups with 75 African-American youth from a city with high HIV/STI prevalence. Data analyses involved coding using qualitative content analysis procedures and memo writing. Results Unexpectedly, the majority of participants’ design recommendations concerned trust. Youth expressed distrust towards people and groups, which was amplified within the context of information technology-mediated interactions about HIV/STI. Participants expressed distrust in the reliability of condoms and the accuracy of HIV tests. They questioned the benevolence of many institutions, and some rejected authoritative HIV/STI information. Therefore, reputational information, including rumor, influenced HIV/STI-related decision making. Participants’ design requirements also focused on trust-related concerns. Accordingly, we developed a novel trust-centered design framework to guide intervention design. Discussion Current approaches to online trust for health informatics do not consider group-level trusting patterns. Yet, trust was the central intervention-relevant issue among African-American youth, suggesting an important focus for culturally informed design. Our design framework incorporates: intervention objectives (eg, network embeddedness, participation); functional specifications (eg, decision support, collective action, credible question and answer services); and interaction design (eg, member control, offline network linkages, optional anonymity). Conclusions Trust is a critical focus for HIV/STI informatics interventions for young African Americans. Our design framework offers practical, culturally relevant, and systematic guidance to designers to reach this underserved group better. PMID:23512830
MIPS: a database for protein sequences and complete genomes.
Mewes, H W; Hani, J; Pfeiffer, F; Frishman, D
1998-01-01
The MIPS group [Munich Information Center for Protein Sequences of the German National Center for Environment and Health (GSF)] at the Max-Planck-Institute for Biochemistry, Martinsried near Munich, Germany, is involved in a number of data collection activities, including a comprehensive database of the yeast genome, a database reflecting the progress in sequencing the Arabidopsis thaliana genome, the systematic analysis of other small genomes and the collection of protein sequence data within the framework of the PIR-International Protein Sequence Database (described elsewhere in this volume). Through its WWW server (http://www.mips.biochem.mpg.de ) MIPS provides access to a variety of generic databases, including a database of protein families as well as automatically generated data by the systematic application of sequence analysis algorithms. The yeast genome sequence and its related information was also compiled on CD-ROM to provide dynamic interactive access to the 16 chromosomes of the first eukaryotic genome unraveled. PMID:9399795
Freehafer, Douglas A.; Pierson, Oliver
2004-01-01
In the fall of 2002, the Onondaga Lake Partnership (OLP) formed a Geographic Information System (GIS) Planning Committee to begin the process of developing a comprehensive watershed geographic information system for Onondaga Lake. The goal of the Onondaga Lake Partnership geographic information system is to integrate the various types of spatial data used for scientific investigations, resource management, and planning and design of improvement projects in the Onondaga Lake Watershed. A needs-assessment survey was conducted and a spatial data framework developed to support the Onondaga Lake Partnership use of geographic information system technology. The design focused on the collection, management, and distribution of spatial data, maps, and internet mapping applications. A geographic information system library of over 100 spatial datasets and metadata links was assembled on the basis of the results of the needs assessment survey. Implementation options were presented, and the Geographic Information System Planning Committee offered recommendations for the management and distribution of spatial data belonging to Onondaga Lake Partnership members. The Onondaga Lake Partnership now has a strong foundation for building a comprehensive geographic information system for the Onondaga Lake watershed. The successful implementation of a geographic information system depends on the Onondaga Lake Partnership’s determination of: (1) the design and plan for a geographic information system, including the applications and spatial data that will be provided and to whom, (2) the level of geographic information system technology to be utilized and funded, and (3) the institutional issues of operation and maintenance of the system.
Mehmood, Amber; Chan, Edward; Allen, Katharine; Al-Kashmiri, Ammar; Al-Busaidi, Ali; Al-Abri, Jehan; Al-Yazidi, Mohamed; Al-Maniri, Abdullah; Hyder, Adnan A.
2017-01-01
ABSTRACT Background: Trauma registries (TRs) play a vital role in the assessment of trauma care, but are often underutilized in countries with a high burden of injuries. Objectives: We investigated whether information and communications technology (ICT) such as mobile health (mHealth) could enable the design of a tablet-based application for healthcare professionals. This would be used to inform trauma care and acquire surveillance data for injury control and prevention in Oman. This paper focuses on documenting the implementation process in a healthcare setting. Methods: The study was conducted using an ICT implementation framework consisting of multistep assessment, development and pilot testing of an electronic tablet-based TR. The pilot study was conducted at two large hospitals in Oman, followed by detailed evaluation of the process, system and impact of implementation. Results: The registry was designed to provide comprehensive information on each trauma case from the location of injury until hospital discharge, with variables organized to cover 11 domains of demographic and clinical information. The pilot study demonstrated that the registry was user friendly and reliable, and the implementation framework was useful in planning for the Omani hospital setting. Data collection by trained and dedicated nurses proved to be more feasible, efficient and reliable than real-time data entry by care providers. Conclusions: The initial results show the promising potential of a user-friendly, comprehensive electronic TR through the use of mHealth tools. The pilot test in two hospitals indicates that the registry can be used to create a multicenter trauma database. PMID:29027507
NASA Astrophysics Data System (ADS)
Stocker, M.; Mokrane, M.; Burton, A.; Koers, H.
2016-12-01
The Scholix framework—Scholarly Link Exchange—is a set of aspirational principles and practical guidelines developed under the umbrella of a joint Working Group of the Research Data Alliance (RDA) and the World Data System (WDS). It supports a global open information ecosystem unveiling the links between scholarly literature and underpinning research data. The core objectives of the framework are to (1) increase visibility and discoverability of data and articles, (2) place data in context to enable re-use, and (3) support credit attribution mechanisms. Thus, facilitating reproducibility and the transparent evaluation of science. Scholix provides an evolving lightweight set of Guidelines to increase interoperability rather than a normative standard. It consists initially of a conceptual and information models, information standards and encoding guidelines, and options for encoding and exchange protocols. An essential prerequisite to enable the proposed framework is the use of global, unique and persistent identifiers for research objects (such as data and literature). Scholix provides incentives and encourages best practice in the use of such identifiers and standardised referencing. The Data and Literature Interlinking Service (DLI: dliservice.research-infrastructures.eu) is the first exemplar of an aggregation and query service supported by the Scholix framework which will allow the emergence of third party services such as domain-specific aggregations, integrations with other global services, discovery tools, impact assessments, etc. Scholix is already implemented by existing hubs or global aggregators of data-literature link information such as DataCite, CrossRef, OpenAIRE, and EMBL-EBI building on the capacities of existing Persistent Identifier Systems (PIDs) such as Digital Object Identifiers (DOI) and Accession Numbers. These hubs in turn work with their natural communities of data centres or literature publishers to collect the information through existing community-specific workflows and standards. Scholix as a technical solution to wholesale information aggregation will need to be complemented by other policy, practice and cultural change advocacy initiatives. This approach could be extended over time to other types of research objects in and beyond research.
Methods for Evaluating Respondent Attrition in Web-Based Surveys.
Hochheimer, Camille J; Sabo, Roy T; Krist, Alex H; Day, Teresa; Cyrus, John; Woolf, Steven H
2016-11-22
Electronic surveys are convenient, cost effective, and increasingly popular tools for collecting information. While the online platform allows researchers to recruit and enroll more participants, there is an increased risk of participant dropout in Web-based research. Often, these dropout trends are simply reported, adjusted for, or ignored altogether. To propose a conceptual framework that analyzes respondent attrition and demonstrates the utility of these methods with existing survey data. First, we suggest visualization of attrition trends using bar charts and survival curves. Next, we propose a generalized linear mixed model (GLMM) to detect or confirm significant attrition points. Finally, we suggest applications of existing statistical methods to investigate the effect of internal survey characteristics and patient characteristics on dropout. In order to apply this framework, we conducted a case study; a seventeen-item Informed Decision-Making (IDM) module addressing how and why patients make decisions about cancer screening. Using the framework, we were able to find significant attrition points at Questions 4, 6, 7, and 9, and were also able to identify participant responses and characteristics associated with dropout at these points and overall. When these methods were applied to survey data, significant attrition trends were revealed, both visually and empirically, that can inspire researchers to investigate the factors associated with survey dropout, address whether survey completion is associated with health outcomes, and compare attrition patterns between groups. The framework can be used to extract information beyond simple responses, can be useful during survey development, and can help determine the external validity of survey results. ©Camille J Hochheimer, Roy T Sabo, Alex H Krist, Teresa Day, John Cyrus, Steven H Woolf. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 22.11.2016.
Gibbs, Jo; Sutcliffe, Lorna J; Gkatzidou, Voula; Hone, Kate; Ashcroft, Richard E; Harding-Esch, Emma M; Lowndes, Catherine M; Sadiq, S Tariq; Sonnenberg, Pam; Estcourt, Claudia S
2016-07-22
Despite considerable international eHealth impetus, there is no guidance on the development of online clinical care pathways. Advances in diagnostics now enable self-testing with home diagnosis, to which comprehensive online clinical care could be linked, facilitating completely self-directed, remote care. We describe a new framework for developing complex online clinical care pathways and its application to clinical management of people with genital chlamydia infection, the commonest sexually transmitted infection (STI) in England. Using the existing evidence-base, guidelines and examples from contemporary clinical practice, we developed the eClinical Care Pathway Framework, a nine-step iterative process. Step 1: define the aims of the online pathway; Step 2: define the functional units; Step 3: draft the clinical consultation; Step 4: expert review; Step 5: cognitive testing; Step 6: user-centred interface testing; Step 7: specification development; Step 8: software testing, usability testing and further comprehension testing; Step 9: piloting. We then applied the Framework to create a chlamydia online clinical care pathway (Online Chlamydia Pathway). Use of the Framework elucidated content and structure of the care pathway and identified the need for significant changes in sequences of care (Traditional: history, diagnosis, information versus Online: diagnosis, information, history) and prescribing safety assessment. The Framework met the needs of complex STI management and enabled development of a multi-faceted, fully-automated consultation. The Framework provides a comprehensive structure on which complex online care pathways such as those needed for STI management, which involve clinical services, public health surveillance functions and third party (sexual partner) management, can be developed to meet national clinical and public health standards. The Online Chlamydia Pathway's standardised method of collecting data on demographics and sexual behaviour, with potential for interoperability with surveillance systems, could be a powerful tool for public health and clinical management.
NASA Astrophysics Data System (ADS)
Katz, Phyllis; McGinnis, J. Randy; Hestness, Emily; Riedinger, Kelly; Marbach-Ad, Gili; Dai, Amy; Pease, Rebecca
2011-06-01
This study investigated the professional identity development of teacher candidates participating in an informal afterschool science internship in a formal science teacher preparation programme. We used a qualitative research methodology. Data were collected from the teacher candidates, their informal internship mentors, and the researchers. The data were analysed through an identity development theoretical framework, informed by participants' mental models of science teaching and learning. We learned that the experience in an afterschool informal internship encouraged the teacher candidates to see themselves, and to be seen by others, as enacting key recommendations by science education standards documents, including exhibiting: positive attitudes, sensitivity to diversity, and increasing confidence in facilitating hands-on science participation, inquiry, and collaborative work. Our study provided evidence that the infusion of an informal science education internship in a formal science teacher education programme influenced positively participants' professional identity development as science teachers.
2014-08-01
Madan Vunnam, Sudhakar Arepally, Dave Bednarz, Ph.D. System Engineering-Analytics TARDEC, Warren, MI Ching Hsieh, Ph.D. Altair Engineering...blast events were estimated to be responsible for 60% of coalition deaths in Iraq [ 1 , 2] and 75% of casualties in Afghanistan [3]. However, other...ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time
A Webgis Framework for Disseminating Processed Remotely Sensed on Land Cover Transformations
NASA Astrophysics Data System (ADS)
Caradonna, Grazia; Novelli, Antonio; Tarantino, Eufemia; Cefalo, Raffaela; Fratino, Umberto
2016-06-01
Mediterranean regions have experienced significant soil degradation over the past decades. In this context, careful land observation using satellite data is crucial for understanding the long-term usage patterns of natural resources and facilitating their sustainable management to monitor and evaluate the potential degradation. Given the environmental and political interest on this problem, there is urgent need for a centralized repository and mechanism to share geospatial data, information and maps of land change. Geospatial data collecting is one of the most important task for many users because there are significant barriers in accessing and using data. This limit could be overcome by implementing a WebGIS through a combination of existing free and open source software for geographic information systems (FOSS4G). In this paper we preliminary discuss methods for collecting raster data in a geodatabase by processing open multi-temporal and multi-scale satellite data aimed at retrieving indicators for land degradation phenomenon (i.e. land cover/land use analysis, vegetation indices, trend analysis, etc.). Then we describe a methodology for designing a WebGIS framework in order to disseminate information through maps for territory monitoring. Basic WebGIS functions were extended with the help of POSTGIS database and OpenLayers libraries. Geoserver was customized to set up and enhance the website functions developing various advanced queries using PostgreSQL and innovative tools to carry out efficiently multi-layer overlay analysis. The end-product is a simple system that provides the opportunity not only to consult interactively but also download processed remote sensing data.
Demirkus, Meltem; Precup, Doina; Clark, James J; Arbel, Tal
2016-06-01
Recent literature shows that facial attributes, i.e., contextual facial information, can be beneficial for improving the performance of real-world applications, such as face verification, face recognition, and image search. Examples of face attributes include gender, skin color, facial hair, etc. How to robustly obtain these facial attributes (traits) is still an open problem, especially in the presence of the challenges of real-world environments: non-uniform illumination conditions, arbitrary occlusions, motion blur and background clutter. What makes this problem even more difficult is the enormous variability presented by the same subject, due to arbitrary face scales, head poses, and facial expressions. In this paper, we focus on the problem of facial trait classification in real-world face videos. We have developed a fully automatic hierarchical and probabilistic framework that models the collective set of frame class distributions and feature spatial information over a video sequence. The experiments are conducted on a large real-world face video database that we have collected, labelled and made publicly available. The proposed method is flexible enough to be applied to any facial classification problem. Experiments on a large, real-world video database McGillFaces [1] of 18,000 video frames reveal that the proposed framework outperforms alternative approaches, by up to 16.96 and 10.13%, for the facial attributes of gender and facial hair, respectively.
mlCAF: Multi-Level Cross-Domain Semantic Context Fusioning for Behavior Identification.
Razzaq, Muhammad Asif; Villalonga, Claudia; Lee, Sungyoung; Akhtar, Usman; Ali, Maqbool; Kim, Eun-Soo; Khattak, Asad Masood; Seung, Hyonwoo; Hur, Taeho; Bang, Jaehun; Kim, Dohyeong; Ali Khan, Wajahat
2017-10-24
The emerging research on automatic identification of user's contexts from the cross-domain environment in ubiquitous and pervasive computing systems has proved to be successful. Monitoring the diversified user's contexts and behaviors can help in controlling lifestyle associated to chronic diseases using context-aware applications. However, availability of cross-domain heterogeneous contexts provides a challenging opportunity for their fusion to obtain abstract information for further analysis. This work demonstrates extension of our previous work from a single domain (i.e., physical activity) to multiple domains (physical activity, nutrition and clinical) for context-awareness. We propose multi-level Context-aware Framework (mlCAF), which fuses the multi-level cross-domain contexts in order to arbitrate richer behavioral contexts. This work explicitly focuses on key challenges linked to multi-level context modeling, reasoning and fusioning based on the mlCAF open-source ontology. More specifically, it addresses the interpretation of contexts from three different domains, their fusioning conforming to richer contextual information. This paper contributes in terms of ontology evolution with additional domains, context definitions, rules and inclusion of semantic queries. For the framework evaluation, multi-level cross-domain contexts collected from 20 users were used to ascertain abstract contexts, which served as basis for behavior modeling and lifestyle identification. The experimental results indicate a context recognition average accuracy of around 92.65% for the collected cross-domain contexts.
mlCAF: Multi-Level Cross-Domain Semantic Context Fusioning for Behavior Identification
Villalonga, Claudia; Lee, Sungyoung; Akhtar, Usman; Ali, Maqbool; Kim, Eun-Soo; Khattak, Asad Masood; Seung, Hyonwoo; Hur, Taeho; Kim, Dohyeong; Ali Khan, Wajahat
2017-01-01
The emerging research on automatic identification of user’s contexts from the cross-domain environment in ubiquitous and pervasive computing systems has proved to be successful. Monitoring the diversified user’s contexts and behaviors can help in controlling lifestyle associated to chronic diseases using context-aware applications. However, availability of cross-domain heterogeneous contexts provides a challenging opportunity for their fusion to obtain abstract information for further analysis. This work demonstrates extension of our previous work from a single domain (i.e., physical activity) to multiple domains (physical activity, nutrition and clinical) for context-awareness. We propose multi-level Context-aware Framework (mlCAF), which fuses the multi-level cross-domain contexts in order to arbitrate richer behavioral contexts. This work explicitly focuses on key challenges linked to multi-level context modeling, reasoning and fusioning based on the mlCAF open-source ontology. More specifically, it addresses the interpretation of contexts from three different domains, their fusioning conforming to richer contextual information. This paper contributes in terms of ontology evolution with additional domains, context definitions, rules and inclusion of semantic queries. For the framework evaluation, multi-level cross-domain contexts collected from 20 users were used to ascertain abstract contexts, which served as basis for behavior modeling and lifestyle identification. The experimental results indicate a context recognition average accuracy of around 92.65% for the collected cross-domain contexts. PMID:29064459
Public Perception on Disaster Management Using Volunteered Geographic Information (vgi): Case of Uae
NASA Astrophysics Data System (ADS)
Yagoub, M. M.
2015-10-01
The number of smart phones that are supported by location facility like Global Positioning System (GPS), Camera and connected to the internet has increased sharply in UAE during the last five years. This increase offers a chance to capitalize on using these devices as resources for data collection, therefore reducing cost. In many cases specific events may happen in areas or at time where there may be no governmental departments to collect such unrepeated events. The current research will showcase various studies that had been conducted on Volunteered Geographic Information (VGI) debating various aspects such as accuracy, legal issues, and privacy. This research will also integrate Geographic Information System (GIS), VGI, social media tools, data mining, and mobile technology to design a conceptual framework for promoting public participation in UAE. The data gathered through survey will be helpful in correlating various aspects of VGI. Since there are diverse views about these aspects, policy makers are left undecided in many countries about how to deal with VGI. The assessment of the UAE case will contribute to the age-long debate by examining the willingness of the public to participate. The result will show the public perception to be as sensors for data collection. Additionally, the potential of citizen involvement in the risk and disaster management process by providing voluntary data collected for VGI applications will also be explored in the paper.
Stokes, Emma J
2010-12-01
Wild tigers are in a critical state with an estimated population decline of more than 95% over the past century. Improving the capacity and effectiveness of law enforcement in reducing poaching of tigers is an immediate priority to secure remaining wild populations in source sites. From 2008-2010, standardized patrol-based law enforcement monitoring (LEM) was established under the Tigers Forever Program across 8 key tiger sites in order to improve and evaluate law enforcement interventions. Patrol-based monitoring has the distinct advantage of providing regular and rapid information on illegal activities and ranger performance, although, until recently, it has received relatively little scrutiny from the conservation community. The present paper outlines a framework for implementation of LEM in tiger source sites using MIST, a computerized management information system for ranger-based data collection. The framework addresses many of the technical, practical and institutional challenges involved in the design, implementation, sustainability and evaluation of LEM. Adoption of such a framework for LEM is a cost-effective strategy to improve the efficiency of law enforcement efforts, to increase the motivation of enforcement staff and to promote the accountability of law enforcement agencies in addressing threats to tigers. When combined with independent, systematic and science-based monitoring of tigers and their prey, LEM has great potential for evaluating the effectiveness of protection-based conservation investments. © 2010 ISZS, Blackwell Publishing and IOZ/CAS.
Gale, Nicola K; Shapiro, Jonathan; McLeod, Hugh S T; Redwood, Sabi; Hewison, Alistair
2014-08-20
Organizational culture is considered by policy-makers, clinicians, health service managers and researchers to be a crucial mediator in the success of implementing health service redesign. It is a challenge to find a method to capture cultural issues that is both theoretically robust and meaningful to those working in the organizations concerned. As part of a comparative study of service redesign in three acute hospital organizations in England, UK, a framework for collecting data reflective of culture was developed that was informed by previous work in the field and social and cultural theory. As part of a larger mixed method comparative case study of hospital service redesign, informed by realist evaluation, the authors developed a framework for researching organisational culture during health service redesign and change. This article documents the development of the model, which involved an iterative process of data analysis, critical interdisciplinary discussion in the research team, and feedback from staff in the partner organisations. Data from semi-structured interviews with 77 key informants are used to illustrate the model. In workshops with NHS partners to share and debate the early findings of the study, organizational culture was identified as a key concept to explore because it was perceived to underpin the whole redesign process. The Patients-People-Place framework for studying culture focuses on three thematic areas ('domains') and three levels of culture in which the data could be organised. The framework can be used to help explain the relationship between observable behaviours and cultural artefacts, the values and habits of social actors and the basic assumptions underpinning an organization's culture in each domain. This paper makes a methodological contribution to the study of culture in health care organizations. It offers guidance and a practical approach to investigating the inherently complex phenomenon of culture in hospital organizations. The Patients-People-Place framework could be applied in other settings as a means of ensuring the three domains and three levels that are important to an organization's culture are addressed in future health service research.
Lyon, Aaron R; Lewis, Cara C; Melvin, Abigail; Boyd, Meredith; Nicodimos, Semret; Liu, Freda F; Jungbluth, Nathaniel
2016-09-22
Health information technologies (HIT) have become nearly ubiquitous in the contemporary healthcare landscape, but information about HIT development, functionality, and implementation readiness is frequently siloed. Theory-driven methods of compiling, evaluating, and integrating information from the academic and commercial sectors are necessary to guide stakeholder decision-making surrounding HIT adoption and to develop pragmatic HIT research agendas. This article presents the Health Information Technologies-Academic and Commercial Evaluation (HIT-ACE) methodology, a structured, theory-driven method for compiling and evaluating information from multiple sectors. As an example demonstration of the methodology, we apply HIT-ACE to mental and behavioral health measurement feedback systems (MFS). MFS are a specific class of HIT that support the implementation of routine outcome monitoring, an evidence-based practice. HIT-ACE is guided by theories and frameworks related to user-centered design and implementation science. The methodology involves four phases: (1) coding academic and commercial materials, (2) developer/purveyor interviews, (3) linking putative implementation mechanisms to hit capabilities, and (4) experimental testing of capabilities and mechanisms. In the current demonstration, phase 1 included a systematic process to identify MFS in mental and behavioral health using academic literature and commercial websites. Using user-centered design, implementation science, and feedback frameworks, the HIT-ACE coding system was developed, piloted, and used to review each identified system for the presence of 38 capabilities and 18 additional characteristics via a consensus coding process. Bibliometic data were also collected to examine the representation of the systems in the scientific literature. As an example, results are presented for the application of HIT-ACE phase 1 to MFS wherein 49 separate MFS were identified, reflecting a diverse array of characteristics and capabilities. Preliminary findings demonstrate the utility of HIT-ACE to represent the scope and diversity of a given class of HIT beyond what can be identified in the academic literature. Phase 2 data collection is expected to confirm and expand the information presented and phases 3 and 4 will provide more nuanced information about the impact of specific HIT capabilities. In all, HIT-ACE is expected to support adoption decisions and additional HIT development and implementation research.
Koutkias, Vassilis G; Lillo-Le Louët, Agnès; Jaulent, Marie-Christine
2017-02-01
Driven by the need of pharmacovigilance centres and companies to routinely collect and review all available data about adverse drug reactions (ADRs) and adverse events of interest, we introduce and validate a computational framework exploiting dominant as well as emerging publicly available data sources for drug safety surveillance. Our approach relies on appropriate query formulation for data acquisition and subsequent filtering, transformation and joint visualization of the obtained data. We acquired data from the FDA Adverse Event Reporting System (FAERS), PubMed and Twitter. In order to assess the validity and the robustness of the approach, we elaborated on two important case studies, namely, clozapine-induced cardiomyopathy/myocarditis versus haloperidol-induced cardiomyopathy/myocarditis, and apixaban-induced cerebral hemorrhage. The analysis of the obtained data provided interesting insights (identification of potential patient and health-care professional experiences regarding ADRs in Twitter, information/arguments against an ADR existence across all sources), while illustrating the benefits (complementing data from multiple sources to strengthen/confirm evidence) and the underlying challenges (selecting search terms, data presentation) of exploiting heterogeneous information sources, thereby advocating the need for the proposed framework. This work contributes in establishing a continuous learning system for drug safety surveillance by exploiting heterogeneous publicly available data sources via appropriate support tools.
Dynamic social networks based on movement
Scharf, Henry; Hooten, Mevin B.; Fosdick, Bailey K.; Johnson, Devin S.; London, Joshua M.; Durban, John W.
2016-01-01
Network modeling techniques provide a means for quantifying social structure in populations of individuals. Data used to define social connectivity are often expensive to collect and based on case-specific, ad hoc criteria. Moreover, in applications involving animal social networks, collection of these data is often opportunistic and can be invasive. Frequently, the social network of interest for a given population is closely related to the way individuals move. Thus, telemetry data, which are minimally invasive and relatively inexpensive to collect, present an alternative source of information. We develop a framework for using telemetry data to infer social relationships among animals. To achieve this, we propose a Bayesian hierarchical model with an underlying dynamic social network controlling movement of individuals via two mechanisms: an attractive effect and an aligning effect. We demonstrate the model and its ability to accurately identify complex social behavior in simulation, and apply our model to telemetry data arising from killer whales. Using auxiliary information about the study population, we investigate model validity and find the inferred dynamic social network is consistent with killer whale ecology and expert knowledge.
A novel approach for estimating ingested dose associated with paracetamol overdose
Zurlinden, Todd J.; Heard, Kennon
2015-01-01
Aim In cases of paracetamol (acetaminophen, APAP) overdose, an accurate estimate of tissue‐specific paracetamol pharmacokinetics (PK) and ingested dose can offer health care providers important information for the individualized treatment and follow‐up of affected patients. Here a novel methodology is presented to make such estimates using a standard serum paracetamol measurement and a computational framework. Methods The core component of the computational framework was a physiologically‐based pharmacokinetic (PBPK) model developed and evaluated using an extensive set of human PK data. Bayesian inference was used for parameter and dose estimation, allowing the incorporation of inter‐study variability, and facilitating the calculation of uncertainty in model outputs. Results Simulations of paracetamol time course concentrations in the blood were in close agreement with experimental data under a wide range of dosing conditions. Also, predictions of administered dose showed good agreement with a large collection of clinical and emergency setting PK data over a broad dose range. In addition to dose estimation, the platform was applied for the determination of optimal blood sampling times for dose reconstruction and quantitation of the potential role of paracetamol conjugate measurement on dose estimation. Conclusions Current therapies for paracetamol overdose rely on a generic methodology involving the use of a clinical nomogram. By using the computational framework developed in this study, serum sample data, and the individual patient's anthropometric and physiological information, personalized serum and liver pharmacokinetic profiles and dose estimate could be generated to help inform an individualized overdose treatment and follow‐up plan. PMID:26441245
A novel approach for estimating ingested dose associated with paracetamol overdose.
Zurlinden, Todd J; Heard, Kennon; Reisfeld, Brad
2016-04-01
In cases of paracetamol (acetaminophen, APAP) overdose, an accurate estimate of tissue-specific paracetamol pharmacokinetics (PK) and ingested dose can offer health care providers important information for the individualized treatment and follow-up of affected patients. Here a novel methodology is presented to make such estimates using a standard serum paracetamol measurement and a computational framework. The core component of the computational framework was a physiologically-based pharmacokinetic (PBPK) model developed and evaluated using an extensive set of human PK data. Bayesian inference was used for parameter and dose estimation, allowing the incorporation of inter-study variability, and facilitating the calculation of uncertainty in model outputs. Simulations of paracetamol time course concentrations in the blood were in close agreement with experimental data under a wide range of dosing conditions. Also, predictions of administered dose showed good agreement with a large collection of clinical and emergency setting PK data over a broad dose range. In addition to dose estimation, the platform was applied for the determination of optimal blood sampling times for dose reconstruction and quantitation of the potential role of paracetamol conjugate measurement on dose estimation. Current therapies for paracetamol overdose rely on a generic methodology involving the use of a clinical nomogram. By using the computational framework developed in this study, serum sample data, and the individual patient's anthropometric and physiological information, personalized serum and liver pharmacokinetic profiles and dose estimate could be generated to help inform an individualized overdose treatment and follow-up plan. © 2015 The British Pharmacological Society.
Kuziemsky, C E; Randell, R; Borycki, E M
2016-11-10
No framework exists to identify and study unintended consequences (UICs) with a focus on organizational and social issues (OSIs). To address this shortcoming, we conducted a literature review to develop a framework for considering UICs and health information technology (HIT) from the perspective of OSIs. A literature review was conducted for the period 2000- 2015 using the search terms "unintended consequences" and "health information technology". 67 papers were screened, of which 18 met inclusion criteria. Data extraction was focused on the types of technologies studied, types of UICs identified, and methods of data collection and analysis used. A thematic analysis was used to identify themes related to UICs. We identified two overarching themes. One was the definition and terminology of how people classify and discuss UICs. Second was OSIs and UICs. For the OSI theme, we also identified four sub-themes: process change and evolution, individual-collaborative interchange, context of use, and approaches to model, study, and understand UICs. While there is a wide body of research on UICs, there is a lack of overall consensus on how they should be classified and reported, limiting our ability to understand the implications of UICs and how to manage them. More mixed-methods research and better proactive identification of UICs remain priorities. Our findings and framework of OSI considerations for studying UICs and HIT extend existing work on HIT and UICs by focusing on organizational and social issues.
Understanding Unintended Consequences and Health Information Technology:
Randell, R.; Borycki, E. M.
2016-01-01
Summary Objective No framework exists to identify and study unintended consequences (UICs) with a focus on organizational and social issues (OSIs). To address this shortcoming, we conducted a literature review to develop a framework for considering UICs and health information technology (HIT) from the perspective of OSIs. Methods A literature review was conducted for the period 2000-2015 using the search terms “unintended consequences” and “health information technology”. 67 papers were screened, of which 18 met inclusion criteria. Data extraction was focused on the types of technologies studied, types of UICs identified, and methods of data collection and analysis used. A thematic analysis was used to identify themes related to UICs. Results We identified two overarching themes. One was the definition and terminology of how people classify and discuss UICs. Second was OSIs and UICs. For the OSI theme, we also identified four sub-themes: process change and evolution, individual-collaborative interchange, context of use, and approaches to model, study, and understand UICs. Conclusions While there is a wide body of research on UICs, there is a lack of overall consensus on how they should be classified and reported, limiting our ability to understand the implications of UICs and how to manage them. More mixed-methods research and better proactive identification of UICs remain priorities. Our findings and framework of OSI considerations for studying UICs and HIT extend existing work on HIT and UICs by focusing on organizational and social issues. PMID:27830231
Mialon, M; Swinburn, B; Sacks, G
2015-07-01
Unhealthy diets represent one of the major risk factors for non-communicable diseases. There is currently a risk that the political influence of the food industry results in public health policies that do not adequately balance public and commercial interests. This paper aims to develop a framework for categorizing the corporate political activity of the food industry with respect to public health and proposes an approach to systematically identify and monitor it. The proposed framework includes six strategies used by the food industry to influence public health policies and outcomes: information and messaging; financial incentive; constituency building; legal; policy substitution; opposition fragmentation and destabilization. The corporate political activity of the food industry could be identified and monitored through publicly available data sourced from the industry itself, governments, the media and other sources. Steps for country-level monitoring include identification of key food industry actors and related sources of information, followed by systematic data collection and analysis of relevant documents, using the proposed framework as a basis for classification of results. The proposed monitoring approach should be pilot tested in different countries as part of efforts to increase the transparency and accountability of the food industry. This approach has the potential to help redress any imbalance of interests and thereby contribute to the prevention and control of non-communicable diseases. © 2015 World Obesity.
Application of quantum Darwinism to a structured environment
NASA Astrophysics Data System (ADS)
Pleasance, Graeme; Garraway, Barry M.
2017-12-01
Quantum Darwinism extends the traditional formalism of decoherence to explain the emergence of classicality in a quantum universe. A classical description emerges when the environment tends to redundantly acquire information about the pointer states of an open system. In light of recent interest, we apply the theoretical tools of the framework to a qubit coupled with many bosonic subenvironments. We examine the degree to which the same classical information is encoded across collections of (i) complete subenvironments and (ii) residual "pseudomode" components of each subenvironment, the conception of which provides a dynamic representation of the reservoir memory. Overall, significant redundancy of information is found as a typical result of the decoherence process. However, by examining its decomposition in terms of classical and quantum correlations, we discover classical information to be nonredundant in both cases i and ii. Moreover, with the full collection of pseudomodes, certain dynamical regimes realize opposite effects, where either the total classical or quantum correlations predominantly decay over time. Finally, when the dynamics are non-Markovian, we find that redundant information is suppressed in line with information backflow to the qubit. By quantifying redundancy, we concretely show it to act as a witness to non-Markovianity in the same way as the trace distance does for nondivisible dynamical maps.
Wicasa Was'aka: restoring the traditional strength of American Indian boys and men.
Brave Heart, Maria Yellow Horse; Elkins, Jennifer; Tafoya, Greg; Bird, Doreen; Salvador, Melina
2012-05-01
We examined health disparities among American Indian men and boys within the framework of historical trauma, which incorporates the historical context of collective massive group trauma across generations. We reviewed the impact of collective traumatic experiences among Lakota men, who have faced cross-generational challenges to enacting traditional tribal roles. We describe historical trauma-informed interventions used with two tribal groups: Lakota men and Southwestern American Indian boys. These two interventions represent novel approaches to addressing historical trauma and the health disparities that American Indians face. We offer public health implications and recommendations for strategies to use in the planning and implementation of policy, research, and program development with American Indian boys and men.
Statistical physics of language dynamics
NASA Astrophysics Data System (ADS)
Loreto, Vittorio; Baronchelli, Andrea; Mukherjee, Animesh; Puglisi, Andrea; Tria, Francesca
2011-04-01
Language dynamics is a rapidly growing field that focuses on all processes related to the emergence, evolution, change and extinction of languages. Recently, the study of self-organization and evolution of language and meaning has led to the idea that a community of language users can be seen as a complex dynamical system, which collectively solves the problem of developing a shared communication framework through the back-and-forth signaling between individuals. We shall review some of the progress made in the past few years and highlight potential future directions of research in this area. In particular, the emergence of a common lexicon and of a shared set of linguistic categories will be discussed, as examples corresponding to the early stages of a language. The extent to which synthetic modeling is nowadays contributing to the ongoing debate in cognitive science will be pointed out. In addition, the burst of growth of the web is providing new experimental frameworks. It makes available a huge amount of resources, both as novel tools and data to be analyzed, allowing quantitative and large-scale analysis of the processes underlying the emergence of a collective information and language dynamics.
Fisher, Michael B.; Mann, Benjamin H.; Cronk, Ryan D.; Shields, Katherine F.; Klug, Tori L.; Ramaswamy, Rohit
2016-01-01
Information and communications technologies (ICTs) such as mobile survey tools (MSTs) can facilitate field-level data collection to drive improvements in national and international development programs. MSTs allow users to gather and transmit field data in real time, standardize data storage and management, automate routine analyses, and visualize data. Dozens of diverse MST options are available, and users may struggle to select suitable options. We developed a systematic MST Evaluation Framework (EF), based on International Organization for Standardization/International Electrotechnical Commission (ISO/IEC) software quality modeling standards, to objectively assess MSTs and assist program implementers in identifying suitable MST options. The EF is applicable to MSTs for a broad variety of applications. We also conducted an MST user survey to elucidate needs and priorities of current MST users. Finally, the EF was used to assess seven MSTs currently used for water and sanitation monitoring, as a validation exercise. The results suggest that the EF is a promising method for evaluating MSTs. PMID:27563916
Fisher, Michael B; Mann, Benjamin H; Cronk, Ryan D; Shields, Katherine F; Klug, Tori L; Ramaswamy, Rohit
2016-08-23
Information and communications technologies (ICTs) such as mobile survey tools (MSTs) can facilitate field-level data collection to drive improvements in national and international development programs. MSTs allow users to gather and transmit field data in real time, standardize data storage and management, automate routine analyses, and visualize data. Dozens of diverse MST options are available, and users may struggle to select suitable options. We developed a systematic MST Evaluation Framework (EF), based on International Organization for Standardization/International Electrotechnical Commission (ISO/IEC) software quality modeling standards, to objectively assess MSTs and assist program implementers in identifying suitable MST options. The EF is applicable to MSTs for a broad variety of applications. We also conducted an MST user survey to elucidate needs and priorities of current MST users. Finally, the EF was used to assess seven MSTs currently used for water and sanitation monitoring, as a validation exercise. The results suggest that the EF is a promising method for evaluating MSTs.
Revised Methods for Characterizing Stream Habitat in the National Water-Quality Assessment Program
Fitzpatrick, Faith A.; Waite, Ian R.; D'Arconte, Patricia J.; Meador, Michael R.; Maupin, Molly A.; Gurtz, Martin E.
1998-01-01
Stream habitat is characterized in the U.S. Geological Survey's National Water-Quality Assessment (NAWQA) Program as part of an integrated physical, chemical, and biological assessment of the Nation's water quality. The goal of stream habitat characterization is to relate habitat to other physical, chemical, and biological factors that describe water-quality conditions. To accomplish this goal, environmental settings are described at sites selected for water-quality assessment. In addition, spatial and temporal patterns in habitat are examined at local, regional, and national scales. This habitat protocol contains updated methods for evaluating habitat in NAWQA Study Units. Revisions are based on lessons learned after 6 years of applying the original NAWQA habitat protocol to NAWQA Study Unit ecological surveys. Similar to the original protocol, these revised methods for evaluating stream habitat are based on a spatially hierarchical framework that incorporates habitat data at basin, segment, reach, and microhabitat scales. This framework provides a basis for national consistency in collection techniques while allowing flexibility in habitat assessment within individual Study Units. Procedures are described for collecting habitat data at basin and segment scales; these procedures include use of geographic information system data bases, topographic maps, and aerial photographs. Data collected at the reach scale include channel, bank, and riparian characteristics.
On effective temperature in network models of collective behavior
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porfiri, Maurizio, E-mail: mporfiri@nyu.edu; Ariel, Gil, E-mail: arielg@math.biu.ac.il
Collective behavior of self-propelled units is studied analytically within the Vectorial Network Model (VNM), a mean-field approximation of the well-known Vicsek model. We propose a dynamical systems framework to study the stochastic dynamics of the VNM in the presence of general additive noise. We establish that a single parameter, which is a linear function of the circular mean of the noise, controls the macroscopic phase of the system—ordered or disordered. By establishing a fluctuation–dissipation relation, we posit that this parameter can be regarded as an effective temperature of collective behavior. The exact critical temperature is obtained analytically for systems withmore » small connectivity, equivalent to low-density ensembles of self-propelled units. Numerical simulations are conducted to demonstrate the applicability of this new notion of effective temperature to the Vicsek model. The identification of an effective temperature of collective behavior is an important step toward understanding order–disorder phase transitions, informing consistent coarse-graining techniques and explaining the physics underlying the emergence of collective phenomena.« less
Understanding visualization: a formal approach using category theory and semiotics.
Vickers, Paul; Faith, Joe; Rossiter, Nick
2013-06-01
This paper combines the vocabulary of semiotics and category theory to provide a formal analysis of visualization. It shows how familiar processes of visualization fit the semiotic frameworks of both Saussure and Peirce, and extends these structures using the tools of category theory to provide a general framework for understanding visualization in practice, including: Relationships between systems, data collected from those systems, renderings of those data in the form of representations, the reading of those representations to create visualizations, and the use of those visualizations to create knowledge and understanding of the system under inspection. The resulting framework is validated by demonstrating how familiar information visualization concepts (such as literalness, sensitivity, redundancy, ambiguity, generalizability, and chart junk) arise naturally from it and can be defined formally and precisely. This paper generalizes previous work on the formal characterization of visualization by, inter alia, Ziemkiewicz and Kosara and allows us to formally distinguish properties of the visualization process that previous work does not.
Quality and content of dental practice websites.
Nichols, L C; Hassall, D
2011-04-09
To investigate the quality and content of dental practice websites by constructing an audit framework based on regulations, guidance and expert advice, and applying this framework to a random sample of UK dental practices' websites. An audit framework was constructed and in-depth data collected from a random sample of 150 UK dental practices. Thirty-five percent of dental practices in this study were found to have websites. Compliance with rules and regulations regarding dental practice websites was generally poor. Use of advised content for practice promotion was variable. Many websites were poorly optimised. Eighty-nine percent of the websites advertised tooth whitening, despite the issues surrounding its legality; 25% of the websites advertised Botox even though advertising of prescription only medicines is illegal. Some websites gave misleading information about the specialist status of their dentists. Those responsible for dental practice websites need to be aware of a wide range of regulations and guidance, and are advised to follow expert advice on content and optimisation in order to maximise the potential of their websites.
Abramson, David M.; Grattan, Lynn M.; Mayer, Brian; Colten, Craig E.; Arosemena, Farah A.; Rung, Ariane; Lichtveld, Maureen
2014-01-01
A number of governmental agencies have called for enhancing citizen’s resilience as a means of preparing populations in advance of disasters, and as a counter-balance to social and individual vulnerabilities. This increasing scholarly, policy and programmatic interest in promoting individual and communal resilience presents a challenge to the research and practice communities: to develop a translational framework that can accommodate multi-disciplinary scientific perspectives into a single, applied model. The Resilience Activation Framework provides a basis for testing how access to social resources, such as formal and informal social support and help, promotes positive adaptation or reduced psychopathology among individuals and communities exposed to the acute collective stressors associated with disasters, whether manmade, natural, or technological in origin. Articulating the mechanisms by which access to social resources activate and sustain resilience capacities for optimal mental health outcomes post-disaster can lead to the development of effective preventive and early intervention programs. PMID:24870399
Abramson, David M; Grattan, Lynn M; Mayer, Brian; Colten, Craig E; Arosemena, Farah A; Bedimo-Rung, Ariane; Lichtveld, Maureen
2015-01-01
A number of governmental agencies have called for enhancing citizens' resilience as a means of preparing populations in advance of disasters, and as a counterbalance to social and individual vulnerabilities. This increasing scholarly, policy, and programmatic interest in promoting individual and communal resilience presents a challenge to the research and practice communities: to develop a translational framework that can accommodate multidisciplinary scientific perspectives into a single, applied model. The Resilience Activation Framework provides a basis for testing how access to social resources, such as formal and informal social support and help, promotes positive adaptation or reduced psychopathology among individuals and communities exposed to the acute collective stressors associated with disasters, whether human-made, natural, or technological in origin. Articulating the mechanisms by which access to social resources activate and sustain resilience capacities for optimal mental health outcomes post-disaster can lead to the development of effective preventive and early intervention programs.
Web-based GIS for spatial pattern detection: application to malaria incidence in Vietnam.
Bui, Thanh Quang; Pham, Hai Minh
2016-01-01
There is a great concern on how to build up an interoperable health information system of public health and health information technology within the development of public information and health surveillance programme. Technically, some major issues remain regarding to health data visualization, spatial processing of health data, health information dissemination, data sharing and the access of local communities to health information. In combination with GIS, we propose a technical framework for web-based health data visualization and spatial analysis. Data was collected from open map-servers and geocoded by open data kit package and data geocoding tools. The Web-based system is designed based on Open-source frameworks and libraries. The system provides Web-based analyst tool for pattern detection through three spatial tests: Nearest neighbour, K function, and Spatial Autocorrelation. The result is a web-based GIS, through which end users can detect disease patterns via selecting area, spatial test parameters and contribute to managers and decision makers. The end users can be health practitioners, educators, local communities, health sector authorities and decision makers. This web-based system allows for the improvement of health related services to public sector users as well as citizens in a secure manner. The combination of spatial statistics and web-based GIS can be a solution that helps empower health practitioners in direct and specific intersectional actions, thus provide for better analysis, control and decision-making.
ERIC Educational Resources Information Center
Normore, Lorraine
2011-01-01
Introduction: The perceived information needs of teachers who specialize in reading instruction for at-risk first graders were studied and related to frameworks for the role of social context in information needs, seeking and use. The frameworks considered were: disciplinarity, role theory in work settings, small worlds and information grounds and…
Description of the U.S. Geological Survey Geo Data Portal data integration framework
Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Lucido, Jessica M.
2012-01-01
The U.S. Geological Survey has developed an open-standard data integration framework for working efficiently and effectively with large collections of climate and other geoscience data. A web interface accesses catalog datasets to find data services. Data resources can then be rendered for mapping and dataset metadata are derived directly from these web services. Algorithm configuration and information needed to retrieve data for processing are passed to a server where all large-volume data access and manipulation takes place. The data integration strategy described here was implemented by leveraging existing free and open source software. Details of the software used are omitted; rather, emphasis is placed on how open-standard web services and data encodings can be used in an architecture that integrates common geographic and atmospheric data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vernon, Christopher R.; Arntzen, Evan V.; Richmond, Marshall C.
Assessing the environmental benefits of proposed flow modification to large rivers provides invaluable insight into future hydropower project operations and relicensing activities. Providing a means to quantitatively define flow-ecology relationships is integral in establishing flow regimes that are mutually beneficial to power production and ecological needs. To compliment this effort an opportunity to create versatile tools that can be applied to broad geographic areas has been presented. In particular, integration with efforts standardized within the ecological limits of hydrologic alteration (ELOHA) is highly advantageous (Poff et al. 2010). This paper presents a geographic information system (GIS) framework for large rivermore » classification that houses a base geomorphic classification that is both flexible and accurate, allowing for full integration with other hydrologic models focused on addressing ELOHA efforts. A case study is also provided that integrates publically available National Hydrography Dataset Plus Version 2 (NHDPlusV2) data, Modular Aquatic Simulation System two-dimensional (MASS2) hydraulic data, and field collected data into the framework to produce a suite of flow-ecology related outputs. The case study objective was to establish areas of optimal juvenile salmonid rearing habitat under varying flow regimes throughout an impounded portion of the lower Snake River, USA (Figure 1) as an indicator to determine sites where the potential exists to create additional shallow water habitat. Additionally, an alternative hydrologic classification useable throughout the contiguous United States which can be coupled with the geomorphic aspect of this framework is also presented. This framework provides the user with the ability to integrate hydrologic and ecologic data into the base geomorphic aspect of this framework within a geographic information system (GIS) to output spatiotemporally variable flow-ecology relationship scenarios.« less
Addressing practical challenges in utility optimization of mobile wireless sensor networks
NASA Astrophysics Data System (ADS)
Eswaran, Sharanya; Misra, Archan; La Porta, Thomas; Leung, Kin
2008-04-01
This paper examines the practical challenges in the application of the distributed network utility maximization (NUM) framework to the problem of resource allocation and sensor device adaptation in a mission-centric wireless sensor network (WSN) environment. By providing rich (multi-modal), real-time information about a variety of (often inaccessible or hostile) operating environments, sensors such as video, acoustic and short-aperture radar enhance the situational awareness of many battlefield missions. Prior work on the applicability of the NUM framework to mission-centric WSNs has focused on tackling the challenges introduced by i) the definition of an individual mission's utility as a collective function of multiple sensor flows and ii) the dissemination of an individual sensor's data via a multicast tree to multiple consuming missions. However, the practical application and performance of this framework is influenced by several parameters internal to the framework and also by implementation-specific decisions. This is made further complex due to mobile nodes. In this paper, we use discrete-event simulations to study the effects of these parameters on the performance of the protocol in terms of speed of convergence, packet loss, and signaling overhead thereby addressing the challenges posed by wireless interference and node mobility in ad-hoc battlefield scenarios. This study provides better understanding of the issues involved in the practical adaptation of the NUM framework. It also helps identify potential avenues of improvement within the framework and protocol.
NASA Astrophysics Data System (ADS)
Arkhipkin, D.; Lauret, J.
2017-10-01
One of the STAR experiment’s modular Messaging Interface and Reliable Architecture framework (MIRA) integration goals is to provide seamless and automatic connections with the existing control systems. After an initial proof of concept and operation of the MIRA system as a parallel data collection system for online use and real-time monitoring, the STAR Software and Computing group is now working on the integration of Experimental Physics and Industrial Control System (EPICS) with MIRA’s interfaces. This integration goals are to allow functional interoperability and, later on, to replace the existing/legacy Detector Control System components at the service level. In this report, we describe the evolutionary integration process and, as an example, will discuss the EPICS Alarm Handler conversion. We review the complete upgrade procedure starting with the integration of EPICS-originated alarm signals propagation into MIRA, followed by the replacement of the existing operator interface based on Motif Editor and Display Manager (MEDM) with modern portable web-based Alarm Handler interface. To achieve this aim, we have built an EPICS-to-MQTT [8] bridging service, and recreated the functionality of the original Alarm Handler using low-latency web messaging technologies. The integration of EPICS alarm handling into our messaging framework allowed STAR to improve the DCS alarm awareness of existing STAR DAQ and RTS services, which use MIRA as a primary source of experiment control information.
Learning and Collective Knowledge Construction With Social Media: A Process-Oriented Perspective
Kimmerle, Joachim; Moskaliuk, Johannes; Oeberst, Aileen; Cress, Ulrike
2015-01-01
Social media are increasingly being used for educational purposes. The first part of this article briefly reviews literature that reports on educational applications of social media tools. The second part discusses theories that may provide a basis for analyzing the processes that are relevant for individual learning and collective knowledge construction. We argue that a systems-theoretical constructivist approach is appropriate to examine the processes of educational social media use, namely, self-organization, the internalization of information, the externalization of knowledge, and the interplay of externalization and internalization providing the basis of a co-evolution of cognitive and social systems. In the third part we present research findings that illustrate and support this systems-theoretical framework. Concluding, we discuss the implications for educational design and for future research on learning and collective knowledge construction with social media. PMID:26246643
NASA Astrophysics Data System (ADS)
Zama, Shinsaku; Endo, Makoto; Takanashi, Ken'ichi; Araiba, Kiminori; Sekizawa, Ai; Hosokawa, Masafumi; Jeong, Byeong-Pyo; Hisada, Yoshiaki; Murakami, Masahiro
Based on the earlier study result that the gathering of damage information can be quickly achieved in a municipality with a smaller population, it is proposed that damage information is gathered and analyzed using an area roughly equivalent to a primary school district as a basic unit. The introduction of this type of decentralized system is expected to quickly gather important information on each area. The information gathered by these communal disaster prevention bases is sent to the disaster prevention headquarters which in turn feeds back more extensive information over a wider area to the communal disaster prevention bases. Concrete systems have been developed according to the above mentioned framework, and we performed large-scale experiments on simulating disaster information collection, transmission and on utilization for smooth responses against earthquake disaster with collaboration from Toyohashi City, Aichi Prefecture, where is considered to suffer extensive damage from the Tokai and Tonankai Earthquakes with very high probability of the occurrence. Using disaster information collection/transmission equipments composed of long-distance wireless LAN, a notebook computer, a Web camera and an IP telephone, city staffs could easily input and transmit the information such as fire, collapsed houses and impassable roads, which were collected by the inhabitants participated in the experiment. Headquarters could confirm such information on the map automatically plotted, and also state of each disaster-prevention facility by means of Web-cameras and IP telephones. Based on the damage information, fire-spreading, evaluation, and traffic simulations were automatically executed at the disaster countermeasure office and their results were displayed on the large screen to utilize for making decisions such as residents' evacuation. These simulated results were simultaneously displayed at each disaster-prevention facility and were served to make people understand the situation of whole damage of the city and necessity of evacuation with optimum timing and access. According to the evaluation by the city staffs through the experiments, information technology is available for rationally implementing initial responses just after a large earthquake in spite of some improvement on the systems used in the experiments.
NASA Astrophysics Data System (ADS)
Pásztor, László; Dobos, Endre; Szabó, József; Bakacsi, Zsófia; Laborczi, Annamária
2013-04-01
There is a heap of evidences that demands on soil related information have been significant worldwide and it is still increasing. Soil maps were typically used for long time to satisfy these demands. By the spread of GI technology, spatial soil information systems (SSIS) and digital soil mapping (DSM) took the role of traditional soil maps. Due to the relatively high costs of data collection, new conventional soil surveys and inventories are getting less and less frequent, which fact valorises legacy soil information and the systems which are serving the their digitally processed version. The existing data contain a wealth of information that can be exploited by proper methodology. Not only the degree of current needs for soil information has changed but also its nature. Traditionally the agricultural functions of soils were focussed on, which was also reflected in the methodology of data collection and mapping. Recently the multifunctionality of soils is getting to gain more and more ground; consequently information related to additional functions of soils becomes identically important. The new types of information requirements however cannot be fulfilled generally with new data collections at least not on such a level as it was done in the frame of traditional soil surveys. Soil monitoring systems have been established for the collection of recent information on the various elements of the DPSIR (Driving Forces-Pressures-State-Impacts-Responses) framework, but the primary goal of these systems has not been mapping by all means. And definitely this is the case concerning the two recently working Hungarian soil monitoring systems. In Hungary, presently soil data requirements are fulfilled with the recently available datasets either by their direct usage or after certain specific and generally fortuitous, thematic and/or spatial inference. Due to the more and more frequently emerging discrepancies between the available and the expected data, there might be notable imperfection as for the accuracy and reliability of the delivered products. Since, similarly to the great majority of the world, large-scale, comprehensive new surveys cannot be expected in the near future, the actually available legacy data should be relied on. With a recently started project we would like to significantly extend the potential, how countrywide soil information requirements could be satisfied. In the frame of our project we plan the execution of spatial and thematic data mining of significant amount of soil related information available in the form of legacy soil data as well as digital databases and spatial soil information systems. In the course of the analyses we will lean on auxiliary, spatial data themes related to environmental elements. Based on the established relationships we will convert and integrate the specific data sets for the regionalization of the various, derived soil parameters. By the aid of GIS and geostatistical tools we will carry out the spatial extension of certain pedological variables featuring the (including degradation) state, processes or functions of soils. We plan to compile digital soil maps which fulfil optimally the national and international demands from points of view of thematic, spatial and temporal accuracy. The targeted spatial resolution of the proposed countrywide, digital, thematic soil property and function maps is at least 1:50.000 (approx. 50-100 meter raster). Our stressful objective is the definite solution of the regionalization of the information collected in the frame of two recent, contemporary, national, systematic soil data collection (not designed for mapping purpose) on the recent state of soils, in order to produce countrywide maps for the spatial inventory of certain soil properties, processes and functions with sufficient accuracy and reliability.
eRegistries: Electronic registries for maternal and child health.
Frøen, J Frederik; Myhre, Sonja L; Frost, Michael J; Chou, Doris; Mehl, Garrett; Say, Lale; Cheng, Socheat; Fjeldheim, Ingvild; Friberg, Ingrid K; French, Steve; Jani, Jagrati V; Kaye, Jane; Lewis, John; Lunde, Ane; Mørkrid, Kjersti; Nankabirwa, Victoria; Nyanchoka, Linda; Stone, Hollie; Venkateswaran, Mahima; Wojcieszek, Aleena M; Temmerman, Marleen; Flenady, Vicki J
2016-01-19
The Global Roadmap for Health Measurement and Accountability sees integrated systems for health information as key to obtaining seamless, sustainable, and secure information exchanges at all levels of health systems. The Global Strategy for Women's, Children's and Adolescent's Health aims to achieve a continuum of quality of care with effective coverage of interventions. The WHO and World Bank recommend that countries focus on intervention coverage to monitor programs and progress for universal health coverage. Electronic health registries - eRegistries - represent integrated systems that secure a triple return on investments: First, effective single data collection for health workers to seamlessly follow individuals along the continuum of care and across disconnected cadres of care providers. Second, real-time public health surveillance and monitoring of intervention coverage, and third, feedback of information to individuals, care providers and the public for transparent accountability. This series on eRegistries presents frameworks and tools to facilitate the development and secure operation of eRegistries for maternal and child health. In this first paper of the eRegistries Series we have used WHO frameworks and taxonomy to map how eRegistries can support commonly used electronic and mobile applications to alleviate health systems constraints in maternal and child health. A web-based survey of public health officials in 64 low- and middle-income countries, and a systematic search of literature from 2005-2015, aimed to assess country capacities by the current status, quality and use of data in reproductive health registries. eRegistries can offer support for the 12 most commonly used electronic and mobile applications for health. Countries are implementing health registries in various forms, the majority in transition from paper-based data collection to electronic systems, but very few have eRegistries that can act as an integrating backbone for health information. More mature country capacity reflected by published health registry based research is emerging in settings reaching regional or national scale, increasingly with electronic solutions. 66 scientific publications were identified based on 32 registry systems in 23 countries over a period of 10 years; this reflects a challenging experience and capacity gap for delivering sustainable high quality registries. Registries are being developed and used in many high burden countries, but their potential benefits are far from realized as few countries have fully transitioned from paper-based health information to integrated electronic backbone systems. Free tools and frameworks exist to facilitate progress in health information for women and children.
Heslop, Carl William; Burns, Sharyn; Lobo, Roanna; McConigley, Ruth
2017-01-01
Introduction There is limited research examining community-based or multilevel interventions that address the sexual health of young people in the rural Australian context. This paper describes the Participatory Action Research (PAR) project that will develop and validate a framework that is effective for planning, implementing and evaluating multilevel community-based sexual health interventions for young people aged 16–24 years in the Australian rural setting. Methods and analysis To develop a framework for sexual health interventions with stakeholders, PAR will be used. Three PAR cycles will be conducted, using semistructured one-on-one interviews, focus groups, community mapping and photovoice to inform the development of a draft framework. Cycle 2 and Cycle 3 will use targeted Delphi studies to gather evaluation and feedback on the developed draft framework. All data collected will be reviewed and analysed in detail and coded as concepts become apparent at each stage of the process. Ethics and dissemination This protocol describes a supervised doctoral research project. This project seeks to contribute to the literature regarding PAR in the rural setting and the use of the Delphi technique within PAR projects. The developed framework as a result of the project will provide a foundation for further research testing the application of the framework in other settings and health areas. This research has received ethics approval from the Curtin University Human Research and Ethics Committee (HR96/2015). PMID:28559453
Park, Hyeone; Higgs, Eric
2018-02-02
Food forestry is a burgeoning practice in North America, representing a strong multifunctional approach that combines agriculture, forestry, and ecological restoration. The Galiano Conservancy Association (GCA), a community conservation, restoration, and educational organization on Galiano Island, British Columbia in Canada, recently has created two food forests on their protected forested lands: one with primarily non-native species and the other comprising native species. These projects, aimed at food production, education, and promotion of local food security and sustainability, are also intended to contribute to the overall ecological integrity of the landscape. Monitoring is essential for assessing how effectively a project is meeting its goal and thus informing its adaptive management. Yet, presently, there are no comprehensive monitoring frameworks for food forestry available. To fill this need, this study developed a generic Criteria and Indicators (C&I) monitoring framework for food forestry, embedded in ecological restoration principles, by employing qualitative content analysis of 61 literature resources and semi-structured interviews with 16 experts in the fields of food forestry and ecological restoration. The generic C&I framework comprises 14 criteria, 39 indicators, and 109 measures and is intended to guide a comprehensive and systematic assessment for food forest projects. The GCA adapted the generic C&I framework to develop a customized monitoring framework. The Galiano C&I monitoring framework has comprehensive suite of monitoring parameters, which are collectively address multiple values and goals.
Advanced Computational Framework for Environmental Management ZEM, Version 1.x
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vesselinov, Velimir V.; O'Malley, Daniel; Pandey, Sachin
2016-11-04
Typically environmental management problems require analysis of large and complex data sets originating from concurrent data streams with different data collection frequencies and pedigree. These big data sets require on-the-fly integration into a series of models with different complexity for various types of model analyses where the data are applied as soft and hard model constraints. This is needed to provide fast iterative model analyses based on the latest available data to guide decision-making. Furthermore, the data and model are associated with uncertainties. The uncertainties are probabilistic (e.g. measurement errors) and non-probabilistic (unknowns, e.g. alternative conceptual models characterizing site conditions).more » To address all of these issues, we have developed an integrated framework for real-time data and model analyses for environmental decision-making called ZEM. The framework allows for seamless and on-the-fly integration of data and modeling results for robust and scientifically-defensible decision-making applying advanced decision analyses tools such as Bayesian- Information-Gap Decision Theory (BIG-DT). The framework also includes advanced methods for optimization that are capable of dealing with a large number of unknown model parameters, and surrogate (reduced order) modeling capabilities based on support vector regression techniques. The framework is coded in Julia, a state-of-the-art high-performance programing language (http://julialang.org). The ZEM framework is open-source and can be applied to any environmental management site. The framework will be open-source and released under GPL V3 license.« less
Optimization-Based Management of Energy Systems
2011-05-11
Power [kW] F u e l co n su m p tio n [g a l/h ] 50 kW ~45 kWh 10 Energy Management Framework: Dealing with Uncertainties Test Cases used to exploit...collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE MAY 2011 2. REPORT TYPE 3. DATES COVERED 00-00...served under all operating conditions. ‘Customizable’ power quality and reliability Seamless transition between islanding and off-grid operation
2012-01-01
Hammond, A. M. Belcher, Nat. Nanotechnol. 2011. [19] C. F. Barbass III, D. R. Burton, J. K. Scott, G. J. Silverman, Phage display : a laboratory manual...with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1...19b. TELEPHONE NUMBER (Include area code) New Reprint Graphene Sheets Stabilized on Genetically Engineered M13 Viral Templates as Conducting
2016-12-06
direction and speed based on cost minimization and best estimated time of arrival (ETA). Sometimes, ships are forced to travel 43 Lehigh Technical...the allowable time to complete the travel . Another important aspect, addressed in the case study, is to investigate the optimal routing of aged...The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Troy Michael; Kress, Joel David; Bhat, Kabekode Ghanasham
Year 1 Objectives (August 2016 – December 2016) – The original Independence model is a sequentially regressed set of parameters from numerous data sets in the Aspen Plus modeling framework. The immediate goal with the basic data model is to collect and evaluate those data sets relevant to the thermodynamic submodels (pure substance heat capacity, solvent mixture heat capacity, loaded solvent heat capacities, and volatility data). These data are informative for the thermodynamic parameters involved in both vapor-liquid equilibrium, and in the chemical equilibrium of the liquid phase.
2016-01-06
Signature// //Signature// //Signature// REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 The public reporting burden for this collection...information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD...D. Musinski 19b. TELEPHONE NUMBER (Include Area Code) (937) 255-0485 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39-18 REPORT
A Bayesian Machine Learning Model for Estimating Building Occupancy from Open Source Data
Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.; ...
2016-01-01
Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.
Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less
Kushniruk, A. W.; Patel, V. L.; Cimino, J. J.
1997-01-01
This paper describes an approach to the evaluation of health care information technologies based on usability engineering and a methodological framework from the study of medical cognition. The approach involves collection of a rich set of data including video recording of health care workers as they interact with systems, such as computerized patient records and decision support tools. The methodology can be applied in the laboratory setting, typically involving subjects "thinking aloud" as they interact with a system. A similar approach to data collection and analysis can also be extended to study of computer systems in the "live" environment of hospital clinics. Our approach is also influenced from work in the area of cognitive task analysis, which aims to characterize the decision making and reasoning of subjects of varied levels of expertise as they interact with information technology in carrying out representative tasks. The stages involved in conducting cognitively-based usability analyses are detailed and the application of such analysis in the iterative process of system and interface development is discussed. PMID:9357620
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crossno, Patricia Joyce; Dunlavy, Daniel M.; Stanton, Eric T.
This report is a summary of the accomplishments of the 'Scalable Solutions for Processing and Searching Very Large Document Collections' LDRD, which ran from FY08 through FY10. Our goal was to investigate scalable text analysis; specifically, methods for information retrieval and visualization that could scale to extremely large document collections. Towards that end, we designed, implemented, and demonstrated a scalable framework for text analysis - ParaText - as a major project deliverable. Further, we demonstrated the benefits of using visual analysis in text analysis algorithm development, improved performance of heterogeneous ensemble models in data classification problems, and the advantages ofmore » information theoretic methods in user analysis and interpretation in cross language information retrieval. The project involved 5 members of the technical staff and 3 summer interns (including one who worked two summers). It resulted in a total of 14 publications, 3 new software libraries (2 open source and 1 internal to Sandia), several new end-user software applications, and over 20 presentations. Several follow-on projects have already begun or will start in FY11, with additional projects currently in proposal.« less
Holden, Richard J.; McDougald Scott, Amanda M.; Hoonakker, Peter L.T.; Hundt, Ann S.; Carayon, Pascale
2014-01-01
Purpose Collecting information about health and disease directly from patients can be fruitfully accomplished using contextual approaches, ones that combine more and less structured methods in home and community settings. This paper's purpose is to describe and illustrate a framework of the challenges of contextual data collection. Methods A framework is presented based on prior work in community-based participatory research and organizational science, comprised of ten types of challenges across four broader categories. Illustrations of challenges and suggestions for addressing them are drawn from two mixed-method, contextual studies of patients with chronic disease in two regions of the US. Results The first major category of challenges was concerned with the researcher-participant partnership, for example, the initial lack of mutual trust and understanding between researchers, patients, and family members. The second category concerned patient characteristics such as cognitive limitations and a busy personal schedule that created barriers to successful data collection. The third concerned research logistics and procedures such as recruitment, travel distances, and compensation. The fourth concerned scientific quality and interpretation, including issues of validity, reliability, and combining data from multiple sources. The two illustrative studies faced both common and diverse research challenges and used many different strategies to address them. Conclusion Collecting less structured data from patients and others in the community is potentially very productive but requires the anticipation, avoidance, or negotiation of various challenges. Future work is necessary to better understand these challenges across different methods and settings, as well as to test and identify strategies to address them. PMID:25154464
Review of Building Data Frameworks across Countries: Lessons for India
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iyer, Maithili; Stratton, Hannah; Mathew, Sangeeta
The report outlines the initial explorations carried out by LBNL on available examples of energy data collection frameworks for buildings. Specifically, this monograph deals with European experience in the buildings sector, the US experience in the commercial buildings sector, and examples of data collection effort in Singapore and China to capture the Asian experience in the commercial sector. The review also provides a summary of the past efforts in India to collect and use commercial building energy data and its strengths and weaknesses. The overall aim of this activity is to help understand the use cases that drive the granularitymore » of data being collected and the range of methodologies adopted for the data collection effort. This review is a key input and reference for developing a data collection framework for India, and also clarifies general thinking on the institutional structure that may be amenable for data collection effort to match the needs and requirements of commercial building sector in India.« less
NASA Astrophysics Data System (ADS)
Bernknopf, R.; Kuwayama, Y.; Brookshire, D.; Macauley, M.; Zaitchik, B.; Pesko, S.; Vail, P.
2014-12-01
Determining how much to invest in earth observation technology depends in part on the value of information (VOI) that can be derived from the observations. We design a framework and then evaluate the value-in-use of the NASA Gravity Research and Climate Experiment (GRACE) for regional water use and reliability in the presence of drought. As a technology that allows measurement of water storage, the GRACE Data Assimilation System (DAS) provides information that is qualitatively different from that generated by other water data sources. It provides a global, reproducible grid of changes in surface and subsurface water resources on a frequent and regular basis. Major damages from recent events such as the 2012 Midwest drought and the ongoing drought in California motivate the need to understand the VOI from remotely sensed data such as that derived from GRACE DAS. Our conceptual framework models a dynamic risk management problem in agriculture. We base the framework on information from stakeholders and subject experts. The economic case for GRACE DAS involves providing better water availability information. In the model, individuals have a "willingness to pay" (wtp) for GRACE DAS - essentially, wtp is an expression of savings in reduced agricultural input costs and for costs that are influenced by regional policy decisions. Our hypothesis is that improvements in decision making can be achieved with GRACE DAS measurements of water storage relative to data collected from groundwater monitoring wells and soil moisture monitors that would be relied on in the absence of GRACE DAS. The VOI is estimated as a comparison of outcomes. The California wine grape industry has features that allow it to be a good case study and a basis for extrapolation to other economic sectors. We model water use in this sector as a sequential decision highlighting the attributes of GRACE DAS input as information for within-season production decisions as well as for longer-term water reliability.
Building Assured Systems Framework
2010-09-01
of standards such as ISO 27001 as frameworks [NASCIO 2009]. In this context, a framework is a standard intended to assist in auditing and compliance...Information Security ISO /IEC 27004 Information technology – Security techniques - Information security management measurement ISO /IEC 15939, System and
Hierarchical Representation Learning for Kinship Verification.
Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul
2017-01-01
Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.
Valseth, Kristen J.; Delzer, Gregory C.; Price, Curtis V.
2018-03-21
The U.S. Geological Survey, in cooperation with the City of Sioux Falls, South Dakota, began developing a groundwater-flow model of the Big Sioux aquifer in 2014 that will enable the City to make more informed water management decisions, such as delineation of areas of the greatest specific yield, which is crucial for locating municipal wells. Innovative tools are being evaluated as part of this study that can improve the delineation of the hydrogeologic framework of the aquifer for use in development of a groundwater-flow model, and the approach could have transfer value for similar hydrogeologic settings. The first step in developing a groundwater-flow model is determining the hydrogeologic framework (vertical and horizontal extents of the aquifer), which typically is determined by interpreting geologic information from drillers’ logs and surficial geology maps. However, well and borehole data only provide hydrogeologic information for a single location; conversely, nearly continuous geophysical data are collected along flight lines using airborne electromagnetic (AEM) surveys. These electromagnetic data are collected every 3 meters along a flight line (on average) and subsequently can be related to hydrogeologic properties. AEM data, coupled with and constrained by well and borehole data, can substantially improve the accuracy of aquifer hydrogeologic framework delineations and result in better groundwater-flow models. AEM data were acquired using the Resolve frequency-domain AEM system to map the Big Sioux aquifer in the region of the city of Sioux Falls. The survey acquired more than 870 line-kilometers of AEM data over a total area of about 145 square kilometers, primarily over the flood plain of the Big Sioux River between the cities of Dell Rapids and Sioux Falls. The U.S. Geological Survey inverted the survey data to generate resistivity-depth sections that were used in two-dimensional maps and in three-dimensional volumetric visualizations of the Earth resistivity distribution. Contact lines were drawn using a geographic information system to delineate interpreted geologic stratigraphy. The contact lines were converted to points and then interpolated into a raster surface. The methods used to develop elevation and depth maps of the hydrogeologic framework of the Big Sioux aquifer are described herein.The final AEM interpreted aquifer thickness ranged from 0 to 31 meters with an average thickness of 12.8 meters. The estimated total volume of the aquifer was 1,060,000,000 cubic meters based on the assumption that the top of the aquifer is the land-surface elevation. A simple calculation of the volume (length times width times height) of a previous delineation of the aquifer estimated the aquifer volume at 378,000,000 cubic meters; thus, the estimation based on AEM data is more than twice the previous estimate. The depth to top of Sioux Quartzite, which ranged in depth from 0 to 90 meters, also was delineated from the AEM data.
NASA Astrophysics Data System (ADS)
Paudyal, D. R.; McDougall, K.; Apan, A.
2012-07-01
The participation and engagement of grass-root level community groups and citizens for natural resource management has a long history. With recent developments in ICT tools and spatial technology, these groups are seeking a new opportunity to manage natural resource data. There are lot of spatial information collected/generated by landcare groups, land holders and other community groups at the grass-root level through their volunteer initiatives. State government organisations are also interested in gaining access to this spatial data/information and engaging these groups to collect spatial information under their mapping programs. The aim of this paper is to explore the possible utilisation of volunteered geographic information (VGI) for catchment management activities. This research paper discusses the importance of spatial information and spatial data infrastructure (SDI) for catchment management and the emergence of VGI. A conceptual framework has been developed to illustrate how these emerging spatial information applications and various community volunteer activities can contribute to a more inclusive spatial data infrastructure (SDI) development at local level. A survey of 56 regional NRM bodies in Australia was utilised to explore the current community-driven volunteer initiatives for NRM activities and the potential of utilisation of VGI initiatives for NRM decision making process. This research paper concludes that VGI activities have great potential to contribute to SDI development at the community level to achieve better natural resource management (NRM) outcomes.
A framework for automatic information quality ranking of diabetes websites.
Belen Sağlam, Rahime; Taskaya Temizel, Tugba
2015-01-01
Objective: When searching for particular medical information on the internet the challenge lies in distinguishing the websites that are relevant to the topic, and contain accurate information. In this article, we propose a framework that automatically identifies and ranks diabetes websites according to their relevance and information quality based on the website content. Design: The proposed framework ranks diabetes websites according to their content quality, relevance and evidence based medicine. The framework combines information retrieval techniques with a lexical resource based on Sentiwordnet making it possible to work with biased and untrusted websites while, at the same time, ensuring the content relevance. Measurement: The evaluation measurements used were Pearson-correlation, true positives, false positives and accuracy. We tested the framework with a benchmark data set consisting of 55 websites with varying degrees of information quality problems. Results: The proposed framework gives good results that are comparable with the non-automated information quality measuring approaches in the literature. The correlation between the results of the proposed automated framework and ground-truth is 0.68 on an average with p < 0.001 which is greater than the other proposed automated methods in the literature (r score in average is 0.33).
Ouimet, Mathieu; Lavis, John N; Léon, Grégory; Ellen, Moriah E; Bédard, Pierre-Olivier; Grimshaw, Jeremy M; Gagnon, Marie-Pierre
2014-10-09
This protocol builds on the development of a) a framework that identified the various supports (i.e. positions, activities, interventions) that a healthcare organisation or health system can implement for evidence-informed decision-making (EIDM) and b) a qualitative study that showed the current mix of supports that some Canadian healthcare organisations have in place and the ones that are perceived to facilitate the use of research evidence in decision-making. Based on these findings, we developed a web survey to collect cross-sectional data about the specific supports that regional health authorities and hospitals in two Canadian provinces (Ontario and Quebec) have in place to facilitate EIDM. This paper describes the methods for a cross-sectional web survey among 32 regional health authorities and 253 hospitals in the provinces of Quebec and Ontario (Canada) to collect data on the current mix of organisational supports that these organisations have in place to facilitate evidence-informed decision-making. The data will be obtained through a two-step survey design: a 10-min survey among CEOs to identify key units and individuals in regard to our objectives (step 1) and a 20-min survey among managers of the key units identified in step 1 to collect information about the activities performed by their unit regarding the acquisition, assessment, adaptation and/or dissemination of research evidence in decision-making (step 2). The study will target three types of informants: CEOs, library/documentation centre managers and all other key managers whose unit is involved in the acquisition, assessment, adaptation/packaging and/or dissemination of research evidence in decision-making. We developed an innovative data collection system to increase the likelihood that only the best-informed respondent available answers each survey question. The reporting of the results will be done using descriptive statistics of supports by organisation type and by province. This study will be the first to collect and report large-scale cross-sectional data on the current mix of supports health system organisations in the two most populous Canadian provinces have in place for evidence-informed decision-making. The study will also provide useful information to researchers on how to collect organisation-level data with reduced risk of self-reporting bias.
The international spinal cord injury endocrine and metabolic function basic data set.
Bauman, W A; Biering-Sørensen, F; Krassioukov, A
2011-10-01
To develop the International Spinal Cord Injury (SCI) Endocrine and Metabolic Function Basic Data Set within the framework of the International SCI Data Sets that would facilitate consistent collection and reporting of basic endocrine and metabolic findings in the SCI population. International. The International SCI Endocrine and Metabolic Function Data Set was developed by a working group. The initial data set document was revised on the basis of suggestions from members of the Executive Committee of the International SCI Standards and Data Sets, the International Spinal Cord Society (ISCoS) Executive and Scientific Committees, American Spinal Injury Association (ASIA) Board, other interested organizations and societies, and individual reviewers. In addition, the data set was posted for 2 months on ISCoS and ASIA websites for comments. The final International SCI Endocrine and Metabolic Function Data Set contains questions on the endocrine and metabolic conditions diagnosed before and after spinal cord lesion. If available, information collected before injury is to be obtained only once, whereas information after injury may be collected at any time. These data include information on diabetes mellitus, lipid disorders, osteoporosis, thyroid disease, adrenal disease, gonadal disease and pituitary disease. The question of gonadal status includes stage of sexual development and that for females also includes menopausal status. Data will be collected for body mass index and for the fasting serum lipid profile. The complete instructions for data collection and the data sheet itself are freely available on the websites of ISCoS (http://www.iscos.org.uk) and ASIA (http://www.asia-spinalinjury.org).
Ecosystem Services and Climate Change Considerations for ...
Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework “iemWatersheds” has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water
Knowledge Management in Taxonomy and Biostratigraphy using TaxonConcept Software
NASA Astrophysics Data System (ADS)
Klump, J.; Huber, R.; Goetz, S.
2005-12-01
The use of fossils to constrain age models for geological samples is not as straightforward as it might seem. Even though index fossils have been defined as biostratigraphic time markers ambiguity arises from the synonymous use of taxonomic names. Progress in our understanding of the origin of certain species have sometimes lead to substantial changes in the taxonomic classification of these organisms. TaxonConcept was created as part of the Stratigraphy.net initiative as a tool to manage taxonomic information and complex knowledge networks to help resolve taxonomic ambiguities in biostratigraphy. Its workflow is based on the principles of open nomenclature. Open nomenclature allows researchers to comment on the identification of a specimen which cannot exactly be determined and is frequently used in synonymy lists. The use of such synonymy lists in TaxonConcept allows to work with taxonomic classifications that are uncertain, or where several versions exist. Every single taxonomic entity in TaxonConcept is recorded with its relevant citations in the literature. This allows to manage information on taxonomy. The members of working groups using TaxonConcept can record their opinion on the taxonomic classification of each taxon in the framework of open nomenclature and annotate it in free text. This managed and structured collection of taxonomic opinions is an example of knowledge management. Taxonomic opinions are otherwise dispersed throughout the literature, if recorded at all, and are only available to the specialist. Assembled as a collection, they represent our knowledge on the taxonomy of a certain group of organisms. In the terminology of computer science, the semantic relationships between taxonomic terms are an ontology. Open nomenclature offers a formal framework that lends itself very well to describe the nature of the relations between taxonomic terms. The use of such synonymy lists in a taxonomic information system allows interesting search options, ranging from tracking name changes to the investigation of complex taxonomic topologies. In addition to its synonymy and literature management, TaxonConcept allows to store many other information categories such as textual descriptions (e.g. diagnoses and comments), images, bioevents and specimen and collection data. Ecological information is scheduled for a later stage of the project. Already now TaxonConcept is linked to taxon names in paleoenvironmental data of the World Data Center for Marine Environmental Sciences (WDC-MARE), interfaces to other databases are planned. WDC-MARE stores environmental, marine and geological research data and frequently uses taxon names in its parameters. By linking TaxonConcept and WDC-MARE, synonymous names can be included in queries, e.g. when researching for stable isotope data measured on microfossils. TaxonConcept is not a project on authoritative taxonomic information, but is a tool for taxonomic projects to use to find a taxonomic consensus, e.g. to define a taxonomic framework for biostratigraphic studies. Both, the project specific hierarchical classification of selected taxa, as well as a project specific selection of any other information categories is supported by TaxonConcept. The results of such a taxonomic consensus can be used to create Fossilium Catalogus style summaries in various output formats which can later be used to create online or print publications.
Magny, J; Reveillère, C
2011-09-01
Within the objective of coordinating actions of the different partners whose mission involves childhood protection measures, and to allow convergence of preoccupying information toward a centralized unit, law n(o) 2007-293 of 5 March 2007 reforming child protection requires the creation of a departmental cell for the collection, processing, and assessment of preoccupying information (cellule départementale, de recueil, de traitement, et d'évaluation des informations préoccupantes, CRIP) on the circumstances of a minor in danger or at risk of being so. The CRIP 75 is a multidisciplinary cell comprising an administrative pole, a socio-educational pole, and a medical health officer. Its mission is to participate in assessing preoccupying information and directing it appropriately, with a preference toward treating situations within an administrative framework and in accordance with the parents. The public prosecutor is only called in when the recommended measures have not provided an adequate response to the danger. Situations that are a matter for prosecution as a criminal offence are transmitted directly to the public prosecutor's office, as are situations for which the social or medico social services are unable to make an assessment. Copyright © 2011 Elsevier Masson SAS. All rights reserved.
Foley, Margaret M; Glenn, Regina M; Meli, Peggy L; Scichilone, Rita A
2009-01-01
Introduction Health information management (HIM) professionals' involvement with disease classification and nomenclature in the United States can be traced back to the early 20th century. In 1914, Grace Whiting Myers, the founder of the association known today as the American Health Information Management Association (AHIMA), served on the Committee on Uniform Nomenclature, which developed a disease classification system based upon etiological groupings. The profession's expertise and leadership in the collection, classification, and reporting of health data has continued since then. For example, in the early 1960s, another HIM professional (a medical record librarian) served as the associate editor of the fifth edition of the Standard Nomenclature of Disease (SNDO), a forerunner of the widely used clinical terminology, Systematized Nomenclature of Medicine Clinical Terms (SNOMED-CT). During the same period in history, the medical record professionals working in hospitals throughout the country were responsible for manually collecting and reporting disease and procedure information from medical records using SNDO.1 Because coded data have played a pivotal role in the ability to record and share health information through the years, creating the appropriate policy framework for the graceful evolution and harmonization of classification systems and clinical terminologies is essential. PMID:20169015
Evaluating Health Information Systems Using Ontologies
Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan
2016-01-01
Background There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. Objectives The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems—whether similar or heterogeneous—by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. Methods On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. Results The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. Conclusions The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems. PMID:27311735
Evaluating Health Information Systems Using Ontologies.
Eivazzadeh, Shahryar; Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan
2016-06-16
There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems-whether similar or heterogeneous-by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems.
Judging nursing information on the world wide web.
Cader, Raffik
2013-02-01
The World Wide Web is increasingly becoming an important source of information for healthcare professionals. However, finding reliable information from unauthoritative Web sites to inform healthcare can pose a challenge to nurses. A study, using grounded theory, was undertaken in two phases to understand how qualified nurses judge the quality of Web nursing information. Data were collected using semistructured interviews and focus groups. An explanatory framework that emerged from the data showed that the judgment process involved the application of forms of knowing and modes of cognition to a range of evaluative tasks and depended on the nurses' critical skills, the time available, and the level of Web information cues. This article mainly focuses on the six evaluative tasks relating to assessing user-friendliness, outlook and authority of Web pages, and relationship to nursing practice; appraising the nature of evidence; and applying cross-checking strategies. The implications of these findings to nurse practitioners and publishers of nursing information are significant.
Jang, Kyungeun; Baek, Young Min
2018-03-20
Public health officials (PHOs) are responsible for providing trustworthy information during a public health crisis; however, there is little research on how the public behaves when their expectations for such information are violated. Drawing on media dependency theory and source credibility research as our primary theoretical framework, we tested how credibility of information from PHOs is associated with people's reliance on a particular communication channel in the context of the 2015 Middle East Respiratory Syndrome (MERS) outbreak in South Korea. Using nationally representative data (N = 1036) collected during the MERS outbreak, we found that less credible information from PHOs led to more frequent use of online news, interpersonal networks, and social media for acquiring MERS-related information. However, credibility of information from PHOs was not associated with the use of television news or print newspapers. The theoretical and practical implications of our results on communication channels usage are discussed.
ERIC Educational Resources Information Center
Klebansky, Anna; Fraser, Sharon P.
2013-01-01
This paper details a conceptual framework that situates curriculum design for information literacy and lifelong learning, through a cohesive developmental information literacy based model for learning, at the core of teacher education courses at UTAS. The implementation of the framework facilitates curriculum design that systematically,…
Supporting shared hypothesis testing in the biomedical domain.
Agibetov, Asan; Jiménez-Ruiz, Ernesto; Ondrésik, Marta; Solimando, Alessandro; Banerjee, Imon; Guerrini, Giovanna; Catalano, Chiara E; Oliveira, Joaquim M; Patanè, Giuseppe; Reis, Rui L; Spagnuolo, Michela
2018-02-08
Pathogenesis of inflammatory diseases can be tracked by studying the causality relationships among the factors contributing to its development. We could, for instance, hypothesize on the connections of the pathogenesis outcomes to the observed conditions. And to prove such causal hypotheses we would need to have the full understanding of the causal relationships, and we would have to provide all the necessary evidences to support our claims. In practice, however, we might not possess all the background knowledge on the causality relationships, and we might be unable to collect all the evidence to prove our hypotheses. In this work we propose a methodology for the translation of biological knowledge on causality relationships of biological processes and their effects on conditions to a computational framework for hypothesis testing. The methodology consists of two main points: hypothesis graph construction from the formalization of the background knowledge on causality relationships, and confidence measurement in a causality hypothesis as a normalized weighted path computation in the hypothesis graph. In this framework, we can simulate collection of evidences and assess confidence in a causality hypothesis by measuring it proportionally to the amount of available knowledge and collected evidences. We evaluate our methodology on a hypothesis graph that represents both contributing factors which may cause cartilage degradation and the factors which might be caused by the cartilage degradation during osteoarthritis. Hypothesis graph construction has proven to be robust to the addition of potentially contradictory information on the simultaneously positive and negative effects. The obtained confidence measures for the specific causality hypotheses have been validated by our domain experts, and, correspond closely to their subjective assessments of confidences in investigated hypotheses. Overall, our methodology for a shared hypothesis testing framework exhibits important properties that researchers will find useful in literature review for their experimental studies, planning and prioritizing evidence collection acquisition procedures, and testing their hypotheses with different depths of knowledge on causal dependencies of biological processes and their effects on the observed conditions.
Ziraba, Abdhalah K; Haregu, Tilahun Nigatu; Mberu, Blessing
2016-01-01
The increase in solid waste generated per capita in Africa has not been accompanied by a commensurate growth in the capacity and funding to manage it. It is reported that less than 30% of urban waste in developing countries is collected and disposed appropriately. The implications of poorly managed waste on health are numerous and depend on the nature of the waste, individuals exposed, duration of exposure and availability of interventions for those exposed. To present a framework for understanding the linkages between poor solid waste management, exposure and associated adverse health outcomes. The framework will aid understanding of the relationships, interlinkages and identification of the potential points for intervention. Development of the framework was informed by a review of literature on solid waste management policies, practices and its impact on health in developing countries. A configurative synthesis of literature was applied to develop the framework. Several iterations of the framework were reviewed by experts in the field. Each linkage and outcomes are described in detail as outputs of this study. The resulting framework identifies groups of people at a heightened risk of exposure and the potential health consequences. Using the iceberg metaphor, the framework illustrates the pathways and potential burden of ill-health related to solid waste that is hidden but rapidly unfolding with our inaction. The existing evidence on the linkage between poor solid waste management and adverse health outcomes calls to action by all stakeholders in understanding, prioritizing, and addressing the issue of solid waste in our midst to ensure that our environment and health are preserved. A resulting framework developed in this study presents a clearer picture of the linkages between poor solid waste management and could guide research, policy and action.
Modeling crowdsourcing as collective problem solving
NASA Astrophysics Data System (ADS)
Guazzini, Andrea; Vilone, Daniele; Donati, Camillo; Nardi, Annalisa; Levnajić, Zoran
2015-11-01
Crowdsourcing is a process of accumulating the ideas, thoughts or information from many independent participants, with aim to find the best solution for a given challenge. Modern information technologies allow for massive number of subjects to be involved in a more or less spontaneous way. Still, the full potentials of crowdsourcing are yet to be reached. We introduce a modeling framework through which we study the effectiveness of crowdsourcing in relation to the level of collectivism in facing the problem. Our findings reveal an intricate relationship between the number of participants and the difficulty of the problem, indicating the optimal size of the crowdsourced group. We discuss our results in the context of modern utilization of crowdsourcing.
Establishment of Low Energy Building materials and Equipment Database Based on Property Information
NASA Astrophysics Data System (ADS)
Kim, Yumin; Shin, Hyery; eon Lee, Seung
2018-03-01
The purpose of this study is to provide reliable service of materials information portal through the establishment of public big data by collecting and integrating scattered low energy building materials and equipment data. There were few cases of low energy building materials database in Korea have provided material properties as factors influencing material pricing. The framework of the database was defined referred with Korea On-line E-procurement system. More than 45,000 data were gathered by the specification of entities and with the gathered data, price prediction models for chillers were suggested. To improve the usability of the prediction model, detailed properties should be analysed for each item.
Tuti, Timothy; Bitok, Michael; Paton, Chris; Makone, Boniface; Malla, Lucas; Muinga, Naomi; Gathara, David; English, Mike
2016-01-01
Objective To share approaches and innovations adopted to deliver a relatively inexpensive clinical data management (CDM) framework within a low-income setting that aims to deliver quality pediatric data useful for supporting research, strengthening the information culture and informing improvement efforts in local clinical practice. Materials and methods The authors implemented a CDM framework to support a Clinical Information Network (CIN) using Research Electronic Data Capture (REDCap), a noncommercial software solution designed for rapid development and deployment of electronic data capture tools. It was used for collection of standardized data from case records of multiple hospitals’ pediatric wards. R, an open-source statistical language, was used for data quality enhancement, analysis, and report generation for the hospitals. Results In the first year of CIN, the authors have developed innovative solutions to support the implementation of a secure, rapid pediatric data collection system spanning 14 hospital sites with stringent data quality checks. Data have been collated on over 37 000 admission episodes, with considerable improvement in clinical documentation of admissions observed. Using meta-programming techniques in R, coupled with branching logic, randomization, data lookup, and Application Programming Interface (API) features offered by REDCap, CDM tasks were configured and automated to ensure quality data was delivered for clinical improvement and research use. Conclusion A low-cost clinically focused but geographically dispersed quality CDM (Clinical Data Management) in a long-term, multi-site, and real world context can be achieved and sustained and challenges can be overcome through thoughtful design and implementation of open-source tools for handling data and supporting research. PMID:26063746
Tuti, Timothy; Bitok, Michael; Paton, Chris; Makone, Boniface; Malla, Lucas; Muinga, Naomi; Gathara, David; English, Mike
2016-01-01
To share approaches and innovations adopted to deliver a relatively inexpensive clinical data management (CDM) framework within a low-income setting that aims to deliver quality pediatric data useful for supporting research, strengthening the information culture and informing improvement efforts in local clinical practice. The authors implemented a CDM framework to support a Clinical Information Network (CIN) using Research Electronic Data Capture (REDCap), a noncommercial software solution designed for rapid development and deployment of electronic data capture tools. It was used for collection of standardized data from case records of multiple hospitals' pediatric wards. R, an open-source statistical language, was used for data quality enhancement, analysis, and report generation for the hospitals. In the first year of CIN, the authors have developed innovative solutions to support the implementation of a secure, rapid pediatric data collection system spanning 14 hospital sites with stringent data quality checks. Data have been collated on over 37 000 admission episodes, with considerable improvement in clinical documentation of admissions observed. Using meta-programming techniques in R, coupled with branching logic, randomization, data lookup, and Application Programming Interface (API) features offered by REDCap, CDM tasks were configured and automated to ensure quality data was delivered for clinical improvement and research use. A low-cost clinically focused but geographically dispersed quality CDM (Clinical Data Management) in a long-term, multi-site, and real world context can be achieved and sustained and challenges can be overcome through thoughtful design and implementation of open-source tools for handling data and supporting research. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Explanation Capabilities for Behavior-Based Robot Control
NASA Technical Reports Server (NTRS)
Huntsberger, Terrance L.
2012-01-01
A recent study that evaluated issues associated with remote interaction with an autonomous vehicle within the framework of grounding found that missing contextual information led to uncertainty in the interpretation of collected data, and so introduced errors into the command logic of the vehicle. As the vehicles became more autonomous through the activation of additional capabilities, more errors were made. This is an inefficient use of the platform, since the behavior of remotely located autonomous vehicles didn't coincide with the "mental models" of human operators. One of the conclusions of the study was that there should be a way for the autonomous vehicles to describe what action they choose and why. Robotic agents with enough self-awareness to dynamically adjust the information conveyed back to the Operations Center based on a detail level component analysis of requests could provide this description capability. One way to accomplish this is to map the behavior base of the robot into a formal mathematical framework called a cost-calculus. A cost-calculus uses composition operators to build up sequences of behaviors that can then be compared to what is observed using well-known inference mechanisms.
Human rights barriers for displaced persons in southern Sudan.
Pavlish, Carol; Ho, Anita
2009-01-01
This community-based research explores community perspectives on human rights barriers that women encounter in a postconflict setting of southern Sudan. An ethnographic design was used to guide data collection in five focus groups with community members and during in-depth interviews with nine key informants. A constant comparison method of data analysis was used. Atlas.ti data management software facilitated the inductive coding and sorting of data. Participants identified three formal and one set of informal community structures for human rights. Human rights barriers included shifting legal frameworks, doubt about human rights, weak government infrastructure, and poverty. The evolving government infrastructure cannot currently provide adequate human rights protection, especially for women. The nature of living in poverty without development opportunities includes human rights abuses. Good governance, protection, and human development opportunities were emphasized as priority human rights concerns. Human rights framework could serve as a powerful integrator of health and development work with community-based organizations. Results help nurses understand the intersection between health and human rights as well as approaches to advancing rights in a culturally attuned manner.
Engaging the Voice of Patients Affected by Gender-Based Violence: Informing Practice and Policy.
Lewis-O'Connor, Annie; Chadwick, Mardi
2015-01-01
Evidence regarding the benefits, opportunities, and risks associated with providing health care to patients experiencing gender-based violence (GBV) and, moreover, their satisfaction with health care services is sparse. Using a patient- and trauma-informed relationship-based framework, survivors of GBV who were referred for follow-up care were asked to participate in a quality improvement (QI) initiative in an effort to understand their perspectives of receiving healthcare services. Patients were asked to answer three open-ended questions in regard to their healthcare experience. Individuals who were eligible for evidence collection after sexual assault (<5 days) were asked two additional questions. Of the 353 women and six men (359) referred to the C.A.R.E. (Coordinated Approach to Recovery and Empowerment) Clinic, 327 patients were contacted. Of the participants, 24% (86) had a mental health diagnosis; 41% (145) reported their incident to the police; 8% (28) had comorbidities of substance abuse, mental health, and/or homelessness; and 33% (118) of the incidents involved alcohol or drugs. Most of the patients stated that they were well cared for and felt safe during their visit. However, many reported "long waits," "disjointed," "chaotic," "too many" providers, "conflicting" and "miss-information," and "confusion" about what to do after their acute care visit. Over half (59%) did not report incident to the police. Some reported regrets with reporting to the police (16%) and regrets in having evidence collection (15%). Of the patients who did not have evidence collected (47), none expressed regret over choosing not to have evidence collected. Five patients with mental health problems were hospitalized within 5 days of their emergency department visit for suicidal thoughts. A number of opportunities to improve the healthcare response were identified. Patients affected by GBV require an improved coordinated and trauma-informed approach. Explicit consent related to evidence collection is needed. Not all patients who have been sexually assaulted should have evidence collected. More extensive research and program evaluation including outcomes research are warranted.
García Lozano, Alejandro J; Heinen, Joel T
2016-04-01
Small-scale fisheries are important for preventing poverty, sustaining local economies, and rural livelihoods, but tend to be negatively impacted by traditional forms of management and overexploitation among other factors. Marine Areas for Responsible Fishing (Áreas Marinas de Pesca Responsable, AMPR) have emerged as a new model for the co-management of small-scale fisheries in Costa Rica, one that involves collaboration between fishers, government agencies, and NGOs. The primary objective of this paper is to elucidate some of the key variables that influence collective action among small-scale fishers in Tárcoles, a community in the Gulf of Nicoya. We examined collective action for the formation of a local marketing cooperative and participation in management through the AMPR. We apply the social-ecological framework as a diagnostic and organizational tool in the analysis of several types of qualitative data, including interviews with key informants, informal interviews, legal documents, and gray literature. Findings illustrate the importance of socio-economic community attributes (e.g., group size, homogeneity, previous cooperation), as well as that of social (e.g., equity) and ecological (e.g., improved stocks) outcomes perceived as favorable by actors. In addition, our work demonstrates the importance of certain kinds of external NGOs for facilitating and sustaining collective action.
NASA Astrophysics Data System (ADS)
García Lozano, Alejandro J.; Heinen, Joel T.
2016-04-01
Small-scale fisheries are important for preventing poverty, sustaining local economies, and rural livelihoods, but tend to be negatively impacted by traditional forms of management and overexploitation among other factors. Marine Areas for Responsible Fishing (Áreas Marinas de Pesca Responsable, AMPR) have emerged as a new model for the co-management of small-scale fisheries in Costa Rica, one that involves collaboration between fishers, government agencies, and NGOs. The primary objective of this paper is to elucidate some of the key variables that influence collective action among small-scale fishers in Tárcoles, a community in the Gulf of Nicoya. We examined collective action for the formation of a local marketing cooperative and participation in management through the AMPR. We apply the social-ecological framework as a diagnostic and organizational tool in the analysis of several types of qualitative data, including interviews with key informants, informal interviews, legal documents, and gray literature. Findings illustrate the importance of socio-economic community attributes (e.g., group size, homogeneity, previous cooperation), as well as that of social (e.g., equity) and ecological (e.g., improved stocks) outcomes perceived as favorable by actors. In addition, our work demonstrates the importance of certain kinds of external NGOs for facilitating and sustaining collective action.
Interactive Information Seeking and Retrieving: A Third Feedback Framework.
ERIC Educational Resources Information Center
Spink, Amanda
1996-01-01
Presents an overview of feedback within the cybernetics and social frameworks. These feedback concepts are then compared with the interactive feedback concept evolving within the framework of information seeking and retrieving, based on their conceptualization of the feedback loop and notion of information. (Author/AEF)
Protecting children's rights in the collection of health and welfare data.
Schenk, Katie; Murove, Tapfuma; Williamson, Jan
2006-01-01
Program managers and researchers promoting children's rights to health, education, and an adequate standard of living often gather data directly from children to assess their needs and develop responsive services. Gathering information within a participatory framework recognizing children's views contributes to protection of their rights. Extra precautions, however, are needed to protect children because of the vulnerabilities associated with their developmental needs. Using case studies of ethical challenges faced by program implementers and sociobehavioral researchers, this article explores ways in which data collection activities among children may affect their rights. We suggest ways in which rights-based principles may be used to derive safeguards to protect against unintentional harm and abuses, based on a multidisciplinary consultation with researchers and service providers.
Wicasa Was'aka: Restoring the Traditional Strength of American Indian Boys and Men
Elkins, Jennifer; Tafoya, Greg; Bird, Doreen; Salvador, Melina
2012-01-01
We examined health disparities among American Indian men and boys within the framework of historical trauma, which incorporates the historical context of collective massive group trauma across generations. We reviewed the impact of collective traumatic experiences among Lakota men, who have faced cross-generational challenges to enacting traditional tribal roles. We describe historical trauma–informed interventions used with two tribal groups: Lakota men and Southwestern American Indian boys. These two interventions represent novel approaches to addressing historical trauma and the health disparities that American Indians face. We offer public health implications and recommendations for strategies to use in the planning and implementation of policy, research, and program development with American Indian boys and men. PMID:22401529
Carron, Maud; Alarcon, Pablo; Karani, Maurice; Muinde, Patrick; Akoko, James; Onono, Joshua; Fèvre, Eric M; Häsler, Barbara; Rushton, Jonathan
2017-11-01
Livestock food systems play key subsistence and income generation roles in low to middle income countries and are important networks for zoonotic disease transmission. The aim of this study was to use a value chain framework to characterize the broiler chicken meat system of Nairobi, its governance and sanitary risks. A total of 4 focus groups and 8 key informant interviews were used to collect cross-sectional data from: small-scale broiler farmers in selected Nairobi peri-urban and informal settlement areas; medium to large integrated broiler production companies; traders and meat inspectors in live chicken and chicken meat markets in Nairobi. Qualitative data were collected on types of people operating in the system, their interactions, sanitary measures in place, sourcing and selling of broiler chickens and products. Framework analysis was used to identify governance themes and risky sanitary practices present in the system. One large company was identified to supply 60% of Nairobi's day-old chicks to farmers, mainly through agrovet shops. Broiler meat products from integrated companies were sold in high-end retailers whereas their low value products were channelled through independent traders to consumers in informal settlements. Peri-urban small-scale farmers reported to slaughter the broilers on the farm and to sell carcasses to retailers (hotels and butcheries mainly) through brokers (80%), while farmers in the informal settlement reported to sell their broilers live to retailers (butcheries, hotels and hawkers mainly) directly. Broiler heads and legs were sold in informal settlements via roadside vendors. Sanitary risks identified were related to lack of biosecurity, cold chain and access to water, poor hygiene practices, lack of inspection at farm slaughter and limited health inspection in markets. Large companies dominated the governance of the broiler system through the control of day-old chick production. Overall government control was described as relatively weak leading to minimal official regulatory enforcement. Large companies and brokers were identified as dominant groups in market information dissemination and price setting. Lack of farmer association was found to be system-wide and to limit market access. Other system barriers included lack of space and expertise, leading to poor infrastructure and limited ability to implement effective hygienic measures. This study highlights significant structural differences between different broiler chains and inequalities in product quality and market access across the system. It provides a foundation for food safety assessments, disease control programmes and informs policy-making for the inclusive growth of this fast-evolving sector. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
From field data collection to earth sciences dissemination: mobile examples in the digital era
NASA Astrophysics Data System (ADS)
Giardino, Marco; Ghiraldi, Luca; Palomba, Mauro; Perotti, Luigi
2015-04-01
In the framework of the technological and cultural revolution related to the massive diffusion of mobile devices, as smartphones and tablets, the information management and accessibility is changing, and many software houses and developer communities realized applications that can meet various people's needs. Modern collection, storing and sharing of data have radically changed, and advances in ICT increasingly involve field-based activities. Progresses in these researches and applications depend on three main components: hardware, software and web system. Since 2008 the geoSITLab multidisciplinary group (Earth Sciences Department and NatRisk Centre of the University of Torino and the Natural Sciences Museum of the Piemonte Region) is active in defining and testing methods for collecting, managing and sharing field information using mobile devices. Key issues include: Geomorphological Digital Mapping, Natural Hazards monitoring, Geoheritage assessment and applications for the teaching of Earth Sciences. An overview of the application studies is offered here, including the use of Mobile tools for data collection, the construction of relational databases for inventory activities and the test of Web-Mapping tools and mobile apps for data dissemination. The fil rouge of connection is a standardized digital approach allowing the use of mobile devices in each step of the process, which will be analysed within different projects set up by the research group (Geonathaz, EgeoFieldwork, Progeo Piemonte, GeomediaWeb). The hardware component mainly consists of the availability of handheld mobile devices (e.g. smartphones, PDAs and Tablets). The software component corresponds to applications for spatial data visualization on mobile devices, such as composite mobile GIS or simple location-based apps. The web component allows the integration of collected data into geodatabase based on client-server architecture, where the information can be easily loaded, uploaded and shared between field staff and data management team, in order to disseminate collected information to media or to inform the decision makers. Results demonstrated the possibility to record field observations in a fast and reliable way, using standardized formats that can improve the precision of collected information and lower the possibility of errors and data omission. Dedicated forms have been set up for gathering different thematic data (geologic/geomorphologic, faunal and floristic, path system…etc.). Field data allowed to arrange maps and SDI useful for many application purposes: from country-planning to disaster risk management, from Geoheritage management to Earth Science concepts dissemination.
Using Openstreetmap Data to Generate Building Models with Their Inner Structures for 3d Maps
NASA Astrophysics Data System (ADS)
Wang, Z.; Zipf, A.
2017-09-01
With the development of Web 2.0, more and more data related to indoor environments has been collected within the volunteered geographic information (VGI) framework, which creates a need for construction of indoor environments from VGI. In this study, we focus on generating 3D building models from OpenStreetMap (OSM) data, and provide an approach to support construction and visualization of indoor environments on 3D maps. In this paper, we present an algorithm which can extract building information from OSM data, and can construct building structures as well as inner building components (e.g., doors, rooms, and windows). A web application is built to support the processing and visualization of the building models on a 3D map. We test our approach with an indoor dataset collected from the field. The results show the feasibility of our approach and its potentials to provide support for a wide range of applications, such as indoor and outdoor navigation, urban planning, and incident management.
Delsignore, Ann Marie; Petrova, Elena; Harper, Amney; Stowe, Angela M; Mu'min, Ameena S; Middleton, Renée A
2010-07-01
An exploratory qualitative analysis of the critical incidents and assistance-seeking behaviors of White mental health psychologists and professional counselors was performed in an effort to examine a theoretical supposition presented within a Person(al)-as-Profession(al) transtheoretical framework (P-A-P). A concurrent nested strategy was used in which both quantitative and qualitative data were collected simultaneously (Creswell, 2003). In this nested strategy, qualitative data was embedded in a predominant (quantitative) method of analysis from an earlier study (see Middleton et al., 2005). Critical incidents categorized as informal (i.e., personal) experiences were cited more often than those characterized as formal (i.e., professional) experiences as influencing the professional perspectives of White mental health practitioners regarding multicultural diversity. Implications for the counseling and psychology professions are discussed.
Health information systems: a survey of frameworks for developing countries.
Marcelo, A B
2010-01-01
The objective of this paper is to perform a survey of excellent research on health information systems (HIS) analysis and design, and their underlying theoretical frameworks. It classifies these frameworks along major themes, and analyzes the different approaches to HIS development that are practical in resource-constrained environments. Literature review based on PubMed citations and conference proceedings, as well as Internet searches on information systems in general, and health information systems in particular. The field of health information systems development has been studied extensively. Despite this, failed implementations are still common. Theoretical frameworks for HIS development are available that can guide implementers. As awareness, acceptance, and demand for health information systems increase globally, the variety of approaches and strategies will also follow. For developing countries with scarce resources, a trial-and-error approach can be very costly. Lessons from the successes and failures of initial HIS implementations have been abstracted into theoretical frameworks. These frameworks organize complex HIS concepts into methodologies that standardize techniques in implementation. As globalization continues to impact healthcare in the developing world, demand for more responsive health systems will become urgent. More comprehensive frameworks and practical tools to guide HIS implementers will be imperative.
Evaluating Academic Journals without Impact Factors for Collection Management Decisions.
ERIC Educational Resources Information Center
Dilevko, Juris; Atkinson, Esther
2002-01-01
Discussion of evaluating academic journals for collection management decisions focuses on a methodological framework for evaluating journals not ranked by impact factors in Journal Citation Reports. Compares nonranked journals with ranked journals and then applies this framework to a case study in the field of medical science. (LRW)
NASA Astrophysics Data System (ADS)
Eddy Spicer, David Henning
Teacher collaboration and joint reflective inquiry have been viewed as central elements of progressive educational reform for more than two decades. More recently, researchers, policy-makers, and practitioners have heralded "blended" or "hybrid" approaches that combine online and on-site environments for collaborative learning as especially promising for "scaling up" instructional improvement. Yet, relatively little is known about how teachers working together navigate organizational and interpersonal constraints to develop and sustain conditions essential to collective inquiry. This in-depth study of meaning making about curriculum and instruction among a group of 11 physics teachers in a public, urban secondary school in the U.S. is an effort to explore collective inquiry as a resource for teacher learning and innovations in teaching practice. Through extended observations, multiple interviews, and close analyses of interaction, the study followed teachers for 7 months as they worked together across 3 settings organized in fundamentally different ways to promote joint inquiry into teaching practice. The explanatory framework of the study rests on the mutually-reinforcing conceptual underpinnings of sociocultural theory and systemic functional linguistics to establish connections between micro-social interactions and macro-social processes. Drawing on systemic functional linguistics, the study explores interpersonal meaning making through close analyses of speech function and speech role in 6 extended sequences of generative interaction. Concepts from activity theory elucidate those features of settings and school that directly impinged on or advanced teachers' collaborative work. Findings run counter to prevailing congenial views of teacher collegiality by identifying ways in which collective inquiry is inherently unstable. That instability makes itself apparent at two levels: (a) the dynamics of authority within the group, and (b) middle-level features of setting and school that favored preserving solidarity above developing a critical stance towards practice. The study offers a theoretically-informed description of collective inquiry and an analytic framework to trace its development in naturally-occurring interaction. The analytic framework extends the tools of functional analysis into a new realm, that of teachers' collaborative work, and offers means to understand better the complex array of forces shaping and shaped by teachers' everyday interactions around their practice.
Munroe, Belinda; Curtis, Kate; Murphy, Margaret; Strachan, Luke; Considine, Julie; Hardy, Jennifer; Wilson, Mark; Ruperto, Kate; Fethney, Judith; Buckley, Thomas
2016-08-01
The aim of this study was to evaluate the effect of the new evidence-informed nursing assessment framework HIRAID (History, Identify Red flags, Assessment, Interventions, Diagnostics, reassessment and communication) on the quality of patient assessment and fundamental nontechnical skills including communication, decision making, task management and situational awareness. Assessment is a core component of nursing practice and underpins clinical decisions and the safe delivery of patient care. Yet there is no universal or validated system used to teach emergency nurses how to comprehensively assess and care for patients. A pre-post design was used. The performance of thirty eight emergency nurses from five Australian hospitals was evaluated before and after undertaking education in the application of the HIRAID assessment framework. Video recordings of participant performance in immersive simulations of common presentations to the emergency department were evaluated, as well as participant documentation during the simulations. Paired parametric and nonparametric tests were used to compare changes from pre to postintervention. From pre to postintervention, participant performance increases were observed in the percentage of patient history elements collected, critical indicators of urgency collected and reported to medical officers, and patient reassessments performed. Participants also demonstrated improvement in each of the four nontechnical skills categories: communication, decision making, task management and situational awareness. The HIRAID assessment framework improves clinical patient assessments performed by emergency nurses and has the potential to enhance patient care. HIRAID should be considered for integration into clinical practice to provide nurses with a systematic approach to patient assessment and potentially improve the delivery of safe patient care. © 2016 John Wiley & Sons Ltd.
Booth, Andrew; Carroll, Christopher
2015-01-01
Increasing recognition of the role and value of theory in improvement work in healthcare offers the prospect of capitalising upon, and consolidating, actionable lessons from synthesis of improvement projects and initiatives. We propose that informed use of theory can (i) provide a mechanism by which to collect and organise data from a body of improvement work, (ii) offer a framework for analysis and identification of lessons learnt and (iii) facilitate an evaluation of the feasibility, effectiveness and acceptability of improvement programmes. Improvement practitioners can benefit from using an underpinning external structure as a lens by which to examine the specific achievements of their own projects alongside comparable initiatives led by others. We demonstrate the utility of a method known as ‘best fit framework synthesis’ (BFFS) in offering a ubiquitous and versatile means by which to collect, analyse and evaluate improvement work in healthcare. First reported in 2011, BFFS represents a pragmatic, flexible approach to integrating theory with findings from practice. A deductive phase, where a review team seeks to accommodate a substantial part of the data, is followed by an inductive phase, in which the team explores data not accommodated by the framework. We explore the potential for BFFS within improvement work by drawing upon the evidence synthesis methodology literature and practical examples of improvement work reported in BMJ Quality and Safety (2011–2015). We suggest four variants of BFFS that may have particular value in synthesising a body of improvement work. We conclude that BFFS, alongside other approaches that seek to optimise the contribution of theory to improvement work, represents one important enabling mechanism by which to establish the rigour and scientific credentials of the emerging discipline of ‘improvement science’. PMID:26306609
Espinoza, Manuel A; Manca, Andrea; Claxton, Karl; Sculpher, Mark J
2014-11-01
This article develops a general framework to guide the use of subgroup cost-effectiveness analysis for decision making in a collectively funded health system. In doing so, it addresses 2 key policy questions, namely, the identification and selection of subgroups, while distinguishing 2 sources of potential value associated with heterogeneity. These are 1) the value of revealing the factors associated with heterogeneity in costs and outcomes using existing evidence (static value) and 2) the value of acquiring further subgroup-related evidence to resolve the uncertainty given the current understanding of heterogeneity (dynamic value). Consideration of these 2 sources of value can guide subgroup-specific treatment decisions and inform whether further research should be conducted to resolve uncertainty to explain variability in costs and outcomes. We apply the proposed methods to a cost-effectiveness analysis for the management of patients with acute coronary syndrome. This study presents the expected net benefits under current and perfect information when subgroups are defined based on the use and combination of 6 binary covariates. The results of the case study confirm the theoretical expectations. As more subgroups are considered, the marginal net benefit gains obtained under the current information show diminishing marginal returns, and the expected value of perfect information shows a decreasing trend. We present a suggested algorithm that synthesizes the results to guide policy. © The Author(s) 2014.
NASA Astrophysics Data System (ADS)
Řezník, T.; Kepka, M.; Charvát, K.; Charvát, K., Jr.; Horáková, S.; Lukas, V.
2016-04-01
From a global perspective, agriculture is the single largest user of freshwater resources, each country using an average of 70% of all its surface water supplies. An essential proportion of agricultural water is recycled back to surface water and/or groundwater. Agriculture and water pollution is therefore the subject of (inter)national legislation, such as the Clean Water Act in the United States of America, the European Water Framework Directive, and the Law of the People's Republic of China on the Prevention and Control of Water Pollution. Regular monitoring by means of sensor networks is needed in order to provide evidence of water pollution in agriculture. This paper describes the benefits of, and open issues stemming from, regular sensor monitoring provided by an Open Farm Management Information System. Emphasis is placed on descriptions of the processes and functionalities available to users, the underlying open data model, and definitions of open and lightweight application programming interfaces for the efficient management of collected (spatial) data. The presented Open Farm Management Information System has already been successfully registered under Phase 8 of the Global Earth Observation System of Systems (GEOSS) Architecture Implementation Pilot in order to support the wide variety of demands that are primarily aimed at agriculture pollution monitoring. The final part of the paper deals with the integration of the Open Farm Management Information System into the Digital Earth framework.
DEVA: An extensible ontology-based annotation model for visual document collections
NASA Astrophysics Data System (ADS)
Jelmini, Carlo; Marchand-Maillet, Stephane
2003-01-01
The description of visual documents is a fundamental aspect of any efficient information management system, but the process of manually annotating large collections of documents is tedious and far from being perfect. The need for a generic and extensible annotation model therefore arises. In this paper, we present DEVA, an open, generic and expressive multimedia annotation framework. DEVA is an extension of the Dublin Core specification. The model can represent the semantic content of any visual document. It is described in the ontology language DAML+OIL and can easily be extended with external specialized ontologies, adapting the vocabulary to the given application domain. In parallel, we present the Magritte annotation tool, which is an early prototype that validates the DEVA features. Magritte allows to manually annotating image collections. It is designed with a modular and extensible architecture, which enables the user to dynamically adapt the user interface to specialized ontologies merged into DEVA.
Energy disclosure, market behavior, and the building data ecosystem.
Kontokosta, Constantine E
2013-08-01
Energy disclosure laws represent one of the most promising public policy tools to accelerate market transformation around building energy efficiency. For this type of information to have an impact on market behavior, it must be collected, analyzed, and disseminated to support the decision-making processes of each end user and influence both the producers and consumers of building performance data. This paper explores the significance of energy disclosure requirements and outlines a framework for utilizing these new sources of transparent, publicly available information. It presents the mechanisms by which information can alter market behavior in the commercial real estate sector and develops a wiring diagram for the flows of information through the building data ecosystem. It concludes with a discussion of the motivations, metrics, and constraints faced by the various stakeholders in the ecosystem and how these factors influence investment decision models. © 2013 New York Academy of Sciences.
Building research capacity for evidence-informed tobacco control in Canada: a case description
McDonald, Paul W; Viehbeck, Sarah; Robinson, Sarah J; Leatherdale, Scott T; Nykiforuk, Candace IJ; Jolin, Mari Alice
2009-01-01
Tobacco use remains the leading cause of death and disability in Canada. Insufficient research capacity can inhibit evidence-informed decision making for tobacco control. This paper outlines a Canadian project to build research capacity, defined as a community's ability to produce research that adequately informs practice, policy, and future research in a timely, practical manner. A key component is that individuals and teams within the community must mutually engage around common, collectively negotiated goals to address specific practices, policies or programs of research. An organizing framework, a set of activities to build strategic recruitment, productivity tools, and procedures for enhancing social capital are described. Actions are intended to facilitate better alignment between research and the priorities of policy developers and service providers, enhance the external validity of the work performed, and reduce the time required to inform policy and practice. PMID:19664224
Pantea, Michael P.; Cole, James C.; Smith, Bruce D.; Faith, Jason R.; Blome, Charles D.; Smith, David V.
2008-01-01
This multimedia report shows and describes digital three-dimensional faulted geologic surfaces and volumes of the lithologic units of the Edwards aquifer in the upper Seco Creek area of Medina and Uvalde Counties in south-central Texas. This geologic framework model was produced using (1) geologic maps and interpretations of depositional environments and paleogeography; (2) lithologic descriptions, interpretations, and geophysical logs from 31 drill holes; (3) rock core and detailed lithologic descriptions from one drill hole; (4) helicopter electromagnetic geophysical data; and (5) known major and minor faults in the study area. These faults were used because of their individual and collective effects on the continuity of the aquifer-forming units in the Edwards Group. Data and information were compared and validated with each other and reflect the complex relationships of structures in the Seco Creek area of the Balcones fault zone. This geologic framework model can be used as a tool to visually explore and study geologic structures within the Seco Creek area of the Balcones fault zone and to show the connectivity of hydrologic units of high and low permeability between and across faults. The software can be used to display other data and information, such as drill-hole data, on this geologic framework model in three-dimensional space.
Federal and state nursing facility websites: just what the consumer needs?
Harrington, Charlene; Collier, Eric; O'Meara, Janis; Kitchener, Martin; Simon, Lisa Payne; Schnelle, John F
2003-01-01
Since the introduction of the Medicare Nursing Home Compare website in 1999, some states have begun to develop their own websites to help consumers compare nursing facilities (NFs). This article presents a brief conceptual framework for the type of information needed for an Internet-based information system and analyzes existing federal and state NF websites, using data collected from a survey completed in 2002. Twenty-four states and the District of Columbia have a variety of information on NFs, similar to the information on the Medicare website. Information on characteristics and deficiencies of a facility is the most commonly available, but a few states have data on ownership, staffing indicators, quality indicators, complaints, and enforcement actions. Other types of data, such as resident characteristics, staff turnover rates, and financial indicators, are generally not available. Although many states are making progress toward providing consumers with information, there are gaps that exist, which if filled, could provide consumers with a better tool for facility selection and monitoring the quality of care.
Collaborative Metaliteracy: Putting the New Information Literacy Framework into (Digital) Practice
ERIC Educational Resources Information Center
Gersch, Beate; Lampner, Wendy; Turner, Dudley
2016-01-01
This article describes a course-integrated collaborative project between a subject librarian, a communication professor, and an instructional designer that illustrates how the TPACK (Technological Pedagogical Content Knowledge) framework, developed by Mishra and Koehler (2006), and the new ACRL Framework for Information Literacy (Framework)…
Leung, Leanne; de Lemos, Mário L; Kovacic, Laurel
2017-01-01
Background With the rising cost of new oncology treatments, it is no longer sustainable to base initial drug funding decisions primarily on prospective clinical trials as their performance in real-life populations are often difficult to determine. In British Columbia, an approach in evidence building is to retrospectively analyse patient outcomes using observational research on an ad hoc basis. Methods The deliberative framework was constructed in three stages: framework design, framework validation and treatment programme characterization, and key informant interview. Framework design was informed through a literature review and analyses of provincial and national decision-making processes. Treatment programmes funded between 2010 and 2013 were used for framework validation. A selection concordance rate of 80% amongst three reviewers was considered to be a validation of the framework. Key informant interviews were conducted to determine the utility of this deliberative framework. Results A multi-domain deliberative framework with 15 assessment parameters was developed. A selection concordance rate of 84.2% was achieved for content validation of the framework. Nine treatment programmes from five different tumour groups were selected for retrospective outcomes analysis. Five contributory factors to funding uncertainties were identified. Key informants agreed that the framework is a comprehensive tool that targets the key areas involved in the funding decision-making process. Conclusions The oncology-based deliberative framework can be routinely used to assess treatment programmes from the major tumour sites for retrospective outcomes analysis. Key informants indicate this is a value-added tool and will provide insight to the current prospective funding model.
An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web
NASA Astrophysics Data System (ADS)
Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.
2013-09-01
Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.
A contemporary approach to validity arguments: a practical guide to Kane's framework.
Cook, David A; Brydges, Ryan; Ginsburg, Shiphra; Hatala, Rose
2015-06-01
Assessment is central to medical education and the validation of assessments is vital to their use. Earlier validity frameworks suffer from a multiplicity of types of validity or failure to prioritise among sources of validity evidence. Kane's framework addresses both concerns by emphasising key inferences as the assessment progresses from a single observation to a final decision. Evidence evaluating these inferences is planned and presented as a validity argument. We aim to offer a practical introduction to the key concepts of Kane's framework that educators will find accessible and applicable to a wide range of assessment tools and activities. All assessments are ultimately intended to facilitate a defensible decision about the person being assessed. Validation is the process of collecting and interpreting evidence to support that decision. Rigorous validation involves articulating the claims and assumptions associated with the proposed decision (the interpretation/use argument), empirically testing these assumptions, and organising evidence into a coherent validity argument. Kane identifies four inferences in the validity argument: Scoring (translating an observation into one or more scores); Generalisation (using the score[s] as a reflection of performance in a test setting); Extrapolation (using the score[s] as a reflection of real-world performance), and Implications (applying the score[s] to inform a decision or action). Evidence should be collected to support each of these inferences and should focus on the most questionable assumptions in the chain of inference. Key assumptions (and needed evidence) vary depending on the assessment's intended use or associated decision. Kane's framework applies to quantitative and qualitative assessments, and to individual tests and programmes of assessment. Validation focuses on evaluating the key claims, assumptions and inferences that link assessment scores with their intended interpretations and uses. The Implications and associated decisions are the most important inferences in the validity argument. © 2015 John Wiley & Sons Ltd.
Large-scale Cross-modality Search via Collective Matrix Factorization Hashing.
Ding, Guiguang; Guo, Yuchen; Zhou, Jile; Gao, Yue
2016-09-08
By transforming data into binary representation, i.e., Hashing, we can perform high-speed search with low storage cost, and thus Hashing has collected increasing research interest in the recent years. Recently, how to generate Hashcode for multimodal data (e.g., images with textual tags, documents with photos, etc) for large-scale cross-modality search (e.g., searching semantically related images in database for a document query) is an important research issue because of the fast growth of multimodal data in the Web. To address this issue, a novel framework for multimodal Hashing is proposed, termed as Collective Matrix Factorization Hashing (CMFH). The key idea of CMFH is to learn unified Hashcodes for different modalities of one multimodal instance in the shared latent semantic space in which different modalities can be effectively connected. Therefore, accurate cross-modality search is supported. Based on the general framework, we extend it in the unsupervised scenario where it tries to preserve the Euclidean structure, and in the supervised scenario where it fully exploits the label information of data. The corresponding theoretical analysis and the optimization algorithms are given. We conducted comprehensive experiments on three benchmark datasets for cross-modality search. The experimental results demonstrate that CMFH can significantly outperform several state-of-the-art cross-modality Hashing methods, which validates the effectiveness of the proposed CMFH.
Communal range defence in primates as a public goods dilemma.
Willems, Erik P; Arseneau, T Jean M; Schleuning, Xenia; van Schaik, Carel P
2015-12-05
Classic socio-ecological theory holds that the occurrence of aggressive range defence is primarily driven by ecological incentives, most notably by the economic defendability of an area or the resources it contains. While this ecological cost-benefit framework has great explanatory power in solitary or pair-living species, comparative work on group-living primates has always found economic defendability to be a necessary, but not sufficient condition to account for the distribution of effective range defence across the taxon. This mismatch between theory and observation has recently been ascribed to a collective action problem among group members in, what is more informatively viewed as, a public goods dilemma: mounting effective defence of a communal range against intrusions by outgroup conspecifics. We here further develop this framework, and report on analyses at three levels of biological organization: across species, across populations within a single lineage and across groups and individuals within a single population. We find that communal range defence in primates very rarely involves collective action sensu stricto and that it is best interpreted as the outcome of opportunistic and strategic individual-level decisions. Whether the public good of a defended communal range is produced by solitary, joint or collective action is thus the outcome of the interplay between the unique characteristics of each individual, local and current socio-ecological conditions, and fundamental life-history traits of the species. © 2015 The Author(s).
Pervasive Sound Sensing: A Weakly Supervised Training Approach.
Kelly, Daniel; Caulfield, Brian
2016-01-01
Modern smartphones present an ideal device for pervasive sensing of human behavior. Microphones have the potential to reveal key information about a person's behavior. However, they have been utilized to a significantly lesser extent than other smartphone sensors in the context of human behavior sensing. We postulate that, in order for microphones to be useful in behavior sensing applications, the analysis techniques must be flexible and allow easy modification of the types of sounds to be sensed. A simplification of the training data collection process could allow a more flexible sound classification framework. We hypothesize that detailed training, a prerequisite for the majority of sound sensing techniques, is not necessary and that a significantly less detailed and time consuming data collection process can be carried out, allowing even a nonexpert to conduct the collection, labeling, and training process. To test this hypothesis, we implement a diverse density-based multiple instance learning framework, to identify a target sound, and a bag trimming algorithm, which, using the target sound, automatically segments weakly labeled sound clips to construct an accurate training set. Experiments reveal that our hypothesis is a valid one and results show that classifiers, trained using the automatically segmented training sets, were able to accurately classify unseen sound samples with accuracies comparable to supervised classifiers, achieving an average F -measure of 0.969 and 0.87 for two weakly supervised datasets.
Communal range defence in primates as a public goods dilemma
Willems, Erik P.; Arseneau, T. Jean. M.; Schleuning, Xenia; van Schaik, Carel P.
2015-01-01
Classic socio-ecological theory holds that the occurrence of aggressive range defence is primarily driven by ecological incentives, most notably by the economic defendability of an area or the resources it contains. While this ecological cost–benefit framework has great explanatory power in solitary or pair-living species, comparative work on group-living primates has always found economic defendability to be a necessary, but not sufficient condition to account for the distribution of effective range defence across the taxon. This mismatch between theory and observation has recently been ascribed to a collective action problem among group members in, what is more informatively viewed as, a public goods dilemma: mounting effective defence of a communal range against intrusions by outgroup conspecifics. We here further develop this framework, and report on analyses at three levels of biological organization: across species, across populations within a single lineage and across groups and individuals within a single population. We find that communal range defence in primates very rarely involves collective action sensu stricto and that it is best interpreted as the outcome of opportunistic and strategic individual-level decisions. Whether the public good of a defended communal range is produced by solitary, joint or collective action is thus the outcome of the interplay between the unique characteristics of each individual, local and current socio-ecological conditions, and fundamental life-history traits of the species. PMID:26503678
Sandhu, Kiran; Burton, Paul; Dedekorkut-Howes, Aysin
2017-01-01
The informal waste recycling sector has been an indispensable but ironically invisible part of the waste management systems in developing countries as India, often completely disregarded and overlooked by decision makers and policy frameworks. The turn towards liberalization of economy since 1991 in India opened the doors for privatization of urban services and the waste sector found favor with private companies facilitated by the local governments. In joining the privatization bandwagon, the local governments aim to create an image of a progressive city demonstrated most visibly through apt management of municipal solid waste. Resultantly, the long important stakeholder, the informal sector has been sidelined and left to face the adverse impacts of privatization. There is hardly any recognition of its contributions or any attempt to integrate it within the formal waste management systems. The study investigates the impacts of privatization on the waste pickers in waste recycling operations. Highlighting the other dimension of waste collection and management in urban India the study focuses on the waste pickers and small time informal scrap dealers and this is done by taking the case study of Amritsar city, which is an important historic centre and a metropolitan city in the state of Punjab, India. The paper develops an analytical framework, drawing from literature review to analyze the impacts. In conclusion, it supports the case for involving informal waste sector towards achieving sustainable waste management in the city. Copyright © 2016 Elsevier Ltd. All rights reserved.
Representation of potential information gain to measure the price of anarchy on ISR activities
NASA Astrophysics Data System (ADS)
Ortiz-Peña, Hector J.; Hirsch, Michael; Karwan, Mark; Nagi, Rakesh; Sudit, Moises
2013-05-01
One of the main technical challenges facing intelligence analysts today is effectively determining information gaps from huge amounts of collected data. Moreover, getting the right information to/from the right person (e.g., analyst, warfighter on the edge) at the right time in a distributed environment has been elusive to our military forces. Synchronization of Intelligence, Surveillance, and Reconnaissance (ISR) activities to maximize the efficient utilization of limited resources (both in quantity and capabilities) has become critically important to increase the accuracy and timeliness of overall information gain. Given this reality, we are interested in quantifying the degradation of solution quality (i.e., information gain) as a centralized system synchronizing ISR activities (from information gap identification to information collection and dissemination) moves to a more decentralized framework. This evaluation extends the concept of price of anarchy, a measure of the inefficiency of a system when agents maximize decisions without coordination, by considering different levels of decentralization. Our initial research representing the potential information gain in geospatial and time discretized spaces is presented. This potential information gain map can represent a consolidation of Intelligence Preparation of the Battlefield products as input to automated ISR synchronization tools. Using the coordination of unmanned vehicles (UxVs) as an example, we developed a mathematical programming model for multi-perspective optimization in which each UxV develops its own fight plan to support mission objectives based only on its perspective of the environment (i.e., potential information gain map). Information is only exchanged when UxVs are part of the same communication network.