Metrics and Mappings: A Framework for Understanding Real-World Quantitative Estimation.
ERIC Educational Resources Information Center
Brown, Norman R.; Siegler, Robert S.
1993-01-01
A metrics and mapping framework is proposed to account for how heuristics, domain-specific reasoning, and intuitive statistical induction processes are integrated to generate estimates. Results of 4 experiments involving 188 undergraduates illustrate framework usefulness and suggest when people use heuristics and when they emphasize…
An Integrated Tone Mapping for High Dynamic Range Image Visualization
NASA Astrophysics Data System (ADS)
Liang, Lei; Pan, Jeng-Shyang; Zhuang, Yongjun
2018-01-01
There are two type tone mapping operators for high dynamic range (HDR) image visualization. HDR image mapped by perceptual operators have strong sense of reality, but will lose local details. Empirical operators can maximize local detail information of HDR image, but realism is not strong. A common tone mapping operator suitable for all applications is not available. This paper proposes a novel integrated tone mapping framework which can achieve conversion between empirical operators and perceptual operators. In this framework, the empirical operator is rendered based on improved saliency map, which simulates the visual attention mechanism of the human eye to the natural scene. The results of objective evaluation prove the effectiveness of the proposed solution.
Relativistic collisions as Yang-Baxter maps
NASA Astrophysics Data System (ADS)
Kouloukas, Theodoros E.
2017-10-01
We prove that one-dimensional elastic relativistic collisions satisfy the set-theoretical Yang-Baxter equation. The corresponding collision maps are symplectic and admit a Lax representation. Furthermore, they can be considered as reductions of a higher dimensional integrable Yang-Baxter map on an invariant manifold. In this framework, we study the integrability of transfer maps that represent particular periodic sequences of collisions.
Self-contained image mapping of placental vasculature in 3D ultrasound-guided fetoscopy.
Yang, Liangjing; Wang, Junchen; Ando, Takehiro; Kubota, Akihiro; Yamashita, Hiromasa; Sakuma, Ichiro; Chiba, Toshio; Kobayashi, Etsuko
2016-09-01
Surgical navigation technology directed at fetoscopic procedures is relatively underdeveloped compared with other forms of endoscopy. The narrow fetoscopic field of views and the vast vascular network on the placenta make examination and photocoagulation treatment of twin-to-twin transfusion syndrome challenging. Though ultrasonography is used for intraoperative guidance, its navigational ability is not fully exploited. This work aims to integrate 3D ultrasound imaging and endoscopic vision seamlessly for placental vasculature mapping through a self-contained framework without external navigational devices. This is achieved through development, integration, and experimentation of novel navigational modules. Firstly, a framework design that addresses the current limitations based on identified gaps is conceptualized. Secondly, integration of navigational modules including (1) ultrasound-based localization, (2) image alignment, and (3) vision-based tracking to update the scene texture map is implemented. This updated texture map is projected to an ultrasound-constructed 3D model for photorealistic texturing of the 3D scene creating a panoramic view of the moving fetoscope. In addition, a collaborative scheme for the integration of the modular workflow system is proposed to schedule updates in a systematic fashion. Finally, experiments are carried out to evaluate each modular variation and an integrated collaborative scheme of the framework. The modules and the collaborative scheme are evaluated through a series of phantom experiments with controlled trajectories for repeatability. The collaborative framework demonstrated the best accuracy (5.2 % RMS error) compared with all the three single-module variations during the experiment. Validation on an ex vivo monkey placenta shows visual continuity of the freehand fetoscopic panorama. The proposed developed collaborative framework and the evaluation study of the framework variations provide analytical insights for effective integration of ultrasonography and endoscopy. This contributes to the development of navigation techniques in fetoscopic procedures and can potentially be extended to other applications in intraoperative imaging.
Corvin, Jaime A; DeBate, Rita; Wolfe-Quintero, Kate; Petersen, Donna J
2017-01-01
In the twenty-first century, the dynamics of health and health care are changing, necessitating a commitment to revising traditional public health curricula to better meet present day challenges. This article describes how the College of Public Health at the University of South Florida utilized the Intervention Mapping framework to translate revised core competencies into an integrated, theory-driven core curriculum to meet the training needs of the twenty-first century public health scholar and practitioner. This process resulted in the development of four sequenced courses: History and Systems of Public Health and Population Assessment I delivered in the first semester and Population Assessment II and Translation to Practice delivered in the second semester. While the transformation process, moving from traditional public health core content to an integrated and innovative curriculum, is a challenging and daunting task, Intervention Mapping provides the ideal framework for guiding this process. Intervention mapping walks the curriculum developers from the broad goals and objectives to the finite details of a lesson plan. Throughout this process, critical lessons were learned, including the importance of being open to new ideologies and frameworks and the critical need to involve key-stakeholders in every step of the decision-making process to ensure the sustainability of the resulting integrated and theory-based curriculum. Ultimately, as a stronger curriculum emerged, the developers and instructors themselves were changed, fostering a stronger public health workforce from within.
Satellites, tweets, forecasts: the future of flood disaster management?
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Kalas, Milan; Lorini, Valerio; Wania, Annett; Pappenberger, Florian; Salamon, Peter; Ramos, Maria Helena; Cloke, Hannah; Castillo, Carlos
2017-04-01
Floods have devastating effects on lives and livelihoods around the world. Structural flood defence measures such as dikes and dams can help protect people. However, it is the emerging science and technologies for flood disaster management and preparedness, such as increasingly accurate flood forecasting systems, high-resolution satellite monitoring, rapid risk mapping, and the unique strength of social media information and crowdsourcing, that are most promising for reducing the impacts of flooding. Here, we describe an innovative framework which integrates in real-time two components of the Copernicus Emergency mapping services, namely the European Flood Awareness System and the satellite-based Rapid Mapping, with new procedures for rapid risk assessment and social media and news monitoring. The integrated framework enables improved flood impact forecast, thanks to the real-time integration of forecasting and monitoring components, and increases the timeliness and efficiency of satellite mapping, with the aim of capturing flood peaks and following the evolution of flooding processes. Thanks to the proposed framework, emergency responders will have access to a broad range of timely and accurate information for more effective and robust planning, decision-making, and resource allocation.
Determination Of Slope Instability Using Spatially Integrated Mapping Framework
NASA Astrophysics Data System (ADS)
Baharuddin, I. N. Z.; Omar, R. C.; Roslan, R.; Khalid, N. H. N.; Hanifah, M. I. M.
2016-11-01
The determination and identification of slope instability are often rely on data obtained from in-situ soil investigation work where it involves the logistic of machineries and manpower, thus these aspects may increase the cost especially for remote locations. Therefore a method, which is able to identify possible slope instability without frequent ground walkabout survey, is needed. This paper presents the method used in prediction of slope instability using spatial integrated mapping framework which applicable for remote areas such as tropical forest and natural hilly terrain. Spatial data such as geology, topography, land use map, slope angle and elevation were used in regional analysis during desktop study. Through this framework, the occurrence of slope instability was able to be identified and was validate using a confirmatory site- specific analysis.
The Conceptual Framework of Thematic Mapping in Case Conceptualization.
Ridley, Charles R; Jeffrey, Christina E
2017-04-01
This article, the 3rd in a series of 5, introduces the conceptual framework for thematic mapping, a novel approach to case conceptualization. The framework is transtheoretical in that it is not constrained by the tenets or concepts of any one therapeutic orientation and transdiagnostic in that it conceptualizes clients outside the constraints of diagnostic criteria. Thematic mapping comprises 4 components: a definition, foundational principles, defining features, and core concepts. These components of the framework, deemed building blocks, are explained in this article. Like the foundation of any structure, the heuristic value of the method requires that the building blocks have integrity, coherence, and sound anchoring. We assert that the conceptual framework provides a solid foundation, making thematic mapping a potential asset in mental health treatment. © 2017 Wiley Periodicals, Inc.
Empowering Provenance in Data Integration
NASA Astrophysics Data System (ADS)
Kondylakis, Haridimos; Doerr, Martin; Plexousakis, Dimitris
The provenance of data has recently been recognized as central to the trust one places in data. This paper presents a novel framework in order to empower provenance in a mediator based data integration system. We use a simple mapping language for mapping schema constructs, between an ontology and relational sources, capable to carry provenance information. This language extends the traditional data exchange setting by translating our mapping specifications into source-to-target tuple generating dependencies (s-t tgds). Then we define formally the provenance information we want to retrieve i.e. annotation, source and tuple provenance. We provide three algorithms to retrieve provenance information using information stored on the mappings and the sources. We show the feasibility of our solution and the advantages of our framework.
Topological Schemas of Cognitive Maps and Spatial Learning.
Babichev, Andrey; Cheng, Sen; Dabaghian, Yuri A
2016-01-01
Spatial navigation in mammals is based on building a mental representation of their environment-a cognitive map. However, both the nature of this cognitive map and its underpinning in neural structures and activity remains vague. A key difficulty is that these maps are collective, emergent phenomena that cannot be reduced to a simple combination of inputs provided by individual neurons. In this paper we suggest computational frameworks for integrating the spiking signals of individual cells into a spatial map, which we call schemas. We provide examples of four schemas defined by different types of topological relations that may be neurophysiologically encoded in the brain and demonstrate that each schema provides its own large-scale characteristics of the environment-the schema integrals. Moreover, we find that, in all cases, these integrals are learned at a rate which is faster than the rate of complete training of neural networks. Thus, the proposed schema framework differentiates between the cognitive aspect of spatial learning and the physiological aspect at the neural network level.
Corvin, Jaime A.; DeBate, Rita; Wolfe-Quintero, Kate; Petersen, Donna J.
2017-01-01
In the twenty-first century, the dynamics of health and health care are changing, necessitating a commitment to revising traditional public health curricula to better meet present day challenges. This article describes how the College of Public Health at the University of South Florida utilized the Intervention Mapping framework to translate revised core competencies into an integrated, theory-driven core curriculum to meet the training needs of the twenty-first century public health scholar and practitioner. This process resulted in the development of four sequenced courses: History and Systems of Public Health and Population Assessment I delivered in the first semester and Population Assessment II and Translation to Practice delivered in the second semester. While the transformation process, moving from traditional public health core content to an integrated and innovative curriculum, is a challenging and daunting task, Intervention Mapping provides the ideal framework for guiding this process. Intervention mapping walks the curriculum developers from the broad goals and objectives to the finite details of a lesson plan. Throughout this process, critical lessons were learned, including the importance of being open to new ideologies and frameworks and the critical need to involve key-stakeholders in every step of the decision-making process to ensure the sustainability of the resulting integrated and theory-based curriculum. Ultimately, as a stronger curriculum emerged, the developers and instructors themselves were changed, fostering a stronger public health workforce from within. PMID:29164094
Accurate Mobile Urban Mapping via Digital Map-Based SLAM †
Roh, Hyunchul; Jeong, Jinyong; Cho, Younggun; Kim, Ayoung
2016-01-01
This paper presents accurate urban map generation using digital map-based Simultaneous Localization and Mapping (SLAM). Throughout this work, our main objective is generating a 3D and lane map aiming for sub-meter accuracy. In conventional mapping approaches, achieving extremely high accuracy was performed by either (i) exploiting costly airborne sensors or (ii) surveying with a static mapping system in a stationary platform. Mobile scanning systems recently have gathered popularity but are mostly limited by the availability of the Global Positioning System (GPS). We focus on the fact that the availability of GPS and urban structures are both sporadic but complementary. By modeling both GPS and digital map data as measurements and integrating them with other sensor measurements, we leverage SLAM for an accurate mobile mapping system. Our proposed algorithm generates an efficient graph SLAM and achieves a framework running in real-time and targeting sub-meter accuracy with a mobile platform. Integrated with the SLAM framework, we implement a motion-adaptive model for the Inverse Perspective Mapping (IPM). Using motion estimation derived from SLAM, the experimental results show that the proposed approaches provide stable bird’s-eye view images, even with significant motion during the drive. Our real-time map generation framework is validated via a long-distance urban test and evaluated at randomly sampled points using Real-Time Kinematic (RTK)-GPS. PMID:27548175
Integration of electro-anatomical and imaging data of the left ventricle: An evaluation framework.
Soto-Iglesias, David; Butakoff, Constantine; Andreu, David; Fernández-Armenta, Juan; Berruezo, Antonio; Camara, Oscar
2016-08-01
Integration of electrical and structural information for scar characterization in the left ventricle (LV) is a crucial step to better guide radio-frequency ablation therapies, which are usually performed in complex ventricular tachycardia (VT) cases. This integration requires finding a common representation where to map the electrical information from the electro-anatomical map (EAM) surfaces and tissue viability information from delay-enhancement magnetic resonance images (DE-MRI). However, the development of a consistent integration method is still an open problem due to the lack of a proper evaluation framework to assess its accuracy. In this paper we present both: (i) an evaluation framework to assess the accuracy of EAM and imaging integration strategies with simulated EAM data and a set of global and local measures; and (ii) a new integration methodology based on a planar disk representation where the LV surface meshes are quasi-conformally mapped (QCM) by flattening, allowing for simultaneous visualization and joint analysis of the multi-modal data. The developed evaluation framework was applied to estimate the accuracy of the QCM-based integration strategy on a benchmark dataset of 128 synthetically generated ground-truth cases presenting different scar configurations and EAM characteristics. The obtained results demonstrate a significant reduction in global overlap errors (50-100%) with respect to state-of-the-art integration techniques, also better preserving the local topology of small structures such as conduction channels in scars. Data from seventeen VT patients were also used to study the feasibility of the QCM technique in a clinical setting, consistently outperforming the alternative integration techniques in the presence of sparse and noisy clinical data. The proposed evaluation framework has allowed a rigorous comparison of different EAM and imaging data integration strategies, providing useful information to better guide clinical practice in complex cardiac interventions. Copyright © 2016 Elsevier B.V. All rights reserved.
Theobroma cacao: A genetically integrated physical map and genome-scale comparative synteny analysis
USDA-ARS?s Scientific Manuscript database
A comprehensive integrated genomic framework is considered a centerpiece of genomic research. In collaboration with the USDA-ARS (SHRS) and Mars Inc., the Clemson University Genomics Institute (CUGI) has developed a genetically anchored physical map of the T. cacao genome. Three BAC libraries contai...
Ergonomics action research II: a framework for integrating HF into work system design.
Neumann, W P; Village, J
2012-01-01
This paper presents a conceptual framework that can support efforts to integrate human factors (HF) into the work system design process, where improved and cost-effective application of HF is possible. The framework advocates strategies of broad stakeholder participation, linking of performance and health goals, and process focussed change tools that can help practitioners engage in improvements to embed HF into a firm's work system design process. Recommended tools include business process mapping of the design process, implementing design criteria, using cognitive mapping to connect to managers' strategic goals, tactical use of training and adopting virtual HF (VHF) tools to support the integration effort. Consistent with organisational change research, the framework provides guidance but does not suggest a strict set of steps. This allows more adaptability for the practitioner who must navigate within a particular organisational context to secure support for embedding HF into the design process for improved operator wellbeing and system performance. There has been little scientific literature about how a practitioner might integrate HF into a company's work system design process. This paper proposes a framework for this effort by presenting a coherent conceptual framework, process tools, design tools and procedural advice that can be adapted for a target organisation.
Changing Consciousness: Autoethnographic Mapping in a Dialog Group
ERIC Educational Resources Information Center
Hager, Tamar; Mazali, Rela
2013-01-01
This article introduces a pedagogical tool for raising critical consciousness and nurturing resistance to discrimination. "Autoethnographic mapping," integrating guided cognitive mapping and autoethnographies, has been implemented for a decade now within the framework of a college course occasioning dialogue between Palestinian Arab and…
Bioenergy Knowledge Discovery Framework Fact Sheet
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The Bioenergy Knowledge Discovery Framework (KDF) supports the development of a sustainable bioenergy industry by providing access to a variety of data sets, publications, and collaboration and mapping tools that support bioenergy research, analysis, and decision making. In the KDF, users can search for information, contribute data, and use the tools and map interface to synthesize, analyze, and visualize information in a spatially integrated manner.
Integrated Modeling, Mapping, and Simulation (IMMS) Framework for Exercise and Response Planning
NASA Technical Reports Server (NTRS)
Mapar, Jalal; Hoette, Trisha; Mahrous, Karim; Pancerella, Carmen M.; Plantenga, Todd; Yang, Christine; Yang, Lynn; Hopmeier, Michael
2011-01-01
EmergenCy management personnel at federal, stale, and local levels can benefit from the increased situational awareness and operational efficiency afforded by simulation and modeling for emergency preparedness, including planning, training and exercises. To support this goal, the Department of Homeland Security's Science & Technology Directorate is funding the Integrated Modeling, Mapping, and Simulation (IMMS) program to create an integrating framework that brings together diverse models for use by the emergency response community. SUMMIT, one piece of the IMMS program, is the initial software framework that connects users such as emergency planners and exercise developers with modeling resources, bridging the gap in expertise and technical skills between these two communities. SUMMIT was recently deployed to support exercise planning for National Level Exercise 2010. Threat, casualty. infrastructure, and medical surge models were combined within SUMMIT to estimate health care resource requirements for the exercise ground truth.
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration.
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Objective Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. Materials and methods We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Results Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. Conclusions We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration. PMID:23571850
Testing an innovative framework for flood forecasting, monitoring and mapping in Europe
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Kalas, Milan; Lorini, Valerio; Wania, Annett; Pappenberger, Florian; Salamon, Peter; Ramos, Maria Helena; Cloke, Hannah; Castillo, Carlos
2017-04-01
Between May and June 2016, France was hit by severe floods, particularly in the Loire and Seine river basins. In this work, we use this case study to test an innovative framework for flood forecasting, mapping and monitoring. More in detail, the system integrates in real-time two components of the Copernicus Emergency mapping services, namely the European Flood Awareness System and the satellite-based Rapid Mapping, with new procedures for rapid risk assessment and social media and news monitoring. We explore in detail the performance of each component of the system, demonstrating the improvements in respect to stand-alone flood forecasting and monitoring systems. We show how the performances of the forecasting component can be refined using the real-time feedback from social media monitoring to identify which areas were flooded, to evaluate the flood intensity, and therefore to correct impact estimations. Moreover, we show how the integration with impact forecast and social media monitoring can improve the timeliness and efficiency of satellite based emergency mapping, and reduce the chances of missing areas where flooding is already happening. These results illustrate how the new integrated approach leads to a better and earlier decision making and a timely evaluation of impacts.
a Hadoop-Based Distributed Framework for Efficient Managing and Processing Big Remote Sensing Images
NASA Astrophysics Data System (ADS)
Wang, C.; Hu, F.; Hu, X.; Zhao, S.; Wen, W.; Yang, C.
2015-07-01
Various sensors from airborne and satellite platforms are producing large volumes of remote sensing images for mapping, environmental monitoring, disaster management, military intelligence, and others. However, it is challenging to efficiently storage, query and process such big data due to the data- and computing- intensive issues. In this paper, a Hadoop-based framework is proposed to manage and process the big remote sensing data in a distributed and parallel manner. Especially, remote sensing data can be directly fetched from other data platforms into the Hadoop Distributed File System (HDFS). The Orfeo toolbox, a ready-to-use tool for large image processing, is integrated into MapReduce to provide affluent image processing operations. With the integration of HDFS, Orfeo toolbox and MapReduce, these remote sensing images can be directly processed in parallel in a scalable computing environment. The experiment results show that the proposed framework can efficiently manage and process such big remote sensing data.
Rajarathinam, Vetrickarthick; Chellappa, Swarnalatha; Nagarajan, Asha
2015-01-01
This study on component framework reveals the importance of management process and technology mapping in a business environment. We defined ERP as a software tool, which has to provide business solution but not necessarily an integration of all the departments. Any business process can be classified as management process, operational process and the supportive process. We have gone through entire management process and were enable to bring influencing components to be mapped with a technology for a business solution. Governance, strategic management, and decision making are thoroughly discussed and the need of mapping these components with the ERP is clearly explained. Also we suggest that implementation of this framework might reduce the ERP failures and especially the ERP misfit was completely rectified.
Chellappa, Swarnalatha; Nagarajan, Asha
2015-01-01
This study on component framework reveals the importance of management process and technology mapping in a business environment. We defined ERP as a software tool, which has to provide business solution but not necessarily an integration of all the departments. Any business process can be classified as management process, operational process and the supportive process. We have gone through entire management process and were enable to bring influencing components to be mapped with a technology for a business solution. Governance, strategic management, and decision making are thoroughly discussed and the need of mapping these components with the ERP is clearly explained. Also we suggest that implementation of this framework might reduce the ERP failures and especially the ERP misfit was completely rectified. PMID:25861688
Ecoregions and ecodistricts: Ecological regionalizations for the Netherlands' environmental policy
NASA Astrophysics Data System (ADS)
Klijn, Frans; de Waal, Rein W.; Oude Voshaar, Jan H.
1995-11-01
For communicating data on the state of the environment to policy makers, various integrative frameworks are used, including regional integration. For this kind of integration we have developed two related ecological regionalizations, ecoregions and ecodistricts, which are two levels in a series of classifications for hierarchically nested ecosystems at different spatial scale levels. We explain the compilation of the maps from existing geographical data, demonstrating the relatively holistic, a priori integrated approach. The resulting maps are submitted to discriminant analysis to test the consistancy of the use of mapping characteristics, using data on individual abiotic ecosystem components from a national database on a 1-km2 grid. This reveals that the spatial patterns of soil, groundwater, and geomorphology correspond with the ecoregion and ecodistrict maps. Differences between the original maps and maps formed by automatically reclassifying 1-km2 cells with these discriminant components are found to be few. These differences are discussed against the background of the principal dilemma between deductive, a priori integrated, and inductive, a posteriori, classification.
Quantifying and Mapping Habitat-Based Biodiversity Metrics Within an Ecosystem Services Framework
Ecosystem services have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with econom...
Soh, Jung; Turinsky, Andrei L; Trinh, Quang M; Chang, Jasmine; Sabhaney, Ajay; Dong, Xiaoli; Gordon, Paul Mk; Janzen, Ryan Pw; Hau, David; Xia, Jianguo; Wishart, David S; Sensen, Christoph W
2009-01-01
We have developed a computational framework for spatiotemporal integration of molecular and anatomical datasets in a virtual reality environment. Using two case studies involving gene expression data and pharmacokinetic data, respectively, we demonstrate how existing knowledge bases for molecular data can be semantically mapped onto a standardized anatomical context of human body. Our data mapping methodology uses ontological representations of heterogeneous biomedical datasets and an ontology reasoner to create complex semantic descriptions of biomedical processes. This framework provides a means to systematically combine an increasing amount of biomedical imaging and numerical data into spatiotemporally coherent graphical representations. Our work enables medical researchers with different expertise to simulate complex phenomena visually and to develop insights through the use of shared data, thus paving the way for pathological inference, developmental pattern discovery and biomedical hypothesis testing.
ERIC Educational Resources Information Center
Bui, Yvonne N.; Fagan, Yvette M.
2013-01-01
The study evaluated the effects of the Integrated Reading Comprehension Strategy on two levels. The Integrated Reading Comprehension Strategy integrated story grammar instruction and story maps, prior knowledge and prediction method, and word webs through a culturally responsive teaching framework; the Integrated Reading Comprehension Strategy…
The use of concept mapping for scale development and validation in evaluation.
Rosas, Scott R; Camphausen, Lauren C
2007-05-01
Evaluators often make key decisions about what content to include when designing new scales. However, without clear conceptual grounding, there is a risk these decisions may compromise the scale's validity. Techniques such as concept mapping are available to evaluators for the specification of conceptual frameworks, but have not been used as a fully integrated part of scale development. As part of a multi-site evaluation of family support programs, we integrated concept mapping with traditional scale-development processes to strengthen the creation of a scale for inclusion in an evaluation instrument. Using concept mapping, we engaged staff and managers in the development of a framework of intended benefits of program participation and used the information to systematically select the scale's content. The psychometric characteristics of the scale were then formally assessed using a sample of program participants. The implications of the approach for supporting construct validity, inclusion of staff and managers, and theory-driven evaluation are discussed.
NASA Astrophysics Data System (ADS)
Emter, Thomas; Petereit, Janko
2014-05-01
An integrated multi-sensor fusion framework for localization and mapping for autonomous navigation in unstructured outdoor environments based on extended Kalman filters (EKF) is presented. The sensors for localization include an inertial measurement unit, a GPS, a fiber optic gyroscope, and wheel odometry. Additionally a 3D LIDAR is used for simultaneous localization and mapping (SLAM). A 3D map is built while concurrently a localization in a so far established 2D map is estimated with the current scan of the LIDAR. Despite of longer run-time of the SLAM algorithm compared to the EKF update, a high update rate is still guaranteed by sophisticatedly joining and synchronizing two parallel localization estimators.
3 Steps to Developing a Tribal Integrated Waste Management Plan (IWMP)
An Integrated Waste Management Plan (IWMP) is the blueprint of a comprehensive waste management program. The steps to developing an IWMP are collect background data, map out the tribal IWMP framework, and write and implement the tribal IWMP.
CARHTA GENE: multipopulation integrated genetic and radiation hybrid mapping.
de Givry, Simon; Bouchez, Martin; Chabrier, Patrick; Milan, Denis; Schiex, Thomas
2005-04-15
CAR(H)(T)A GENE: is an integrated genetic and radiation hybrid (RH) mapping tool which can deal with multiple populations, including mixtures of genetic and RH data. CAR(H)(T)A GENE: performs multipoint maximum likelihood estimations with accelerated expectation-maximization algorithms for some pedigrees and has sophisticated algorithms for marker ordering. Dedicated heuristics for framework mapping are also included. CAR(H)(T)A GENE: can be used as a C++ library, through a shell command and a graphical interface. The XML output for companion tools is integrated. The program is available free of charge from www.inra.fr/bia/T/CarthaGene for Linux, Windows and Solaris machines (with Open Source). tschiex@toulouse.inra.fr.
Mapping SOA Artefacts onto an Enterprise Reference Architecture Framework
NASA Astrophysics Data System (ADS)
Noran, Ovidiu
Currently, there is still no common agreement on the service-Oriented architecture (SOA) definition, or the types and meaning of the artefacts involved in the creation and maintenance of an SOA. Furthermore, the SOA image shift from an infrastructure solution to a business-wide change project may have promoted a perception that SOA is a parallel initiative, a competitor and perhaps a successor of enterprise architecture (EA). This chapter attempts to map several typical SOA artefacts onto an enterprise reference framework commonly used in EA. This is done in order to show that the EA framework can express and structure most of the SOA artefacts and therefore, a framework for SOA could in fact be derived from an EA framework with the ensuing SOA-EA integration benefits.
Ecosystem services (ESS) represent an ecosystems capacity for satisfying essential human needs, directly or indirectly, above that required to maintain ecosystem integrity (structure, function and processes). The spatial characterization and mapping of ESS is an essential first s...
Concept mapping as a promising method to bring practice into science.
van Bon-Martens, M J H; van de Goor, L A M; Holsappel, J C; Kuunders, T J M; Jacobs-van der Bruggen, M A M; te Brake, J H M; van Oers, J A M
2014-06-01
Concept mapping is a method for developing a conceptual framework of a complex topic for use as a guide to evaluation or planning. In concept mapping, thoughts and ideas are represented in the form of a picture or map, the content of which is determined by a group of stakeholders. This study aimed to explore the suitability of this method as a tool to integrate practical knowledge with scientific knowledge in order to improve theory development as a sound basis for practical decision-making. Following a short introduction to the method of concept mapping, five Dutch studies, serving different purposes and fields in public health, will be described. The aim of these studies was: to construct a theoretical framework for good regional public health reporting; to design an implementation strategy for a guideline for integral local health policy; to guide the evaluation of a local integral approach of overweight and obesity in youth; to guide the construction of a questionnaire to measure the quality of postdisaster psychosocial care; and to conceptualize an integral base for formulation of ambitions and targets for the new youth healthcare programme of a regional health service. The studies showed that concept mapping is a way to integrate practical and scientific knowledge with careful selection of participants that represent the different perspectives. Theory development can be improved through concept mapping; not by formulating new theories, but by highlighting the key issues and defining perceived relationships between topics. In four of the five studies, the resulting concept map was received as a sound basis for practical decision-making. Concept mapping is a valuable method for evidence-based public health policy, and a powerful instrument for facilitating dialogue, coherence and collaboration between researchers, practitioners, policy makers and the public. Development of public health theory was realized by a step-by-step approach, considering both scientific and practical knowledge. However, the external validity of the concept maps in place and time is of importance. Copyright © 2014 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
An Ontology-Based Reasoning Framework for Querying Satellite Images for Disaster Monitoring.
Alirezaie, Marjan; Kiselev, Andrey; Längkvist, Martin; Klügl, Franziska; Loutfi, Amy
2017-11-05
This paper presents a framework in which satellite images are classified and augmented with additional semantic information to enable queries about what can be found on the map at a particular location, but also about paths that can be taken. This is achieved by a reasoning framework based on qualitative spatial reasoning that is able to find answers to high level queries that may vary on the current situation. This framework called SemCityMap, provides the full pipeline from enriching the raw image data with rudimentary labels to the integration of a knowledge representation and reasoning methods to user interfaces for high level querying. To illustrate the utility of SemCityMap in a disaster scenario, we use an urban environment-central Stockholm-in combination with a flood simulation. We show that the system provides useful answers to high-level queries also with respect to the current flood status. Examples of such queries concern path planning for vehicles or retrieval of safe regions such as "find all regions close to schools and far from the flooded area". The particular advantage of our approach lies in the fact that ontological information and reasoning is explicitly integrated so that queries can be formulated in a natural way using concepts on appropriate level of abstraction, including additional constraints.
An Ontology-Based Reasoning Framework for Querying Satellite Images for Disaster Monitoring
Alirezaie, Marjan; Klügl, Franziska; Loutfi, Amy
2017-01-01
This paper presents a framework in which satellite images are classified and augmented with additional semantic information to enable queries about what can be found on the map at a particular location, but also about paths that can be taken. This is achieved by a reasoning framework based on qualitative spatial reasoning that is able to find answers to high level queries that may vary on the current situation. This framework called SemCityMap, provides the full pipeline from enriching the raw image data with rudimentary labels to the integration of a knowledge representation and reasoning methods to user interfaces for high level querying. To illustrate the utility of SemCityMap in a disaster scenario, we use an urban environment—central Stockholm—in combination with a flood simulation. We show that the system provides useful answers to high-level queries also with respect to the current flood status. Examples of such queries concern path planning for vehicles or retrieval of safe regions such as “find all regions close to schools and far from the flooded area”. The particular advantage of our approach lies in the fact that ontological information and reasoning is explicitly integrated so that queries can be formulated in a natural way using concepts on appropriate level of abstraction, including additional constraints. PMID:29113073
Development of a landscape integrity model framework to support regional conservation planning.
Walston, Leroy J; Hartmann, Heidi M
2018-01-01
Land managers increasingly rely upon landscape assessments to understand the status of natural resources and identify conservation priorities. Many of these landscape planning efforts rely on geospatial models that characterize the ecological integrity of the landscape. These general models utilize measures of habitat disturbance and human activity to map indices of ecological integrity. We built upon these modeling frameworks by developing a Landscape Integrity Index (LII) model using geospatial datasets of the human footprint, as well as incorporation of other indicators of ecological integrity such as biodiversity and vegetation departure. Our LII model serves as a general indicator of ecological integrity in a regional context of human activity, biodiversity, and change in habitat composition. We also discuss the application of the LII framework in two related coarse-filter landscape conservation approaches to expand the size and connectedness of protected areas as regional mitigation for anticipated land-use changes.
Development of a landscape integrity model framework to support regional conservation planning
Hartmann, Heidi M.
2018-01-01
Land managers increasingly rely upon landscape assessments to understand the status of natural resources and identify conservation priorities. Many of these landscape planning efforts rely on geospatial models that characterize the ecological integrity of the landscape. These general models utilize measures of habitat disturbance and human activity to map indices of ecological integrity. We built upon these modeling frameworks by developing a Landscape Integrity Index (LII) model using geospatial datasets of the human footprint, as well as incorporation of other indicators of ecological integrity such as biodiversity and vegetation departure. Our LII model serves as a general indicator of ecological integrity in a regional context of human activity, biodiversity, and change in habitat composition. We also discuss the application of the LII framework in two related coarse-filter landscape conservation approaches to expand the size and connectedness of protected areas as regional mitigation for anticipated land-use changes. PMID:29614093
2014-01-01
Background Modern watermelon (Citrullus lanatus L.) cultivars share a narrow genetic base due to many years of selection for desirable horticultural qualities. Wild subspecies within C. lanatus are important potential sources of novel alleles for watermelon breeding, but successful trait introgression into elite cultivars has had limited success. The application of marker assisted selection (MAS) in watermelon is yet to be realized, mainly due to the past lack of high quality genetic maps. Recently, a number of useful maps have become available, however these maps have few common markers, and were constructed using different marker sets, thus, making integration and comparative analysis among maps difficult. The objective of this research was to use single-nucleotide polymorphism (SNP) anchor markers to construct an integrated genetic map for C. lanatus. Results Under the framework of the high density genetic map, an integrated genetic map was constructed by merging data from four independent mapping experiments using a genetically diverse array of parental lines, which included three subspecies of watermelon. The 698 simple sequence repeat (SSR), 219 insertion-deletion (InDel), 36 structure variation (SV) and 386 SNP markers from the four maps were used to construct an integrated map. This integrated map contained 1339 markers, spanning 798 cM with an average marker interval of 0.6 cM. Fifty-eight previously reported quantitative trait loci (QTL) for 12 traits in these populations were also integrated into the map. In addition, new QTL identified for brix, fructose, glucose and sucrose were added. Some QTL associated with economically important traits detected in different genetic backgrounds mapped to similar genomic regions of the integrated map, suggesting that such QTL are responsible for the phenotypic variability observed in a broad array of watermelon germplasm. Conclusions The integrated map described herein enhances the utility of genomic tools over previous watermelon genetic maps. A large proportion of the markers in the integrated map are SSRs, InDels and SNPs, which are easily transferable across laboratories. Moreover, the populations used to construct the integrated map include all three watermelon subspecies, making this integrated map useful for the selection of breeding traits, identification of QTL, MAS, analysis of germplasm and commercial hybrid seed detection. PMID:24443961
Genetic map of artichoke × wild cardoon: toward a consensus map for Cynara cardunculus.
Sonnante, Gabriella; Gatto, Angela; Morgese, Anita; Montemurro, Francesco; Sarli, Giulio; Blanco, Emanuela; Pignone, Domenico
2011-11-01
An integrated consensus linkage map is proposed for globe artichoke. Maternal and paternal genetic maps were constructed on the basis of an F(1) progeny derived from crossing an artichoke genotype (Mola) with its progenitor, the wild cardoon (Tolfa), using EST-derived SSRs, genomic SSRs, AFLPs, ten genes, and two morphological traits. For most genes, mainly belonging to the chlorogenic acid pathway, new markers were developed. Five of these were SNP markers analyzed through high-resolution melt technology. From the maternal (Mola) and paternal (Tolfa) maps, an integrated map was obtained, containing 337 molecular and one morphological markers ordered in 17 linkage groups (LGs), linked between Mola and Tolfa. The integrated map covers 1,488.8 cM, with an average distance of 4.4 cM between markers. The map was aligned with already existing maps for artichoke, and 12 LGs were linked via 31 bridge markers. LG numbering has been proposed. A total of 124 EST-SSRs and two genes were mapped here for the first time, providing a framework for the construction of a functional map in artichoke. The establishment of a consensus map represents a necessary condition to plan a complete sequencing of the globe artichoke genome.
An Integrated Approach for Urban Earthquake Vulnerability Analyses
NASA Astrophysics Data System (ADS)
Düzgün, H. S.; Yücemen, M. S.; Kalaycioglu, H. S.
2009-04-01
The earthquake risk for an urban area has increased over the years due to the increasing complexities in urban environments. The main reasons are the location of major cities in hazard prone areas, growth in urbanization and population and rising wealth measures. In recent years physical examples of these factors are observed through the growing costs of major disasters in urban areas which have stimulated a demand for in-depth evaluation of possible strategies to manage the large scale damaging effects of earthquakes. Understanding and formulation of urban earthquake risk requires consideration of a wide range of risk aspects, which can be handled by developing an integrated approach. In such an integrated approach, an interdisciplinary view should be incorporated into the risk assessment. Risk assessment for an urban area requires prediction of vulnerabilities related to elements at risk in the urban area and integration of individual vulnerability assessments. However, due to complex nature of an urban environment, estimating vulnerabilities and integrating them necessities development of integrated approaches in which vulnerabilities of social, economical, structural (building stock and infrastructure), cultural and historical heritage are estimated for a given urban area over a given time period. In this study an integrated urban earthquake vulnerability assessment framework, which considers vulnerability of urban environment in a holistic manner and performs the vulnerability assessment for the smallest administrative unit, namely at neighborhood scale, is proposed. The main motivation behind this approach is the inability to implement existing vulnerability assessment methodologies for countries like Turkey, where the required data are usually missing or inadequate and decision makers seek for prioritization of their limited resources in risk reduction in the administrative districts from which they are responsible. The methodology integrates socio-economical, structural, coastal, ground condition, organizational vulnerabilities, as well as accessibility to critical services within the framework. The proposed framework has the following eight components: Seismic hazard analysis, soil response analysis, tsunami inundation analysis, structural vulnerability analysis, socio-economic vulnerability analysis, accessibility to critical services, GIS-based integrated vulnerability assessment, and visualization of vulnerabilities in 3D virtual city model The integrated model for various vulnerabilities obtained for the urban area is developed in GIS environment by using individual vulnerability assessments for considered elements at risk and serve for establishing the backbone of the spatial decision support system. The stages followed in the model are: Determination of a common mapping unit for each aspect of urban earthquake vulnerability, formation of a geo-database for the vulnerabilities, evaluation of urban vulnerability based on multi attribute utility theory with various weighting algorithms, mapping of the evaluated integrated earthquake risk in geographic information systems (GIS) in the neighborhood scale. The framework is also applicable to larger geographical mapping scales, for example, the building scale. When illustrating the results in building scale, 3-D visualizations with remote sensing data is used so that decision-makers can easily interpret the outputs. The proposed vulnerability assessment framework is flexible and can easily be applied to urban environments at various geographical scales with different mapping units. The obtained total vulnerability maps for the urban area provide a baseline for the development of risk reduction strategies for the decision makers. Moreover, as several aspects of elements at risk for an urban area is considered through vulnerability analyses, effect on changes in vulnerability conditions on the total can easily be determined. The developed approach also enables decision makers to monitor temporal and spatial changes in the urban environment due to implementation of risk reduction strategies.
A framework for simulating map error in ecosystem models
Sean P. Healey; Shawn P. Urbanski; Paul L. Patterson; Chris Garrard
2014-01-01
The temporal depth and spatial breadth of observations from platforms such as Landsat provide unique perspective on ecosystem dynamics, but the integration of these observations into formal decision support will rely upon improved uncertainty accounting. Monte Carlo (MC) simulations offer a practical, empirical method of accounting for potential map errors in broader...
The fun integration theory: toward sustaining children and adolescents sport participation.
Visek, Amanda J; Achrati, Sara M; Mannix, Heather; McDonnell, Karen; Harris, Brandonn S; DiPietro, Loretta
2015-03-01
Children cite "fun" as the primary reason for participation in organized sport and its absence as the number-one reason for youth sport attrition. Therefore, the purpose of this study was to develop a theoretical framework of fun using a novel mixed-method assessment of participants in sport (FUN MAPS) via concept mapping. Youth soccer players (n = 142), coaches (n = 37), and parents (n = 57) were stratified by age, sex, and competition level and contributed their ideas through (a) qualitative brainstorming, identifying all of the things that make playing sports fun for players; (b) sorting of ideas; and (c) rating each idea on its importance, frequency, and feasibility. The FUN MAPS identify the 4 fundamental tenets of fun in youth sport within 11 fun-dimensions composed of 81 specific fun-determinants, while also establishing the youth sport ethos. The FUN MAPS provide pictorial evidence-based blueprints for the fun integration theory (FIT), which is a multitheoretical, multidimensional, and stakeholder derived framework that can be used to maximize fun for children and adolescents to promote and sustain an active and healthy lifestyle through sport.
Ecological subregion codes by county, coterminous United States
Victor A. Rudis
1999-01-01
This publication presents the National Hierarchical Framework of Ecological Units (ECOMAP 1993) by county for the coterminous United States. Assignment of the framework to individual counties is based on the predominant area by province and section to facilitate integration of county-referenced information with areas of uniform ecological potential. Included are maps...
Integrated Modeling, Mapping, and Simulation (IMMS) framework for planning exercises.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman-Hill, Ernest J.; Plantenga, Todd D.
2010-06-01
The Integrated Modeling, Mapping, and Simulation (IMMS) program is designing and prototyping a simulation and collaboration environment for linking together existing and future modeling and simulation tools to enable analysts, emergency planners, and incident managers to more effectively, economically, and rapidly prepare, analyze, train, and respond to real or potential incidents. When complete, the IMMS program will demonstrate an integrated modeling and simulation capability that supports emergency managers and responders with (1) conducting 'what-if' analyses and exercises to address preparedness, analysis, training, operations, and lessons learned, and (2) effectively, economically, and rapidly verifying response tactics, plans and procedures.
The knowledge-value chain: A conceptual framework for knowledge translation in health.
Landry, Réjean; Amara, Nabil; Pablos-Mendes, Ariel; Shademani, Ramesh; Gold, Irving
2006-08-01
This article briefly discusses knowledge translation and lists the problems associated with it. Then it uses knowledge-management literature to develop and propose a knowledge-value chain framework in order to provide an integrated conceptual model of knowledge management and application in public health organizations. The knowledge-value chain is a non-linear concept and is based on the management of five dyadic capabilities: mapping and acquisition, creation and destruction, integration and sharing/transfer, replication and protection, and performance and innovation.
The knowledge-value chain: A conceptual framework for knowledge translation in health.
Landry, Réjean; Amara, Nabil; Pablos-Mendes, Ariel; Shademani, Ramesh; Gold, Irving
2006-01-01
This article briefly discusses knowledge translation and lists the problems associated with it. Then it uses knowledge-management literature to develop and propose a knowledge-value chain framework in order to provide an integrated conceptual model of knowledge management and application in public health organizations. The knowledge-value chain is a non-linear concept and is based on the management of five dyadic capabilities: mapping and acquisition, creation and destruction, integration and sharing/transfer, replication and protection, and performance and innovation. PMID:16917645
Werts, Joshua D; Mikhailova, Elena A; Post, Christopher J; Sharp, Julia L
2012-04-01
Volunteered geographic information and social networking in a WebGIS has the potential to increase public participation in soil and water conservation, promote environmental awareness and change, and provide timely data that may be otherwise unavailable to policymakers in soil and water conservation management. The objectives of this study were: (1) to develop a framework for combining current technologies, computing advances, data sources, and social media; and (2) develop and test an online web mapping interface. The mapping interface integrates Microsoft Silverlight, Bing Maps, ArcGIS Server, Google Picasa Web Albums Data API, RSS, Google Analytics, and Facebook to create a rich user experience. The website allows the public to upload photos and attributes of their own subdivisions or sites they have identified and explore other submissions. The website was made available to the public in early February 2011 at http://www.AbandonedDevelopments.com and evaluated for its potential long-term success in a pilot study.
NASA Astrophysics Data System (ADS)
Werts, Joshua D.; Mikhailova, Elena A.; Post, Christopher J.; Sharp, Julia L.
2012-04-01
Volunteered geographic information and social networking in a WebGIS has the potential to increase public participation in soil and water conservation, promote environmental awareness and change, and provide timely data that may be otherwise unavailable to policymakers in soil and water conservation management. The objectives of this study were: (1) to develop a framework for combining current technologies, computing advances, data sources, and social media; and (2) develop and test an online web mapping interface. The mapping interface integrates Microsoft Silverlight, Bing Maps, ArcGIS Server, Google Picasa Web Albums Data API, RSS, Google Analytics, and Facebook to create a rich user experience. The website allows the public to upload photos and attributes of their own subdivisions or sites they have identified and explore other submissions. The website was made available to the public in early February 2011 at http://www.AbandonedDevelopments.com and evaluated for its potential long-term success in a pilot study.
Atlas Basemaps in Web 2.0 Epoch
NASA Astrophysics Data System (ADS)
Chabaniuk, V.; Dyshlyk, O.
2016-06-01
The authors have analyzed their experience of the production of various Electronic Atlases (EA) and Atlas Information Systems (AtIS) of so-called "classical type". These EA/AtIS have been implemented in the past decade in the Web 1.0 architecture (e.g., National Atlas of Ukraine, Atlas of radioactive contamination of Ukraine, and others). One of the main distinguishing features of these atlases was their static nature - the end user could not change the content of EA/AtIS. Base maps are very important element of any EA/AtIS. In classical type EA/AtIS they were static datasets, which consisted of two parts: the topographic data of a fixed scale and data of the administrative-territorial division of Ukraine. It is important to note that the technique of topographic data production was based on the use of direct channels of topographic entity observation (such as aerial photography) for the selected scale. Changes in the information technology of the past half-decade are characterized by the advent of the "Web 2.0 epoch". Due to this, in cartography appeared such phenomena as, for example, "neo-cartography" and various mapping platforms like OpenStreetMap. These changes have forced developers of EA/AtIS to use new atlas basemaps. Our approach is described in the article. The phenomenon of neo-cartography and/or Web 2.0 cartography are analysed by authors using previously developed Conceptual framework of EA/AtIS. This framework logically explains the cartographic phenomena relations of three formations: Web 1.0, Web 1.0x1.0 and Web 2.0. Atlas basemaps of the Web 2.0 epoch are integrated information systems. We use several ways to integrate separate atlas basemaps into the information system - by building: weak integrated information system, structured system and meta-system. This integrated information system consists of several basemaps and falls under the definition of "big data". In real projects it is already used the basemaps of three strata: Conceptual, Application and Operational. It is possible to use several variants of the basemap for each stratum. Furthermore, the developed methods of integration allow logically coordinate the application of different types of basemaps into a specific EA/AtIS. For example, such variants of the Conceptual strata basemap as the National map of Ukraine of our production and external resources such as OpenStreetMap are used with the help of meta-system replacement procedures. The authors propose a Conceptual framework of the basemap, which consists of the Conceptual solutions framework of the basemap and few Application solutions frameworks of the basemap. Conceptual framework is intended to be reused in many projects and significantly reduce the resources. We differentiate Application frameworks for mobile and non-mobile environments. The results of the research are applied in few EA produced in 2014-2015 at the Institute of Geography of the National Academy of Sciences of Ukraine. One of them is the Atlas of emergency situations. It includes elements that work on mobile devices. At its core it is "ubiquitous" subset of the Atlas.
Indoor Map Aided Wi-Fi Integrated Lbs on Smartphone Platforms
NASA Astrophysics Data System (ADS)
Yu, C.; El-Sheimy, N.
2017-09-01
In this research, an indoor map aided INS/Wi-Fi integrated location based services (LBS) applications is proposed and implemented on smartphone platforms. Indoor map information together with measurements from an inertial measurement unit (IMU) and Received Signal Strength Indicator (RSSI) value from Wi-Fi are collected to obtain an accurate, continuous, and low-cost position solution. The main challenge of this research is to make effective use of various measurements that complement each other without increasing the computational burden of the system. The integrated system in this paper includes three modules: INS, Wi-Fi (if signal available) and indoor maps. A cascade structure Particle/Kalman filter framework is applied to combine the different modules. Firstly, INS position and Wi-Fi fingerprint position integrated through Kalman filter for estimating positioning information. Then, indoor map information is applied to correct the error of INS/Wi-Fi estimated position through particle filter. Indoor tests show that the proposed method can effectively reduce the accumulation positioning errors of stand-alone INS systems, and provide stable, continuous and reliable indoor location service.
The National Map - Orthoimagery Layer
,
2007-01-01
Many Federal, State, and local agencies use a common set of framework geographic information databases as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and homeland security applications rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continually maintained, and nationally consistent set of online, public domain, framework geographic information databases. The National Map will serve as a foundation for integrating, sharing, and using data easily and consistently. The data will be the source of revised paper topographic maps. The National Map includes digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information.
Integrated health monitoring and controls for rocket engines
NASA Technical Reports Server (NTRS)
Merrill, W. C.; Musgrave, J. L.; Guo, T. H.
1992-01-01
Current research in intelligent control systems at the Lewis Research Center is described in the context of a functional framework. The framework is applicable to a variety of reusable space propulsion systems for existing and future launch vehicles. It provides a 'road map' technology development to enable enhanced engine performance with increased reliability, durability, and maintainability. The framework hierarchy consists of a mission coordination level, a propulsion system coordination level, and an engine control level. Each level is described in the context of the Space Shuttle Main Engine. The concept of integrating diagnostics with control is discussed within the context of the functional framework. A distributed real time simulation testbed is used to realize and evaluate the functionalities in closed loop.
A Comprehensive Linkage Map of the Dog Genome
Wong, Aaron K.; Ruhe, Alison L.; Dumont, Beth L.; Robertson, Kathryn R.; Guerrero, Giovanna; Shull, Sheila M.; Ziegle, Janet S.; Millon, Lee V.; Broman, Karl W.; Payseur, Bret A.; Neff, Mark W.
2010-01-01
We have leveraged the reference sequence of a boxer to construct the first complete linkage map for the domestic dog. The new map improves access to the dog's unique biology, from human disease counterparts to fascinating evolutionary adaptations. The map was constructed with ∼3000 microsatellite markers developed from the reference sequence. Familial resources afforded 450 mostly phase-known meioses for map assembly. The genotype data supported a framework map with ∼1500 loci. An additional ∼1500 markers served as map validators, contributing modestly to estimates of recombination rate but supporting the framework content. Data from ∼22,000 SNPs informing on a subset of meioses supported map integrity. The sex-averaged map extended 21 M and revealed marked region- and sex-specific differences in recombination rate. The map will enable empiric coverage estimates and multipoint linkage analysis. Knowledge of the variation in recombination rate will also inform on genomewide patterns of linkage disequilibrium (LD), and thus benefit association, selective sweep, and phylogenetic mapping approaches. The computational and wet-bench strategies can be applied to the reference genome of any nonmodel organism to assemble a de novo linkage map. PMID:19966068
Knowledge mapping as a technique to support knowledge translation.
Ebener, S.; Khan, A.; Shademani, R.; Compernolle, L.; Beltran, M.; Lansang, Ma; Lippman, M.
2006-01-01
This paper explores the possibility of integrating knowledge mapping into a conceptual framework that could serve as a tool for understanding the many complex processes, resources and people involved in a health system, and for identifying potential gaps within knowledge translation processes in order to address them. After defining knowledge mapping, this paper presents various examples of the application of this process in health, before looking at the steps that need to be taken to identify potential gaps, to determine to what extent these gaps affect the knowledge translation process and to establish their cause. This is followed by proposals for interventions aimed at strengthening the overall process. Finally, potential limitations on the application of this framework at the country level are addressed. PMID:16917651
ERIC Educational Resources Information Center
Gulyaev, Sergei A.; Stonyer, Heather R.
2002-01-01
Develops an integrated approach based on the use of general systems theory (GST) and the concept of 'mapping' scientific knowledge to provide students with tools for a more holistic understanding of science. Uses GST as the core methodology for understanding science and its complexity. Discusses the role of scientific community in producing…
Organization and integration of biomedical knowledge with concept maps for key peroxisomal pathways.
Willemsen, A M; Jansen, G A; Komen, J C; van Hooff, S; Waterham, H R; Brites, P M T; Wanders, R J A; van Kampen, A H C
2008-08-15
One important area of clinical genomics research involves the elucidation of molecular mechanisms underlying (complex) disorders which eventually may lead to new diagnostic or drug targets. To further advance this area of clinical genomics one of the main challenges is the acquisition and integration of data, information and expert knowledge for specific biomedical domains and diseases. Currently the required information is not very well organized but scattered over biological and biomedical databases, basic text books, scientific literature and experts' minds and may be highly specific, heterogeneous, complex and voluminous. We present a new framework to construct knowledge bases with concept maps for presentation of information and the web ontology language OWL for the representation of information. We demonstrate this framework through the construction of a peroxisomal knowledge base, which focuses on four key peroxisomal pathways and several related genetic disorders. All 155 concept maps in our knowledge base are linked to at least one other concept map, which allows the visualization of one big network of related pieces of information. The peroxisome knowledge base is available from www.bioinformaticslaboratory.nl (Support-->Web applications). Supplementary data is available from www.bioinformaticslaboratory.nl (Research-->Output--> Publications--> KB_SuppInfo)
The Fun Integration Theory: Towards Sustaining Children and Adolescents Sport Participation
Visek, Amanda J.; Achrati, Sara M.; Manning, Heather; McDonnell, Karen; Harris, Brandonn S.; DiPietro, Loretta
2014-01-01
BACKGROUND Children cite ‘fun’ as the primary reason for participation in organized sport and its absence as the number one reason for youth sport attrition. Therefore, the purpose of this study was to develop a theoretical framework of fun using a novel mixed-method assessment of participants in sport (FUN MAPS) via concept mapping. METHODS Youth soccer players (n = 142), coaches (n = 37), and parents (n = 57) were stratified by age, sex, and competition level and contributed their “fun” ideas through: (a) qualitative brainstorming, identifying all of the things that make playing sports fun for players; (b) sorting of ideas; and (c) rating each idea on its importance, frequency, and feasibility. RESULTS The FUN MAPS identify the four fundamental tenets of fun in youth sport within 11 fun-dimensions composed of 81 specific fun-determinants, while also establishing the youth sport ethos. CONCLUSION The FUN MAPS provide pictorial evidence-based blueprints for the fun integration theory (FIT), which is a multi-theoretical, multidimensional, and stakeholder derived framework that can be used to maximize fun for children and adolescents in order to promote and sustain an active and healthy lifestyle through sport. PMID:24770788
Infrared and visible image fusion method based on saliency detection in sparse domain
NASA Astrophysics Data System (ADS)
Liu, C. H.; Qi, Y.; Ding, W. R.
2017-06-01
Infrared and visible image fusion is a key problem in the field of multi-sensor image fusion. To better preserve the significant information of the infrared and visible images in the final fused image, the saliency maps of the source images is introduced into the fusion procedure. Firstly, under the framework of the joint sparse representation (JSR) model, the global and local saliency maps of the source images are obtained based on sparse coefficients. Then, a saliency detection model is proposed, which combines the global and local saliency maps to generate an integrated saliency map. Finally, a weighted fusion algorithm based on the integrated saliency map is developed to achieve the fusion progress. The experimental results show that our method is superior to the state-of-the-art methods in terms of several universal quality evaluation indexes, as well as in the visual quality.
Warren B. Cohen; Hans-Erik Andersen; Sean P. Healey; Gretchen G. Moisen; Todd A. Schroeder; Christopher W. Woodall; Grant M. Domke; Zhiqiang Yang; Robert E. Kennedy; Stephen V. Stehman; Curtis Woodcock; Jim Vogelmann; Zhe Zhu; Chengquan Huang
2015-01-01
We are developing a system that provides temporally consistent biomass estimates for national greenhouse gas inventory reporting to the United Nations Framework Convention on Climate Change. Our model-assisted estimation framework relies on remote sensing to scale from plot measurements to lidar strip samples, to Landsat time series-based maps. As a demonstration, new...
Exploration and implementation of ontology-based cultural relic knowledge map integration platform
NASA Astrophysics Data System (ADS)
Yang, Weiqiang; Dong, Yiqiang
2018-05-01
To help designers to better carry out creative design and improve the ability of searching traditional cultural relic information, the ontology-based knowledge map construction method was explored and an integrated platform for cultural relic knowledge map was developed. First of all, the construction method of the ontology of cultural relics was put forward, and the construction of the knowledge map of cultural relics was completed based on the constructed cultural relic otology. Then, a personalized semantic retrieval framework for creative design was proposed. Finally, the integrated platform of the knowledge map of cultural relics was designed and realized. The platform was divided into two parts. One was the foreground display system, which was used for designers to search and browse cultural relics. The other was the background management system, which was for cultural experts to manage cultural relics' knowledge. The research results showed that the platform designed could improve the retrieval ability of cultural relic information. To sum up, the platform can provide a good support for the designer's creative design.
Astronomical Data Integration Beyond the Virtual Observatory
NASA Astrophysics Data System (ADS)
Lemson, G.; Laurino, O.
2015-09-01
"Data integration" generally refers to the process of combining data from different source data bases into a unified view. Much work has been devoted in this area by the International Virtual Observatory Alliance (IVOA), allowing users to discover and access databases through standard protocols. However, different archives present their data through their own schemas and users must still select, filter, and combine data for each archive individually. An important reason for this is that the creation of common data models that satisfy all sub-disciplines is fraught with difficulties. Furthermore it requires a substantial amount of work for data providers to present their data according to some standard representation. We will argue that existing standards allow us to build a data integration framework that works around these problems. The particular framework requires the implementation of the IVOA Table Access Protocol (TAP) only. It uses the newly developed VO data modelling language (VO-DML) specification, which allows one to define extensible object-oriented data models using a subset of UML concepts through a simple XML serialization language. A rich mapping language allows one to describe how instances of VO-DML data models are represented by the TAP service, bridging the possible mismatch between a local archive's schema and some agreed-upon representation of the astronomical domain. In this so called local-as-view approach to data integration, “mediators" use the mapping prescriptions to translate queries phrased in terms of the common schema to the underlying TAP service. This mapping language has a graphical representation, which we expose through a web based graphical “drag-and-drop-and-connect" interface. This service allows any user to map the holdings of any TAP service to the data model(s) of choice. The mappings are defined and stored outside of the data sources themselves, which allows the interface to be used in a kind of crowd-sourcing effort to annotate any remote database of interest. This reduces the burden of publishing one's data and allows a great flexibility in the definition of the views through which particular communities might wish to access remote archives. At the same time, the framework easies the user's effort to select, filter, and combine data from many different archives, so as to build knowledge bases for their analysis. We will present the framework and demonstrate a prototype implementation. We will discuss ideas for producing the missing elements, in particular the query language and the implementation of mediator tools to translate object queries to ADQL
NASA Astrophysics Data System (ADS)
Lin, T.; Lin, Z.; Lim, S.
2017-12-01
We present an integrated modeling framework to simulate groundwater level change under the dramatic increase of hydraulic fracturing water use in the Bakken Shale oil production area. The framework combines the agent-based model (ABM) with the Fox Hills-Hell Creek (FH-HC) groundwater model. In development of the ABM, institution theory is used to model the regulation policies from the North Dakota State Water Commission, while evolutionary programming and cognitive maps are used to model the social structure that emerges from the behavior of competing individual water businesses. Evolutionary programming allows individuals to select an appropriate strategy when annually applying for potential water use permits; whereas cognitive maps endow agent's ability and willingness to compete for more water sales. All agents have their own influence boundaries that inhibit their competitive behavior toward their neighbors but not to non-neighbors. The decision-making process is constructed and parameterized with both quantitative and qualitative information, i.e., empirical water use data and knowledge gained from surveys with stakeholders. By linking institution theory, evolutionary programming, and cognitive maps, our approach addresses a higher complexity of the real decision making process. Furthermore, this approach is a new exploration for modeling the dynamics of Coupled Human and Natural System. After integrating ABM with the FH-HC model, drought and limited water accessibility scenarios are simulated to predict FH-HC ground water level changes in the future. The integrated modeling framework of ABM and FH-HC model can be used to support making scientifically sound policies in water allocation and management.
Ultra-Structure database design methodology for managing systems biology data and analyses
Maier, Christopher W; Long, Jeffrey G; Hemminger, Bradley M; Giddings, Morgan C
2009-01-01
Background Modern, high-throughput biological experiments generate copious, heterogeneous, interconnected data sets. Research is dynamic, with frequently changing protocols, techniques, instruments, and file formats. Because of these factors, systems designed to manage and integrate modern biological data sets often end up as large, unwieldy databases that become difficult to maintain or evolve. The novel rule-based approach of the Ultra-Structure design methodology presents a potential solution to this problem. By representing both data and processes as formal rules within a database, an Ultra-Structure system constitutes a flexible framework that enables users to explicitly store domain knowledge in both a machine- and human-readable form. End users themselves can change the system's capabilities without programmer intervention, simply by altering database contents; no computer code or schemas need be modified. This provides flexibility in adapting to change, and allows integration of disparate, heterogenous data sets within a small core set of database tables, facilitating joint analysis and visualization without becoming unwieldy. Here, we examine the application of Ultra-Structure to our ongoing research program for the integration of large proteomic and genomic data sets (proteogenomic mapping). Results We transitioned our proteogenomic mapping information system from a traditional entity-relationship design to one based on Ultra-Structure. Our system integrates tandem mass spectrum data, genomic annotation sets, and spectrum/peptide mappings, all within a small, general framework implemented within a standard relational database system. General software procedures driven by user-modifiable rules can perform tasks such as logical deduction and location-based computations. The system is not tied specifically to proteogenomic research, but is rather designed to accommodate virtually any kind of biological research. Conclusion We find Ultra-Structure offers substantial benefits for biological information systems, the largest being the integration of diverse information sources into a common framework. This facilitates systems biology research by integrating data from disparate high-throughput techniques. It also enables us to readily incorporate new data types, sources, and domain knowledge with no change to the database structure or associated computer code. Ultra-Structure may be a significant step towards solving the hard problem of data management and integration in the systems biology era. PMID:19691849
NASA Astrophysics Data System (ADS)
Pierce, M. E.; Aktas, M. S.; Aydin, G.; Fox, G. C.; Gadgil, H.; Sayar, A.
2005-12-01
We examine the application of Web Service Architectures and Grid-based distributed computing technologies to geophysics and geo-informatics. We are particularly interested in the integration of Geographical Information System (GIS) services with distributed data mining applications. GIS services provide the general purpose framework for building archival data services, real time streaming data services, and map-based visualization services that may be integrated with data mining and other applications through the use of distributed messaging systems and Web Service orchestration tools. Building upon on our previous work in these areas, we present our current research efforts. These include fundamental investigations into increasing XML-based Web service performance, supporting real time data streams, and integrating GIS mapping tools with audio/video collaboration systems for shared display and annotation.
Description of the U.S. Geological Survey Geo Data Portal data integration framework
Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Lucido, Jessica M.
2012-01-01
The U.S. Geological Survey has developed an open-standard data integration framework for working efficiently and effectively with large collections of climate and other geoscience data. A web interface accesses catalog datasets to find data services. Data resources can then be rendered for mapping and dataset metadata are derived directly from these web services. Algorithm configuration and information needed to retrieve data for processing are passed to a server where all large-volume data access and manipulation takes place. The data integration strategy described here was implemented by leveraging existing free and open source software. Details of the software used are omitted; rather, emphasis is placed on how open-standard web services and data encodings can be used in an architecture that integrates common geographic and atmospheric data.
A Framework for Hierarchical Perception-Action Learning Utilizing Fuzzy Reasoning.
Windridge, David; Felsberg, Michael; Shaukat, Affan
2013-02-01
Perception-action (P-A) learning is an approach to cognitive system building that seeks to reduce the complexity associated with conventional environment-representation/action-planning approaches. Instead, actions are directly mapped onto the perceptual transitions that they bring about, eliminating the need for intermediate representation and significantly reducing training requirements. We here set out a very general learning framework for cognitive systems in which online learning of the P-A mapping may be conducted within a symbolic processing context, so that complex contextual reasoning can influence the P-A mapping. In utilizing a variational calculus approach to define a suitable objective function, the P-A mapping can be treated as an online learning problem via gradient descent using partial derivatives. Our central theoretical result is to demonstrate top-down modulation of low-level perceptual confidences via the Jacobian of the higher levels of a subsumptive P-A hierarchy. Thus, the separation of the Jacobian as a multiplying factor between levels within the objective function naturally enables the integration of abstract symbolic manipulation in the form of fuzzy deductive logic into the P-A mapping learning. We experimentally demonstrate that the resulting framework achieves significantly better accuracy than using P-A learning without top-down modulation. We also demonstrate that it permits novel forms of context-dependent multilevel P-A mapping, applying the mechanism in the context of an intelligent driver assistance system.
Ma, Liyan; Qiu, Bo; Cui, Mingyue; Ding, Jianwei
2017-01-01
Depth image-based rendering (DIBR), which is used to render virtual views with a color image and the corresponding depth map, is one of the key techniques in the 2D to 3D conversion process. Due to the absence of knowledge about the 3D structure of a scene and its corresponding texture, DIBR in the 2D to 3D conversion process, inevitably leads to holes in the resulting 3D image as a result of newly-exposed areas. In this paper, we proposed a structure-aided depth map preprocessing framework in the transformed domain, which is inspired by recently proposed domain transform for its low complexity and high efficiency. Firstly, our framework integrates hybrid constraints including scene structure, edge consistency and visual saliency information in the transformed domain to improve the performance of depth map preprocess in an implicit way. Then, adaptive smooth localization is cooperated and realized in the proposed framework to further reduce over-smoothness and enhance optimization in the non-hole regions. Different from the other similar methods, the proposed method can simultaneously achieve the effects of hole filling, edge correction and local smoothing for typical depth maps in a united framework. Thanks to these advantages, it can yield visually satisfactory results with less computational complexity for high quality 2D to 3D conversion. Numerical experimental results demonstrate the excellent performances of the proposed method. PMID:28407027
The NIH Common Fund Human Biomolecular Atlas Program (HuBMAP) aims to develop a framework for functional mapping the human body with cellular resolution to enhance our understanding of cellular organization-function. HuBMAP will accelerate the development of the next generation of tools and techniques to generate 3D tissue maps using validated high-content, high-throughput imaging and omics assays, and establish an open data platform for integrating, visualizing data to build multi-dimensional maps.
Barrett, Lisa Feldman; Satpute, Ajay
2013-01-01
Understanding how a human brain creates a human mind ultimately depends on mapping psychological categories and concepts to physical measurements of neural response. Although it has long been assumed that emotional, social, and cognitive phenomena are realized in the operations of separate brain regions or brain networks, we demonstrate that it is possible to understand the body of neuroimaging evidence using a framework that relies on domain general, distributed structure-function mappings. We review current research in affective and social neuroscience and argue that the emerging science of large-scale intrinsic brain networks provides a coherent framework for a domain-general functional architecture of the human brain. PMID:23352202
Bristol, R. Sky; Euliss, Ned H.; Booth, Nathaniel L.; Burkardt, Nina; Diffendorfer, Jay E.; Gesch, Dean B.; McCallum, Brian E.; Miller, David M.; Morman, Suzette A.; Poore, Barbara S.; Signell, Richard P.; Viger, Roland J.
2013-01-01
Core Science Systems is a new mission of the U.S. Geological Survey (USGS) that resulted from the 2007 Science Strategy, "Facing Tomorrow's Challenges: U.S. Geological Survey Science in the Decade 2007-2017." This report describes the Core Science Systems vision and outlines a strategy to facilitate integrated characterization and understanding of the complex Earth system. The vision and suggested actions are bold and far-reaching, describing a conceptual model and framework to enhance the ability of the USGS to bring its core strengths to bear on pressing societal problems through data integration and scientific synthesis across the breadth of science. The context of this report is inspired by a direction set forth in the 2007 Science Strategy. Specifically, ecosystem-based approaches provide the underpinnings for essentially all science themes that define the USGS. Every point on Earth falls within a specific ecosystem where data, other information assets, and the expertise of USGS and its many partners can be employed to quantitatively understand how that ecosystem functions and how it responds to natural and anthropogenic disturbances. Every benefit society obtains from the planet-food, water, raw materials to build infrastructure, homes and automobiles, fuel to heat homes and cities, and many others, are derived from or affect ecosystems. The vision for Core Science Systems builds on core strengths of the USGS in characterizing and understanding complex Earth and biological systems through research, modeling, mapping, and the production of high quality data on the Nation's natural resource infrastructure. Together, these research activities provide a foundation for ecosystem-based approaches through geologic mapping, topographic mapping, and biodiversity mapping. The vision describes a framework founded on these core mapping strengths that makes it easier for USGS scientists to discover critical information, share and publish results, and identify potential collaborations that transcend all USGS missions. The framework is designed to improve the efficiency of scientific work within USGS by establishing a means to preserve and recall data for future applications, organizing existing scientific knowledge and data to facilitate new use of older information, and establishing a future workflow that naturally integrates new data, applications, and other science products to make interdisciplinary research easier and more efficient. Given the increasing need for integrated data and interdisciplinary approaches to solve modern problems, leadership by the Core Science Systems mission will facilitate problem solving by all USGS missions in ways not formerly possible. The report lays out a strategy to achieve this vision through three goals with accompanying objectives and actions. The first goal builds on and enhances the strengths of the Core Science Systems mission in characterizing and understanding the Earth system from the geologic framework to the topographic characteristics of the land surface and biodiversity across the Nation. The second goal enhances and develops new strengths in computer and information science to make it easier for USGS scientists to discover data and models, share and publish results, and discover connections between scientific information and knowledge. The third goal brings additional focus to research and development methods to address complex issues affecting society that require integration of knowledge and new methods for synthesizing scientific information. Collectively, the report lays out a strategy to create a seamless connection between all USGS activities to accelerate and make USGS science more efficient by fully integrating disciplinary expertise within a new and evolving science paradigm for a changing world in the 21st century.
An Integrated Framework for Parameter-based Optimization of Scientific Workflows.
Kumar, Vijay S; Sadayappan, P; Mehta, Gaurang; Vahi, Karan; Deelman, Ewa; Ratnakar, Varun; Kim, Jihie; Gil, Yolanda; Hall, Mary; Kurc, Tahsin; Saltz, Joel
2009-01-01
Data analysis processes in scientific applications can be expressed as coarse-grain workflows of complex data processing operations with data flow dependencies between them. Performance optimization of these workflows can be viewed as a search for a set of optimal values in a multi-dimensional parameter space. While some performance parameters such as grouping of workflow components and their mapping to machines do not a ect the accuracy of the output, others may dictate trading the output quality of individual components (and of the whole workflow) for performance. This paper describes an integrated framework which is capable of supporting performance optimizations along multiple dimensions of the parameter space. Using two real-world applications in the spatial data analysis domain, we present an experimental evaluation of the proposed framework.
DistMap: a toolkit for distributed short read mapping on a Hadoop cluster.
Pandey, Ram Vinay; Schlötterer, Christian
2013-01-01
With the rapid and steady increase of next generation sequencing data output, the mapping of short reads has become a major data analysis bottleneck. On a single computer, it can take several days to map the vast quantity of reads produced from a single Illumina HiSeq lane. In an attempt to ameliorate this bottleneck we present a new tool, DistMap - a modular, scalable and integrated workflow to map reads in the Hadoop distributed computing framework. DistMap is easy to use, currently supports nine different short read mapping tools and can be run on all Unix-based operating systems. It accepts reads in FASTQ format as input and provides mapped reads in a SAM/BAM format. DistMap supports both paired-end and single-end reads thereby allowing the mapping of read data produced by different sequencing platforms. DistMap is available from http://code.google.com/p/distmap/
DistMap: A Toolkit for Distributed Short Read Mapping on a Hadoop Cluster
Pandey, Ram Vinay; Schlötterer, Christian
2013-01-01
With the rapid and steady increase of next generation sequencing data output, the mapping of short reads has become a major data analysis bottleneck. On a single computer, it can take several days to map the vast quantity of reads produced from a single Illumina HiSeq lane. In an attempt to ameliorate this bottleneck we present a new tool, DistMap - a modular, scalable and integrated workflow to map reads in the Hadoop distributed computing framework. DistMap is easy to use, currently supports nine different short read mapping tools and can be run on all Unix-based operating systems. It accepts reads in FASTQ format as input and provides mapped reads in a SAM/BAM format. DistMap supports both paired-end and single-end reads thereby allowing the mapping of read data produced by different sequencing platforms. DistMap is available from http://code.google.com/p/distmap/ PMID:24009693
Adverse outcome pathway (AOP) provides a conceptual framework to evaluate and integrate chemical toxicity and its effects across the levels of biological organization. As such, it is essential to develop a resource-efficient and effective approach to extend molecular initiating ...
Progressive simplification and transmission of building polygons based on triangle meshes
NASA Astrophysics Data System (ADS)
Li, Hongsheng; Wang, Yingjie; Guo, Qingsheng; Han, Jiafu
2010-11-01
Digital earth is a virtual representation of our planet and a data integration platform which aims at harnessing multisource, multi-resolution, multi-format spatial data. This paper introduces a research framework integrating progressive cartographic generalization and transmission of vector data. The progressive cartographic generalization provides multiple resolution data from coarse to fine as key scales and increments between them which is not available in traditional generalization framework. Based on the progressive simplification algorithm, the building polygons are triangulated into meshes and encoded according to the simplification sequence of two basic operations, edge collapse and vertex split. The map data at key scales and encoded increments between them are stored in a multi-resolution file. As the client submits requests to the server, the coarsest map is transmitted first and then the increments. After data decoding and mesh refinement the building polygons with more details will be visualized. Progressive generalization and transmission of building polygons is demonstrated in the paper.
Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework
NASA Astrophysics Data System (ADS)
Wang, C.; Hu, F.; Sha, D.; Han, X.
2017-10-01
Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.
Cyberinfrastructure for the digital brain: spatial standards for integrating rodent brain atlases
Zaslavsky, Ilya; Baldock, Richard A.; Boline, Jyl
2014-01-01
Biomedical research entails capture and analysis of massive data volumes and new discoveries arise from data-integration and mining. This is only possible if data can be mapped onto a common framework such as the genome for genomic data. In neuroscience, the framework is intrinsically spatial and based on a number of paper atlases. This cannot meet today's data-intensive analysis and integration challenges. A scalable and extensible software infrastructure that is standards based but open for novel data and resources, is required for integrating information such as signal distributions, gene-expression, neuronal connectivity, electrophysiology, anatomy, and developmental processes. Therefore, the International Neuroinformatics Coordinating Facility (INCF) initiated the development of a spatial framework for neuroscience data integration with an associated Digital Atlasing Infrastructure (DAI). A prototype implementation of this infrastructure for the rodent brain is reported here. The infrastructure is based on a collection of reference spaces to which data is mapped at the required resolution, such as the Waxholm Space (WHS), a 3D reconstruction of the brain generated using high-resolution, multi-channel microMRI. The core standards of the digital atlasing service-oriented infrastructure include Waxholm Markup Language (WaxML): XML schema expressing a uniform information model for key elements such as coordinate systems, transformations, points of interest (POI)s, labels, and annotations; and Atlas Web Services: interfaces for querying and updating atlas data. The services return WaxML-encoded documents with information about capabilities, spatial reference systems (SRSs) and structures, and execute coordinate transformations and POI-based requests. Key elements of INCF-DAI cyberinfrastructure have been prototyped for both mouse and rat brain atlas sources, including the Allen Mouse Brain Atlas, UCSD Cell-Centered Database, and Edinburgh Mouse Atlas Project. PMID:25309417
Cyberinfrastructure for the digital brain: spatial standards for integrating rodent brain atlases.
Zaslavsky, Ilya; Baldock, Richard A; Boline, Jyl
2014-01-01
Biomedical research entails capture and analysis of massive data volumes and new discoveries arise from data-integration and mining. This is only possible if data can be mapped onto a common framework such as the genome for genomic data. In neuroscience, the framework is intrinsically spatial and based on a number of paper atlases. This cannot meet today's data-intensive analysis and integration challenges. A scalable and extensible software infrastructure that is standards based but open for novel data and resources, is required for integrating information such as signal distributions, gene-expression, neuronal connectivity, electrophysiology, anatomy, and developmental processes. Therefore, the International Neuroinformatics Coordinating Facility (INCF) initiated the development of a spatial framework for neuroscience data integration with an associated Digital Atlasing Infrastructure (DAI). A prototype implementation of this infrastructure for the rodent brain is reported here. The infrastructure is based on a collection of reference spaces to which data is mapped at the required resolution, such as the Waxholm Space (WHS), a 3D reconstruction of the brain generated using high-resolution, multi-channel microMRI. The core standards of the digital atlasing service-oriented infrastructure include Waxholm Markup Language (WaxML): XML schema expressing a uniform information model for key elements such as coordinate systems, transformations, points of interest (POI)s, labels, and annotations; and Atlas Web Services: interfaces for querying and updating atlas data. The services return WaxML-encoded documents with information about capabilities, spatial reference systems (SRSs) and structures, and execute coordinate transformations and POI-based requests. Key elements of INCF-DAI cyberinfrastructure have been prototyped for both mouse and rat brain atlas sources, including the Allen Mouse Brain Atlas, UCSD Cell-Centered Database, and Edinburgh Mouse Atlas Project.
Integrating biodiversity distribution knowledge: toward a global map of life.
Jetz, Walter; McPherson, Jana M; Guralnick, Robert P
2012-03-01
Global knowledge about the spatial distribution of species is orders of magnitude coarser in resolution than other geographically-structured environmental datasets such as topography or land cover. Yet such knowledge is crucial in deciphering ecological and evolutionary processes and in managing global change. In this review, we propose a conceptual and cyber-infrastructure framework for refining species distributional knowledge that is novel in its ability to mobilize and integrate diverse types of data such that their collective strengths overcome individual weaknesses. The ultimate aim is a public, online, quality-vetted 'Map of Life' that for every species integrates and visualizes available distributional knowledge, while also facilitating user feedback and dynamic biodiversity analyses. First milestones toward such an infrastructure have now been implemented. Copyright © 2011 Elsevier Ltd. All rights reserved.
2014-12-11
Cassava (Manihot esculenta Crantz) is a major staple crop in Africa, Asia, and South America, and its starchy roots provide nourishment for 800 million people worldwide. Although native to South America, cassava was brought to Africa 400-500 years ago and is now widely cultivated across sub-Saharan Africa, but it is subject to biotic and abiotic stresses. To assist in the rapid identification of markers for pathogen resistance and crop traits, and to accelerate breeding programs, we generated a framework map for M. esculenta Crantz from reduced representation sequencing [genotyping-by-sequencing (GBS)]. The composite 2412-cM map integrates 10 biparental maps (comprising 3480 meioses) and organizes 22,403 genetic markers on 18 chromosomes, in agreement with the observed karyotype. We used the map to anchor 71.9% of the draft genome assembly and 90.7% of the predicted protein-coding genes. The chromosome-anchored genome sequence will be useful for breeding improvement by assisting in the rapid identification of markers linked to important traits, and in providing a framework for genomic selection-enhanced breeding of this important crop. Copyright © 2015 International Cassava Genetic Map Consortium (ICGMC).
An integrated study for mapping the moisture distribution in an ancient damaged wall painting.
Capitani, Donatella; Proietti, Noemi; Gobbino, Marco; Soroldoni, Luigi; Casellato, Umberto; Valentini, Massimo; Rosina, Elisabetta
2009-12-01
An integrated study of microclimate monitoring, IR thermography (IRT), gravimetric tests and portable unilateral nuclear magnetic resonance (NMR) was applied in the framework of planning emergency intervention on a very deteriorated wall painting in San Rocco church, Cornaredo (Milan, Italy). The IRT investigation supported by gravimetric tests showed that the worst damage, due to water infiltration, was localized on the wall painting of the northern wall. Unilateral NMR, a new non-destructive technique which measures the hydrogen signal of the moisture and that was applied directly to the wall, allowed a detailed map of the distribution of the moisture in the plaster underlying the wall panting to be obtained. With a proper calibration of the integral of the recorded signal with suitable specimens, each area of the map corresponded to an accurate amount of moisture. IRT, gravimetric tests and unilateral NMR applied to investigate the northern wall painting showed the presence of two wet areas separated by a dry area. The moisture found in the lower area was ascribed to the occurrence of rising damp at the bottom of the wall due to the slope of the garden soil towards the northern exterior. The moisture found in the upper area was ascribed to condensation phenomena associated with the presence of a considerable amount of soluble, hygroscopic salts. In the framework of this integrated study, IRT investigation and gravimetric methods validated portable unilateral NMR as a new analytical tool for measuring in situ and without any sampling of the distribution and amount of moisture in wall paintings.
Project outputs will include: 1) the sustainability network and associated web pages; 2) sustainability indicators and associated maps representing the current values of the metrics; 3) an integrated assessment model of the impacts of electricity generation alternatives on a ...
Gao, Tian; Qiu, Ling; Chen, Cun-gen
2010-09-01
Based on the biotope classification system with vegetation structure as the framework, a modified biotope mapping model integrated with vegetation cover continuity attributes was developed, and applied to the study of the greenbelts in Helsingborg in southern Sweden. An evaluation of the vegetation cover continuity in the greenbelts was carried out by the comparisons of the vascular plant species richness in long- and short-continuity forests, based on the identification of woodland continuity by using ancient woodland indicator species (AWIS). In the test greenbelts, long-continuity woodlands had more AWIS. Among the forests where the dominant trees were more than 30-year-old, the long-continuity ones had a higher biodiversity of vascular plants, compared with the short-continuity ones with the similar vegetation structure. The modified biotope mapping model integrated with the continuity features of vegetation cover could be an important tool in investigating urban biodiversity, and provide corresponding strategies for future urban biodiversity conservation.
Salient object detection based on discriminative boundary and multiple cues integration
NASA Astrophysics Data System (ADS)
Jiang, Qingzhu; Wu, Zemin; Tian, Chang; Liu, Tao; Zeng, Mingyong; Hu, Lei
2016-01-01
In recent years, many saliency models have achieved good performance by taking the image boundary as the background prior. However, if all boundaries of an image are equally and artificially selected as background, misjudgment may happen when the object touches the boundary. We propose an algorithm called weighted contrast optimization based on discriminative boundary (wCODB). First, a background estimation model is reliably constructed through discriminating each boundary via Hausdorff distance. Second, the background-only weighted contrast is improved by fore-background weighted contrast, which is optimized through weight-adjustable optimization framework. Then to objectively estimate the quality of a saliency map, a simple but effective metric called spatial distribution of saliency map and mean saliency in covered window ratio (MSR) is designed. Finally, in order to further promote the detection result using MSR as the weight, we propose a saliency fusion framework to integrate three other cues-uniqueness, distribution, and coherence from three representative methods into our wCODB model. Extensive experiments on six public datasets demonstrate that our wCODB performs favorably against most of the methods based on boundary, and the integrated result outperforms all state-of-the-art methods.
Neural and Cognitive Plasticity: From Maps to Minds
ERIC Educational Resources Information Center
Mercado, Eduardo, III
2008-01-01
Some species and individuals are able to learn cognitive skills more flexibly than others. Learning experiences and cortical function are known to contribute to such differences, but the specific factors that determine an organism's intellectual capacities remain unclear. Here, an integrative framework is presented suggesting that variability in…
Mapping and monitoring potato cropping systems in Maine: geospatial methods and land use assessments
USDA-ARS?s Scientific Manuscript database
Geospatial frameworks and GIS-based approaches were used to assess current cropping practices in potato production systems in Maine. Results from the geospatial integration of remotely-sensed cropland layers (2008-2011) and soil datasets for Maine revealed a four-year potato systems footprint estima...
Geologic setting of the low-level burial grounds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lindsey, K.A.; Jaeger, G.K.; Slate, J.L.
1994-10-13
This report describes the regional and site specific geology of the Hanford Sites low-level burial grounds in the 200 East and West Areas. The report incorporates data from boreholes across the entire 200 Areas, integrating the geology of this area into a single framework. Geologic cross-sections, isopach maps, and structure contour maps of all major geological units from the top of the Columbia River Basalt Group to the surface are included. The physical properties and characteristics of the major suprabasalt sedimentary units also are discussed.
Putting people on the map through an approach that integrates social data in conservation planning.
Stephanson, Sheri L; Mascia, Michael B
2014-10-01
Conservation planning is integral to strategic and effective operations of conservation organizations. Drawing upon biological sciences, conservation planning has historically made limited use of social data. We offer an approach for integrating data on social well-being into conservation planning that captures and places into context the spatial patterns and trends in human needs and capacities. This hierarchical approach provides a nested framework for characterizing and mapping data on social well-being in 5 domains: economic well-being, health, political empowerment, education, and culture. These 5 domains each have multiple attributes; each attribute may be characterized by one or more indicators. Through existing or novel data that display spatial and temporal heterogeneity in social well-being, conservation scientists, planners, and decision makers may measure, benchmark, map, and integrate these data within conservation planning processes. Selecting indicators and integrating these data into conservation planning is an iterative, participatory process tailored to the local context and planning goals. Social well-being data complement biophysical and threat-oriented social data within conservation planning processes to inform decisions regarding where and how to conserve biodiversity, provide a structure for exploring socioecological relationships, and to foster adaptive management. Building upon existing conservation planning methods and insights from multiple disciplines, this approach to putting people on the map can readily merge with current planning practices to facilitate more rigorous decision making. © 2014 Society for Conservation Biology.
Okamura-Oho, Yuko; Shimokawa, Kazuro; Nishimura, Masaomi; Takemoto, Satoko; Sato, Akira; Furuichi, Teiichi; Yokota, Hideo
2014-01-01
Using a recently invented technique for gene expression mapping in the whole-anatomy context, termed transcriptome tomography, we have generated a dataset of 36,000 maps of overall gene expression in the adult-mouse brain. Here, using an informatics approach, we identified a broad co-expression network that follows an inverse power law and is rich in functional interaction and gene-ontology terms. Our framework for the integrated analysis of expression maps and graphs of co-expression networks revealed that groups of combinatorially expressed genes, which regulate cell differentiation during development, were present in the adult brain and each of these groups was associated with a discrete cell types. These groups included non-coding genes of unknown function. We found that these genes specifically linked developmentally conserved groups in the network. A previously unrecognized robust expression pattern covering the whole brain was related to the molecular anatomy of key biological processes occurring in particular areas. PMID:25382412
NASA Astrophysics Data System (ADS)
Qin, Rufu; Lin, Liangzhao
2017-06-01
Coastal seiches have become an increasingly important issue in coastal science and present many challenges, particularly when attempting to provide warning services. This paper presents the methodologies, techniques and integrated services adopted for the design and implementation of a Seiches Monitoring and Forecasting Integration Framework (SMAF-IF). The SMAF-IF is an integrated system with different types of sensors and numerical models and incorporates the Geographic Information System (GIS) and web techniques, which focuses on coastal seiche events detection and early warning in the North Jiangsu shoal, China. The in situ sensors perform automatic and continuous monitoring of the marine environment status and the numerical models provide the meteorological and physical oceanographic parameter estimates. A model outputs processing software was developed in C# language using ArcGIS Engine functions, which provides the capabilities of automatically generating visualization maps and warning information. Leveraging the ArcGIS Flex API and ASP.NET web services, a web based GIS framework was designed to facilitate quasi real-time data access, interactive visualization and analysis, and provision of early warning services for end users. The integrated framework proposed in this study enables decision-makers and the publics to quickly response to emergency coastal seiche events and allows an easy adaptation to other regional and scientific domains related to real-time monitoring and forecasting.
A Map/INS/Wi-Fi Integrated System for Indoor Location-Based Service Applications
Yu, Chunyang; Lan, Haiyu; Gu, Fuqiang; Yu, Fei; El-Sheimy, Naser
2017-01-01
In this research, a new Map/INS/Wi-Fi integrated system for indoor location-based service (LBS) applications based on a cascaded Particle/Kalman filter framework structure is proposed. Two-dimension indoor map information, together with measurements from an inertial measurement unit (IMU) and Received Signal Strength Indicator (RSSI) value, are integrated for estimating positioning information. The main challenge of this research is how to make effective use of various measurements that complement each other in order to obtain an accurate, continuous, and low-cost position solution without increasing the computational burden of the system. Therefore, to eliminate the cumulative drift caused by low-cost IMU sensor errors, the ubiquitous Wi-Fi signal and non-holonomic constraints are rationally used to correct the IMU-derived navigation solution through the extended Kalman Filter (EKF). Moreover, the map-aiding method and map-matching method are innovatively combined to constrain the primary Wi-Fi/IMU-derived position through an Auxiliary Value Particle Filter (AVPF). Different sources of information are incorporated through a cascaded structure EKF/AVPF filter algorithm. Indoor tests show that the proposed method can effectively reduce the accumulation of positioning errors of a stand-alone Inertial Navigation System (INS), and provide a stable, continuous and reliable indoor location service. PMID:28574471
A Map/INS/Wi-Fi Integrated System for Indoor Location-Based Service Applications.
Yu, Chunyang; Lan, Haiyu; Gu, Fuqiang; Yu, Fei; El-Sheimy, Naser
2017-06-02
In this research, a new Map/INS/Wi-Fi integrated system for indoor location-based service (LBS) applications based on a cascaded Particle/Kalman filter framework structure is proposed. Two-dimension indoor map information, together with measurements from an inertial measurement unit (IMU) and Received Signal Strength Indicator (RSSI) value, are integrated for estimating positioning information. The main challenge of this research is how to make effective use of various measurements that complement each other in order to obtain an accurate, continuous, and low-cost position solution without increasing the computational burden of the system. Therefore, to eliminate the cumulative drift caused by low-cost IMU sensor errors, the ubiquitous Wi-Fi signal and non-holonomic constraints are rationally used to correct the IMU-derived navigation solution through the extended Kalman Filter (EKF). Moreover, the map-aiding method and map-matching method are innovatively combined to constrain the primary Wi-Fi/IMU-derived position through an Auxiliary Value Particle Filter (AVPF). Different sources of information are incorporated through a cascaded structure EKF/AVPF filter algorithm. Indoor tests show that the proposed method can effectively reduce the accumulation of positioning errors of a stand-alone Inertial Navigation System (INS), and provide a stable, continuous and reliable indoor location service.
Mapping intra-urban transmission risk of dengue fever with big hourly cellphone data.
Mao, Liang; Yin, Ling; Song, Xiaoqing; Mei, Shujiang
2016-10-01
Cellphone tracking has been recently integrated into risk assessment of disease transmission, because travel behavior of disease carriers can be depicted in unprecedented details. Still in its infancy, such an integration has been limited to: 1) risk assessment only at national and provincial scales, where intra-urban human movements are neglected, and 2) using irregularly logged cellphone data that miss numerous user movements. Furthermore, few risk assessments have considered positional uncertainty of cellphone data. This study proposed a new framework for mapping intra-urban disease risk with regularly logged cellphone tracking data, taking the dengue fever in Shenzhen city as an example. Hourly tracking records of 5.85 million cellphone users, combined with the random forest classification and mosquito activities, were utilized to estimate the local transmission risk of dengue fever and the importation risk through travels. Stochastic simulations were further employed to quantify the uncertainty of risk. The resultant maps suggest targeted interventions to maximally reduce dengue cases exported to other places, as well as appropriate interventions to contain risk in places that import them. Given the popularity of cellphone use in urbanized areas, this framework can be adopted by other cities to design spatio-temporally resolved programs for disease control. Copyright © 2016 Elsevier B.V. All rights reserved.
Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site
NASA Astrophysics Data System (ADS)
Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.
2012-04-01
The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological Development, Area "Environment", Activity 1.3.3.1 "Prediction of triggering and risk assessment for landslides".
NASA Astrophysics Data System (ADS)
Kolkman, M. J.; Kok, M.; van der Veen, A.
The solution of complex, unstructured problems is faced with policy controversy and dispute, unused and misused knowledge, project delay and failure, and decline of public trust in governmental decisions. Mental model mapping (also called concept mapping) is a technique to analyse these difficulties on a fundamental cognitive level, which can reveal experiences, perceptions, assumptions, knowledge and subjective beliefs of stakeholders, experts and other actors, and can stimulate communication and learning. This article presents the theoretical framework from which the use of mental model mapping techniques to analyse this type of problems emerges as a promising technique. The framework consists of the problem solving or policy design cycle, the knowledge production or modelling cycle, and the (computer) model as interface between the cycles. Literature attributes difficulties in the decision-making process to communication gaps between decision makers, stakeholders and scientists, and to the construction of knowledge within different paradigm groups that leads to different interpretation of the problem situation. Analysis of the decision-making process literature indicates that choices, which are made in all steps of the problem solving cycle, are based on an individual decision maker’s frame of perception. This frame, in turn, depends on the mental model residing in the mind of the individual. Thus we identify three levels of awareness on which the decision process can be analysed. This research focuses on the third level. Mental models can be elicited using mapping techniques. In this way, analysing an individual’s mental model can shed light on decision-making problems. The steps of the knowledge production cycle are, in the same manner, ultimately driven by the mental models of the scientist in a specific discipline. Remnants of this mental model can be found in the resulting computer model. The characteristics of unstructured problems (complexity, uncertainty and disagreement) can be positioned in the framework, as can the communities of knowledge construction and valuation involved in the solution of these problems (core science, applied science, and professional consultancy, and “post-normal” science). Mental model maps, this research hypothesises, are suitable to analyse the above aspects of the problem. This hypothesis is tested for the case of the Zwolle storm surch barrier. Analysis can aid integration between disciplines, participation of public stakeholders, and can stimulate learning processes. Mental model mapping is recommended to visualise the use of knowledge, to analyse difficulties in problem solving process, and to aid information transfer and communication. Mental model mapping help scientists to shape their new, post-normal responsibilities in a manner that complies with integrity when dealing with unstructured problems in complex, multifunctional systems.
2011-01-01
Background A number of molecular marker linkage maps have been developed for melon (Cucumis melo L.) over the last two decades. However, these maps were constructed using different marker sets, thus, making comparative analysis among maps difficult. In order to solve this problem, a consensus genetic map in melon was constructed using primarily highly transferable anchor markers that have broad potential use for mapping, synteny, and comparative quantitative trait loci (QTL) analysis, increasing breeding effectiveness and efficiency via marker-assisted selection (MAS). Results Under the framework of the International Cucurbit Genomics Initiative (ICuGI, http://www.icugi.org), an integrated genetic map has been constructed by merging data from eight independent mapping experiments using a genetically diverse array of parental lines. The consensus map spans 1150 cM across the 12 melon linkage groups and is composed of 1592 markers (640 SSRs, 330 SNPs, 252 AFLPs, 239 RFLPs, 89 RAPDs, 15 IMAs, 16 indels and 11 morphological traits) with a mean marker density of 0.72 cM/marker. One hundred and ninety-six of these markers (157 SSRs, 32 SNPs, 6 indels and 1 RAPD) were newly developed, mapped or provided by industry representatives as released markers, including 27 SNPs and 5 indels from genes involved in the organic acid metabolism and transport, and 58 EST-SSRs. Additionally, 85 of 822 SSR markers contributed by Syngenta Seeds were included in the integrated map. In addition, 370 QTL controlling 62 traits from 18 previously reported mapping experiments using genetically diverse parental genotypes were also integrated into the consensus map. Some QTL associated with economically important traits detected in separate studies mapped to similar genomic positions. For example, independently identified QTL controlling fruit shape were mapped on similar genomic positions, suggesting that such QTL are possibly responsible for the phenotypic variability observed for this trait in a broad array of melon germplasm. Conclusions Even though relatively unsaturated genetic maps in a diverse set of melon market types have been published, the integrated saturated map presented herein should be considered the initial reference map for melon. Most of the mapped markers contained in the reference map are polymorphic in diverse collection of germplasm, and thus are potentially transferrable to a broad array of genetic experimentation (e.g., integration of physical and genetic maps, colinearity analysis, map-based gene cloning, epistasis dissection, and marker-assisted selection). PMID:21797998
Diaz, Aurora; Fergany, Mohamed; Formisano, Gelsomina; Ziarsolo, Peio; Blanca, José; Fei, Zhanjun; Staub, Jack E; Zalapa, Juan E; Cuevas, Hugo E; Dace, Gayle; Oliver, Marc; Boissot, Nathalie; Dogimont, Catherine; Pitrat, Michel; Hofstede, René; van Koert, Paul; Harel-Beja, Rotem; Tzuri, Galil; Portnoy, Vitaly; Cohen, Shahar; Schaffer, Arthur; Katzir, Nurit; Xu, Yong; Zhang, Haiying; Fukino, Nobuko; Matsumoto, Satoru; Garcia-Mas, Jordi; Monforte, Antonio J
2011-07-28
A number of molecular marker linkage maps have been developed for melon (Cucumis melo L.) over the last two decades. However, these maps were constructed using different marker sets, thus, making comparative analysis among maps difficult. In order to solve this problem, a consensus genetic map in melon was constructed using primarily highly transferable anchor markers that have broad potential use for mapping, synteny, and comparative quantitative trait loci (QTL) analysis, increasing breeding effectiveness and efficiency via marker-assisted selection (MAS). Under the framework of the International Cucurbit Genomics Initiative (ICuGI, http://www.icugi.org), an integrated genetic map has been constructed by merging data from eight independent mapping experiments using a genetically diverse array of parental lines. The consensus map spans 1150 cM across the 12 melon linkage groups and is composed of 1592 markers (640 SSRs, 330 SNPs, 252 AFLPs, 239 RFLPs, 89 RAPDs, 15 IMAs, 16 indels and 11 morphological traits) with a mean marker density of 0.72 cM/marker. One hundred and ninety-six of these markers (157 SSRs, 32 SNPs, 6 indels and 1 RAPD) were newly developed, mapped or provided by industry representatives as released markers, including 27 SNPs and 5 indels from genes involved in the organic acid metabolism and transport, and 58 EST-SSRs. Additionally, 85 of 822 SSR markers contributed by Syngenta Seeds were included in the integrated map. In addition, 370 QTL controlling 62 traits from 18 previously reported mapping experiments using genetically diverse parental genotypes were also integrated into the consensus map. Some QTL associated with economically important traits detected in separate studies mapped to similar genomic positions. For example, independently identified QTL controlling fruit shape were mapped on similar genomic positions, suggesting that such QTL are possibly responsible for the phenotypic variability observed for this trait in a broad array of melon germplasm. Even though relatively unsaturated genetic maps in a diverse set of melon market types have been published, the integrated saturated map presented herein should be considered the initial reference map for melon. Most of the mapped markers contained in the reference map are polymorphic in diverse collection of germplasm, and thus are potentially transferrable to a broad array of genetic experimentation (e.g., integration of physical and genetic maps, colinearity analysis, map-based gene cloning, epistasis dissection, and marker-assisted selection).
Lyons, Jessica
2014-12-11
Cassava Manihot esculenta Crantz) is a major staple crop in Africa, Asia, and South America, and its starchy roots provide nourishment for 800 million people worldwide. Although native to South America, cassava was brought to Africa 400–500 years ago and is now widely cultivated across sub-Saharan Africa, but it is subject to biotic and abiotic stresses. To assist in the rapid identification of markers for pathogen resistance and crop traits, and to accelerate breeding programs, we generated a framework map for M. esculent Crantz from reduced representation sequencing [genotyping-by-sequencing (GBS)]. The composite 2412-cM map integrates 10 biparental maps (comprising 3480more » meioses) and organizes 22,403 genetic markers on 18 chromosomes, in agreement with the observed karyotype. Here, we used the map to anchor 71.9% of the draft genome assembly and 90.7% of the predicted protein-coding genes. The chromosome-anchored genome sequence will be useful for breeding improvement by assisting in the rapid identification of markers linked to important traits, and in providing a framework for genomic selectionenhanced breeding of this important crop.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyons, Jessica
Cassava Manihot esculenta Crantz) is a major staple crop in Africa, Asia, and South America, and its starchy roots provide nourishment for 800 million people worldwide. Although native to South America, cassava was brought to Africa 400–500 years ago and is now widely cultivated across sub-Saharan Africa, but it is subject to biotic and abiotic stresses. To assist in the rapid identification of markers for pathogen resistance and crop traits, and to accelerate breeding programs, we generated a framework map for M. esculent Crantz from reduced representation sequencing [genotyping-by-sequencing (GBS)]. The composite 2412-cM map integrates 10 biparental maps (comprising 3480more » meioses) and organizes 22,403 genetic markers on 18 chromosomes, in agreement with the observed karyotype. Here, we used the map to anchor 71.9% of the draft genome assembly and 90.7% of the predicted protein-coding genes. The chromosome-anchored genome sequence will be useful for breeding improvement by assisting in the rapid identification of markers linked to important traits, and in providing a framework for genomic selectionenhanced breeding of this important crop.« less
Ecosystem services of boreal forests - Carbon budget mapping at high resolution.
Akujärvi, Anu; Lehtonen, Aleksi; Liski, Jari
2016-10-01
The carbon (C) cycle of forests produces ecosystem services (ES) such as climate regulation and timber production. Mapping these ES using simple land cover -based proxies might add remarkable inaccuracy to the estimates. A framework to map the current status of the C budget of boreal forested landscapes was developed. The C stocks of biomass and soil and the annual change in these stocks were quantified in a 20 × 20 m resolution at the regional level on mineral soils in southern Finland. The fine-scale variation of the estimates was analyzed geo-statistically. The reliability of the estimates was evaluated by comparing them to measurements from the national multi-source forest inventory. The C stocks of forests increased slightly from the south coast to inland whereas the changes in these stocks were more uniform. The spatial patches of C stocks were larger than those of C stock changes. The patch size of the C stocks reflected the spatial variation in the environmental conditions, and that of the C stock changes the typical area of forest management compartments. The simulated estimates agreed well with the measurements indicating a good mapping framework performance. The mapping framework is the basis for evaluating the effects of forest management alternatives on C budget at high resolution across large spatial scales. It will be coupled with the assessment of other ES and biodiversity to study their relationships. The framework integrated a wide suite of simulation models and extensive inventory data. It provided reliable estimates of the human influence on C cycle in forested landscapes. Copyright © 2016 Elsevier Ltd. All rights reserved.
A case study for the integration of predictive mineral potential maps
NASA Astrophysics Data System (ADS)
Lee, Saro; Oh, Hyun-Joo; Heo, Chul-Ho; Park, Inhye
2014-09-01
This study aims to elaborate on the mineral potential maps using various models and verify the accuracy for the epithermal gold (Au) — silver (Ag) deposits in a Geographic Information System (GIS) environment assuming that all deposits shared a common genesis. The maps of potential Au and Ag deposits were produced by geological data in Taebaeksan mineralized area, Korea. The methodological framework consists of three main steps: 1) identification of spatial relationships 2) quantification of such relationships and 3) combination of multiple quantified relationships. A spatial database containing 46 Au-Ag deposits was constructed using GIS. The spatial association between training deposits and 26 related factors were identified and quantified by probabilistic and statistical modelling. The mineral potential maps were generated by integrating all factors using the overlay method and recombined afterwards using the likelihood ratio model. They were verified by comparison with test mineral deposit locations. The verification revealed that the combined mineral potential map had the greatest accuracy (83.97%), whereas it was 72.24%, 65.85%, 72.23% and 71.02% for the likelihood ratio, weight of evidence, logistic regression and artificial neural network models, respectively. The mineral potential map can provide useful information for the mineral resource development.
Science strategy for Core Science Systems in the U.S. Geological Survey, 2013-2023
Bristol, R. Sky; Euliss, Ned H.; Booth, Nathaniel L.; Burkardt, Nina; Diffendorfer, Jay E.; Gesch, Dean B.; McCallum, Brian E.; Miller, David M.; Morman, Suzette A.; Poore, Barbara S.; Signell, Richard P.; Viger, Roland J.
2012-01-01
Core Science Systems is a new mission of the U.S. Geological Survey (USGS) that grew out of the 2007 Science Strategy, “Facing Tomorrow’s Challenges: U.S. Geological Survey Science in the Decade 2007–2017.” This report describes the vision for this USGS mission and outlines a strategy for Core Science Systems to facilitate integrated characterization and understanding of the complex earth system. The vision and suggested actions are bold and far-reaching, describing a conceptual model and framework to enhance the ability of USGS to bring its core strengths to bear on pressing societal problems through data integration and scientific synthesis across the breadth of science.The context of this report is inspired by a direction set forth in the 2007 Science Strategy. Specifically, ecosystem-based approaches provide the underpinnings for essentially all science themes that define the USGS. Every point on earth falls within a specific ecosystem where data, other information assets, and the expertise of USGS and its many partners can be employed to quantitatively understand how that ecosystem functions and how it responds to natural and anthropogenic disturbances. Every benefit society obtains from the planet—food, water, raw materials to build infrastructure, homes and automobiles, fuel to heat homes and cities, and many others, are derived from or effect ecosystems.The vision for Core Science Systems builds on core strengths of the USGS in characterizing and understanding complex earth and biological systems through research, modeling, mapping, and the production of high quality data on the nation’s natural resource infrastructure. Together, these research activities provide a foundation for ecosystem-based approaches through geologic mapping, topographic mapping, and biodiversity mapping. The vision describes a framework founded on these core mapping strengths that makes it easier for USGS scientists to discover critical information, share and publish results, and identify potential collaborations that transcend all USGS missions. The framework is designed to improve the efficiency of scientific work within USGS by establishing a means to preserve and recall data for future applications, organizing existing scientific knowledge and data to facilitate new use of older information, and establishing a future workflow that naturally integrates new data, applications, and other science products to make it easier and more efficient to conduct interdisciplinary research over time. Given the increasing need for integrated data and interdisciplinary approaches to solve modern problems, leadership by the Core Science Systems mission will facilitate problem solving by all USGS missions in ways not formerly possible.The report lays out a strategy to achieve this vision through three goals with accompanying objectives and actions. The first goal builds on and enhances the strengths of the Core Science Systems mission in characterizing and understanding the earth system from the geologic framework to the topographic characteristics of the land surface and biodiversity across the nation. The second goal enhances and develops new strengths in computer and information science to make it easier for USGS scientists to discover data and models, share and publish results, and discover connections between scientific information and knowledge. The third goal brings additional focus to research and development methods to address complex issues affecting society that require integration of knowledge and new methods for synthesizing scientific information. Collectively, the report lays out a strategy to create a seamless connection between all USGS activities to accelerate and make USGS science more efficient by fully integrating disciplinary expertise within a new and evolving science paradigm for a changing world in the 21st century.
Geological mapping goes 3-D in response to societal needs
Thorleifson, H.; Berg, R.C.; Russell, H.A.J.
2010-01-01
The transition to 3-D mapping has been made possible by technological advances in digital cartography, GIS, data storage, analysis, and visualization. Despite various challenges, technological advancements facilitated a gradual transition from 2-D maps to 2.5-D draped maps to 3-D geological mapping, supported by digital spatial and relational databases that can be interrogated horizontally or vertically and viewed interactively. Challenges associated with data collection, human resources, and information management are daunting due to their resource and training requirements. The exchange of strategies at the workshops has highlighted the use of basin analysis to develop a process-based predictive knowledge framework that facilitates data integration. Three-dimensional geological information meets a public demand that fills in the blanks left by conventional 2-D mapping. Two-dimensional mapping will, however, remain the standard method for extensive areas of complex geology, particularly where deformed igneous and metamorphic rocks defy attempts at 3-D depiction.
The Multiple Abilities Paradigm: Integrated General and Special Education Teacher Preparation.
ERIC Educational Resources Information Center
Ellis, Edwin S.; And Others
1995-01-01
The Multiple Abilities Program (MAP) at the University of Alabama is a five-semester, competency-based preservice program preparing teachers to teach all students regardless of settings or disability labels. This article outlines the program rationale, organizational framework, and the program feature in which undergraduates spend over 50 percent…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahmadi, Rouhollah, E-mail: rouhollahahmadi@yahoo.com; Khamehchi, Ehsan
Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks andmore » fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.« less
Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero
2011-03-24
High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.
To the National Map and beyond
Kelmelis, J.
2003-01-01
Scientific understanding, technology, and social, economic, and environmental conditions have driven a rapidly changing demand for geographic information, both digital and analog. For more than a decade, the U.S. Geological Survey (USGS) has been developing innovative partnerships with other government agencies and private industry to produce and distribute geographic information efficiently; increase activities in remote sensing to ensure ongoing monitoring of the land surface; and develop new understanding of the causes and consequences of land surface change. These activities are now contributing to a more robust set of geographic information called The National Map (TNM). The National Map is designed to provide an up-to-date, seamless, horizontally and vertically integrated set of basic digital geographic data, a frequent monitoring of changes on the land surface, and an understanding of the condition of the Earth's surface and many of the processes that shape it. The USGS has reorganized its National Mapping Program into three programs to address the continuum of scientific activities-describing (mapping), monitoring, understanding, modeling, and predicting. The Cooperative Topographic Mapping Program focuses primarily on the mapping and revision aspects of TNM. The National Map also includes results from the Land Remote Sensing and Geographic Analysis and Monitoring Programs that provide continual updates, new insights, and analytical tools. The National Map is valuable as a framework for current research, management, and operational activities. It also provides a critical framework for the development of distributed, spatially enabled decision support systems.
Interoperable cross-domain semantic and geospatial framework for automatic change detection
NASA Astrophysics Data System (ADS)
Kuo, Chiao-Ling; Hong, Jung-Hong
2016-01-01
With the increasingly diverse types of geospatial data established over the last few decades, semantic interoperability in integrated applications has attracted much interest in the field of Geographic Information System (GIS). This paper proposes a new strategy and framework to process cross-domain geodata at the semantic level. This framework leverages the semantic equivalence of concepts between domains through bridge ontology and facilitates the integrated use of different domain data, which has been long considered as an essential superiority of GIS, but is impeded by the lack of understanding about the semantics implicitly hidden in the data. We choose the task of change detection to demonstrate how the introduction of ontology concept can effectively make the integration possible. We analyze the common properties of geodata and change detection factors, then construct rules and summarize possible change scenario for making final decisions. The use of topographic map data to detect changes in land use shows promising success, as far as the improvement of efficiency and level of automation is concerned. We believe the ontology-oriented approach will enable a new way for data integration across different domains from the perspective of semantic interoperability, and even open a new dimensionality for the future GIS.
NASA Astrophysics Data System (ADS)
Hinsby, Klaus; Broers, Hans Peter
2014-05-01
The EU Water Framework and Groundwater Directives stipulate that EU member states (MS) should ensure good groundwater chemical and quantitative by 2015. For the assessment of good chemical status the MS have to establish Natural Background Levels (NBLs) and Threshold Values (TVs) for groundwater bodies at risk and compare current concentration levels to these. In addition the MS shall ensure trend reversals in cases where contaminants or water levels show critical increasing or decreasing trends. The EU MS have to demonstrate that the quantitative and chemical status of its groundwater bodies does not put drinking water, ecosystems or other legitimate uses at risk. Easy on-line access to relevant visualizations of groundwater quality and quantity data of e.g. nitrate, chloride, arsenic and water tables in Europe's major aquifer types compiled from national databases would be of great importance for managers, authorities and scientists conducting risk and status assessments. The Water Resources Expert Group of the EuroGeoSurveys propose to develop Pan-European interactive on-line digital maps and visualizations of concentrations levels and trends, as well as calculated natural background levels and threshold values for the most important aquifer types of Europe mainly derived based on principles established in the former EU project "BRIDGE" - Background cRiteria for the IDentification of Groundwater Thresholds. Further, we propose to develop Pan-European digital and dynamic maps and cross sections in close collaboration with ecologists, which delineate dependent or associated terrestrial and aquatic ecosystems across Europe where groundwater quantity and quality plays a significant role in sustaining good ecological status of the ecosystem, and where the water resources and ecosystems are most vulnerable to climate change. Finally, integrated water resources management requires integrated consideration of both deep and shallow groundwater and surface water issues and interaction. It is therefore proposed to map regions of Europe that use coupled groundwater-surface water models in integrated water resources and river basin management. In the presentation we will show selected examples of data visualizations of importance to integrated water resources and river basin management and the implementation of the Water Framework Directive.
Matching methods evaluation framework for stereoscopic breast x-ray images.
Rousson, Johanna; Naudin, Mathieu; Marchessoux, Cédric
2016-01-01
Three-dimensional (3-D) imaging has been intensively studied in the past few decades. Depth information is an important added value of 3-D systems over two-dimensional systems. Special focuses were devoted to the development of stereo matching methods for the generation of disparity maps (i.e., depth information within a 3-D scene). Dedicated frameworks were designed to evaluate and rank the performance of different stereo matching methods but never considering x-ray medical images. Yet, 3-D x-ray acquisition systems and 3-D medical displays have already been introduced into the diagnostic market. To access the depth information within x-ray stereoscopic images, computing accurate disparity maps is essential. We aimed at developing a framework dedicated to x-ray stereoscopic breast images used to evaluate and rank several stereo matching methods. A multiresolution pyramid optimization approach was integrated to the framework to increase the accuracy and the efficiency of the stereo matching techniques. Finally, a metric was designed to score the results of the stereo matching compared with the ground truth. Eight methods were evaluated and four of them [locally scaled sum of absolute differences (LSAD), zero mean sum of absolute differences, zero mean sum of squared differences, and locally scaled mean sum of squared differences] appeared to perform equally good with an average error score of 0.04 (0 is the perfect matching). LSAD was selected for generating the disparity maps.
NASA Astrophysics Data System (ADS)
Sacchetti, F.; Benetti, S.; Fitzpatrick, F.
2006-12-01
During the last six years, the Geological Survey of Ireland and the Marine Institute of Ireland worked together on the multimillion Irish National Seabed Survey project with the purpose of mapping the Irish marine territory using a suite of remote sensing equipment, from multibeam to seismic, achieving 87% coverage of the marine zone. Ireland was the first country in the world to carry out an extensive mapping project of their extended Exclusive Economic Zone. The Irish National Seabed Survey is now succeeded by the multiyear INFOMAR Programme. INFOMAR will concentrate initially on mapping twenty-six selected priority bays, three sea areas and the fisheries-protection "Biologically Sensitive Area", and then will complete 100% mapping of the remainder of the EEZ. Designed to incorporate all elements of an integrated mapping programme, the key data acquisition will include hydrography, oceanographic, geological and heritage data. These data sets discharge Ireland's obligations under international treaties to which she is signatory and the uses of these data are vast and multipurpose: from management plans for inshore fishing, aquaculture, coastal protection and engineering works, to environmental impact assessments related to licensing activity and support to the evolving needs of integrated coastal zone management. INFOMAR also includes a data management, exchange and integration programme for the establishment of a National Marine Data Discovery and Exchange Service; providing improved dissemination of information to researchers, policy makers, the public and private sector and the adoption of standard operating procedures in data management to facilitate inter-agency data integration. During the first year of activity, INFOMAR carried out an integrated survey from the national research vessel, the RV Celtic Explorer, acquiring hydrographic, geophysical and groundtruthing data from Bantry and Dunmanus Bays, located off the South West coast of Ireland. Airborne LiDAR (Light Detection And Ranging) and small-vessel mapping surveys have also been carried out, giving detailed bathymetric, topographic and habitat information for the shoaler waters and inshore areas. This presentation will focus both on the general framework and scope of INFOMAR and the initial results and experiences of this year's survey.
Partnerships - Working Together to Build The National Map
,
2004-01-01
Through The National Map, the U.S. Geological Survey (USGS) is working with partners to ensure that current, accurate, and complete base geographic information is available for the Nation. Designed as a network of online digital databases, it provides a consistent geographic data framework for the country and serves as a foundation for integrating, sharing, and using data easily and reliably. It provides public access to high quality geospatial data and information from multiple partners to help inform decisionmaking by resource managers and the public, and to support intergovernmental homeland security and emergency management requirements.
NASA Astrophysics Data System (ADS)
Lorek, Dariusz
2016-12-01
The article presents a framework for integrating historical sources with elements of the geographical space recorded in unique cartographic materials. The aim of the project was to elaborate a method of integrating spatial data sources that would facilitate studying and presenting the phenomena of economic history. The proposed methodology for multimedia integration of old materials made it possible to demonstrate the successive stages of the transformation which was characteristic of the 19th-century space. The point of reference for this process of integrating information was topographic maps from the first half of the 19th century, while the research area comprised the castle complex in Kórnik together with the small town - the pre-industrial landscape in Wielkopolska (Greater Poland). On the basis of map and plan transformation, graphic processing of the scans of old drawings, texture mapping of the facades of historic buildings, and a 360° panorama, the source material collected was integrated. The final product was a few-minute-long video, composed of nine sequences. It captures the changing form of the castle building together with its facades, the castle park, and its further topographic and urban surroundings, since the beginning of the 19th century till the present day. For a topographic map sheet dating back to the first half of the 19th century, in which the hachuring method had been used to present land relief, a terrain model was generated. The transition from parallel to bird's-eye-view perspective served to demonstrate the distinctive character of the pre-industrial landscape.
Knowledge integration: conceptualizing communications in cancer control systems.
Best, Allan; Hiatt, Robert A; Norman, Cameron D
2008-06-01
This paper was prepared by the National Cancer Institute of Canada (NCIC) Working Group on Translational Research and Knowledge Transfer. The goal was to nurture common ground upon which to build a platform for translating what we know about cancer into what we do in practice and policy. Methods included expert panels, literature review, and concept mapping, to develop a framework that built on earlier cancer control conceptualizations of communications that have guided researchers and end users. The concept of 'knowledge integration' is used to describe the resulting refinement and the nature of evidence necessary for decision-making to at the systems level. Current evidence for knowledge integration in cancer control is presented across the levels of individual, organizational and systems level interventions and across basic, clinical and population science knowledge bases. A systems-oriented approach to integrating evidence into action assists organizations to conduct research and policy and practice. Practitioners can use this framework to understand the challenges of implementing and evaluating cancer control strategies.
Using the Knowledge to Action Framework in practice: a citation analysis and systematic review.
Field, Becky; Booth, Andrew; Ilott, Irene; Gerrish, Kate
2014-11-23
Conceptual frameworks are recommended as a way of applying theory to enhance implementation efforts. The Knowledge to Action (KTA) Framework was developed in Canada by Graham and colleagues in the 2000s, following a review of 31 planned action theories. The framework has two components: Knowledge Creation and an Action Cycle, each of which comprises multiple phases. This review sought to answer two questions: 'Is the KTA Framework used in practice? And if so, how?' This study is a citation analysis and systematic review. The index citation for the original paper was identified on three databases-Web of Science, Scopus and Google Scholar-with the facility for citation searching. Limitations of English language and year of publication 2006-June 2013 were set. A taxonomy categorising the continuum of usage was developed. Only studies applying the framework to implementation projects were included. Data were extracted and mapped against each phase of the framework for studies where it was integral to the implementation project. The citation search yielded 1,787 records. A total of 1,057 titles and abstracts were screened. One hundred and forty-six studies described usage to varying degrees, ranging from referenced to integrated. In ten studies, the KTA Framework was integral to the design, delivery and evaluation of the implementation activities. All ten described using the Action Cycle and seven referred to Knowledge Creation. The KTA Framework was enacted in different health care and academic settings with projects targeted at patients, the public, and nursing and allied health professionals. The KTA Framework is being used in practice with varying degrees of completeness. It is frequently cited, with usage ranging from simple attribution via a reference, through informing planning, to making an intellectual contribution. When the framework was integral to knowledge translation, it guided action in idiosyncratic ways and there was theory fidelity. Prevailing wisdom encourages the use of theories, models and conceptual frameworks, yet their application is less evident in practice. This may be an artefact of reporting, indicating that prospective, primary research is needed to explore the real value of the KTA Framework and similar tools.
Bainbridge, Daryl; Brazil, Kevin; Ploeg, Jenny; Krueger, Paul; Taniguchi, Alan
2016-06-01
Healthcare integration is a priority in many countries, yet there remains little direction on how to systematically evaluate this construct to inform further development. The examination of community-based palliative care networks provides an ideal opportunity for the advancement of integration measures, in consideration of how fundamental provider cohesion is to effective care at end of life. This article presents a variable-oriented analysis from a theory-based case study of a palliative care network to help bridge the knowledge gap in integration measurement. Data from a mixed-methods case study were mapped to a conceptual framework for evaluating integrated palliative care and a visual array depicting the extent of key factors in the represented palliative care network was formulated. The study included data from 21 palliative care network administrators, 86 healthcare professionals, and 111 family caregivers, all from an established palliative care network in Ontario, Canada. The framework used to guide this research proved useful in assessing qualities of integration and functioning in the palliative care network. The resulting visual array of elements illustrates that while this network performed relatively well at the multiple levels considered, room for improvement exists, particularly in terms of interventions that could facilitate the sharing of information. This study, along with the other evaluative examples mentioned, represents important initial attempts at empirically and comprehensively examining network-integrated palliative care and healthcare integration in general. © The Author(s) 2016.
Collaborative Learning Utilizing a Domain-Based Shared Data Repository to Enhance Learning Outcomes
ERIC Educational Resources Information Center
Lubliner, David; Widmeyer, George; Deek, Fadi P.
2009-01-01
The objective of this study was to determine whether there was a quantifiable improvement in learning outcomes by integrating course materials in a 4-year baccalaureate program, utilizing a knowledge repository with a conceptual map that spans a discipline. Two new models were developed to provide the framework for this knowledge repository. A…
A systematic framework for Monte Carlo simulation of remote sensing errors map in carbon assessments
S. Healey; P. Patterson; S. Urbanski
2014-01-01
Remotely sensed observations can provide unique perspective on how management and natural disturbance affect carbon stocks in forests. However, integration of these observations into formal decision support will rely upon improved uncertainty accounting. Monte Carlo (MC) simulations offer a practical, empirical method of accounting for potential remote sensing errors...
NASA Astrophysics Data System (ADS)
Saleh, F.; Ramaswamy, V.; Georgas, N.; Blumberg, A. F.; Wang, Y.
2016-12-01
Advances in computational resources and modeling techniques are opening the path to effectively integrate existing complex models. In the context of flood prediction, recent extreme events have demonstrated the importance of integrating components of the hydrosystem to better represent the interactions amongst different physical processes and phenomena. As such, there is a pressing need to develop holistic and cross-disciplinary modeling frameworks that effectively integrate existing models and better represent the operative dynamics. This work presents a novel Hydrologic-Hydraulic-Hydrodynamic Ensemble (H3E) flood prediction framework that operationally integrates existing predictive models representing coastal (New York Harbor Observing and Prediction System, NYHOPS), hydrologic (US Army Corps of Engineers Hydrologic Modeling System, HEC-HMS) and hydraulic (2-dimensional River Analysis System, HEC-RAS) components. The state-of-the-art framework is forced with 125 ensemble meteorological inputs from numerical weather prediction models including the Global Ensemble Forecast System, the European Centre for Medium-Range Weather Forecasts (ECMWF), the Canadian Meteorological Centre (CMC), the Short Range Ensemble Forecast (SREF) and the North American Mesoscale Forecast System (NAM). The framework produces, within a 96-hour forecast horizon, on-the-fly Google Earth flood maps that provide critical information for decision makers and emergency preparedness managers. The utility of the framework was demonstrated by retrospectively forecasting an extreme flood event, hurricane Sandy in the Passaic and Hackensack watersheds (New Jersey, USA). Hurricane Sandy caused significant damage to a number of critical facilities in this area including the New Jersey Transit's main storage and maintenance facility. The results of this work demonstrate that ensemble based frameworks provide improved flood predictions and useful information about associated uncertainties, thus improving the assessment of risks as when compared to a deterministic forecast. The work offers perspectives for short-term flood forecasts, flood mitigation strategies and best management practices for climate change scenarios.
a Conceptual Framework for Indoor Mapping by Using Grammars
NASA Astrophysics Data System (ADS)
Hu, X.; Fan, H.; Zipf, A.; Shang, J.; Gu, F.
2017-09-01
Maps are the foundation of indoor location-based services. Many automatic indoor mapping approaches have been proposed, but they rely highly on sensor data, such as point clouds and users' location traces. To address this issue, this paper presents a conceptual framework to represent the layout principle of research buildings by using grammars. This framework can benefit the indoor mapping process by improving the accuracy of generated maps and by dramatically reducing the volume of the sensor data required by traditional reconstruction approaches. In addition, we try to present more details of partial core modules of the framework. An example using the proposed framework is given to show the generation process of a semantic map. This framework is part of an ongoing research for the development of an approach for reconstructing semantic maps.
NASA Astrophysics Data System (ADS)
Hassanzadeh, Elmira; Elshorbagy, Amin; Wheater, Howard; Gober, Patricia
2015-04-01
Climate uncertainty can affect water resources availability and management decisions. Sustainable water resources management therefore requires evaluation of policy and management decisions under a wide range of possible future water supply conditions. This study proposes a risk-based framework to integrate water supply uncertainty into a forward-looking decision making context. To apply this framework, a stochastic reconstruction scheme is used to generate a large ensemble of flow series. For the Rocky Mountain basins considered here, two key characteristics of the annual hydrograph are its annual flow volume and the timing of the seasonal flood peak. These are perturbed to represent natural randomness and potential changes due to future climate. 30-year series of perturbed flows are used as input to the SWAMP model - an integrated water resources model that simulates regional water supply-demand system and estimates economic productivity of water and other sustainability indicators, including system vulnerability and resilience. The simulation results are used to construct 2D-maps of net revenue of a particular water sector; e.g., hydropower, or for all sectors combined. Each map cell represents a risk scenario of net revenue based on a particular annual flow volume, timing of the peak flow, and 200 stochastic realizations of flow series. This framework is demonstrated for a water resources system in the Saskatchewan River Basin (SaskRB) in Saskatchewan, Canada. Critical historical drought sequences, derived from tree-ring reconstructions of several hundred years of annual river flows, are used to evaluate the system's performance (net revenue risk) under extremely low flow conditions and also to locate them on the previously produced 2D risk maps. This simulation and analysis framework is repeated under various reservoir operation strategies (e.g., maximizing flood protection or maximizing water supply security); development proposals, such as irrigation expansion; and change in energy prices. Such risk-based analysis demonstrates relative reduction/increase of risk associated with management and policy decisions and allow decision makers to explore the relative importance of policy versus natural water supply change in a water resources system.
Cloudgene: A graphical execution platform for MapReduce programs on private and public clouds
2012-01-01
Background The MapReduce framework enables a scalable processing and analyzing of large datasets by distributing the computational load on connected computer nodes, referred to as a cluster. In Bioinformatics, MapReduce has already been adopted to various case scenarios such as mapping next generation sequencing data to a reference genome, finding SNPs from short read data or matching strings in genotype files. Nevertheless, tasks like installing and maintaining MapReduce on a cluster system, importing data into its distributed file system or executing MapReduce programs require advanced knowledge in computer science and could thus prevent scientists from usage of currently available and useful software solutions. Results Here we present Cloudgene, a freely available platform to improve the usability of MapReduce programs in Bioinformatics by providing a graphical user interface for the execution, the import and export of data and the reproducibility of workflows on in-house (private clouds) and rented clusters (public clouds). The aim of Cloudgene is to build a standardized graphical execution environment for currently available and future MapReduce programs, which can all be integrated by using its plug-in interface. Since Cloudgene can be executed on private clusters, sensitive datasets can be kept in house at all time and data transfer times are therefore minimized. Conclusions Our results show that MapReduce programs can be integrated into Cloudgene with little effort and without adding any computational overhead to existing programs. This platform gives developers the opportunity to focus on the actual implementation task and provides scientists a platform with the aim to hide the complexity of MapReduce. In addition to MapReduce programs, Cloudgene can also be used to launch predefined systems (e.g. Cloud BioLinux, RStudio) in public clouds. Currently, five different bioinformatic programs using MapReduce and two systems are integrated and have been successfully deployed. Cloudgene is freely available at http://cloudgene.uibk.ac.at. PMID:22888776
Local adaptive tone mapping for video enhancement
NASA Astrophysics Data System (ADS)
Lachine, Vladimir; Dai, Min (.
2015-03-01
As new technologies like High Dynamic Range cameras, AMOLED and high resolution displays emerge on consumer electronics market, it becomes very important to deliver the best picture quality for mobile devices. Tone Mapping (TM) is a popular technique to enhance visual quality. However, the traditional implementation of Tone Mapping procedure is limited by pixel's value to value mapping, and the performance is restricted in terms of local sharpness and colorfulness. To overcome the drawbacks of traditional TM, we propose a spatial-frequency based framework in this paper. In the proposed solution, intensity component of an input video/image signal is split on low pass filtered (LPF) and high pass filtered (HPF) bands. Tone Mapping (TM) function is applied to LPF band to improve the global contrast/brightness, and HPF band is added back afterwards to keep the local contrast. The HPF band may be adjusted by a coring function to avoid noise boosting and signal overshooting. Colorfulness of an original image may be preserved or enhanced by chroma components correction by means of saturation function. Localized content adaptation is further improved by dividing an image to a set of non-overlapped regions and modifying each region individually. The suggested framework allows users to implement a wide range of tone mapping applications with perceptional local sharpness and colorfulness preserved or enhanced. Corresponding hardware circuit may be integrated in camera, video or display pipeline with minimal hardware budget
Cross-Population Joint Analysis of eQTLs: Fine Mapping and Functional Annotation
Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger
2015-01-01
Mapping expression quantitative trait loci (eQTLs) has been shown as a powerful tool to uncover the genetic underpinnings of many complex traits at molecular level. In this paper, we present an integrative analysis approach that leverages eQTL data collected from multiple population groups. In particular, our approach effectively identifies multiple independent cis-eQTL signals that are consistent across populations, accounting for population heterogeneity in allele frequencies and linkage disequilibrium patterns. Furthermore, by integrating genomic annotations, our analysis framework enables high-resolution functional analysis of eQTLs. We applied our statistical approach to analyze the GEUVADIS data consisting of samples from five population groups. From this analysis, we concluded that i) jointly analysis across population groups greatly improves the power of eQTL discovery and the resolution of fine mapping of causal eQTL ii) many genes harbor multiple independent eQTLs in their cis regions iii) genetic variants that disrupt transcription factor binding are significantly enriched in eQTLs (p-value = 4.93 × 10-22). PMID:25906321
USDA-ARS?s Scientific Manuscript database
Verticillium wilt (VW) of alfalfa is a soilborne disease that causes severe yield loss in alfalfa. To identify molecular markers associated with VW resistance, an integrated framework of genome-wide association study (GWAS) with high-throughput genotyping by sequencing (GBS) was used for mapping lo...
ERIC Educational Resources Information Center
Charles, Leslin H.
2017-01-01
Many academic librarians in the state of New Jersey (NJ) have successfully integrated information literacy (IL) into the curriculum using the ACRL IL Competency Standards for Higher Education ("Standards"). These "Standards" formed the underpinnings of IL curriculum mapping and assessment plans, and have been adopted by…
HOTEX: An Approach for Global Mapping of Human Built-Up and Settlement Extent
NASA Technical Reports Server (NTRS)
Wang, Panshi; Huang, Chengquan; Tilton, James C.; Tan, Bin; Brown De Colstoun, Eric C.
2017-01-01
Understanding the impacts of urbanization requires accurate and updatable urban extent maps. Here we present an algorithm for mapping urban extent at global scale using Landsat data. An innovative hierarchical object-based texture (HOTex) classification approach was designed to overcome spectral confusion between urban and nonurban land cover types. VIIRS nightlights data and MODIS vegetation index datasets are integrated as high-level features under an object-based framework. We applied the HOTex method to the GLS-2010 Landsat images to produce a global map of human built-up and settlement extent. As shown by visual assessments, our method could effectively map urban extent and generate consistent results using images with inconsistent acquisition time and vegetation phenology. Using scene-level cross validation for results in Europe, we assessed the performance of HOTex and achieved a kappa coefficient of 0.91, compared to 0.74 from a baseline method using per-pixel classification using spectral information.
Jairin, Jirapong; Kobayashi, Tetsuya; Yamagata, Yoshiyuki; Sanada-Morimura, Sachiyo; Mori, Kazuki; Tashiro, Kosuke; Kuhara, Satoru; Kuwazaki, Seigo; Urio, Masahiro; Suetsugu, Yoshitaka; Yamamoto, Kimiko; Matsumura, Masaya; Yasui, Hideshi
2013-01-01
In this study, we developed the first genetic linkage map for the major rice insect pest, the brown planthopper (BPH, Nilaparvata lugens). The linkage map was constructed by integrating linkage data from two backcross populations derived from three inbred BPH strains. The consensus map consists of 474 simple sequence repeats, 43 single-nucleotide polymorphisms, and 1 sequence-tagged site, for a total of 518 markers at 472 unique positions in 17 linkage groups. The linkage groups cover 1093.9 cM, with an average distance of 2.3 cM between loci. The average number of marker loci per linkage group was 27.8. The sex-linkage group was identified by exploiting X-linked and Y-specific markers. Our linkage map and the newly developed markers used to create it constitute an essential resource and a useful framework for future genetic analyses in BPH. PMID:23204257
A service-based framework for pharmacogenomics data integration
NASA Astrophysics Data System (ADS)
Wang, Kun; Bai, Xiaoying; Li, Jing; Ding, Cong
2010-08-01
Data are central to scientific research and practices. The advance of experiment methods and information retrieval technologies leads to explosive growth of scientific data and databases. However, due to the heterogeneous problems in data formats, structures and semantics, it is hard to integrate the diversified data that grow explosively and analyse them comprehensively. As more and more public databases are accessible through standard protocols like programmable interfaces and Web portals, Web-based data integration becomes a major trend to manage and synthesise data that are stored in distributed locations. Mashup, a Web 2.0 technique, presents a new way to compose content and software from multiple resources. The paper proposes a layered framework for integrating pharmacogenomics data in a service-oriented approach using the mashup technology. The framework separates the integration concerns from three perspectives including data, process and Web-based user interface. Each layer encapsulates the heterogeneous issues of one aspect. To facilitate the mapping and convergence of data, the ontology mechanism is introduced to provide consistent conceptual models across different databases and experiment platforms. To support user-interactive and iterative service orchestration, a context model is defined to capture information of users, tasks and services, which can be used for service selection and recommendation during a dynamic service composition process. A prototype system is implemented and cases studies are presented to illustrate the promising capabilities of the proposed approach.
Semantic Integration for Marine Science Interoperability Using Web Technologies
NASA Astrophysics Data System (ADS)
Rueda, C.; Bermudez, L.; Graybeal, J.; Isenor, A. W.
2008-12-01
The Marine Metadata Interoperability Project, MMI (http://marinemetadata.org) promotes the exchange, integration, and use of marine data through enhanced data publishing, discovery, documentation, and accessibility. A key effort is the definition of an Architectural Framework and Operational Concept for Semantic Interoperability (http://marinemetadata.org/sfc), which is complemented with the development of tools that realize critical use cases in semantic interoperability. In this presentation, we describe a set of such Semantic Web tools that allow performing important interoperability tasks, ranging from the creation of controlled vocabularies and the mapping of terms across multiple ontologies, to the online registration, storage, and search services needed to work with the ontologies (http://mmisw.org). This set of services uses Web standards and technologies, including Resource Description Framework (RDF), Web Ontology language (OWL), Web services, and toolkits for Rich Internet Application development. We will describe the following components: MMI Ontology Registry: The MMI Ontology Registry and Repository provides registry and storage services for ontologies. Entries in the registry are associated with projects defined by the registered users. Also, sophisticated search functions, for example according to metadata items and vocabulary terms, are provided. Client applications can submit search requests using the WC3 SPARQL Query Language for RDF. Voc2RDF: This component converts an ASCII comma-delimited set of terms and definitions into an RDF file. Voc2RDF facilitates the creation of controlled vocabularies by using a simple form-based user interface. Created vocabularies and their descriptive metadata can be submitted to the MMI Ontology Registry for versioning and community access. VINE: The Vocabulary Integration Environment component allows the user to map vocabulary terms across multiple ontologies. Various relationships can be established, for example exactMatch, narrowerThan, and subClassOf. VINE can compute inferred mappings based on the given associations. Attributes about each mapping, like comments and a confidence level, can also be included. VINE also supports registering and storing resulting mapping files in the Ontology Registry. The presentation will describe the application of semantic technologies in general, and our planned applications in particular, to solve data management problems in the marine and environmental sciences.
Merlin, Jessica S; Young, Sarah R; Johnson, Mallory O; Saag, Michael; Demonte, William; Kerns, Robert; Bair, Matthew J; Kertesz, Stefan; Turan, Janet M; Kilgore, Meredith; Clay, Olivio J; Pekmezi, Dorothy; Davies, Susan
2018-06-01
Chronic pain is an important comorbidity among individuals with HIV. Behavioral interventions are widely regarded as evidence-based, efficacious non-pharmacologic interventions for chronic pain in the general population. An accepted principle in behavioral science is that theory-based, systematically-developed behavioral interventions tailored to the unique needs of a target population are most likely to be efficacious. Our aim was to use Intervention Mapping to systematically develop a Social Cognitive Theory (SCT)-based intervention for chronic pain tailored to individuals with HIV that will improve pain intensity and pain-related functional impairment. Our Intervention Mapping process was informed by qualitative inquiry of 24 patients and seven providers in an HIV primary care clinic. The resulting intervention includes group and one-on-one sessions and peer and staff interventionists. We also developed a conceptual framework that integrates our qualitative findings with SCT-based theoretical constructs. Using this conceptual framework as a guide, our future work will investigate the intervention's impact on chronic pain outcomes, as well as our hypothesized proximal mediators of the intervention's effect.
Lamort-Bouché, Marion; Sarnin, Philippe; Kok, Gerjo; Rouat, Sabrina; Péron, Julien; Letrilliart, Laurent; Fassier, Jean-Baptiste
2018-04-01
The Intervention Mapping (IM) protocol provides a structured framework to develop, implement, and evaluate complex interventions. The main objective of this review was to identify and describe the content of the interventions developed in the field of cancer with the IM protocol. Secondary objectives were to assess their fidelity to the IM protocol and to review their theoretical frameworks. Medline, Web of Science, PsycINFO, PASCAL, FRANCIS, and BDSP databases were searched. All titles and abstracts were reviewed. A standardized extraction form was developed. All included studies were reviewed by 2 reviewers blinded to each other. Sixteen studies were identified, and these reported 15 interventions. The objectives were to increase cancer screening participation (n = 7), early consultation (n = 1), and aftercare/quality of life among cancer survivors (n = 7). Six reported a complete participatory planning group, and 7 described a complete logic model of the problem. Ten studies described a complete logic model of change. The main theoretical frameworks used were the theory of planned behaviour (n = 8), the transtheoretical model (n = 6), the health belief model (n = 6), and the social cognitive theory (n = 6). The environment was rarely integrated in the interventions (n = 4). Five interventions were reported as effective. Culturally relevant interventions were developed with the IM protocol that were effective to increase cancer screening and reduce social disparities, particularly when they were developed through a participative approach and integrated the environment. Stakeholders' involvement and the role of the environment were heterogeneously integrated in the interventions. Copyright © 2017 John Wiley & Sons, Ltd.
Glusman, Gustavo; Rose, Peter W; Prlić, Andreas; Dougherty, Jennifer; Duarte, José M; Hoffman, Andrew S; Barton, Geoffrey J; Bendixen, Emøke; Bergquist, Timothy; Bock, Christian; Brunk, Elizabeth; Buljan, Marija; Burley, Stephen K; Cai, Binghuang; Carter, Hannah; Gao, JianJiong; Godzik, Adam; Heuer, Michael; Hicks, Michael; Hrabe, Thomas; Karchin, Rachel; Leman, Julia Koehler; Lane, Lydie; Masica, David L; Mooney, Sean D; Moult, John; Omenn, Gilbert S; Pearl, Frances; Pejaver, Vikas; Reynolds, Sheila M; Rokem, Ariel; Schwede, Torsten; Song, Sicheng; Tilgner, Hagen; Valasatava, Yana; Zhang, Yang; Deutsch, Eric W
2017-12-18
The translation of personal genomics to precision medicine depends on the accurate interpretation of the multitude of genetic variants observed for each individual. However, even when genetic variants are predicted to modify a protein, their functional implications may be unclear. Many diseases are caused by genetic variants affecting important protein features, such as enzyme active sites or interaction interfaces. The scientific community has catalogued millions of genetic variants in genomic databases and thousands of protein structures in the Protein Data Bank. Mapping mutations onto three-dimensional (3D) structures enables atomic-level analyses of protein positions that may be important for the stability or formation of interactions; these may explain the effect of mutations and in some cases even open a path for targeted drug development. To accelerate progress in the integration of these data types, we held a two-day Gene Variation to 3D (GVto3D) workshop to report on the latest advances and to discuss unmet needs. The overarching goal of the workshop was to address the question: what can be done together as a community to advance the integration of genetic variants and 3D protein structures that could not be done by a single investigator or laboratory? Here we describe the workshop outcomes, review the state of the field, and propose the development of a framework with which to promote progress in this arena. The framework will include a set of standard formats, common ontologies, a common application programming interface to enable interoperation of the resources, and a Tool Registry to make it easy to find and apply the tools to specific analysis problems. Interoperability will enable integration of diverse data sources and tools and collaborative development of variant effect prediction methods.
Efficient in-situ visualization of unsteady flows in climate simulation
NASA Astrophysics Data System (ADS)
Vetter, Michael; Olbrich, Stephan
2017-04-01
The simulation of climate data tends to produce very large data sets, which hardly can be processed in classical post-processing visualization applications. Typically, the visualization pipeline consisting of the processes data generation, visualization mapping and rendering is distributed into two parts over the network or separated via file transfer. Within most traditional post-processing scenarios the simulation is done on a supercomputer whereas the data analysis and visualization is done on a graphics workstation. That way temporary data sets with huge volume have to be transferred over the network, which leads to bandwidth bottlenecks and volume limitations. The solution to this issue is the avoidance of temporary storage, or at least significant reduction of data complexity. Within the Climate Visualization Lab - as part of the Cluster of Excellence "Integrated Climate System Analysis and Prediction" (CliSAP) at the University of Hamburg, in cooperation with the German Climate Computing Center (DKRZ) - we develop and integrate an in-situ approach. Our software framework DSVR is based on the separation of the process chain between the mapping and the rendering processes. It couples the mapping process directly to the simulation by calling methods of a parallelized data extraction library, which create a time-based sequence of geometric 3D scenes. This sequence is stored on a special streaming server with an interactive post-filtering option and then played-out asynchronously in a separate 3D viewer application. Since the rendering is part of this viewer application, the scenes can be navigated interactively. In contrast to other in-situ approaches where 2D images are created as part of the simulation or synchronous co-visualization takes place, our method supports interaction in 3D space and in time, as well as fixed frame rates. To integrate in-situ processing based on our DSVR framework and methods in the ICON climate model, we are continuously evolving the data structures and mapping algorithms of the framework to support the ICON model's native grid structures, since DSVR originally was designed for rectilinear grids only. We now have implemented a new output module to ICON to take advantage of the DSVR visualization. The visualization can be configured as most output modules by using a specific namelist and is exemplarily integrated within the non-hydrostatic atmospheric model time loop. With the integration of a DSVR based in-situ pathline extraction within ICON, a further milestone is reached. The pathline algorithm as well as the grid data structures have been optimized for the domain decomposition used for the parallelization of ICON based on MPI and OpenMP. The software implementation and evaluation is done on the supercomputers at DKRZ. In principle, the data complexity is reduced from O(n3) to O(m), where n is the grid resolution and m the number of supporting point of all pathlines. The stability and scalability evaluation is done using Atmospheric Model Intercomparison Project (AMIP) runs. We will give a short introduction in our software framework, as well as a short overview on the implementation and usage of DSVR within ICON. Furthermore, we will present visualization and evaluation results of sample applications.
Framework for National Flood Risk Assessment for Canada
NASA Astrophysics Data System (ADS)
Elshorbagy, A. A.; Raja, B.; Lakhanpal, A.; Razavi, S.; Ceola, S.; Montanari, A.
2016-12-01
Worldwide, floods have been identified as a standout amongst the most widely recognized catastrophic events, resulting in the loss of life and property. These natural hazards cannot be avoided, but their consequences can certainly be reduced by having prior knowledge of their occurrence and impact. In the context of floods, the terms occurrence and impact are substituted by flood hazard and flood vulnerability, respectively, which collectively define the flood risk. There is a high need for identifying the flood-prone areas and to quantify the risk associated with them. The present study aims at delivering flood risk maps, which prioritize the potential flood risk areas in Canada. The methodology adopted in this study involves integrating various available spatial datasets such as nightlights satellite imagery, land use, population and the digital elevation model, to build a flexible framework for national flood risk assessment for Canada. The flood risk framework assists in identifying the flood-prone areas and evaluating the associated risk. All these spatial datasets were brought to a common GIS platform for flood risk analysis. The spatial datasets deliver the socioeconomic and topographical information that is required for evaluating the flood vulnerability and flood hazard, respectively. Nightlights have been investigated as a tool to be used as a proxy for the human activities to identify areas with regard to economic investment. However, other datasets, including existing flood protection measures, we added to identify a realistic flood assessment framework. Furthermore, the city of Calgary was used as an example to investigate the effect of using Digital Elevation Models (DEMs) of varying resolutions on risk maps. Along with this, the risk map for the city was further enhanced by including the population data to give a social dimension to the risk map. Flood protection measures play a major role by significantly reducing the flood risk of events with a specific return period. An analysis to update the risk maps when information on protection measures is available was carried out for the city of Winnipeg, Canada. The proposed framework is a promising approach to identify and prioritize flood-prone areas, which are in need of intervention or detailed studies.
Kawakami, Eiryo; Singh, Vivek K; Matsubara, Kazuko; Ishii, Takashi; Matsuoka, Yukiko; Hase, Takeshi; Kulkarni, Priya; Siddiqui, Kenaz; Kodilkar, Janhavi; Danve, Nitisha; Subramanian, Indhupriya; Katoh, Manami; Shimizu-Yoshida, Yuki; Ghosh, Samik; Jere, Abhay; Kitano, Hiroaki
2016-01-01
Cellular stress responses require exquisite coordination between intracellular signaling molecules to integrate multiple stimuli and actuate specific cellular behaviors. Deciphering the web of complex interactions underlying stress responses is a key challenge in understanding robust biological systems and has the potential to lead to the discovery of targeted therapeutics for diseases triggered by dysregulation of stress response pathways. We constructed large-scale molecular interaction maps of six major stress response pathways in Saccharomyces cerevisiae (baker’s or budding yeast). Biological findings from over 900 publications were converted into standardized graphical formats and integrated into a common framework. The maps are posted at http://www.yeast-maps.org/yeast-stress-response/ for browse and curation by the research community. On the basis of these maps, we undertook systematic analyses to unravel the underlying architecture of the networks. A series of network analyses revealed that yeast stress response pathways are organized in bow–tie structures, which have been proposed as universal sub-systems for robust biological regulation. Furthermore, we demonstrated a potential role for complexes in stabilizing the conserved core molecules of bow–tie structures. Specifically, complex-mediated reversible reactions, identified by network motif analyses, appeared to have an important role in buffering the concentration and activity of these core molecules. We propose complex-mediated reactions as a key mechanism mediating robust regulation of the yeast stress response. Thus, our comprehensive molecular interaction maps provide not only an integrated knowledge base, but also a platform for systematic network analyses to elucidate the underlying architecture in complex biological systems. PMID:28725465
Planetary mapping—The datamodel's perspective and GIS framework
NASA Astrophysics Data System (ADS)
van Gasselt, S.; Nass, A.
2011-09-01
Demands for a broad range of integrated geospatial data-analysis tools and methods for planetary data organization have been growing considerably since the late 1990s when a plethora of missions equipped with new instruments entered planetary orbits or landed on the surface. They sent back terabytes of new data which soon became accessible for the scientific community and public and which needed to be organized. On the terrestrial side, issues of data access, organization and utilization for scientific and economic analyses are handled by using a range of well-established geographic information systems (GIS) that also found their way into the field of planetary sciences in the late 1990s. We here address key issues concerning the field of planetary mapping by making use of established GIS environments and discuss methods of addressing data organization and mapping requirements by using an easily integrable datamodel that is - for the time being - designed as file-geodatabase (FileGDB) environment in ESRI's ArcGIS. A major design-driving requirement for this datamodel is its extensibility and scalability for growing scientific as well as technical needs, e.g., the utilization of such a datamodel for surface mapping of different planetary objects as defined by their respective reference system and by using different instrument data. Furthermore, it is a major goal to construct a generic model which allows to perform combined geologic as well as geomorphologic mapping tasks making use of international standards without loss of information and by maintaining topologic integrity. An integration of such a datamodel within a geospatial DBMS context can practically be performed by individuals as well as groups without having to deal with the details of administrative tasks and data ingestion issues. Besides the actual mapping, key components of such a mapping datamodel deal with the organization and search for image-sensor data and previous mapping efforts, as well as the proper organization of cartographic representations and assignments of geologic/geomorphologic units within their stratigraphic context.
ResearchMaps.org for integrating and planning research.
Matiasz, Nicholas J; Wood, Justin; Doshi, Pranay; Speier, William; Beckemeyer, Barry; Wang, Wei; Hsu, William; Silva, Alcino J
2018-01-01
To plan experiments, a biologist needs to evaluate a growing set of empirical findings and hypothetical assertions from diverse fields that use increasingly complex techniques. To address this problem, we operationalized principles (e.g., convergence and consistency) that biologists use to test causal relations and evaluate experimental evidence. With the framework we derived, we then created a free, open-source web application that allows biologists to create research maps, graph-based representations of empirical evidence and hypothetical assertions found in research articles, reviews, and other sources. With our ResearchMaps web application, biologists can systematically reason through the research that is most important to them, as well as evaluate and plan experiments with a breadth and precision that are unlikely without such a tool.
ResearchMaps.org for integrating and planning research
Speier, William; Beckemeyer, Barry; Wang, Wei; Hsu, William; Silva, Alcino J.
2018-01-01
To plan experiments, a biologist needs to evaluate a growing set of empirical findings and hypothetical assertions from diverse fields that use increasingly complex techniques. To address this problem, we operationalized principles (e.g., convergence and consistency) that biologists use to test causal relations and evaluate experimental evidence. With the framework we derived, we then created a free, open-source web application that allows biologists to create research maps, graph-based representations of empirical evidence and hypothetical assertions found in research articles, reviews, and other sources. With our ResearchMaps web application, biologists can systematically reason through the research that is most important to them, as well as evaluate and plan experiments with a breadth and precision that are unlikely without such a tool. PMID:29723213
Mapping a research agenda for the science of team science
Falk-Krzesinski, Holly J; Contractor, Noshir; Fiore, Stephen M; Hall, Kara L; Kane, Cathleen; Keyton, Joann; Klein, Julie Thompson; Spring, Bonnie; Stokols, Daniel; Trochim, William
2012-01-01
An increase in cross-disciplinary, collaborative team science initiatives over the last few decades has spurred interest by multiple stakeholder groups in empirical research on scientific teams, giving rise to an emergent field referred to as the science of team science (SciTS). This study employed a collaborative team science concept-mapping evaluation methodology to develop a comprehensive research agenda for the SciTS field. Its integrative mixed-methods approach combined group process with statistical analysis to derive a conceptual framework that identifies research areas of team science and their relative importance to the emerging SciTS field. The findings from this concept-mapping project constitute a lever for moving SciTS forward at theoretical, empirical, and translational levels. PMID:23223093
Granger-causality maps of diffusion processes.
Wahl, Benjamin; Feudel, Ulrike; Hlinka, Jaroslav; Wächter, Matthias; Peinke, Joachim; Freund, Jan A
2016-02-01
Granger causality is a statistical concept devised to reconstruct and quantify predictive information flow between stochastic processes. Although the general concept can be formulated model-free it is often considered in the framework of linear stochastic processes. Here we show how local linear model descriptions can be employed to extend Granger causality into the realm of nonlinear systems. This novel treatment results in maps that resolve Granger causality in regions of state space. Through examples we provide a proof of concept and illustrate the utility of these maps. Moreover, by integration we convert the local Granger causality into a global measure that yields a consistent picture for a global Ornstein-Uhlenbeck process. Finally, we recover invariance transformations known from the theory of autoregressive processes.
Introduction to a special issue on concept mapping.
Trochim, William M; McLinden, Daniel
2017-02-01
Concept mapping was developed in the 1980s as a unique integration of qualitative (group process, brainstorming, unstructured sorting, interpretation) and quantitative (multidimensional scaling, hierarchical cluster analysis) methods designed to enable a group of people to articulate and depict graphically a coherent conceptual framework or model of any topic or issue of interest. This introduction provides the basic definition and description of the methodology for the newcomer and describes the steps typically followed in its most standard canonical form (preparation, generation, structuring, representation, interpretation and utilization). It also introduces this special issue which reviews the history of the methodology, describes its use in a variety of contexts, shows the latest ways it can be integrated with other methodologies, considers methodological advances and developments, and sketches a vision of the future of the method's evolution. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Maity, Debotyam
This study is aimed at an improved understanding of unconventional reservoirs which include tight reservoirs (such as shale oil and gas plays), geothermal developments, etc. We provide a framework for improved fracture zone identification and mapping of the subsurface for a geothermal system by integrating data from different sources. The proposed ideas and methods were tested primarily on data obtained from North Brawley geothermal field and the Geysers geothermal field apart from synthetic datasets which were used to test new algorithms before actual application on the real datasets. The study has resulted in novel or improved algorithms for use at specific stages of data acquisition and analysis including improved phase detection technique for passive seismic (and teleseismic) data as well as optimization of passive seismic surveys for best possible processing results. The proposed workflow makes use of novel integration methods as a means of making best use of the available geophysical data for fracture characterization. The methodology incorporates soft computing tools such as hybrid neural networks (neuro-evolutionary algorithms) as well as geostatistical simulation techniques to improve the property estimates as well as overall characterization efficacy. The basic elements of the proposed characterization workflow involves using seismic and microseismic data to characterize structural and geomechanical features within the subsurface. We use passive seismic data to model geomechanical properties which are combined with other properties evaluated from seismic and well logs to derive both qualitative and quantitative fracture zone identifiers. The study has resulted in a broad framework highlighting a new technique for utilizing geophysical data (seismic and microseismic) for unconventional reservoir characterization. It provides an opportunity to optimally develop the resources in question by incorporating data from different sources and using their temporal and spatial variability as a means to better understand the reservoir behavior. As part of this study, we have developed the following elements which are discussed in the subsequent chapters: 1. An integrated characterization framework for unconventional settings with adaptable workflows for all stages of data processing, interpretation and analysis. 2. A novel autopicking workflow for noisy passive seismic data used for improved accuracy in event picking as well as for improved velocity model building. 3. Improved passive seismic survey design optimization framework for better data collection and improved property estimation. 4. Extensive post-stack seismic attribute studies incorporating robust schemes applicable in complex reservoir settings. 5. Uncertainty quantification and analysis to better quantify property estimates over and above the qualitative interpretations made and to validate observations independently with quantified uncertainties to prevent erroneous interpretations. 6. Property mapping from microseismic data including stress and anisotropic weakness estimates for integrated reservoir characterization and analysis. 7. Integration of results (seismic, microseismic and well logs) from analysis of individual data sets for integrated interpretation using predefined integration framework and soft computing tools.
The brain, self and society: a social-neuroscience model of predictive processing.
Kelly, Michael P; Kriznik, Natasha M; Kinmonth, Ann Louise; Fletcher, Paul C
2018-05-10
This paper presents a hypothesis about how social interactions shape and influence predictive processing in the brain. The paper integrates concepts from neuroscience and sociology where a gulf presently exists between the ways that each describe the same phenomenon - how the social world is engaged with by thinking humans. We combine the concepts of predictive processing models (also called predictive coding models in the neuroscience literature) with ideal types, typifications and social practice - concepts from the sociological literature. This generates a unified hypothetical framework integrating the social world and hypothesised brain processes. The hypothesis combines aspects of neuroscience and psychology with social theory to show how social behaviors may be "mapped" onto brain processes. It outlines a conceptual framework that connects the two disciplines and that may enable creative dialogue and potential future research.
ActionMap: A web-based software that automates loci assignments to framework maps.
Albini, Guillaume; Falque, Matthieu; Joets, Johann
2003-07-01
Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).
ActionMap: a web-based software that automates loci assignments to framework maps
Albini, Guillaume; Falque, Matthieu; Joets, Johann
2003-01-01
Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/). PMID:12824426
The Seismotectonic Map of Africa
NASA Astrophysics Data System (ADS)
Meghraoui, Mustapha
2015-04-01
We present the Seismotectonic Map of Africa based on a geological, geophysical and geodetic database including the instrumental seismicity and re-appraisal of large historical events with harmonization and homogenization of earthquake parameters in catalogues. Although the seismotectonic framework and mapping of the African continent is a difficult task, several previous and ongoing projects provide a wealth of data and outstanding results. The database of large and moderate earthquakes in different geological domains includes the coseismic and Quaternary faulting that reveals the complex nature of the active tectonics in Africa. The map also benefits from previous works on local and regional seismotectonic maps that needed to be integrated with the lithospheric and upper mantle structures from tomographic anisotropy and gravity anomaly into a continental framework. The synthesis of earthquake and volcanic studies with the analysis of long-term (late Quaternary) and short-term (last decades and centuries) active deformation observed with geodetic and other approaches presented along with the seismotectonic map serves as a basis for hazard calculations and the reduction of seismic risks. The map may also be very useful in the assessment of seismic hazard and mitigation of earthquake risk for significant infrastructures and their implications in the socio-economic impact in Africa. In addition, the constant population increase and infrastructure growth in the continent that exacerbate the earthquake risk justify the necessity for a continuous updating of the seismotectonic map. The database and related map are prepared in the framework of the IGC Project-601 "Seismotectonics and Seismic Hazards in Africa" of UNESCO-IUGS, funded by the Swedish International Development Agency and UNESCO-Nairobi for a period of 4 years (2011 - 2014), extended to 2016. * Mustapha Meghraoui (Coordinator) EOST - IPG Strasbourg CNRS-UMR 7516 m.meghraoui@unistra.fr corresponding author. Paulina Amponsah (AECG, Accra), Abdelhakim Ayadi (CRAAG, Algiers), Atalay Ayele (Univ. Addis Ababa), Ateba Bekoa (Bueah Univ. Yaounde), Abdunnur Bensuleman (Tripoli Univ.), Damien Delvaux (MRAC-Tervuren); Mohamed El Gabry (NRIAG, Cairo), Rui-Manuel Fernandes (Beira Univ.) ; Vunganai Midzi & Magda Roos (CGS, Pretoria), Youssef Timoulali (Univ. Mohamed V, Rabat). Website: http://eost.u-strasbg.fr/igcp601/index.html
Mapping copy number variation by population-scale genome sequencing.
Mills, Ryan E; Walter, Klaudia; Stewart, Chip; Handsaker, Robert E; Chen, Ken; Alkan, Can; Abyzov, Alexej; Yoon, Seungtai Chris; Ye, Kai; Cheetham, R Keira; Chinwalla, Asif; Conrad, Donald F; Fu, Yutao; Grubert, Fabian; Hajirasouliha, Iman; Hormozdiari, Fereydoun; Iakoucheva, Lilia M; Iqbal, Zamin; Kang, Shuli; Kidd, Jeffrey M; Konkel, Miriam K; Korn, Joshua; Khurana, Ekta; Kural, Deniz; Lam, Hugo Y K; Leng, Jing; Li, Ruiqiang; Li, Yingrui; Lin, Chang-Yun; Luo, Ruibang; Mu, Xinmeng Jasmine; Nemesh, James; Peckham, Heather E; Rausch, Tobias; Scally, Aylwyn; Shi, Xinghua; Stromberg, Michael P; Stütz, Adrian M; Urban, Alexander Eckehart; Walker, Jerilyn A; Wu, Jiantao; Zhang, Yujun; Zhang, Zhengdong D; Batzer, Mark A; Ding, Li; Marth, Gabor T; McVean, Gil; Sebat, Jonathan; Snyder, Michael; Wang, Jun; Ye, Kenny; Eichler, Evan E; Gerstein, Mark B; Hurles, Matthew E; Lee, Charles; McCarroll, Steven A; Korbel, Jan O
2011-02-03
Genomic structural variants (SVs) are abundant in humans, differing from other forms of variation in extent, origin and functional impact. Despite progress in SV characterization, the nucleotide resolution architecture of most SVs remains unknown. We constructed a map of unbalanced SVs (that is, copy number variants) based on whole genome DNA sequencing data from 185 human genomes, integrating evidence from complementary SV discovery approaches with extensive experimental validations. Our map encompassed 22,025 deletions and 6,000 additional SVs, including insertions and tandem duplications. Most SVs (53%) were mapped to nucleotide resolution, which facilitated analysing their origin and functional impact. We examined numerous whole and partial gene deletions with a genotyping approach and observed a depletion of gene disruptions amongst high frequency deletions. Furthermore, we observed differences in the size spectra of SVs originating from distinct formation mechanisms, and constructed a map of SV hotspots formed by common mechanisms. Our analytical framework and SV map serves as a resource for sequencing-based association studies.
EMDS 3.0: A modeling framework for coping with complexity in environmental assessment and planning.
K.M. Reynolds
2006-01-01
EMDS 3.0 is implemented as an ArcMap® extension and integrates the logic engine of NetWeaver® to perform landscape evaluations, and the decision modeling engine of Criterium DecisionPlus® for evaluating management priorities. Key features of the system's evaluation component include abilities to (1) reason about large, abstract, multifaceted ecosystem management...
Davis, Brian W; Raudsepp, Terje; Pearks Wilkerson, Alison J; Agarwala, Richa; Schäffer, Alejandro A; Houck, Marlys; Chowdhary, Bhanu P; Murphy, William J
2009-04-01
We describe the construction of a high-resolution radiation hybrid (RH) map of the domestic cat genome, which includes 2662 markers, translating to an estimated average intermarker distance of 939 kilobases (kb). Targeted marker selection utilized the recent feline 1.9x genome assembly, concentrating on regions of low marker density on feline autosomes and the X chromosome, in addition to regions flanking interspecies chromosomal breakpoints. Average gap (breakpoint) size between cat-human ordered conserved segments is less than 900 kb. The map was used for a fine-scale comparison of conserved syntenic blocks with the human and canine genomes. Corroborative fluorescence in situ hybridization (FISH) data were generated using 129 domestic cat BAC clones as probes, providing independent confirmation of the long-range correctness of the map. Cross-species hybridization of BAC probes on divergent felids from the genera Profelis (serval) and Panthera (snow leopard) provides further evidence for karyotypic conservation within felids, and demonstrates the utility of such probes for future studies of chromosome evolution within the cat family and in related carnivores. The integrated map constitutes a comprehensive framework for identifying genes controlling feline phenotypes of interest, and to aid in assembly of a higher coverage feline genome sequence.
Davis, Brian W.; Raudsepp, Terje; Wilkerson, Alison J. Pearks; Agarwala, Richa; Schäffer, Alejandro A.; Houck, Marlys; Ryder, Oliver A.; Chowdhdary, Bhanu P.; Murphy, William J.
2008-01-01
We describe the construction of a high-resolution radiation hybrid (RH) map of the domestic cat genome, which includes 2,662 markers, translating to an estimated average intermarker distance of 939 kilobases (Kb). Targeted marker selection utilized the recent feline 1.9x genome assembly, concentrating on regions of low marker density on feline autosomes and the X chromosome, in addition to regions flanking interspecies chromosomal breakpoints. Average gap (breakpoint) size between cat-human ordered conserved segments is less than 900 Kb. The map was used for a fine-scale comparison of conserved syntenic blocks with the human and canine genomes. Corroborative fluorescence in situ hybridization (FISH) data were generated using 129 domestic cat BAC-clones as probes, providing independent confirmation of the long-range correctness of the map. Cross-species hybridization of BAC probes on divergent felids from the genera Profelis (serval) and Panthera (snow leopard) provides further evidence for karyotypic conservation within felids, and demonstrates the utility of such probes for future studies of chromosome evolution within the cat family and in related carnivores. The integrated map constitutes a comprehensive framework for identifying genes controlling feline phenotypes of interest, and to aid in assembly of a higher coverage feline genome sequence. PMID:18951970
Ecoregions and ecoregionalization: geographical and ecological perspectives
Loveland, Thomas R.; Merchant, James W.
2005-01-01
Ecoregions, i.e., areas exhibiting relative homogeneity of ecosystems, are units of analysis that are increasingly important in environmental assessment and management. Ecoregions provide a holistic framework for flexible, comparative analysis of complex environmental problems. Ecoregions mapping has intellectual foundations in both geography and ecology. However, a hallmark of ecoregions mapping is that it is a truly interdisciplinary endeavor that demands the integration of knowledge from a multitude of sciences. Geographers emphasize the role of place, scale, and both natural and social elements when delineating and characterizing regions. Ecologists tend to focus on environmental processes with special attention given to energy flows and nutrient cycling. Integration of disparate knowledge from the many key sciences has been one of the great challenges of ecoregions mapping, and may lie at the heart of the lack of consensus on the “optimal” approach and methods to use in such work. Through a review of the principal existing US ecoregion maps, issues that should be addressed in order to advance the state of the art are identified. Research related to needs, methods, data sources, data delivery, and validation is needed. It is also important that the academic system foster education so that there is an infusion of new expertise in ecoregion mapping and use.
Holdsworth, Michelle; Nicolaou, Mary; Langøien, Lars Jørun; Osei-Kwasi, Hibbah Araba; Chastin, Sebastien F M; Stok, F Marijn; Capranica, Laura; Lien, Nanna; Terragni, Laura; Monsivais, Pablo; Mazzocchi, Mario; Maes, Lea; Roos, Gun; Mejean, Caroline; Powell, Katie; Stronks, Karien
2017-11-07
Some ethnic minority populations have a higher risk of non-communicable diseases than the majority European population. Diet and physical activity behaviours contribute to this risk, shaped by a system of inter-related factors. This study mapped a systems-based framework of the factors influencing dietary and physical activity behaviours in ethnic minority populations living in Europe, to inform research prioritisation and intervention development. A concept mapping approach guided by systems thinking was used: i. Preparation (protocol and terminology); ii. Generating a list of factors influencing dietary and physical activity behaviours in ethnic minority populations living in Europe from evidence (systematic mapping reviews) and 'eminence' (89 participants from 24 academic disciplines via brainstorming, an international symposium and expert review) and; iii. Seeking consensus on structuring, rating and clustering factors, based on how they relate to each other; and iv. Interpreting/utilising the framework for research and interventions. Similar steps were undertaken for frameworks developed for the majority European population. Seven distinct clusters emerged for dietary behaviour (containing 85 factors) and 8 for physical activity behaviours (containing 183 factors). Four clusters were similar across behaviours: Social and cultural environment; Social and material resources; Psychosocial; and Migration context. Similar clusters of factors emerged in the frameworks for diet and physical activity behaviours of the majority European population, except for 'migration context'. The importance of factors across all clusters was acknowledged, but their relative importance differed for ethnic minority populations compared with the majority population. This systems-based framework integrates evidence from both expert opinion and published literature, to map the factors influencing dietary and physical activity behaviours in ethnic minority groups. Our findings illustrate that innovative research and complex interventions need to be developed that are sensitive to the needs of ethnic minority populations. A systems approach that encompasses the complexity of the inter-related factors that drive behaviours may inform a more holistic public health paradigm to more effectively reach ethnic minorities living in Europe, as well as the majority host population.
Gar, Oron; Sargent, Daniel J.; Tsai, Ching-Jung; Pleban, Tzili; Shalev, Gil; Byrne, David H.; Zamir, Dani
2011-01-01
Polyploidy is a pivotal process in plant evolution as it increase gene redundancy and morphological intricacy but due to the complexity of polysomic inheritance we have only few genetic maps of autopolyploid organisms. A robust mapping framework is particularly important in polyploid crop species, rose included (2n = 4x = 28), where the objective is to study multiallelic interactions that control traits of value for plant breeding. From a cross between the garden, peach red and fragrant cultivar Fragrant Cloud (FC) and a cut-rose yellow cultivar Golden Gate (GG), we generated an autotetraploid GGFC mapping population consisting of 132 individuals. For the map we used 128 sequence-based markers, 141 AFLP, 86 SSR and three morphological markers. Seven linkage groups were resolved for FC (Total 632 cM) and GG (616 cM) which were validated by markers that segregated in both parents as well as the diploid integrated consensus map. The release of the Fragaria vesca genome, which also belongs to the Rosoideae, allowed us to place 70 rose sequenced markers on the seven strawberry pseudo-chromosomes. Synteny between Rosa and Fragaria was high with an estimated four major translocations and six inversions required to place the 17 non-collinear markers in the same order. Based on a verified linear order of the rose markers, we could further partition each of the parents into its four homologous groups, thus providing an essential framework to aid the sequencing of an autotetraploid genome. PMID:21647382
Gar, Oron; Sargent, Daniel J; Tsai, Ching-Jung; Pleban, Tzili; Shalev, Gil; Byrne, David H; Zamir, Dani
2011-01-01
Polyploidy is a pivotal process in plant evolution as it increase gene redundancy and morphological intricacy but due to the complexity of polysomic inheritance we have only few genetic maps of autopolyploid organisms. A robust mapping framework is particularly important in polyploid crop species, rose included (2n = 4x = 28), where the objective is to study multiallelic interactions that control traits of value for plant breeding. From a cross between the garden, peach red and fragrant cultivar Fragrant Cloud (FC) and a cut-rose yellow cultivar Golden Gate (GG), we generated an autotetraploid GGFC mapping population consisting of 132 individuals. For the map we used 128 sequence-based markers, 141 AFLP, 86 SSR and three morphological markers. Seven linkage groups were resolved for FC (Total 632 cM) and GG (616 cM) which were validated by markers that segregated in both parents as well as the diploid integrated consensus map.The release of the Fragaria vesca genome, which also belongs to the Rosoideae, allowed us to place 70 rose sequenced markers on the seven strawberry pseudo-chromosomes. Synteny between Rosa and Fragaria was high with an estimated four major translocations and six inversions required to place the 17 non-collinear markers in the same order. Based on a verified linear order of the rose markers, we could further partition each of the parents into its four homologous groups, thus providing an essential framework to aid the sequencing of an autotetraploid genome.
Theeboom, Tim; Van Vianen, Annelies E M; Beersma, Bianca
2017-01-01
Economic pressures on companies, technological developments, and less stable career paths pose potential threats to the well-being of employees (e.g., stress, burn-out) and require constant adaptation. In the light of these challenges, it is not surprising that employees often seek the support of a coach. The role of a coach is to foster change by facilitating a coachees' movement through a self-regulatory cycle with the ultimate aim of stimulating sustained well-being and functioning. While meta-analytic research indicates that coaching interventions can be effectively applied to assist employees in dealing with change, the current literature on coaching lacks solid theoretical frameworks that are needed to build a cumulative knowledge-base and to inspire evidence-based practice. In this conceptual analysis, we examine the coaching process through a temporal lens. By doing so, we provide an integrated theoretical framework: a temporal map of coaching. In this framework, we link seminal concepts in psychology to the coaching process, and describe which competencies of coachees are crucial in the different stages of change that coaching aims to bring about. During the preparatory contemplation stage, targeting coachees' awareness by enhancing their mindfulness and environmental receptiveness is important. During the contemplation stage, coachees' willingness and perceived ability to change are central competencies. We propose that coaches should therefore foster intrinsic goal orientation and self-efficacy during this stage. During the planning stage, coaches should focus on goal-setting and implementation intentions. Finally, during the maintenance/termination stage, stimulating coachees' reflection is especially important in order to help them to integrate their learning experiences. The framework delineated in this paper contributes to the understanding of coaching as a tool to assist employees in dealing with the challenges of an increasingly dynamic work-environment and yields concrete suggestions for future theory development and research on coaching.
Theeboom, Tim; Van Vianen, Annelies E. M.; Beersma, Bianca
2017-01-01
Economic pressures on companies, technological developments, and less stable career paths pose potential threats to the well-being of employees (e.g., stress, burn-out) and require constant adaptation. In the light of these challenges, it is not surprising that employees often seek the support of a coach. The role of a coach is to foster change by facilitating a coachees’ movement through a self-regulatory cycle with the ultimate aim of stimulating sustained well-being and functioning. While meta-analytic research indicates that coaching interventions can be effectively applied to assist employees in dealing with change, the current literature on coaching lacks solid theoretical frameworks that are needed to build a cumulative knowledge-base and to inspire evidence-based practice. In this conceptual analysis, we examine the coaching process through a temporal lens. By doing so, we provide an integrated theoretical framework: a temporal map of coaching. In this framework, we link seminal concepts in psychology to the coaching process, and describe which competencies of coachees are crucial in the different stages of change that coaching aims to bring about. During the preparatory contemplation stage, targeting coachees’ awareness by enhancing their mindfulness and environmental receptiveness is important. During the contemplation stage, coachees’ willingness and perceived ability to change are central competencies. We propose that coaches should therefore foster intrinsic goal orientation and self-efficacy during this stage. During the planning stage, coaches should focus on goal-setting and implementation intentions. Finally, during the maintenance/termination stage, stimulating coachees’ reflection is especially important in order to help them to integrate their learning experiences. The framework delineated in this paper contributes to the understanding of coaching as a tool to assist employees in dealing with the challenges of an increasingly dynamic work-environment and yields concrete suggestions for future theory development and research on coaching. PMID:28848470
NASA Astrophysics Data System (ADS)
Lev, S. M.; Gallo, J.
2017-12-01
The international Arctic scientific community has identified the need for a sustained and integrated portfolio of pan-Arctic Earth-observing systems. In 2017, an international effort was undertaken to develop the first ever Value Tree framework for identifying common research and operational objectives that rely on Earth observation data derived from Earth-observing systems, sensors, surveys, networks, models, and databases to deliver societal benefits in the Arctic. A Value Tree Analysis is a common tool used to support decision making processes and is useful for defining concepts, identifying objectives, and creating a hierarchical framework of objectives. A multi-level societal benefit area value tree establishes the connection from societal benefits to the set of observation inputs that contribute to delivering those benefits. A Value Tree that relies on expert domain knowledge from Arctic and non-Arctic nations, international researchers, Indigenous knowledge holders, and other experts to develop a framework to serve as a logical and interdependent decision support tool will be presented. Value tree examples that map the contribution of Earth observations in the Arctic to achieving societal benefits will be presented in the context of the 2017 International Arctic Observations Assessment Framework. These case studies will highlight specific observing products and capability groups where investment is needed to contribute to the development of a sustained portfolio of Arctic observing systems.
Draye, Xavier; Lin, Yann-Rong; Qian, Xiao-yin; Bowers, John E.; Burow, Gloria B.; Morrell, Peter L.; Peterson, Daniel G.; Presting, Gernot G.; Ren, Shu-xin; Wing, Rod A.; Paterson, Andrew H.
2001-01-01
The small genome of sorghum (Sorghum bicolor L. Moench.) provides an important template for study of closely related large-genome crops such as maize (Zea mays) and sugarcane (Saccharum spp.), and is a logical complement to distantly related rice (Oryza sativa) as a “grass genome model.” Using a high-density RFLP map as a framework, a robust physical map of sorghum is being assembled by integrating hybridization and fingerprint data with comparative data from related taxa such as rice and using new methods to resolve genomic duplications into locus-specific groups. By taking advantage of allelic variation revealed by heterologous probes, the positions of corresponding loci on the wheat (Triticum aestivum), rice, maize, sugarcane, and Arabidopsis genomes are being interpolated on the sorghum physical map. Bacterial artificial chromosomes for the small genome of rice are shown to close several gaps in the sorghum contigs; the emerging rice physical map and assembled sequence will further accelerate progress. An important motivation for developing genomic tools is to relate molecular level variation to phenotypic diversity. “Diversity maps,” which depict the levels and patterns of variation in different gene pools, shed light on relationships of allelic diversity with chromosome organization, and suggest possible locations of genomic regions that are under selection due to major gene effects (some of which may be revealed by quantitative trait locus mapping). Both physical maps and diversity maps suggest interesting features that may be integrally related to the chromosomal context of DNA—progress in cytology promises to provide a means to elucidate such relationships. We seek to provide a detailed picture of the structure, function, and evolution of the genome of sorghum and its relatives, together with molecular tools such as locus-specific sequence-tagged site DNA markers and bacterial artificial chromosome contigs that will have enduring value for many aspects of genome analysis. PMID:11244113
A Scalable Data Integration and Analysis Architecture for Sensor Data of Pediatric Asthma.
Stripelis, Dimitris; Ambite, José Luis; Chiang, Yao-Yi; Eckel, Sandrah P; Habre, Rima
2017-04-01
According to the Centers for Disease Control, in the United States there are 6.8 million children living with asthma. Despite the importance of the disease, the available prognostic tools are not sufficient for biomedical researchers to thoroughly investigate the potential risks of the disease at scale. To overcome these challenges we present a big data integration and analysis infrastructure developed by our Data and Software Coordination and Integration Center (DSCIC) of the NIBIB-funded Pediatric Research using Integrated Sensor Monitoring Systems (PRISMS) program. Our goal is to help biomedical researchers to efficiently predict and prevent asthma attacks. The PRISMS-DSCIC is responsible for collecting, integrating, storing, and analyzing real-time environmental, physiological and behavioral data obtained from heterogeneous sensor and traditional data sources. Our architecture is based on the Apache Kafka, Spark and Hadoop frameworks and PostgreSQL DBMS. A main contribution of this work is extending the Spark framework with a mediation layer, based on logical schema mappings and query rewriting, to facilitate data analysis over a consistent harmonized schema. The system provides both batch and stream analytic capabilities over the massive data generated by wearable and fixed sensors.
a Virtual Hub Brokering Approach for Integration of Historical and Modern Maps
NASA Astrophysics Data System (ADS)
Bruno, N.; Previtali, M.; Barazzetti, L.; Brumana, R.; Roncella, R.
2016-06-01
Geospatial data are today more and more widespread. Many different institutions, such as Geographical Institutes, Public Administrations, collaborative communities (e.g., OSM) and web companies, make available nowadays a large number of maps. Besides this cartography, projects of digitizing, georeferencing and web publication of historical maps have increasingly spread in the recent years. In spite of these variety and availability of data, information overload makes difficult their discovery and management: without knowing the specific repository where the data are stored, it is difficult to find the information required and problems of interconnection between different data sources and their restricted interoperability limit a wide utilization of available geo-data. This paper aims to describe some actions performed to assure interoperability between data, in particular spatial and geographic data, gathered from different data providers, with different features and referring to different historical periods. The article summarizes and exemplifies how, starting from projects of historical map digitizing and Historical GIS implementation, respectively for the Lombardy and for the city of Parma, the interoperability is possible in the framework of the ENERGIC OD project. The European project ENERGIC OD, thanks to a specific component - the virtual hub - based on a brokering framework, copes with the previous listed problems and allows the interoperability between different data sources.
Poynton, Clare B; Chen, Kevin T; Chonde, Daniel B; Izquierdo-Garcia, David; Gollub, Randy L; Gerstner, Elizabeth R; Batchelor, Tracy T; Catana, Ciprian
2014-01-01
We present a new MRI-based attenuation correction (AC) approach for integrated PET/MRI systems that combines both segmentation- and atlas-based methods by incorporating dual-echo ultra-short echo-time (DUTE) and T1-weighted (T1w) MRI data and a probabilistic atlas. Segmented atlases were constructed from CT training data using a leave-one-out framework and combined with T1w, DUTE, and CT data to train a classifier that computes the probability of air/soft tissue/bone at each voxel. This classifier was applied to segment the MRI of the subject of interest and attenuation maps (μ-maps) were generated by assigning specific linear attenuation coefficients (LACs) to each tissue class. The μ-maps generated with this "Atlas-T1w-DUTE" approach were compared to those obtained from DUTE data using a previously proposed method. For validation of the segmentation results, segmented CT μ-maps were considered to the "silver standard"; the segmentation accuracy was assessed qualitatively and quantitatively through calculation of the Dice similarity coefficient (DSC). Relative change (RC) maps between the CT and MRI-based attenuation corrected PET volumes were also calculated for a global voxel-wise assessment of the reconstruction results. The μ-maps obtained using the Atlas-T1w-DUTE classifier agreed well with those derived from CT; the mean DSCs for the Atlas-T1w-DUTE-based μ-maps across all subjects were higher than those for DUTE-based μ-maps; the atlas-based μ-maps also showed a lower percentage of misclassified voxels across all subjects. RC maps from the atlas-based technique also demonstrated improvement in the PET data compared to the DUTE method, both globally as well as regionally.
A framework for evaluating and utilizing medical terminology mappings.
Hussain, Sajjad; Sun, Hong; Sinaci, Anil; Erturkmen, Gokce Banu Laleci; Mead, Charles; Gray, Alasdair J G; McGuinness, Deborah L; Prud'Hommeaux, Eric; Daniel, Christel; Forsberg, Kerstin
2014-01-01
Use of medical terminologies and mappings across them are considered to be crucial pre-requisites for achieving interoperable eHealth applications. Built upon the outcomes of several research projects, we introduce a framework for evaluating and utilizing terminology mappings that offers a platform for i) performing various mappings strategies, ii) representing terminology mappings together with their provenance information, and iii) enabling terminology reasoning for inferring both new and erroneous mappings. We present the results of the introduced framework from SALUS project where we evaluated the quality of both existing and inferred terminology mappings among standard terminologies.
Processing Solutions for Big Data in Astronomy
NASA Astrophysics Data System (ADS)
Fillatre, L.; Lepiller, D.
2016-09-01
This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.
Map Resource Packet: Course Models for the History-Social Science Framework, Grade Seven.
ERIC Educational Resources Information Center
California State Dept. of Education, Sacramento.
This packet of maps is an auxiliary resource to the "World History and Geography: Medieval and Early Modern Times. Course Models for the History-Social Science Framework, Grade Seven." The set includes: outline, precipitation, and elevation maps; maps for locating key places; landform maps; and historical maps. The list of maps are…
Bassett, Danielle S; Sporns, Olaf
2017-01-01
Despite substantial recent progress, our understanding of the principles and mechanisms underlying complex brain function and cognition remains incomplete. Network neuroscience proposes to tackle these enduring challenges. Approaching brain structure and function from an explicitly integrative perspective, network neuroscience pursues new ways to map, record, analyze and model the elements and interactions of neurobiological systems. Two parallel trends drive the approach: the availability of new empirical tools to create comprehensive maps and record dynamic patterns among molecules, neurons, brain areas and social systems; and the theoretical framework and computational tools of modern network science. The convergence of empirical and computational advances opens new frontiers of scientific inquiry, including network dynamics, manipulation and control of brain networks, and integration of network processes across spatiotemporal domains. We review emerging trends in network neuroscience and attempt to chart a path toward a better understanding of the brain as a multiscale networked system. PMID:28230844
INFOMAR, Ireland's National Seabed Mapping Programme; Sharing Valuable Insights.
NASA Astrophysics Data System (ADS)
Judge, M. T.; McGrath, F.; Cullen, S.; Verbruggen, K.
2017-12-01
Following the successful high-resolution deep-sea mapping carried out as part of the Irish National Seabed Survey (INSS), a strategic, long term programme was established: INtegrated mapping FOr the sustainable development of Ireland MArine Resources (INFOMAR). Funded by Ireland's Department of Communication, Climate Action and Environment, INFOMAR comprises a multi-platform approach to completing Ireland's marine mapping, and is a key action in the integrated marine plan, Harnessing Our Ocean Wealth. Co-managed by Geological Survey Ireland and the Marine Institute, the programme has three work strands: Data Acquisition; Data Exchange and Integration; Value Added Exploitation.The Data Acquisition strand includes collection of geological, hydrographic, oceanographic, habitat and heritage datasets that underpin sustainable development and management of Ireland's marine resources. INFOMAR operates a free data policy; data and outputs are delivered online through the Data Exchange and Integration strand. Uses of data and outputs are wide-ranging and multipurpose. In order to address the evolution and diversification of user requirements, further data product development is facilitated through the Value Added Exploitation strand.Ninety percent of Ireland's territory lies offshore. Therefore, strategic national seabed mapping continues to provide critical, high-resolution baseline datasets for numerous economic sectors and societal needs. From these we can glean important geodynamic knowledge of Ireland's vast maritime territory. INFOMAR remains aligned with national and European policies and directives. Exemplified by our commitment to EMODnet, a European Commission funded project that supports the collection, standardisation and sharing of available marine information, data and data products across all European Seas. As EMODnet Geology Minerals leaders we have developed a framework for mapping marine minerals. Furthermore, collaboration with the international research project NAGTEC has unlocked the value of Irish marine data as an important jigsaw piece in the new atlas detailing the tectonostratigraphic evolution of the North-East Atlantic, with emphasis on conjugate margin comparisons.
Efficiency Analysis of Integrated Public Hospital Networks in Outpatient Internal Medicine.
Ortíz-Barrios, Miguel Angel; Escorcia-Caballero, Juan P; Sánchez-Sánchez, Fabián; De Felice, Fabio; Petrillo, Antonella
2017-09-07
Healthcare systems are evolving towards a complex network of interconnected services due to the increasing costs and the increasing expectations for high service levels. It is evidenced in the literature the importance of implementing management techniques and sophisticated methods to improve the efficiency of healthcare systems, especially in emerging economies. This paper proposes an integrated collaboration model between two public hospitals to reach the reduction of weighted average lead time in outpatient internal medicine department. A strategic framework based on value stream mapping and collaborative practices has been developed in real case study settled in Colombia.
The Emergence of Compositional Communication in a Synthetic Ethology Framework
2005-08-12
34Integrating Language and Cognition: A Cognitive Robotics Approach", invited contribution to IEEE Computational Intelligence Magazine . The first two...papers address the main topic of investigation of the research proposal. In particular, we have introduced a simple structured meaning-signal mapping...Cavalli-Sforza (1982) to investigate analytically the evolution of structured com- munication codes. Let x 6 [0,1] be the proportion of individuals in a
A Systems Approach to Biometrics in the Military Domain.
Wilson, Lauren; Gahan, Michelle; Lennard, Chris; Robertson, James
2018-02-21
Forensic biometrics is the application of forensic science principles to physical and behavioral characteristics. Forensic biometrics is a secondary sub-system in the forensic science "system of systems," which describes forensic science as a sub-system in the larger criminal justice, law enforcement, intelligence, and military system. The purpose of this paper is to discuss biometrics in the military domain and integration into the wider forensic science system of systems. The holistic system thinking methodology was applied to the U.S. biometric system to map it to the system of systems framework. The U.S. biometric system is used as a case study to help guide other countries to develop military biometric systems that are integrated and interoperable at the whole-of-government level. The aim is to provide the system of systems framework for agencies to consider for proactive design of biometric systems. © 2018 American Academy of Forensic Sciences.
Hierarchical semantic structures for medical NLP.
Taira, Ricky K; Arnold, Corey W
2013-01-01
We present a framework for building a medical natural language processing (NLP) system capable of deep understanding of clinical text reports. The framework helps developers understand how various NLP-related efforts and knowledge sources can be integrated. The aspects considered include: 1) computational issues dealing with defining layers of intermediate semantic structures to reduce the dimensionality of the NLP problem; 2) algorithmic issues in which we survey the NLP literature and discuss state-of-the-art procedures used to map between various levels of the hierarchy; and 3) implementation issues to software developers with available resources. The objective of this poster is to educate readers to the various levels of semantic representation (e.g., word level concepts, ontological concepts, logical relations, logical frames, discourse structures, etc.). The poster presents an architecture for which diverse efforts and resources in medical NLP can be integrated in a principled way.
Semantic Web repositories for genomics data using the eXframe platform.
Merrill, Emily; Corlosquet, Stéphane; Ciccarese, Paolo; Clark, Tim; Das, Sudeshna
2014-01-01
With the advent of inexpensive assay technologies, there has been an unprecedented growth in genomics data as well as the number of databases in which it is stored. In these databases, sample annotation using ontologies and controlled vocabularies is becoming more common. However, the annotation is rarely available as Linked Data, in a machine-readable format, or for standardized queries using SPARQL. This makes large-scale reuse, or integration with other knowledge bases very difficult. To address this challenge, we have developed the second generation of our eXframe platform, a reusable framework for creating online repositories of genomics experiments. This second generation model now publishes Semantic Web data. To accomplish this, we created an experiment model that covers provenance, citations, external links, assays, biomaterials used in the experiment, and the data collected during the process. The elements of our model are mapped to classes and properties from various established biomedical ontologies. Resource Description Framework (RDF) data is automatically produced using these mappings and indexed in an RDF store with a built-in Sparql Protocol and RDF Query Language (SPARQL) endpoint. Using the open-source eXframe software, institutions and laboratories can create Semantic Web repositories of their experiments, integrate it with heterogeneous resources and make it interoperable with the vast Semantic Web of biomedical knowledge.
Leverage hadoop framework for large scale clinical informatics applications.
Dong, Xiao; Bahroos, Neil; Sadhu, Eugene; Jackson, Tommie; Chukhman, Morris; Johnson, Robert; Boyd, Andrew; Hynes, Denise
2013-01-01
In this manuscript, we present our experiences using the Apache Hadoop framework for high data volume and computationally intensive applications, and discuss some best practice guidelines in a clinical informatics setting. There are three main aspects in our approach: (a) process and integrate diverse, heterogeneous data sources using standard Hadoop programming tools and customized MapReduce programs; (b) after fine-grained aggregate results are obtained, perform data analysis using the Mahout data mining library; (c) leverage the column oriented features in HBase for patient centric modeling and complex temporal reasoning. This framework provides a scalable solution to meet the rapidly increasing, imperative "Big Data" needs of clinical and translational research. The intrinsic advantage of fault tolerance, high availability and scalability of Hadoop platform makes these applications readily deployable at the enterprise level cluster environment.
Sawkins, M C; Farmer, A D; Hoisington, D; Sullivan, J; Tolopko, A; Jiang, Z; Ribaut, J-M
2004-10-01
In the past few decades, a wealth of genomic data has been produced in a wide variety of species using a diverse array of functional and molecular marker approaches. In order to unlock the full potential of the information contained in these independent experiments, researchers need efficient and intuitive means to identify common genomic regions and genes involved in the expression of target phenotypic traits across diverse conditions. To address this need, we have developed a Comparative Map and Trait Viewer (CMTV) tool that can be used to construct dynamic aggregations of a variety of types of genomic datasets. By algorithmically determining correspondences between sets of objects on multiple genomic maps, the CMTV can display syntenic regions across taxa, combine maps from separate experiments into a consensus map, or project data from different maps into a common coordinate framework using dynamic coordinate translations between source and target maps. We present a case study that illustrates the utility of the tool for managing large and varied datasets by integrating data collected by CIMMYT in maize drought tolerance research with data from public sources. This example will focus on one of the visualization features for Quantitative Trait Locus (QTL) data, using likelihood ratio (LR) files produced by generic QTL analysis software and displaying the data in a unique visual manner across different combinations of traits, environments and crosses. Once a genomic region of interest has been identified, the CMTV can search and display additional QTLs meeting a particular threshold for that region, or other functional data such as sets of differentially expressed genes located in the region; it thus provides an easily used means for organizing and manipulating data sets that have been dynamically integrated under the focus of the researcher's specific hypothesis.
DyKOSMap: A framework for mapping adaptation between biomedical knowledge organization systems.
Dos Reis, Julio Cesar; Pruski, Cédric; Da Silveira, Marcos; Reynaud-Delaître, Chantal
2015-06-01
Knowledge Organization Systems (KOS) and their associated mappings play a central role in several decision support systems. However, by virtue of knowledge evolution, KOS entities are modified over time, impacting mappings and potentially turning them invalid. This requires semi-automatic methods to maintain such semantic correspondences up-to-date at KOS evolution time. We define a complete and original framework based on formal heuristics that drives the adaptation of KOS mappings. Our approach takes into account the definition of established mappings, the evolution of KOS and the possible changes that can be applied to mappings. This study experimentally evaluates the proposed heuristics and the entire framework on realistic case studies borrowed from the biomedical domain, using official mappings between several biomedical KOSs. We demonstrate the overall performance of the approach over biomedical datasets of different characteristics and sizes. Our findings reveal the effectiveness in terms of precision, recall and F-measure of the suggested heuristics and methods defining the framework to adapt mappings affected by KOS evolution. The obtained results contribute and improve the quality of mappings over time. The proposed framework can adapt mappings largely automatically, facilitating thus the maintenance task. The implemented algorithms and tools support and minimize the work of users in charge of KOS mapping maintenance. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Gopalakrishnan, G.
2013-12-01
In the aftermath of man-made disasters such as oil spills or natural disasters such as hurricanes and floods, city planners and residents of affected areas are often concerned about future vulnerabilities and rebuilding the area to increase resilience. However, identifying locations in the affected area that are most impacted by the disaster, the associated human health risks and potential vulnerabilities often require a monitoring effort that is expensive, time-consuming and difficult to implement in disaster-hit areas using traditional monitoring techniques. This project presents a framework for identifying areas that are most likely to be impacted by disasters by integrating remote sensing data and information from social media networks, including Twitter streams. The framework was tested for New York, coastal New Jersey and Staten Island in the aftermath of Hurricane Sandy. Vulnerable areas were identified using anomaly detection and the results were mapped against measurements collected on the ground. A correlation coefficient of 0.78 was obtained. Uncertainty in model predictions was evaluated using Monte Carlo simulations.
Page, William R.; Berry, Margaret E.; VanSistine, D. Paco; Snyders, Scott R.
2009-01-01
The purpose of this map is to provide an integrated, bi-national geologic map dataset for display and analyses on an Arc Internet Map Service (IMS) dedicated to environmental health studies in the United States-Mexico border region. The IMS web site was designed by the US-Mexico Border Environmental Health Initiative project and collaborators, and the IMS and project web site address is http://borderhealth.cr.usgs.gov/. The objective of the project is to acquire, evaluate, analyze, and provide earth, biologic, and human health resources data within a GIS framework (IMS) to further our understanding of possible linkages between the physical environment and public health issues. The geologic map dataset is just one of many datasets included in the web site; other datasets include biologic, hydrologic, geographic, and human health themes.
Direito, Artur; Walsh, Deirdre; Hinbarji, Moohamad; Albatal, Rami; Tooley, Mark; Whittaker, Robyn; Maddison, Ralph
2018-06-01
Few interventions to promote physical activity (PA) adapt dynamically to changes in individuals' behavior. Interventions targeting determinants of behavior are linked with increased effectiveness and should reflect changes in behavior over time. This article describes the application of two frameworks to assist the development of an adaptive evidence-based smartphone-delivered intervention aimed at influencing PA and sedentary behaviors (SB). Intervention mapping was used to identify the determinants influencing uptake of PA and optimal behavior change techniques (BCTs). Behavioral intervention technology was used to translate and operationalize the BCTs and its modes of delivery. The intervention was based on the integrated behavior change model, focused on nine determinants, consisted of 33 BCTs, and included three main components: (1) automated capture of daily PA and SB via an existing smartphone application, (2) classification of the individual into an activity profile according to their PA and SB, and (3) behavior change content delivery in a dynamic fashion via a proof-of-concept application. This article illustrates how two complementary frameworks can be used to guide the development of a mobile health behavior change program. This approach can guide the development of future mHealth programs.
Clark, Phillip G; Cott, Cheryl; Drinka, Theresa J K
2007-12-01
Interprofessional teamwork is an essential and expanding form of health care practice. While moral issues arising in teamwork relative to the patient have been explored, the analysis of ethical issues regarding the function of the team itself is limited. This paper develops a conceptual framework for organizing and analyzing the different types of ethical issues in interprofessional teamwork. This framework is a matrix that maps the elements of principles, structures, and processes against individual, team, and organizational levels. A case study is presented that illustrates different dimensions of these topics, based on the application of this framework. Finally, a set of conclusions and recommendations is presented to summarize the integration of theory and practice in interprofessional ethics, including: (i) importance of a framework, (ii) interprofessional ethics discourse, and (iii) interprofessional ethics as an emerging field. The goal of this paper is to begin a dialogue and discussion on the ethical issues confronting interprofessional teams and to lay the foundation for an expanding discourse on interprofessional ethics.
Harnagea, Hermina; Lamothe, Lise; Couturier, Yves; Esfandiari, Shahrokh; Voyer, René; Charbonneau, Anne; Emami, Elham
2018-02-15
Despite its importance, the integration of oral health into primary care is still an emerging practice in the field of health care services. This scoping review aims to map the literature and provide a summary on the conceptual frameworks, policies and programs related to this concept. Using the Levac et al. six-stage framework, we performed a systematic search of electronic databases, organizational websites and grey literature from 1978 to April 2016. All relevant original publications with a focus on the integration of oral health into primary care were retrieved. Content analyses were performed to synthesize the results. From a total of 1619 citations, 67 publications were included in the review. Two conceptual frameworks were identified. Policies regarding oral heath integration into primary care were mostly oriented toward common risk factors approach and care coordination processes. In general, oral health integrated care programs were designed in the public health sector and based on partnerships with various private and public health organizations, governmental bodies and academic institutions. These programmes used various strategies to empower oral health integrated care, including building interdisciplinary networks, training non-dental care providers, oral health champion modelling, enabling care linkages and care coordinated process, as well as the use of e-health technologies. The majority of studies on the programs outcomes were descriptive in nature without reporting long-term outcomes. This scoping review provided a comprehensive overview on the concept of integration of oral health in primary care. The findings identified major gaps in reported programs outcomes mainly because of the lack of related research. However, the results could be considered as a first step in the development of health care policies that support collaborative practices and patient-centred care in the field of primary care sector.
Mapping population-based structural connectomes.
Zhang, Zhengwu; Descoteaux, Maxime; Zhang, Jingwen; Girard, Gabriel; Chamberland, Maxime; Dunson, David; Srivastava, Anuj; Zhu, Hongtu
2018-05-15
Advances in understanding the structural connectomes of human brain require improved approaches for the construction, comparison and integration of high-dimensional whole-brain tractography data from a large number of individuals. This article develops a population-based structural connectome (PSC) mapping framework to address these challenges. PSC simultaneously characterizes a large number of white matter bundles within and across different subjects by registering different subjects' brains based on coarse cortical parcellations, compressing the bundles of each connection, and extracting novel connection weights. A robust tractography algorithm and streamline post-processing techniques, including dilation of gray matter regions, streamline cutting, and outlier streamline removal are applied to improve the robustness of the extracted structural connectomes. The developed PSC framework can be used to reproducibly extract binary networks, weighted networks and streamline-based brain connectomes. We apply the PSC to Human Connectome Project data to illustrate its application in characterizing normal variations and heritability of structural connectomes in healthy subjects. Copyright © 2018 Elsevier Inc. All rights reserved.
A Spatial Framework to Map Heat Health Risks at Multiple Scales.
Ho, Hung Chak; Knudby, Anders; Huang, Wei
2015-12-18
In the last few decades extreme heat events have led to substantial excess mortality, most dramatically in Central Europe in 2003, in Russia in 2010, and even in typically cool locations such as Vancouver, Canada, in 2009. Heat-related morbidity and mortality is expected to increase over the coming centuries as the result of climate-driven global increases in the severity and frequency of extreme heat events. Spatial information on heat exposure and population vulnerability may be combined to map the areas of highest risk and focus mitigation efforts there. However, a mismatch in spatial resolution between heat exposure and vulnerability data can cause spatial scale issues such as the Modifiable Areal Unit Problem (MAUP). We used a raster-based model to integrate heat exposure and vulnerability data in a multi-criteria decision analysis, and compared it to the traditional vector-based model. We then used the Getis-Ord G(i) index to generate spatially smoothed heat risk hotspot maps from fine to coarse spatial scales. The raster-based model allowed production of maps at spatial resolution, more description of local-scale heat risk variability, and identification of heat-risk areas not identified with the vector-based approach. Spatial smoothing with the Getis-Ord G(i) index produced heat risk hotspots from local to regional spatial scale. The approach is a framework for reducing spatial scale issues in future heat risk mapping, and for identifying heat risk hotspots at spatial scales ranging from the block-level to the municipality level.
Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon
NASA Astrophysics Data System (ADS)
Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.
2015-12-01
Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction Engineering, Oregon State University, Corvallis, OR 97331, USA
Kendall, Tamil; Langer, Ana; Bärnighausen, Till
2014-01-01
Objective: Both sexual and reproductive health (SRH) services and HIV programs in sub-Saharan Africa are typically delivered vertically, operating parallel to national health systems. The objective of this study was to map the evidence on national and international strategies for integration of SRH and HIV services in sub-Saharan Africa and to develop a research agenda for future health systems integration. Methods: We examined the literature on national and international strategies to integrate SRH and HIV services using a scoping study methodology. Current policy frameworks, national HIV strategies and research, and gray literature on integration were mapped. Five countries in sub-Saharan Africa with experience of integrating SRH and HIV services were purposively sampled for detailed thematic analysis, according to the health systems functions of governance, policy and planning, financing, health workforce organization, service organization, and monitoring and evaluation. Results: The major international health policies and donor guidance now support integration. Most integration research has focused on linkages of SRH and HIV front-line services. Yet, the common problems with implementation are related to delayed or incomplete integration of higher level health systems functions: lack of coordinated leadership and unified national integration policies; separate financing streams for SRH and HIV services and inadequate health worker training, supervision and retention. Conclusions: Rigorous health systems research on the integration of SRH and HIV services is urgently needed. Priority research areas include integration impact, performance, and economic evaluation to inform the planning, financing, and coordination of integrated service delivery. PMID:25436826
NASA Astrophysics Data System (ADS)
Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.
2005-12-01
As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s Web Feature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS/DOT, and Intergraph; and 5) develop WFS-based solutions and technical documents using the GeoMedia WebMap WFS toolkit. Geospatial Web Feature Service is demonstrated to be more efficient in sharing vector data and supports direct Internet access transportation data. Developed WFS solutions also enhanced the interoperable service provided by CEOSR through the FGDC clearinghouse node and the GOS Portal.
Chan, Louis K H; Hayward, William G
2009-02-01
In feature integration theory (FIT; A. Treisman & S. Sato, 1990), feature detection is driven by independent dimensional modules, and other searches are driven by a master map of locations that integrates dimensional information into salience signals. Although recent theoretical models have largely abandoned this distinction, some observed results are difficult to explain in its absence. The present study measured dimension-specific performance during detection and localization, tasks that require operation of dimensional modules and the master map, respectively. Results showed a dissociation between tasks in terms of both dimension-switching costs and cross-dimension attentional capture, reflecting a dimension-specific nature for detection tasks and a dimension-general nature for localization tasks. In a feature-discrimination task, results precluded an explanation based on response mode. These results are interpreted to support FIT's postulation that different mechanisms are involved in parallel and focal attention searches. This indicates that the FIT architecture should be adopted to explain the current results and that a variety of visual attention findings can be addressed within this framework. Copyright 2009 APA, all rights reserved.
The MMI Semantic Framework: Rosetta Stones for Earth Sciences
NASA Astrophysics Data System (ADS)
Rueda, C.; Bermudez, L. E.; Graybeal, J.; Alexander, P.
2009-12-01
Semantic interoperability—the exchange of meaning among computer systems—is needed to successfully share data in Ocean Science and across all Earth sciences. The best approach toward semantic interoperability requires a designed framework, and operationally tested tools and infrastructure within that framework. Currently available technologies make a scientific semantic framework feasible, but its development requires sustainable architectural vision and development processes. This presentation outlines the MMI Semantic Framework, including recent progress on it and its client applications. The MMI Semantic Framework consists of tools, infrastructure, and operational and community procedures and best practices, to meet short-term and long-term semantic interoperability goals. The design and prioritization of the semantic framework capabilities are based on real-world scenarios in Earth observation systems. We describe some key uses cases, as well as the associated requirements for building the overall infrastructure, which is realized through the MMI Ontology Registry and Repository. This system includes support for community creation and sharing of semantic content, ontology registration, version management, and seamless integration of user-friendly tools and application programming interfaces. The presentation describes the architectural components for semantic mediation, registry and repository for vocabularies, ontology, and term mappings. We show how the technologies and approaches in the framework can address community needs for managing and exchanging semantic information. We will demonstrate how different types of users and client applications exploit the tools and services for data aggregation, visualization, archiving, and integration. Specific examples from OOSTethys (http://www.oostethys.org) and the Ocean Observatories Initiative Cyberinfrastructure (http://www.oceanobservatories.org) will be cited. Finally, we show how semantic augmentation of web services standards could be performed using framework tools.
Structured feedback on students' concept maps: the proverbial path to learning?
Joseph, Conran; Conradsson, David; Nilsson Wikmar, Lena; Rowe, Michael
2017-05-25
Good conceptual knowledge is an essential requirement for health professions students, in that they are required to apply concepts learned in the classroom to a variety of different contexts. However, the use of traditional methods of assessment limits the educator's ability to correct students' conceptual knowledge prior to altering the educational context. Concept mapping (CM) is an educational tool for evaluating conceptual knowledge, but little is known about its use in facilitating the development of richer knowledge frameworks. In addition, structured feedback has the potential to develop good conceptual knowledge. The purpose of this study was to use Kinchin's criteria to assess the impact of structured feedback on the graphical complexity of CM's by observing the development of richer knowledge frameworks. Fifty-eight physiotherapy students created CM's targeting the integration of two knowledge domains within a case-based teaching paradigm. Each student received one round of structured feedback that addressed correction, reinforcement, forensic diagnosis, benchmarking, and longitudinal development on their CM's prior to the final submission. The concept maps were categorized according to Kinchin's criteria as either Spoke, Chain or Net representations, and then evaluated against defined traits of meaningful learning. The inter-rater reliability of categorizing CM's was good. Pre-feedback CM's were predominantly Chain structures (57%), with Net structures appearing least often. There was a significant reduction of the basic Spoke- structured CMs (P = 0.002) and a significant increase of Net-structured maps (P < 0.001) at the final evaluation (post-feedback). Changes in structural complexity of CMs appeared to be indicative of broader knowledge frameworks as assessed against the meaningful learning traits. Feedback on CM's seemed to have contributed towards improving conceptual knowledge and correcting naive conceptions of related knowledge. Educators in medical education could therefore consider using CM's to target individual student development.
DeBate, Rita; Corvin, Jaime A; Wolfe-Quintero, Kate; Petersen, Donna J
2017-01-01
Twenty-first century health challenges have significantly altered the expanding role and functions of public health professionals. Guided by a call from the Association of Schools and Programs of Public Health's (ASPPH) and the Framing the Future: The Second 100 Years of Education for Public Health report to adopt new and innovative approaches to prepare public health leaders, the University of South Florida College of Public Health aimed to self-assess the current Masters of Public Health (MPH) core curriculum with regard to preparing students to meet twenty-first century public health challenges. This paper describes how Intervention Mapping was employed as a framework to increase readiness and mobilize the COPH community for curricular change. Intervention Mapping provides an ideal framework, allowing organizations to access capacity, specify goals, and guide the change process from curriculum development to implementation and evaluation of competency-driven programs. The steps outlined in this paper resulted in a final set of revised MPH core competencies that are interdisciplinary in nature and fulfill the emergent needs to address changing trends in both public health education and challenges in population health approaches. Ultimately, the competencies developed through this process were agreed upon by the entire College of Public Health faculty, signaling one college's readiness for change, while providing the impetus to revolutionize the delivery of public health education at the University of South Florida.
DeBate, Rita; Corvin, Jaime A.; Wolfe-Quintero, Kate; Petersen, Donna J.
2017-01-01
Twenty-first century health challenges have significantly altered the expanding role and functions of public health professionals. Guided by a call from the Association of Schools and Programs of Public Health’s (ASPPH) and the Framing the Future: The Second 100 Years of Education for Public Health report to adopt new and innovative approaches to prepare public health leaders, the University of South Florida College of Public Health aimed to self-assess the current Masters of Public Health (MPH) core curriculum with regard to preparing students to meet twenty-first century public health challenges. This paper describes how Intervention Mapping was employed as a framework to increase readiness and mobilize the COPH community for curricular change. Intervention Mapping provides an ideal framework, allowing organizations to access capacity, specify goals, and guide the change process from curriculum development to implementation and evaluation of competency-driven programs. The steps outlined in this paper resulted in a final set of revised MPH core competencies that are interdisciplinary in nature and fulfill the emergent needs to address changing trends in both public health education and challenges in population health approaches. Ultimately, the competencies developed through this process were agreed upon by the entire College of Public Health faculty, signaling one college’s readiness for change, while providing the impetus to revolutionize the delivery of public health education at the University of South Florida. PMID:29164095
Geologic Map of the Stafford Quadrangle, Stafford County, Virginia
Mixon, Robert B.; Pavlides, Louis; Horton, J. Wright; Powars, David S.; Schindler, J. Stephen
2005-01-01
Introduction The Stafford 7.5-minute quadrangle, comprising approximately 55 square miles (142.5 square kilometers) of northeastern Virginia, is about 40 miles (mi) south of Washington, D.C. The region's main north-south transportation corridor, which connects Washington, D.C., and Richmond, Va., consists of Interstate 95, U.S. Highway 1, and the heavily used CSX and Amtrak railroads. Although the northern and eastern parts of the Stafford quadrangle have undergone extensive suburban development, the remainder of the area is still dominantly rural in character. The town of Stafford is the county seat. The Stafford 7.5-minute quadrangle is located in the Fredericksburg 30'x60' quadrangle, where information on the regional stratigraphy and structure is available from Mixon and others' (2000) geologic map and multichapter explanatory text. In addition to straddling the 'Fall Zone' boundary between the Appalachian Piedmont and the Atlantic Coastal Plain provinces, this quadrangle contains the best preserved and best studied segment of the Stafford fault system, an important example of late Cenozoic faulting in eastern North America (Mixon and Newell, 1977). This 1:24,000-scale geologic map provides a detailed framework for interpreting and integrating topical studies of that fault system. Our geologic map integrates more than two decades of intermittent geologic mapping and related investigations by the authors in this part of the Virginia Coastal Plain. Earlier mapping in the Piedmont by Pavlides (1995) has been revised by additional detailed mapping in selected areas, particularly near Abel Lake and Smith Lake, and by field evaluation of selected contact relations.
Oliveira, Sandra; Félix, Fernando; Nunes, Adélia; Lourenço, Luciano; Laneve, Giovanni; Sebastián-López, Ana
2018-01-15
Vulnerability assessment is a vital component of wildfire management. This research focused on the development of a framework to measure and map vulnerability levels in several areas within Mediterranean Europe, where wildfires are a major concern. The framework followed a stepwise approach to evaluate its main components, expressed by exposure, sensitivity and coping capacity. Data on population density, fuel types, protected areas location, roads infrastructure and surveillance activities, among others, were integrated to create composite indices, representing each component and articulated in multiple dimensions. Maps were created for several test areas, in northwest Portugal, southwest Sardinia in Italy and northeast Corsica in France, with the contribution of local participants from civil protection institutions and forest services. Results showed the influence of fuel sensitivity levels, population distribution and protected areas coverage for the overall vulnerability classes. Reasonable levels of accuracy were found on the maps provided through the validation procedure, with an overall match above 72% for the several sites. The systematic and flexible approach applied allowed for adjustments to local circumstances with regards to data availability and fire management procedures, without compromising its consistency and with substantial operational capabilities. The results obtained and the positive feedback of end-users encourage its further application, as a means to improve wildfire management strategies at multiple levels with the latest scientific outputs. Copyright © 2017 Elsevier Ltd. All rights reserved.
A road map for integrating eco-evolutionary processes into biodiversity models.
Thuiller, Wilfried; Münkemüller, Tamara; Lavergne, Sébastien; Mouillot, David; Mouquet, Nicolas; Schiffers, Katja; Gravel, Dominique
2013-05-01
The demand for projections of the future distribution of biodiversity has triggered an upsurge in modelling at the crossroads between ecology and evolution. Despite the enthusiasm around these so-called biodiversity models, most approaches are still criticised for not integrating key processes known to shape species ranges and community structure. Developing an integrative modelling framework for biodiversity distribution promises to improve the reliability of predictions and to give a better understanding of the eco-evolutionary dynamics of species and communities under changing environments. In this article, we briefly review some eco-evolutionary processes and interplays among them, which are essential to provide reliable projections of species distributions and community structure. We identify gaps in theory, quantitative knowledge and data availability hampering the development of an integrated modelling framework. We argue that model development relying on a strong theoretical foundation is essential to inspire new models, manage complexity and maintain tractability. We support our argument with an example of a novel integrated model for species distribution modelling, derived from metapopulation theory, which accounts for abiotic constraints, dispersal, biotic interactions and evolution under changing environmental conditions. We hope such a perspective will motivate exciting and novel research, and challenge others to improve on our proposed approach. © 2013 John Wiley & Sons Ltd/CNRS.
The mineralogy of global magnetic anomalies
NASA Technical Reports Server (NTRS)
Haggerty, S. E. (Principal Investigator)
1984-01-01
Experimental and analytical data on magnetic mineralogy was provided as an aid to the interpretation of magnetic anomaly maps. An integrated program, ranging from the chemistry of materials from 100 or more km depth within the Earth, to an examination of the MAGSAT anomaly maps at about 400 km above the Earth's surface, was undertaken. Within this framework, a detailed picture of the pertinent mineralogical and magnetic relationships for the region of West Africa was provided. Efforts were directed toward: (1) examining the geochemistry, mineralogy, magnetic properties, and phases relations of magnetic oxides and metal alloys in rocks demonstrated to have originated in the lower crust of upper mantle, (2) examining the assumption that these rocks portray the nature of their source regions; and (3) examining the regional geology, tectonics, gravity field and the MAGSAT anomaly maps for West Africa.
Using hydrogeologic data to evaluate geothermal potential in the eastern Great Basin
Masbruch, Melissa D.; Heilweil, Victor M.; Brooks, Lynette E.
2012-01-01
In support of a larger study to evaluate geothermal resource development of high-permeability stratigraphic units in sedimentary basins, this paper integrates groundwater and thermal data to evaluate heat and fluid flow within the eastern Great Basin. Previously published information from a hydrogeologic framework, a potentiometric-surface map, and groundwater budgets was compared to a surficial heat-flow map. Comparisons between regional groundwater flow patterns and surficial heat flow indicate a strong spatial relation between regional groundwater movement and surficial heat distribution. Combining aquifer geometry and heat-flow maps, a selected group of subareas within the eastern Great Basin are identified that have high surficial heat flow and are underlain by a sequence of thick basin-fill deposits and permeable carbonate aquifers. These regions may have potential for future geothermal resources development.
Prototype development of a web-based participative decision support platform in risk management
NASA Astrophysics Data System (ADS)
Aye, Zar Chi; Olyazadeh, Roya; Jaboyedoff, Michel; Derron, Marc-Henri
2014-05-01
This paper discusses the proposed background architecture and prototype development of an internet-based decision support system (DSS) in the field of natural hazards and risk management using open-source geospatial software and web technologies. It is based on a three-tier, client-server architecture with the support of boundless (opengeo) framework and its client side SDK application environment using customized gxp components and data utility classes. The main purpose of the system is to integrate the workflow of risk management systematically with the diverse involvement of stakeholders from different organizations dealing with natural hazards and risk for evaluation of management measures through the active online participation approach. It aims to develop an adaptive user friendly, web-based environment that allows the users to set up risk management strategies based on actual context and data by integrating web-GIS and DSS functionality associated with process flow and other visualization tools. Web-GIS interface has been integrated within the DSS to deliver maps and provide certain geo-processing capabilities on the web, which can be easily accessible and shared by different organizations located in case study sites of the project. This platform could be envisaged not only as a common web-based platform for the centralized sharing of data such as hazard maps, elements at risk maps and additional information but also to ensure an integrated platform of risk management where the users could upload data, analyze risk and identify possible alternative scenarios for risk reduction especially for floods and landslides, either quantitatively or qualitatively depending on the risk information provided by the stakeholders in case study regions. The level of involvement, access to and interaction with the provided functionality of the system varies depending on the roles and responsibilities of the stakeholders, for example, only the experts (planners, geological services, etc.) can have access to the alternative definition component to formulate the risk reduction measures. The development of such a participative platform would finally lead to an integrated risk management approach highlighting the needs to deal with involved experts and civil society in the decision-making process for evaluation of risk management measures through the active participation approach. The system will be applied and evaluated in four case study areas of the CHANGES project in Europe: Romania, North Eastern Italy, French Alps and Poland. However, the framework of the system is designed in a generic way so as to be applicable in other regions to achieve the high adaptability and flexibility of the system. The research has been undertaken as a part of the CHANGES project funded by the European Commission's 7th framework program.
NASA Astrophysics Data System (ADS)
Condit, C. D.; Mninch, M.
2012-12-01
The Dynamic Digital Map (DDM) is an ideal vehicle for the professional geologist to use to describe the geologic setting of key sites to the public in a format that integrates and presents maps and associated analytical data and multimedia without the need for an ArcGIS interface. Maps with field trip guide stops that include photographs, movies and figures and animations, showing, for example, how the features seen in the field formed, or how data might be best visualized in "time-frame" sequences are ideally included in DDMs. DDMs distribute geologic maps, images, movies, analytical data, and text such as field guides, in an integrated cross-platform, web enabled format that are intuitive to use, easily and quickly searchable, and require no additional proprietary software to operate. Maps, photos, movies and animations are stored outside the program, which acts as an organizational framework and index to present these data. Once created, the DDM can be downloaded from the web site hosting it in the flavor matching the user's operating system (e.g. Linux, Windows and Macintosh) as zip, dmg or tar files (and soon as iOS and Android tablet apps). When decompressed, the DDM can then access its associated data directly from that site with no browser needed. Alternatively, the entire package can be distributed and used from CD, DVD, or flash-memory storage. The intent of this presentation is to introduce the variety of geology that can be accessed from the over 25 DDMs created to date, concentrating on the DDM of the Springerville Volcanic Field. We will highlight selected features of some of them, introduce a simplified interface to the original DDM (that we renamed DDMC for Classic) and give a brief look at a the recently (2010-2011) completed geologic maps of the Springerville Volcanic field to see examples of each of the features discussed above, and a display of the integrated analytical data set. We will also highlight the differences between the classic or DDMCs and the new Dynamic Digital Map Extended (DDME) designed from the ground up to take advantage of the expanded connectedness this redesigned program will accommodate.
Advancing the integration of spatial data to map human and natural drivers on coral reefs
Gove, Jamison M.; Walecka, Hilary R.; Donovan, Mary K.; Williams, Gareth J.; Jouffray, Jean-Baptiste; Crowder, Larry B.; Erickson, Ashley; Falinski, Kim; Friedlander, Alan M.; Kappel, Carrie V.; Kittinger, John N.; McCoy, Kaylyn; Norström, Albert; Nyström, Magnus; Oleson, Kirsten L. L.; Stamoulis, Kostantinos A.; White, Crow; Selkoe, Kimberly A.
2018-01-01
A major challenge for coral reef conservation and management is understanding how a wide range of interacting human and natural drivers cumulatively impact and shape these ecosystems. Despite the importance of understanding these interactions, a methodological framework to synthesize spatially explicit data of such drivers is lacking. To fill this gap, we established a transferable data synthesis methodology to integrate spatial data on environmental and anthropogenic drivers of coral reefs, and applied this methodology to a case study location–the Main Hawaiian Islands (MHI). Environmental drivers were derived from time series (2002–2013) of climatological ranges and anomalies of remotely sensed sea surface temperature, chlorophyll-a, irradiance, and wave power. Anthropogenic drivers were characterized using empirically derived and modeled datasets of spatial fisheries catch, sedimentation, nutrient input, new development, habitat modification, and invasive species. Within our case study system, resulting driver maps showed high spatial heterogeneity across the MHI, with anthropogenic drivers generally greatest and most widespread on O‘ahu, where 70% of the state’s population resides, while sedimentation and nutrients were dominant in less populated islands. Together, the spatial integration of environmental and anthropogenic driver data described here provides a first-ever synthetic approach to visualize how the drivers of coral reef state vary in space and demonstrates a methodological framework for implementation of this approach in other regions of the world. By quantifying and synthesizing spatial drivers of change on coral reefs, we provide an avenue for further research to understand how drivers determine reef diversity and resilience, which can ultimately inform policies to protect coral reefs. PMID:29494613
A unified approach for debugging is-a structure and mappings in networked taxonomies
2013-01-01
Background With the increased use of ontologies and ontology mappings in semantically-enabled applications such as ontology-based search and data integration, the issue of detecting and repairing defects in ontologies and ontology mappings has become increasingly important. These defects can lead to wrong or incomplete results for the applications. Results We propose a unified framework for debugging the is-a structure of and mappings between taxonomies, the most used kind of ontologies. We present theory and algorithms as well as an implemented system RepOSE, that supports a domain expert in detecting and repairing missing and wrong is-a relations and mappings. We also discuss two experiments performed by domain experts: an experiment on the Anatomy ontologies from the Ontology Alignment Evaluation Initiative, and a debugging session for the Swedish National Food Agency. Conclusions Semantically-enabled applications need high quality ontologies and ontology mappings. One key aspect is the detection and removal of defects in the ontologies and ontology mappings. Our system RepOSE provides an environment that supports domain experts to deal with this issue. We have shown the usefulness of the approach in two experiments by detecting and repairing circa 200 and 30 defects, respectively. PMID:23548155
Database integration in a multimedia-modeling environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dorow, Kevin E.
2002-09-02
Integration of data from disparate remote sources has direct applicability to modeling, which can support Brownfield assessments. To accomplish this task, a data integration framework needs to be established. A key element in this framework is the metadata that creates the relationship between the pieces of information that are important in the multimedia modeling environment and the information that is stored in the remote data source. The design philosophy is to allow modelers and database owners to collaborate by defining this metadata in such a way that allows interaction between their components. The main parts of this framework include toolsmore » to facilitate metadata definition, database extraction plan creation, automated extraction plan execution / data retrieval, and a central clearing house for metadata and modeling / database resources. Cross-platform compatibility (using Java) and standard communications protocols (http / https) allow these parts to run in a wide variety of computing environments (Local Area Networks, Internet, etc.), and, therefore, this framework provides many benefits. Because of the specific data relationships described in the metadata, the amount of data that have to be transferred is kept to a minimum (only the data that fulfill a specific request are provided as opposed to transferring the complete contents of a data source). This allows for real-time data extraction from the actual source. Also, the framework sets up collaborative responsibilities such that the different types of participants have control over the areas in which they have domain knowledge-the modelers are responsible for defining the data relevant to their models, while the database owners are responsible for mapping the contents of the database using the metadata definitions. Finally, the data extraction mechanism allows for the ability to control access to the data and what data are made available.« less
Neuhaus, Philipp; Doods, Justin; Dugas, Martin
2015-01-01
Automatic coding of medical terms is an important, but highly complicated and laborious task. To compare and evaluate different strategies a framework with a standardized web-interface was created. Two UMLS mapping strategies are compared to demonstrate the interface. The framework is a Java Spring application running on a Tomcat application server. It accepts different parameters and returns results in JSON format. To demonstrate the framework, a list of medical data items was mapped by two different methods: similarity search in a large table of terminology codes versus search in a manually curated repository. These mappings were reviewed by a specialist. The evaluation shows that the framework is flexible (due to standardized interfaces like HTTP and JSON), performant and reliable. Accuracy of automatically assigned codes is limited (up to 40%). Combining different semantic mappers into a standardized Web-API is feasible. This framework can be easily enhanced due to its modular design.
Saliency detection using mutual consistency-guided spatial cues combination
NASA Astrophysics Data System (ADS)
Wang, Xin; Ning, Chen; Xu, Lizhong
2015-09-01
Saliency detection has received extensive interests due to its remarkable contribution to wide computer vision and pattern recognition applications. However, most existing computational models are designed for detecting saliency in visible images or videos. When applied to infrared images, they may suffer from limitations in saliency detection accuracy and robustness. In this paper, we propose a novel algorithm to detect visual saliency in infrared images by mutual consistency-guided spatial cues combination. First, based on the luminance contrast and contour characteristics of infrared images, two effective saliency maps, i.e., the luminance contrast saliency map and contour saliency map are constructed, respectively. Afterwards, an adaptive combination scheme guided by mutual consistency is exploited to integrate these two maps to generate the spatial saliency map. This idea is motivated by the observation that different maps are actually related to each other and the fusion scheme should present a logically consistent view of these maps. Finally, an enhancement technique is adopted to incorporate spatial saliency maps at various scales into a unified multi-scale framework to improve the reliability of the final saliency map. Comprehensive evaluations on real-life infrared images and comparisons with many state-of-the-art saliency models demonstrate the effectiveness and superiority of the proposed method for saliency detection in infrared images.
NASA Astrophysics Data System (ADS)
El-Gafy, Mohamed Anwar
Transportation projects will have impact on the environment. The general environmental pollution and damage caused by roads is closely associated with the level of economic activity. Although Environmental Impact Assessments (EIAs) are dependent on geo-spatial information in order to make an assessment, there are no rules per se how to conduct an environmental assessment. Also, the particular objective of each assessment is dictated case-by-case, based on what information and analyses are required. The conventional way of Environmental Impact Assessment (EIA) study is a time consuming process because it has large number of dependent and independent variables which have to be taken into account, which also have different consequences. With the emergence of satellite remote sensing technology and Geographic Information Systems (GIS), this research presents a new framework for the analysis phase of the Environmental Impact Assessment (EIA) for transportation projects based on the integration between remote sensing technology, geographic information systems, and spatial modeling. By integrating the merits of the map overlay method and the matrix method, the framework analyzes comprehensively the environmental vulnerability around the road and its impact on the environment. This framework is expected to: (1) improve the quality of the decision making process, (2) be applied both to urban and inter-urban projects, regardless of transport mode, and (3) present the data and make the appropriate analysis to support the decision of the decision-makers and allow them to present these data to the public hearings in a simple manner. Case studies, transportation projects in the State of Florida, were analyzed to illustrate the use of the decision support framework and demonstrate its capabilities. This cohesive and integrated system will facilitate rational decisions through cost effective coordination of environmental information and data management that can be tailored to specific projects. The framework would facilitate collecting, organizing, analyzing, archiving, and coordinating the information and data necessary to support technical and policy transportation decisions.
Semantic Web repositories for genomics data using the eXframe platform
2014-01-01
Background With the advent of inexpensive assay technologies, there has been an unprecedented growth in genomics data as well as the number of databases in which it is stored. In these databases, sample annotation using ontologies and controlled vocabularies is becoming more common. However, the annotation is rarely available as Linked Data, in a machine-readable format, or for standardized queries using SPARQL. This makes large-scale reuse, or integration with other knowledge bases very difficult. Methods To address this challenge, we have developed the second generation of our eXframe platform, a reusable framework for creating online repositories of genomics experiments. This second generation model now publishes Semantic Web data. To accomplish this, we created an experiment model that covers provenance, citations, external links, assays, biomaterials used in the experiment, and the data collected during the process. The elements of our model are mapped to classes and properties from various established biomedical ontologies. Resource Description Framework (RDF) data is automatically produced using these mappings and indexed in an RDF store with a built-in Sparql Protocol and RDF Query Language (SPARQL) endpoint. Conclusions Using the open-source eXframe software, institutions and laboratories can create Semantic Web repositories of their experiments, integrate it with heterogeneous resources and make it interoperable with the vast Semantic Web of biomedical knowledge. PMID:25093072
COEUS: “semantic web in a box” for biomedical applications
2012-01-01
Background As the “omics” revolution unfolds, the growth in data quantity and diversity is bringing about the need for pioneering bioinformatics software, capable of significantly improving the research workflow. To cope with these computer science demands, biomedical software engineers are adopting emerging semantic web technologies that better suit the life sciences domain. The latter’s complex relationships are easily mapped into semantic web graphs, enabling a superior understanding of collected knowledge. Despite increased awareness of semantic web technologies in bioinformatics, their use is still limited. Results COEUS is a new semantic web framework, aiming at a streamlined application development cycle and following a “semantic web in a box” approach. The framework provides a single package including advanced data integration and triplification tools, base ontologies, a web-oriented engine and a flexible exploration API. Resources can be integrated from heterogeneous sources, including CSV and XML files or SQL and SPARQL query results, and mapped directly to one or more ontologies. Advanced interoperability features include REST services, a SPARQL endpoint and LinkedData publication. These enable the creation of multiple applications for web, desktop or mobile environments, and empower a new knowledge federation layer. Conclusions The platform, targeted at biomedical application developers, provides a complete skeleton ready for rapid application deployment, enhancing the creation of new semantic information systems. COEUS is available as open source at http://bioinformatics.ua.pt/coeus/. PMID:23244467
COEUS: "semantic web in a box" for biomedical applications.
Lopes, Pedro; Oliveira, José Luís
2012-12-17
As the "omics" revolution unfolds, the growth in data quantity and diversity is bringing about the need for pioneering bioinformatics software, capable of significantly improving the research workflow. To cope with these computer science demands, biomedical software engineers are adopting emerging semantic web technologies that better suit the life sciences domain. The latter's complex relationships are easily mapped into semantic web graphs, enabling a superior understanding of collected knowledge. Despite increased awareness of semantic web technologies in bioinformatics, their use is still limited. COEUS is a new semantic web framework, aiming at a streamlined application development cycle and following a "semantic web in a box" approach. The framework provides a single package including advanced data integration and triplification tools, base ontologies, a web-oriented engine and a flexible exploration API. Resources can be integrated from heterogeneous sources, including CSV and XML files or SQL and SPARQL query results, and mapped directly to one or more ontologies. Advanced interoperability features include REST services, a SPARQL endpoint and LinkedData publication. These enable the creation of multiple applications for web, desktop or mobile environments, and empower a new knowledge federation layer. The platform, targeted at biomedical application developers, provides a complete skeleton ready for rapid application deployment, enhancing the creation of new semantic information systems. COEUS is available as open source at http://bioinformatics.ua.pt/coeus/.
Poynton, Clare B; Chen, Kevin T; Chonde, Daniel B; Izquierdo-Garcia, David; Gollub, Randy L; Gerstner, Elizabeth R; Batchelor, Tracy T; Catana, Ciprian
2014-01-01
We present a new MRI-based attenuation correction (AC) approach for integrated PET/MRI systems that combines both segmentation- and atlas-based methods by incorporating dual-echo ultra-short echo-time (DUTE) and T1-weighted (T1w) MRI data and a probabilistic atlas. Segmented atlases were constructed from CT training data using a leave-one-out framework and combined with T1w, DUTE, and CT data to train a classifier that computes the probability of air/soft tissue/bone at each voxel. This classifier was applied to segment the MRI of the subject of interest and attenuation maps (μ-maps) were generated by assigning specific linear attenuation coefficients (LACs) to each tissue class. The μ-maps generated with this “Atlas-T1w-DUTE” approach were compared to those obtained from DUTE data using a previously proposed method. For validation of the segmentation results, segmented CT μ-maps were considered to the “silver standard”; the segmentation accuracy was assessed qualitatively and quantitatively through calculation of the Dice similarity coefficient (DSC). Relative change (RC) maps between the CT and MRI-based attenuation corrected PET volumes were also calculated for a global voxel-wise assessment of the reconstruction results. The μ-maps obtained using the Atlas-T1w-DUTE classifier agreed well with those derived from CT; the mean DSCs for the Atlas-T1w-DUTE-based μ-maps across all subjects were higher than those for DUTE-based μ-maps; the atlas-based μ-maps also showed a lower percentage of misclassified voxels across all subjects. RC maps from the atlas-based technique also demonstrated improvement in the PET data compared to the DUTE method, both globally as well as regionally. PMID:24753982
Figueroa, Debbie M; Bass, Hank W
2012-05-01
Integrated cytogenetic pachytene fluorescence in situ hybridization (FISH) maps were developed for chromosomes 1, 3, 4, 5, 6, and 8 of maize using restriction fragment length polymorphism marker-selected Sorghum propinquum bacterial artificial chromosomes (BACs) for 19 core bin markers and 4 additional genetic framework loci. Using transgenomic BAC FISH mapping on maize chromosome addition lines of oats, we found that the relative locus position along the pachytene chromosome did not change as a function of total arm length, indicative of uniform axial contraction along the fibers during mid-prophase for tested loci on chromosomes 4 and 5. Additionally, we cytogenetically FISH mapped six loci from chromosome 9 onto their duplicated syntenic regions on chromosomes 1 and 6, which have varying amounts of sequence divergence, using sorghum BACs homologous to the chromosome 9 loci. We found that successful FISH mapping was possible even when the chromosome 9 selective marker had no counterpart in the syntenic block. In total, these 29 FISH-mapped loci were used to create the most extensive pachytene FISH maps to date for these six maize chromosomes. The FISH-mapped loci were then merged into one composite karyotype for direct comparative analysis with the recombination nodule-predicted cytogenetic, genetic linkage, and genomic physical maps using the relative marker positions of the loci on all the maps. Marker colinearity was observed between all pair-wise map comparisons, although marker distribution patterns varied widely in some cases. As expected, we found that the recombination nodule-based predictions most closely resembled the cytogenetic map positions overall. Cytogenetic and linkage map comparisons agreed with previous studies showing a decrease in marker spacing in the peri-centromeric heterochromatin region on the genetic linkage maps. In fact, there was a general trend with most loci mapping closer towards the telomere on the linkage maps than on the cytogenetic maps, regardless of chromosome number or maize inbred line source, with just some of the telomeric loci exempted. Finally and somewhat surprisingly, we observed considerable variation between the relative arm positions of loci when comparing our cytogenetic FISH map to the B73 genomic physical maps, even where comparisons were to a B73-derived cytogenetic map. This variation is more evident between different chromosome arms, but less so within a given arm, ruling out any type of inbred-line dependent global features of linear deoxyribonucleic acid compared with the meiotic fiber organization. This study provides a means for analyzing the maize genome structure by producing new connections for integrating the cytogenetic, linkage, and physical maps of maize.
QUADrATiC: scalable gene expression connectivity mapping for repurposing FDA-approved therapeutics.
O'Reilly, Paul G; Wen, Qing; Bankhead, Peter; Dunne, Philip D; McArt, Darragh G; McPherson, Suzanne; Hamilton, Peter W; Mills, Ken I; Zhang, Shu-Dong
2016-05-04
Gene expression connectivity mapping has proven to be a powerful and flexible tool for research. Its application has been shown in a broad range of research topics, most commonly as a means of identifying potential small molecule compounds, which may be further investigated as candidates for repurposing to treat diseases. The public release of voluminous data from the Library of Integrated Cellular Signatures (LINCS) programme further enhanced the utilities and potentials of gene expression connectivity mapping in biomedicine. We describe QUADrATiC ( http://go.qub.ac.uk/QUADrATiC ), a user-friendly tool for the exploration of gene expression connectivity on the subset of the LINCS data set corresponding to FDA-approved small molecule compounds. It enables the identification of compounds for repurposing therapeutic potentials. The software is designed to cope with the increased volume of data over existing tools, by taking advantage of multicore computing architectures to provide a scalable solution, which may be installed and operated on a range of computers, from laptops to servers. This scalability is provided by the use of the modern concurrent programming paradigm provided by the Akka framework. The QUADrATiC Graphical User Interface (GUI) has been developed using advanced Javascript frameworks, providing novel visualization capabilities for further analysis of connections. There is also a web services interface, allowing integration with other programs or scripts. QUADrATiC has been shown to provide an improvement over existing connectivity map software, in terms of scope (based on the LINCS data set), applicability (using FDA-approved compounds), usability and speed. It offers potential to biological researchers to analyze transcriptional data and generate potential therapeutics for focussed study in the lab. QUADrATiC represents a step change in the process of investigating gene expression connectivity and provides more biologically-relevant results than previous alternative solutions.
Digital Field Mapping with the British Geological Survey
NASA Astrophysics Data System (ADS)
Leslie, Graham; Smith, Nichola; Jordan, Colm
2014-05-01
The BGS•SIGMA project was initiated in 2001 in response to a major stakeholder review of onshore mapping within the British Geological Survey (BGS). That review proposed a significant change for BGS with the recommendation that digital methods should be implemented for field mapping and data compilation. The BGS•SIGMA project (System for Integrated Geoscience MApping) is an integrated workflow for geoscientific surveying and visualisation using digital methods for geological data visualisation, recording and interpretation, in both 2D and 3D. The project has defined and documented an underpinning framework of best practice for survey and information management, best practice that has then informed the design brief and specification for a toolkit to support this new methodology. The project has now delivered BGS•SIGMA2012. BGS•SIGMA2012 is a integrated toolkit which enables assembly and interrogation/visualisation of existing geological information; capture of, and integration with, new data and geological interpretations; and delivery of 3D digital products and services. From its early days as a system which used PocketGIS run on Husky Fex21 hardware, to the present day system which runs on ruggedized tablet PCs with integrated GPS units, the system has evolved into a complete digital mapping and compilation system. BGS•SIGMA2012 uses a highly customised version of ESRI's ArcGIS 10 and 10.1 with a fully relational Access 2007/2010 geodatabase. BGS•SIGMA2012 is the third external release of our award-winning digital field mapping toolkit. The first free external release of the award-winning digital field mapping toolkit was in 2009, with the third version (BGS-SIGMAmobile2012 v1.01) released on our website (http://www.bgs.ac.uk/research/sigma/home.html) in 2013. The BGS•SIGMAmobile toolkit formed the major part of the first two releases but this new version integrates the BGS•SIGMAdesktop functionality that BGS routinely uses to transform our field data into corporate standard geological models and derivative map outputs. BGS•SIGMA2012 is the default toolkit within BGS for bedrock and superficial geological mapping and other data acquisition projects across the UK, both onshore and offshore. It is used in mapping projects in Africa, the Middle East and the USA, and has been taken to Japan as part of the Tohoku tsunami damage assessment project. It is also successfully being used worldwide by other geological surveys e.g. Norway and Tanzania; by universities including Leicester, Keele and Kyoto, and by organisations such as Vale Mining in Brazil and the Montana Bureau of Mines and Geology. It is used globally, with over 2000 licenses downloaded worldwide to date and in use on all seven continents. Development of the system is still ongoing as a result of both user feedback and the changing face of technology. Investigations into the development of a BGS•SIGMA smartphone app are currently taking place alongside system developments such as a new and more streamlined data entry system.
Developing ShakeCast statistical fragility analysis framework for rapid post-earthquake assessment
Lin, K.-W.; Wald, D.J.
2012-01-01
When an earthquake occurs, the U. S. Geological Survey (USGS) ShakeMap estimates the extent of potentially damaging shaking and provides overall information regarding the affected areas. The USGS ShakeCast system is a freely-available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users’ facilities, sends notifications of potential damage to responsible parties, and generates facility damage assessment maps and other web-based products for emergency managers and responders. We describe notable improvements of the ShakeMap and the ShakeCast applications. We present a design for comprehensive fragility implementation, integrating spatially-varying ground-motion uncertainties into fragility curves for ShakeCast operations. For each facility, an overall inspection priority (or damage assessment) is assigned on the basis of combined component-based fragility curves using pre-defined logic. While regular ShakeCast users receive overall inspection priority designations for each facility, engineers can access the full fragility analyses for further evaluation.
Automated Database Mediation Using Ontological Metadata Mappings
Marenco, Luis; Wang, Rixin; Nadkarni, Prakash
2009-01-01
Objective To devise an automated approach for integrating federated database information using database ontologies constructed from their extended metadata. Background One challenge of database federation is that the granularity of representation of equivalent data varies across systems. Dealing effectively with this problem is analogous to dealing with precoordinated vs. postcoordinated concepts in biomedical ontologies. Model Description The authors describe an approach based on ontological metadata mapping rules defined with elements of a global vocabulary, which allows a query specified at one granularity level to fetch data, where possible, from databases within the federation that use different granularities. This is implemented in OntoMediator, a newly developed production component of our previously described Query Integrator System. OntoMediator's operation is illustrated with a query that accesses three geographically separate, interoperating databases. An example based on SNOMED also illustrates the applicability of high-level rules to support the enforcement of constraints that can prevent inappropriate curator or power-user actions. Summary A rule-based framework simplifies the design and maintenance of systems where categories of data must be mapped to each other, for the purpose of either cross-database query or for curation of the contents of compositional controlled vocabularies. PMID:19567801
Luo, Jake; Chen, Weiheng; Wu, Min; Weng, Chunhua
2018-01-01
Background Prior studies of clinical trial planning indicate that it is crucial to search and screen recruitment sites before starting to enroll participants. However, currently there is no systematic method developed to support clinical investigators to search candidate recruitment sites according to their interested clinical trial factors. Objective In this study, we aim at developing a new approach to integrating the location data of over one million heterogeneous recruitment sites that are stored in clinical trial documents. The integrated recruitment location data can be searched and visualized using a map-based information retrieval method. The method enables systematic search and analysis of recruitment sites across a large amount of clinical trials. Methods The location data of more than 1.4 million recruitment sites of over 183,000 clinical trials was normalized and integrated using a geocoding method. The integrated data can be used to support geographic information retrieval of recruitment sites. Additionally, the information of over 6000 clinical trial target disease conditions and close to 4000 interventions was also integrated into the system and linked to the recruitment locations. Such data integration enabled the construction of a novel map-based query system. The system will allow clinical investigators to search and visualize candidate recruitment sites for clinical trials based on target conditions and interventions. Results The evaluation results showed that the coverage of the geographic location mapping for the 1.4 million recruitment sites was 99.8%. The evaluation of 200 randomly retrieved recruitment sites showed that the correctness of geographic information mapping was 96.5%. The recruitment intensities of the top 30 countries were also retrieved and analyzed. The data analysis results indicated that the recruitment intensity varied significantly across different countries and geographic areas. Conclusion This study contributed a new data processing framework to extract and integrate the location data of heterogeneous recruitment sites from clinical trial documents. The developed system can support effective retrieval and analysis of potential recruitment sites using target clinical trial factors. PMID:29132636
Luo, Jake; Chen, Weiheng; Wu, Min; Weng, Chunhua
2017-12-01
Prior studies of clinical trial planning indicate that it is crucial to search and screen recruitment sites before starting to enroll participants. However, currently there is no systematic method developed to support clinical investigators to search candidate recruitment sites according to their interested clinical trial factors. In this study, we aim at developing a new approach to integrating the location data of over one million heterogeneous recruitment sites that are stored in clinical trial documents. The integrated recruitment location data can be searched and visualized using a map-based information retrieval method. The method enables systematic search and analysis of recruitment sites across a large amount of clinical trials. The location data of more than 1.4 million recruitment sites of over 183,000 clinical trials was normalized and integrated using a geocoding method. The integrated data can be used to support geographic information retrieval of recruitment sites. Additionally, the information of over 6000 clinical trial target disease conditions and close to 4000 interventions was also integrated into the system and linked to the recruitment locations. Such data integration enabled the construction of a novel map-based query system. The system will allow clinical investigators to search and visualize candidate recruitment sites for clinical trials based on target conditions and interventions. The evaluation results showed that the coverage of the geographic location mapping for the 1.4 million recruitment sites was 99.8%. The evaluation of 200 randomly retrieved recruitment sites showed that the correctness of geographic information mapping was 96.5%. The recruitment intensities of the top 30 countries were also retrieved and analyzed. The data analysis results indicated that the recruitment intensity varied significantly across different countries and geographic areas. This study contributed a new data processing framework to extract and integrate the location data of heterogeneous recruitment sites from clinical trial documents. The developed system can support effective retrieval and analysis of potential recruitment sites using target clinical trial factors. Copyright © 2017 Elsevier B.V. All rights reserved.
Assessing Local Knowledge Use in Agroforestry Management with Cognitive Maps
NASA Astrophysics Data System (ADS)
Isaac, Marney E.; Dawoe, Evans; Sieciechowicz, Krystyna
2009-06-01
Small-holder farmers often develop adaptable agroforestry management techniques to improve and diversify crop production. In the cocoa growing region of Ghana, local knowledge on such farm management holds a noteworthy role in the overall farm development. The documentation and analysis of such knowledge use in cocoa agroforests may afford an applicable framework to determine mechanisms driving farmer preference and indicators in farm management. This study employed 12 in-depth farmer interviews regarding variables in farm management as a unit of analysis and utilized cognitive mapping as a qualitative method of analysis. Our objectives were (1) to illustrate and describe agroforestry management variables and associated farm practices, (2) to determine the scope of decision making of individual farmers, and (3) to investigate the suitability of cognitive mapping as a tool for assessing local knowledge use. Results from the cognitive maps revealed an average of 16 ± 3 variables and 19 ± 3 links between management variables in the farmer cognitive maps. Farmer use of advantageous ecological processes was highly central to farm management (48% of all variables), particularly manipulation of organic matter, shade and food crop establishment, and maintenance of a tree stratum as the most common, highly linked variables. Over 85% of variables included bidirectional arrows, interpreted as farm management practices dominated by controllable factors, insofar as farmers indicated an ability to alter most farm characteristics. Local knowledge use on cocoa production revealed detailed indicators for site evaluation, thus affecting farm preparation and management. Our findings suggest that amid multisourced information under conditions of uncertainty, strategies for adaptable agroforestry management should integrate existing and localized management frameworks and that cognitive mapping provides a tool-based approach to advance such a management support system.
Assessing local knowledge use in agroforestry management with cognitive maps.
Isaac, Marney E; Dawoe, Evans; Sieciechowicz, Krystyna
2009-06-01
Small-holder farmers often develop adaptable agroforestry management techniques to improve and diversify crop production. In the cocoa growing region of Ghana, local knowledge on such farm management holds a noteworthy role in the overall farm development. The documentation and analysis of such knowledge use in cocoa agroforests may afford an applicable framework to determine mechanisms driving farmer preference and indicators in farm management. This study employed 12 in-depth farmer interviews regarding variables in farm management as a unit of analysis and utilized cognitive mapping as a qualitative method of analysis. Our objectives were (1) to illustrate and describe agroforestry management variables and associated farm practices, (2) to determine the scope of decision making of individual farmers, and (3) to investigate the suitability of cognitive mapping as a tool for assessing local knowledge use. Results from the cognitive maps revealed an average of 16 +/- 3 variables and 19 +/- 3 links between management variables in the farmer cognitive maps. Farmer use of advantageous ecological processes was highly central to farm management (48% of all variables), particularly manipulation of organic matter, shade and food crop establishment, and maintenance of a tree stratum as the most common, highly linked variables. Over 85% of variables included bidirectional arrows, interpreted as farm management practices dominated by controllable factors, insofar as farmers indicated an ability to alter most farm characteristics. Local knowledge use on cocoa production revealed detailed indicators for site evaluation, thus affecting farm preparation and management. Our findings suggest that amid multisourced information under conditions of uncertainty, strategies for adaptable agroforestry management should integrate existing and localized management frameworks and that cognitive mapping provides a tool-based approach to advance such a management support system.
NASA Astrophysics Data System (ADS)
Hong, Haibo; Yin, Yuehong; Chen, Xing
2016-11-01
Despite the rapid development of computer science and information technology, an efficient human-machine integrated enterprise information system for designing complex mechatronic products is still not fully accomplished, partly because of the inharmonious communication among collaborators. Therefore, one challenge in human-machine integration is how to establish an appropriate knowledge management (KM) model to support integration and sharing of heterogeneous product knowledge. Aiming at the diversity of design knowledge, this article proposes an ontology-based model to reach an unambiguous and normative representation of knowledge. First, an ontology-based human-machine integrated design framework is described, then corresponding ontologies and sub-ontologies are established according to different purposes and scopes. Second, a similarity calculation-based ontology integration method composed of ontology mapping and ontology merging is introduced. The ontology searching-based knowledge sharing method is then developed. Finally, a case of human-machine integrated design of a large ultra-precision grinding machine is used to demonstrate the effectiveness of the method.
NASA Astrophysics Data System (ADS)
Yang, Z. L.; Wu, W. Y.; Lin, P.; Maidment, D. R.
2017-12-01
Extreme water events such as catastrophic floods and severe droughts have increased in recent decades. Mitigating the risk to lives, food security, infrastructure, energy supplies, as well as numerous other industries posed by these extreme events requires informed decision-making and planning based on sound science. We are developing a global water modeling capability by building models that will provide total operational water predictions (evapotranspiration, soil moisture, groundwater, channel flow, inundation, snow) at unprecedented spatial resolutions and updated frequencies. Toward this goal, this talk presents an integrated global hydrological modeling framework that takes advantage of gridded meteorological forcing, land surface modeling, channeled flow modeling, ground observations, and satellite remote sensing. Launched in August 2016, the National Water Model successfully incorporates weather forecasts to predict river flows for more than 2.7 million rivers across the continental United States, which transfers a "synoptic weather map" to a "synoptic river flow map" operationally. In this study, we apply a similar framework to a high-resolution global river network database, which is developed from a hierarchical Dominant River Tracing (DRT) algorithm, and runoff output from the Global Land Data Assimilation System (GLDAS) to a vector-based river routing model (The Routing Application for Parallel Computation of Discharge, RAPID) to produce river flows from 2001 to 2016 using Message Passing Interface (MPI) on Texas Advanced Computer Center's Stampede system. In this simulation, global river discharges for more than 177,000 rivers are computed every 30 minutes. The modeling framework's performance is evaluated with various observations including river flows at more than 400 gauge stations globally. Overall, the model exhibits a reasonably good performance in simulating the averaged patterns of terrestrial water storage, evapotranspiration and runoff. The system is appropriate for monitoring and studying floods and droughts. Directions for future research will be outlined and discussed.
Towards a Cloud Based Smart Traffic Management Framework
NASA Astrophysics Data System (ADS)
Rahimi, M. M.; Hakimpour, F.
2017-09-01
Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM) real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.
Novel Models of Visual Topographic Map Alignment in the Superior Colliculus
El-Ghazawi, Tarek A.; Triplett, Jason W.
2016-01-01
The establishment of precise neuronal connectivity during development is critical for sensing the external environment and informing appropriate behavioral responses. In the visual system, many connections are organized topographically, which preserves the spatial order of the visual scene. The superior colliculus (SC) is a midbrain nucleus that integrates visual inputs from the retina and primary visual cortex (V1) to regulate goal-directed eye movements. In the SC, topographically organized inputs from the retina and V1 must be aligned to facilitate integration. Previously, we showed that retinal input instructs the alignment of V1 inputs in the SC in a manner dependent on spontaneous neuronal activity; however, the mechanism of activity-dependent instruction remains unclear. To begin to address this gap, we developed two novel computational models of visual map alignment in the SC that incorporate distinct activity-dependent components. First, a Correlational Model assumes that V1 inputs achieve alignment with established retinal inputs through simple correlative firing mechanisms. A second Integrational Model assumes that V1 inputs contribute to the firing of SC neurons during alignment. Both models accurately replicate in vivo findings in wild type, transgenic and combination mutant mouse models, suggesting either activity-dependent mechanism is plausible. In silico experiments reveal distinct behaviors in response to weakening retinal drive, providing insight into the nature of the system governing map alignment depending on the activity-dependent strategy utilized. Overall, we describe novel computational frameworks of visual map alignment that accurately model many aspects of the in vivo process and propose experiments to test them. PMID:28027309
A Hybrid Neuro-Fuzzy Model For Integrating Large Earth-Science Datasets
NASA Astrophysics Data System (ADS)
Porwal, A.; Carranza, J.; Hale, M.
2004-12-01
A GIS-based hybrid neuro-fuzzy approach to integration of large earth-science datasets for mineral prospectivity mapping is described. It implements a Takagi-Sugeno type fuzzy inference system in the framework of a four-layered feed-forward adaptive neural network. Each unique combination of the datasets is considered a feature vector whose components are derived by knowledge-based ordinal encoding of the constituent datasets. A subset of feature vectors with a known output target vector (i.e., unique conditions known to be associated with either a mineralized or a barren location) is used for the training of an adaptive neuro-fuzzy inference system. Training involves iterative adjustment of parameters of the adaptive neuro-fuzzy inference system using a hybrid learning procedure for mapping each training vector to its output target vector with minimum sum of squared error. The trained adaptive neuro-fuzzy inference system is used to process all feature vectors. The output for each feature vector is a value that indicates the extent to which a feature vector belongs to the mineralized class or the barren class. These values are used to generate a prospectivity map. The procedure is demonstrated by an application to regional-scale base metal prospectivity mapping in a study area located in the Aravalli metallogenic province (western India). A comparison of the hybrid neuro-fuzzy approach with pure knowledge-driven fuzzy and pure data-driven neural network approaches indicates that the former offers a superior method for integrating large earth-science datasets for predictive spatial mathematical modelling.
Key subsurface data help to refine Trinity aquifer hydrostratigraphic units, south-central Texas
Blome, Charles D.; Clark, Allan K.
2014-01-01
The geologic framework and hydrologic characteristics of aquifers are important components for studying the nation’s subsurface heterogeneity and predicting its hydraulic budgets. Detailed study of an aquifer’s subsurface hydrostratigraphy is needed to understand both its geologic and hydrologic frameworks. Surface hydrostratigraphic mapping can also help characterize the spatial distribution and hydraulic connectivity of an aquifer’s permeable zones. Advances in three-dimensional (3-D) mapping and modeling have also enabled geoscientists to visualize the spatial relations between the saturated and unsaturated lithologies. This detailed study of two borehole cores, collected in 2001 on the Camp Stanley Storage Activity (CSSA) area, provided the foundation for revising a number of hydrostratigraphic units representing the middle zone of the Trinity aquifer. The CSSA area is a restricted military facility that encompasses approximately 4,000 acres and is located in Boerne, Texas, northwest of the city of San Antonio. Studying both the surface and subsurface geology of the CSSA area are integral parts of a U.S. Geological Survey project funded through the National Cooperative Geologic Mapping Program. This modification of hydrostratigraphic units is being applied to all subsurface data used to construct a proposed 3-D EarthVision model of the CSSA area and areas to the south and west.
Mapping morphological shape as a high-dimensional functional curve
Fu, Guifang; Huang, Mian; Bo, Wenhao; Hao, Han; Wu, Rongling
2018-01-01
Abstract Detecting how genes regulate biological shape has become a multidisciplinary research interest because of its wide application in many disciplines. Despite its fundamental importance, the challenges of accurately extracting information from an image, statistically modeling the high-dimensional shape and meticulously locating shape quantitative trait loci (QTL) affect the progress of this research. In this article, we propose a novel integrated framework that incorporates shape analysis, statistical curve modeling and genetic mapping to detect significant QTLs regulating variation of biological shape traits. After quantifying morphological shape via a radius centroid contour approach, each shape, as a phenotype, was characterized as a high-dimensional curve, varying as angle θ runs clockwise with the first point starting from angle zero. We then modeled the dynamic trajectories of three mean curves and variation patterns as functions of θ. Our framework led to the detection of a few significant QTLs regulating the variation of leaf shape collected from a natural population of poplar, Populus szechuanica var tibetica. This population, distributed at altitudes 2000–4500 m above sea level, is an evolutionarily important plant species. This is the first work in the quantitative genetic shape mapping area that emphasizes a sense of ‘function’ instead of decomposing the shape into a few discrete principal components, as the majority of shape studies do. PMID:28062411
Optimality in mono- and multisensory map formation.
Bürck, Moritz; Friedel, Paul; Sichert, Andreas B; Vossen, Christine; van Hemmen, J Leo
2010-07-01
In the struggle for survival in a complex and dynamic environment, nature has developed a multitude of sophisticated sensory systems. In order to exploit the information provided by these sensory systems, higher vertebrates reconstruct the spatio-temporal environment from each of the sensory systems they have at their disposal. That is, for each modality the animal computes a neuronal representation of the outside world, a monosensory neuronal map. Here we present a universal framework that allows to calculate the specific layout of the involved neuronal network by means of a general mathematical principle, viz., stochastic optimality. In order to illustrate the use of this theoretical framework, we provide a step-by-step tutorial of how to apply our model. In so doing, we present a spatial and a temporal example of optimal stimulus reconstruction which underline the advantages of our approach. That is, given a known physical signal transmission and rudimental knowledge of the detection process, our approach allows to estimate the possible performance and to predict neuronal properties of biological sensory systems. Finally, information from different sensory modalities has to be integrated so as to gain a unified perception of reality for further processing, e.g., for distinct motor commands. We briefly discuss concepts of multimodal interaction and how a multimodal space can evolve by alignment of monosensory maps.
The International Association for Danube Research (IAD)-portrait of a transboundary scientific NGO.
Bloesch, Jürg
2009-08-01
The International Association for Danube Research (IAD), a legal association (Verein) according to Austrian law, presently consists of 13 member countries and 12 expert groups covering all water-relevant scientific disciplines. IAD, founded in 1956, represents a traditional and significant stakeholder in the Danube River Basin, fulfilling an important task towards an integrative water and river basin management requested by the EU Water Framework Directive. IAD, stretching between basic and applied research, adapted its strategy after the major political changes in 1989. IAD fosters transdisciplinary and transboundary projects to support integrative Danube River protection in line with the governmental International Commission for the Protection of the Danube River (ICPDR) in which IAD has had observer status since 1998. Recent scientific outputs of IAD encompass, amongst others, a water quality map of the Danube and major tributaries, the Sturgeon Action Plan, hydromorphological mapping of the Drava, a macrophyte inventory, and a Mures River study. Further information about IAD can be found on our website http://www.iad.gs.
Hung, Sheng-Hui; Wang, Pa-Chun; Lin, Hung-Chun; Chen, Hung-Ying; Su, Chao-Ton
2015-01-01
Specimen handling is a critical patient safety issue. Problematic handling process, such as misidentification (of patients, surgical site, and specimen counts), specimen loss, or improper specimen preparation can lead to serious patient harms and lawsuits. Value stream map (VSM) is a tool used to find out non-value-added works, enhance the quality, and reduce the cost of the studied process. On the other hand, healthcare failure mode and effect analysis (HFMEA) is now frequently employed to avoid possible medication errors in healthcare process. Both of them have a goal similar to Six Sigma methodology for process improvement. This study proposes a model that integrates VSM and HFMEA into the framework, which mainly consists of define, measure, analyze, improve, and control (DMAIC), of Six Sigma. A Six Sigma project for improving the process of surgical specimen handling in a hospital was conducted to demonstrate the effectiveness of the proposed model.
Integrating Machine Learning into a Crowdsourced Model for Earthquake-Induced Damage Assessment
NASA Technical Reports Server (NTRS)
Rebbapragada, Umaa; Oommen, Thomas
2011-01-01
On January 12th, 2010, a catastrophic 7.0M earthquake devastated the country of Haiti. In the aftermath of an earthquake, it is important to rapidly assess damaged areas in order to mobilize the appropriate resources. The Haiti damage assessment effort introduced a promising model that uses crowdsourcing to map damaged areas in freely available remotely-sensed data. This paper proposes the application of machine learning methods to improve this model. Specifically, we apply work on learning from multiple, imperfect experts to the assessment of volunteer reliability, and propose the use of image segmentation to automate the detection of damaged areas. We wrap both tasks in an active learning framework in order to shift volunteer effort from mapping a full catalog of images to the generation of high-quality training data. We hypothesize that the integration of machine learning into this model improves its reliability, maintains the speed of damage assessment, and allows the model to scale to higher data volumes.
Comparative physical mapping between wheat chromosome arm 2BL and rice chromosome 4.
Lee, Tong Geon; Lee, Yong Jin; Kim, Dae Yeon; Seo, Yong Weon
2010-12-01
Physical maps of chromosomes provide a framework for organizing and integrating diverse genetic information. DNA microarrays are a valuable technique for physical mapping and can also be used to facilitate the discovery of single feature polymorphisms (SFPs). Wheat chromosome arm 2BL was physically mapped using a Wheat Genome Array onto near-isogenic lines (NILs) with the aid of wheat-rice synteny and mapped wheat EST information. Using high variance probe set (HVP) analysis, 314 HVPs constituting genes present on 2BL were identified. The 314 HVPs were grouped into 3 categories: HVPs that match only rice chromosome 4 (298 HVPs), those that match only wheat ESTs mapped on 2BL (1), and those that match both rice chromosome 4 and wheat ESTs mapped on 2BL (15). All HVPs were converted into gene sets, which represented either unique rice gene models or mapped wheat ESTs that matched identified HVPs. Comparative physical maps were constructed for 16 wheat gene sets and 271 rice gene sets. Of the 271 rice gene sets, 257 were mapped to the 18-35 Mb regions on rice chromosome 4. Based on HVP analysis and sequence similarity between the gene models in the rice chromosomes and mapped wheat ESTs, the outermost rice gene model that limits the translocation breakpoint to orthologous regions was identified.
Kampmann, Martin; Bassik, Michael C.; Weissman, Jonathan S.
2013-01-01
A major challenge of the postgenomic era is to understand how human genes function together in normal and disease states. In microorganisms, high-density genetic interaction (GI) maps are a powerful tool to elucidate gene functions and pathways. We have developed an integrated methodology based on pooled shRNA screening in mammalian cells for genome-wide identification of genes with relevant phenotypes and systematic mapping of all GIs among them. We recently demonstrated the potential of this approach in an application to pathways controlling the susceptibility of human cells to the toxin ricin. Here we present the complete quantitative framework underlying our strategy, including experimental design, derivation of quantitative phenotypes from pooled screens, robust identification of hit genes using ultra-complex shRNA libraries, parallel measurement of tens of thousands of GIs from a single double-shRNA experiment, and construction of GI maps. We describe the general applicability of our strategy. Our pooled approach enables rapid screening of the same shRNA library in different cell lines and under different conditions to determine a range of different phenotypes. We illustrate this strategy here for single- and double-shRNA libraries. We compare the roles of genes for susceptibility to ricin and Shiga toxin in different human cell lines and reveal both toxin-specific and cell line-specific pathways. We also present GI maps based on growth and ricin-resistance phenotypes, and we demonstrate how such a comparative GI mapping strategy enables functional dissection of physical complexes and context-dependent pathways. PMID:23739767
Fishman, L; Willis, J H; Wu, C A; Lee, Y-W
2014-05-01
Changes in chromosome number and structure are important contributors to adaptation, speciation and macroevolution. In flowering plants, polyploidy and subsequent reductions in chromosome number by fusion are major sources of chromosomal evolution, but chromosome number increase by fission has been relatively unexplored. Here, we use comparative linkage mapping with gene-based markers to reconstruct chromosomal synteny within the model flowering plant genus Mimulus (monkeyflowers). Two sections of the genus with haploid numbers ≥ 14 have been inferred to be relatively recent polyploids because they are phylogenetically nested within numerous taxa with low base numbers (n=8-10). We combined multiple data sets to build integrated genetic maps of the M. guttatus species complex (section Simiolus, n=14) and the M. lewisii group (section Erythranthe; n=8), and then aligned the two integrated maps using >100 shared markers. We observed strong segmental synteny between M. lewisii and M. guttatus maps, with essentially 1-to-1 correspondence across each of 16 chromosomal blocks. Assuming that the M. lewisii (and widespread) base number of 8 is ancestral, reconstruction of 14 M. guttatus chromosomes requires at least eight fission events (likely shared by Simiolus and sister section Paradanthus (n=16)), plus two fusion events. This apparent burst of fission in the yellow monkeyflower lineages raises new questions about mechanisms and consequences of chromosomal fission in plants. Our comparative maps also provide insight into the origins of a chromosome exhibiting centromere-associated female meiotic drive and create a framework for transferring M. guttatus genome resources across the entire genus.
Cumulative biological impacts framework for solar energy projects in the California Desert
Davis, Frank W.; Kreitler, Jason R.; Soong, Oliver; Stoms, David M.; Dashiell, Stephanie; Hannah, Lee; Wilkinson, Whitney; Dingman, John
2013-01-01
This project developed analytical approaches, tools and geospatial data to support conservation planning for renewable energy development in the California deserts. Research focused on geographical analysis to avoid, minimize and mitigate the cumulative biological effects of utility-scale solar energy development. A hierarchical logic model was created to map the compatibility of new solar energy projects with current biological conservation values. The research indicated that the extent of compatible areas is much greater than the estimated land area required to achieve 2040 greenhouse gas reduction goals. Species distribution models were produced for 65 animal and plant species that were of potential conservation significance to the Desert Renewable Energy Conservation Plan process. These models mapped historical and projected future habitat suitability using 270 meter resolution climate grids. The results were integrated into analytical frameworks to locate potential sites for offsetting project impacts and evaluating the cumulative effects of multiple solar energy projects. Examples applying these frameworks in the Western Mojave Desert ecoregion show the potential of these publicly-available tools to assist regional planning efforts. Results also highlight the necessity to explicitly consider projected land use change and climate change when prioritizing areas for conservation and mitigation offsets. Project data, software and model results are all available online.
Integrating Entropy-Based Naïve Bayes and GIS for Spatial Evaluation of Flood Hazard.
Liu, Rui; Chen, Yun; Wu, Jianping; Gao, Lei; Barrett, Damian; Xu, Tingbao; Li, Xiaojuan; Li, Linyi; Huang, Chang; Yu, Jia
2017-04-01
Regional flood risk caused by intensive rainfall under extreme climate conditions has increasingly attracted global attention. Mapping and evaluation of flood hazard are vital parts in flood risk assessment. This study develops an integrated framework for estimating spatial likelihood of flood hazard by coupling weighted naïve Bayes (WNB), geographic information system, and remote sensing. The north part of Fitzroy River Basin in Queensland, Australia, was selected as a case study site. The environmental indices, including extreme rainfall, evapotranspiration, net-water index, soil water retention, elevation, slope, drainage proximity, and density, were generated from spatial data representing climate, soil, vegetation, hydrology, and topography. These indices were weighted using the statistics-based entropy method. The weighted indices were input into the WNB-based model to delineate a regional flood risk map that indicates the likelihood of flood occurrence. The resultant map was validated by the maximum inundation extent extracted from moderate resolution imaging spectroradiometer (MODIS) imagery. The evaluation results, including mapping and evaluation of the distribution of flood hazard, are helpful in guiding flood inundation disaster responses for the region. The novel approach presented consists of weighted grid data, image-based sampling and validation, cell-by-cell probability inferring and spatial mapping. It is superior to an existing spatial naive Bayes (NB) method for regional flood hazard assessment. It can also be extended to other likelihood-related environmental hazard studies. © 2016 Society for Risk Analysis.
Hammer, Monica; Balfors, Berit; Mörtberg, Ulla; Petersson, Mona; Quin, Andrew
2011-03-01
In this article, focusing on the ongoing implementation of the EU Water Framework Directive, we analyze some of the opportunities and challenges for a sustainable governance of water resources from an ecosystem management perspective. In the face of uncertainty and change, the ecosystem approach as a holistic and integrated management framework is increasingly recognized. The ongoing implementation of the Water Framework Directive (WFD) could be viewed as a reorganization phase in the process of change in institutional arrangements and ecosystems. In this case study from the Northern Baltic Sea River Basin District, Sweden, we focus in particular on data and information management from a multi-level governance perspective from the local stakeholder to the River Basin level. We apply a document analysis, hydrological mapping, and GIS models to analyze some of the institutional framework created for the implementation of the WFD. The study underlines the importance of institutional arrangements that can handle variability of local situations and trade-offs between solutions and priorities on different hierarchical levels.
2011-01-01
Background A robust bacterial artificial chromosome (BAC)-based physical map is essential for many aspects of genomics research, including an understanding of chromosome evolution, high-resolution genome mapping, marker-assisted breeding, positional cloning of genes, and quantitative trait analysis. To facilitate turkey genetics research and better understand avian genome evolution, a BAC-based integrated physical, genetic, and comparative map was developed for this important agricultural species. Results The turkey genome physical map was constructed based on 74,013 BAC fingerprints (11.9 × coverage) from two independent libraries, and it was integrated with the turkey genetic map and chicken genome sequence using over 41,400 BAC assignments identified by 3,499 overgo hybridization probes along with > 43,000 BAC end sequences. The physical-comparative map consists of 74 BAC contigs, with an average contig size of 13.6 Mb. All but four of the turkey chromosomes were spanned on this map by three or fewer contigs, with 14 chromosomes spanned by a single contig and nine chromosomes spanned by two contigs. This map predicts 20 to 27 major rearrangements distinguishing turkey and chicken chromosomes, despite up to 40 million years of separate evolution between the two species. These data elucidate the chromosomal evolutionary pattern within the Phasianidae that led to the modern turkey and chicken karyotypes. The predominant rearrangement mode involves intra-chromosomal inversions, and there is a clear bias for these to result in centromere locations at or near telomeres in turkey chromosomes, in comparison to interstitial centromeres in the orthologous chicken chromosomes. Conclusion The BAC-based turkey-chicken comparative map provides novel insights into the evolution of avian genomes, a framework for assembly of turkey whole genome shotgun sequencing data, and tools for enhanced genetic improvement of these important agricultural and model species. PMID:21906286
People with Disability in Vocational High Schools: between School and Work
NASA Astrophysics Data System (ADS)
Haryanti, R. H.
2018-02-01
Vocational education is positioned within the framework of Vocational Education for All. Therefore, the alignment between the world of education and the world of work is an issue that is always actual within the framework of vocational education, including being an actual issue for people with disabilities. This article aims to map how the state frames disability and vocational education issues within the framework of public policy. The research was conducted using qualitative research method in which the data obtained from the study of documentation. Analysis of the data using content analysis. The results of the study show that the State Policy has not fully framed the issue of vocational education for the disabled into special policies. The vocational education policy for the disabled is still integrated in the major policies in certain institutions. No policy innovations have yet significantly provided a special place for the disabled.
A social-cognitive framework of multidisciplinary team innovation.
Paletz, Susannah B F; Schunn, Christian D
2010-01-01
The psychology of science typically lacks integration between cognitive and social variables. We present a new framework of team innovation in multidisciplinary science and engineering groups that ties factors from both literatures together. We focus on the effects of a particularly challenging social factor, knowledge diversity, which has a history of mixed effects on creativity, most likely because those effects are mediated and moderated by cognitive and additional social variables. In addition, we highlight the distinction between team innovative processes that are primarily divergent versus convergent; we propose that the social and cognitive implications are different for each, providing a possible explanation for knowledge diversity's mixed results on team outcomes. Social variables mapped out include formal roles, communication norms, sufficient participation and information sharing, and task conflict; cognitive variables include analogy, information search, and evaluation. This framework provides a roadmap for research that aims to harness the power of multidisciplinary teams. Copyright © 2009 Cognitive Science Society, Inc.
Forestry timber typing. Tanana demonstration project, Alaska ASVT. [Alaska
NASA Technical Reports Server (NTRS)
Morrissey, L. A.; Ambrosia, V. G.
1982-01-01
The feasibility of using LANDSAT digital data in conjunction with topographic data to delineate commercial forests by stand size and crown closure in the Tanana River basin of Alaska was tested. A modified clustering approach using two LANDSAT dates to generate an initial forest type classification was then refined with topographic data. To further demonstrate the ability of remotely sensed data in a fire protection planning framework, the timber type data were subsequently integrated with terrain information to generate a fire hazard map of the study area. This map provides valuable assistance in initial attack planning, determining equipment accessibility, and fire growth modeling. The resulting data sets were incorporated into the Alaska Department of Natural Resources geographic information system for subsequent utilization.
Lee, Minhyun; Koo, Choongwan; Hong, Taehoon; Park, Hyo Seon
2014-04-15
For the effective photovoltaic (PV) system, it is necessary to accurately determine the monthly average daily solar radiation (MADSR) and to develop an accurate MADSR map, which can simplify the decision-making process for selecting the suitable location of the PV system installation. Therefore, this study aimed to develop a framework for the mapping of the MADSR using an advanced case-based reasoning (CBR) and a geostatistical technique. The proposed framework consists of the following procedures: (i) the geographic scope for the mapping of the MADSR is set, and the measured MADSR and meteorological data in the geographic scope are collected; (ii) using the collected data, the advanced CBR model is developed; (iii) using the advanced CBR model, the MADSR at unmeasured locations is estimated; and (iv) by applying the measured and estimated MADSR data to the geographic information system, the MADSR map is developed. A practical validation was conducted by applying the proposed framework to South Korea. It was determined that the MADSR map developed through the proposed framework has been improved in terms of accuracy. The developed MADSR map can be used for estimating the MADSR at unmeasured locations and for determining the optimal location for the PV system installation.
Conceptual framework for drought phenotyping during molecular breeding.
Salekdeh, Ghasem Hosseini; Reynolds, Matthew; Bennett, John; Boyer, John
2009-09-01
Drought is a major threat to agricultural production and drought tolerance is a prime target for molecular approaches to crop improvement. To achieve meaningful results, these approaches must be linked with suitable phenotyping protocols at all stages, such as the screening of germplasm collections, mutant libraries, mapping populations, transgenic lines and breeding materials and the design of OMICS and quantitative trait loci (QTLs) experiments. Here we present a conceptual framework for molecular breeding for drought tolerance based on the Passioura equation of expressing yield as the product of water use (WU), water use efficiency (WUE) and harvest index (HI). We identify phenotyping protocols that address each of these factors, describe their key features and illustrate their integration with different molecular approaches.
Towards a framework for geospatial tangible user interfaces in collaborative urban planning
NASA Astrophysics Data System (ADS)
Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric
2018-04-01
The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.
Restoring 2D content from distorted documents.
Brown, Michael S; Sun, Mingxuan; Yang, Ruigang; Yun, Lin; Seales, W Brent
2007-11-01
This paper presents a framework to restore the 2D content printed on documents in the presence of geometric distortion and non-uniform illumination. Compared with textbased document imaging approaches that correct distortion to a level necessary to obtain sufficiently readable text or to facilitate optical character recognition (OCR), our work targets nontextual documents where the original printed content is desired. To achieve this goal, our framework acquires a 3D scan of the document's surface together with a high-resolution image. Conformal mapping is used to rectify geometric distortion by mapping the 3D surface back to a plane while minimizing angular distortion. This conformal "deskewing" assumes no parametric model of the document's surface and is suitable for arbitrary distortions. Illumination correction is performed by using the 3D shape to distinguish content gradient edges from illumination gradient edges in the high-resolution image. Integration is performed using only the content edges to obtain a reflectance image with significantly less illumination artifacts. This approach makes no assumptions about light sources and their positions. The results from the geometric and photometric correction are combined to produce the final output.
Towards a framework for geospatial tangible user interfaces in collaborative urban planning
NASA Astrophysics Data System (ADS)
Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric
2018-03-01
The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.
Burt, Kate Gardner; Koch, Pamela; Contento, Isobel
2017-10-01
Researchers have established the benefits of school gardens on students' academic achievement, dietary outcomes, physical activity, and psychosocial skills, yet limited research has been conducted about how school gardens become institutionalized and sustained. Our aim was to develop a tool that captures how gardens are effectively established, integrated, and sustained in schools. We conducted a sequential, exploratory, mixed-methods study. Participants were identified with the help of Grow To Learn, the organization coordinating the New York City school garden initiative, and recruited via e-mail. A stratified, purposeful sample of 21 New York City elementary and middle schools participated in this study throughout the 2013/2014 school year. The sample was stratified in their garden budgets and purposeful in that each of the schools' gardens were determined to be well integrated and sustained. The processes and strategies used by school gardeners to establish well-integrated school gardens were assessed via data collected from surveys, interviews, observations, and concept mapping. Descriptive statistics as well as multidimensional scaling and hierarchical cluster analysis were used to examine the survey and concept mapping data. Qualitative data analysis consisted of thematic coding, pattern matching, explanation building and cross-case synthesis. Nineteen components within four domains of school garden integration were found through the mixed-methods concept mapping analysis. When the analyses of other data were combined, relationships between domains and components emerged. These data resulted in the development of the GREEN (Garden Resources, Education, and Environment Nexus) Tool. When schools with integrated and sustained gardens were studied, patterns emerged about how gardeners achieve institutionalization through different combinations of critical components. These patterns are best described by the GREEN Tool, the first framework to identify how to operationalize school gardening components and describe an evidence-based strategy of successful school garden integration. Copyright © 2017 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
Omics Data Complementarity Underlines Functional Cross-Communication in Yeast.
Malod-Dognin, Noël; Pržulj, Nataša
2017-06-10
Mapping the complete functional layout of a cell and understanding the cross-talk between different processes are fundamental challenges. They elude us because of the incompleteness and noisiness of molecular data and because of the computational intractability of finding the exact answer. We perform a simple integration of three types of baker's yeast omics data to elucidate the functional organization and lines of cross-functional communication. We examine protein-protein interaction (PPI), co-expression (COEX) and genetic interaction (GI) data, and explore their relationship with the gold standard of functional organization, the Gene Ontology (GO). We utilize a simple framework that identifies functional cross-communication lines in each of the three data types, in GO, and collectively in the integrated model of the three omics data types; we present each of them in our new Functional Organization Map (FOM) model. We compare the FOMs of the three omics datasets with the FOM of GO and find that GI is in best agreement with GO, followed COEX and PPI. We integrate the three FOMs into a unified FOM and find that it is in better agreement with the FOM of GO than those of any omics dataset alone, demonstrating functional complementarity of different omics data.
Colom, Roberto; Solomon, Jeffrey; Krueger, Frank; Forbes, Chad; Grafman, Jordan
2012-01-01
Although cognitive neuroscience has made remarkable progress in understanding the involvement of the prefrontal cortex in executive control, the broader functional networks that support high-level cognition and give rise to general intelligence remain to be well characterized. Here, we investigated the neural substrates of the general factor of intelligence (g) and executive function in 182 patients with focal brain damage using voxel-based lesion–symptom mapping. The Wechsler Adult Intelligence Scale and Delis–Kaplan Executive Function System were used to derive measures of g and executive function, respectively. Impaired performance on these measures was associated with damage to a distributed network of left lateralized brain areas, including regions of frontal and parietal cortex and white matter association tracts, which bind these areas into a coordinated system. The observed findings support an integrative framework for understanding the architecture of general intelligence and executive function, supporting their reliance upon a shared fronto-parietal network for the integration and control of cognitive representations and making specific recommendations for the application of the Wechsler Adult Intelligence Scale and Delis–Kaplan Executive Function System to the study of high-level cognition in health and disease. PMID:22396393
Common Data Servers as a Foundation for Specialized Services
NASA Astrophysics Data System (ADS)
Burger, E. F.; Schweitzer, R.; O'Brien, K.; Manke, A. B.; Smith, K. M.
2017-12-01
NOAA's Pacific Marine Environmental Laboratory (PMEL) hosts a broad range of research efforts that span many scientific and environmental research disciplines. Many of these research projects have their own data streams that are as diverse as the research. Data are collected using various platforms, including innovative new platforms such as Saildrones and autonomous profilers. With its requirements for public access to federally funded research results and data, the 2013 White House Office of Science and Technology memo on Public Access to Research Results (PARR) changed the data landscape for Federal agencies. In 2015, with support from the PMEL Director, the PMEL Science Data Integration Group (SDIG) initiated a multi-year effort to formulate and implement an integrated data-management strategy for PMEL research efforts. The PMEL integrated data management strategy will provide data access, visualization and some archive services to PMEL data and use existing and proven frameworks for this capability. In addition to these foundational data services, these data access and visualization frameworks are also leveraged to provide enhanced services to scientists. One enhanced service developed is a data management "dashboard". This application provides scientists with a snapshot of their data assets, access to these data, a map view of data locations, and information on the archival status. Ideally, information on the dashboard continually updates to accurately reflect the project's data asset status. This poster explains how frameworks such as ERDDAP and LAS were used as a foundation for the development of custom services, as well as an explanation of the PMEL data management dashboard functionality. We will also highlight accomplishments of the PMEL Integrated data management strategy implementation.
NASA Astrophysics Data System (ADS)
Boldrini, E.; Brumana, R.; Previtali, M., Jr.; Mazzetti, P., Sr.; Cuca, B., Sr.; Barazzetti, L., Sr.; Camagni, R.; Santoro, M.
2016-12-01
The Built Environment (BE) is intended as the sum of natural and human activities in dynamic transformations in the past, in the present and in the future: it calls for more informed decisions to face the challenging threats (climate change, natural hazards, anthropic pressures) by exploiting resilience, sustainable intervention and tackling societal opportunities, as heritage valorization and tourism acknowledgment; thus, it asks for awareness rising among circular reflective society. In the framework of ENERGIC OD project (EU Network for Redistributing Geographic Information - Open Data), this paper describes the implementation of an application (GeoPAN Atl@s app) addressed to improve a circular multi-temporal knowledge oriented generation of information, able to integrate and take in account historic and current maps, as well as products of satellite image processing to understand on course and on coming phenomena and relating them with the ones occurred in the ancient and recent past in a diachronic approach. The app is focused on riverbeds-BE and knowledge generation for the detection of their changes by involving geologist community and providing to other user the retrieved information (architects and urban planner, tourists and citizen). Here is described the implementation of the app interfaced with the ENERGIC OD Virtual Hub component, based on a brokering framework for OD discovery and access, to assure interoperability and integration of different datasets, wide spread cartographic products with huge granularity (national, regional environmental Risk Maps, i.e. PAI, on site local data, i.e. UAV data, or results of Copernicus Programme satellite data processing, i.e. object-based and time series image analysis for riverbed monitoring using Sentinel2): different sources, scales and formats, including historical maps needing metadata generation, and SHP data used by the geologist in their daily activities for hydrogeological analysis, to be both usable as OD by the VH. "The research leading to these results has received funding from the European Union ICT Policy Support Programme (ICT PSP) under the Competitiveness and Innovation Framework Programme (CIP), grant agreement n° 620400."
NASA Astrophysics Data System (ADS)
Yeo, I. Y.; Lang, M.; Lee, S.; Huang, C.; Jin, H.; McCarty, G.; Sadeghi, A.
2017-12-01
The wetland ecosystem plays crucial roles in improving hydrological function and ecological integrity for the downstream water and the surrounding landscape. However, changing behaviours and functioning of wetland ecosystems are poorly understood and extremely difficult to characterize. Improved understanding on hydrological behaviours of wetlands, considering their interaction with surrounding landscapes and impacts on downstream waters, is an essential first step toward closing the knowledge gap. We present an integrated wetland-catchment modelling study that capitalizes on recently developed inundation maps and other geospatial data. The aim of the data-model integration is to improve spatial prediction of wetland inundation and evaluate cumulative hydrological benefits at the catchment scale. In this paper, we highlight problems arising from data preparation, parameterization, and process representation in simulating wetlands within a distributed catchment model, and report the recent progress on mapping of wetland dynamics (i.e., inundation) using multiple remotely sensed data. We demonstrate the value of spatially explicit inundation information to develop site-specific wetland parameters and to evaluate model prediction at multi-spatial and temporal scales. This spatial data-model integrated framework is tested using Soil and Water Assessment Tool (SWAT) with improved wetland extension, and applied for an agricultural watershed in the Mid-Atlantic Coastal Plain, USA. This study illustrates necessity of spatially distributed information and a data integrated modelling approach to predict inundation of wetlands and hydrologic function at the local landscape scale, where monitoring and conservation decision making take place.
Integrative Medicine in Preventive Medicine Education
Jani, Asim A.; Trask, Jennifer; Ali, Ather
2016-01-01
During 2012, the USDHHS’s Health Resources and Services Administration funded 12 accredited preventive medicine residencies to incorporate an evidence-based integrative medicine curriculum into their training programs. It also funded a national coordinating center at the American College of Preventive Medicine, known as the Integrative Medicine in Preventive Medicine Education (IMPriME) Center, to provide technical assistance to the 12 grantees. To help with this task, the IMPriME Center established a multidisciplinary steering committee, versed in integrative medicine, whose primary aim was to develop integrative medicine core competencies for incorporation into preventive medicine graduate medical education training. The competency development process was informed by central integrative medicine definitions and principles, preventive medicine’s dual role in clinical and population-based prevention, and the burgeoning evidence base of integrative medicine. The steering committee considered an interdisciplinary integrative medicine contextual framework guided by several themes related to workforce development and population health. A list of nine competencies, mapped to the six general domains of competence approved by the Accreditation Council of Graduate Medical Education, was operationalized through an iterative exercise with the 12 grantees in a process that included mapping each site’s competency and curriculum products to the core competencies. The competencies, along with central curricular components informed by grantees’ work presented elsewhere in this supplement, are outlined as a roadmap for residency programs aiming to incorporate integrative medicine content into their curricula. This set of competencies adds to the larger efforts of the IMPriME initiative to facilitate and enhance further curriculum development and implementation by not only the current grantees but other stakeholders in graduate medical education around integrative medicine training. PMID:26477897
NASA Astrophysics Data System (ADS)
Pascual-Aguilar, J. A.; Rubio, J. L.; Domínguez, J.; Andreu, V.
2012-04-01
New information technologies give the possibility of widespread dissemination of spatial information to different geographical scales from continental to local by means of Spatial Data Infrastructures. Also administrative awareness on the need for open access information services has allowed the citizens access to this spatial information through development of legal documents, such as the INSPIRE Directive of the European Union, adapted by national laws as in the case of Spain. The translation of the general criteria of generic Spatial Data Infrastructures (SDI) to thematic ones is a crucial point for the progress of these instruments as large tool for the dissemination of information. In such case, it must be added to the intrinsic criteria of digital information, such as the harmonization information and the disclosure of metadata, the own environmental information characteristics and the techniques employed in obtaining it. In the case of inventories and mapping of soils, existing information obtained by traditional means, prior to the digital technologies, is considered to be a source of valid information, as well as unique, for the development of thematic SDI. In this work, an evaluation of existing and accessible information that constitutes the basis for building a thematic SDI of soils in Spain is undertaken. This information framework has common features to other European Union states. From a set of more than 1,500 publications corresponding to the national territory of Spain, the study was carried out in those documents (94) found for five autonomous regions of northern Iberian Peninsula (Asturias, Cantabria, Basque Country, Navarra and La Rioja). The analysis was performed taking into account the criteria of soil mapping and inventories. The results obtained show a wide variation in almost all the criteria: geographic representation (projections, scales) and geo-referencing the location of the profiles, map location of profiles integrated with edaphic units, description and taxonomic classification systems of soils (FAO, Soil taxonomy, etc.), amount and type of soil analysis parameters and dates of the inventories. In conclusion, the construction of thematic SDI on soil should take into account, prior to the integration of all maps and inventories, a series of processes of harmonization that allows spatial continuity between existing information and also temporal identification of the inventories and maps. This should require the development of at least two types of integration tools: (1) enabling spatial continuity without contradictions between maps made at different times and with different criteria and (2) the development of information systems data (metadata) to highlight the characteristics of information and connection possibilities with other sources that comprise the Spatial Data Infrastructure. Acknowledgements This research has financed by the European Union within the framework of the GS Soil project (eContentplus Programme ECP-2008-GEO-318004).
A hierarchical framework of aquatic ecological units in North America (Nearctic Zone).
James R. Maxwell; Clayton J. Edwards; Mark E. Jensen; Steven J. Paustian; Harry Parrott; Donley M. Hill
1995-01-01
Proposes a framework for classifying and mapping aquatic systems at various scales using ecologically significant physical and biological criteria. Classification and mapping concepts follow tenets of hierarchical theory, pattern recognition, and driving variables. Criteria are provided for the hierarchical classification and mapping of aquatic ecological units of...
Forest Resource Information System (FRIS)
NASA Technical Reports Server (NTRS)
1983-01-01
The technological and economical feasibility of using multispectral digital image data as acquired from the LANDSAT satellites in an ongoing operational forest information system was evaluated. Computer compatible multispectral scanner data secured from the LANDSAT satellites were demonstrated to be a significant contributor to ongoing information systems by providing the added dimensions of synoptic and repeat coverage of the Earth's surface. Major forest cover types of conifer, deciduous, mixed conifer-deciduous and non-forest, were classified well within the bounds of the statistical accuracy of the ground sample. Further, when overlayed with existing maps, the acreage of cover type retains a high level of positional integrity. Maps were digitized by a graphics design system, overlayed and registered onto LANDSAT imagery such that the map data with associated attributes were displayed on the image. Once classified, the analysis results were converted back to map form as a cover type of information. Existing tabular information as represented by inventory is registered geographically to the map base through a vendor provided data management system. The notion of a geographical reference base (map) providing the framework to which imagery and tabular data bases are registered and where each of the three functions of imagery, maps and inventory can be accessed singly or in combination is the very essence of the forest resource information system design.
Geosites and geoheritage representations - a cartographic approach
NASA Astrophysics Data System (ADS)
Rocha, Joao; Brilha, José
2016-04-01
In recent years, the increasing awareness of the importance of nature conservation, particularly towards the protection, conservation and promotion of geological sites, has resulted in a wide range of scientific studies. In a certain way, the majority of geodiversity studies, geoconservation strategies and geosites inventories and geoheritage assessment projects will use, on a particular stage, a cartographic representation - a map - of the most relevant geological and geomorphological features within the area of analyses. A wide range of geosite maps and geological heritage maps have been produced but, so far, a widely accepted conceptual cartographic framework with a specific symbology for cartographic representation has not been created. In this work we debate the lack of a systematic and conceptual framework to support geoheritage and geosite mapping. It is important to create a widely accepted conceptual cartographic framework with a specific symbology to be used within maps dedicated to geoheritage and geosites. We propose a cartographic approach aiming the conceptualization and the definition of a nomenclature and symbology system to be used on both geosite and geoheritage maps. We define a symbology framework for geosite and geoheritage mapping addressed to general public and to secondary school students, in order to be used as geotouristic and didactic tools, respectively. Three different approaches to support the definition of the symbology framework were developed: i) symbols to correlate geosites with the geological time scale; ii) symbols related to each one of the 27 geological frameworks defined in the Portuguese geoheritage inventory; iii) symbols to represent groups of geosites that share common geological and geomorphological features. The use of these different symbols in a map allows a quick understanding of a set of relevant information, in addition to the usual geographical distribution of geosites in a certain area.
NASA Astrophysics Data System (ADS)
Leonard, Graham S.; Stewart, Carol; Wilson, Thomas M.; Procter, Jonathan N.; Scott, Bradley J.; Keys, Harry J.; Jolly, Gill E.; Wardman, Johnny B.; Cronin, Shane J.; McBride, Sara K.
2014-10-01
New Zealand's Tongariro National Park volcanoes produce hazardous eruptions every few years to decades. On 6 August 2012 the Te Maari vent of Tongariro Volcano erupted, producing a series of explosions and a fine ash of minor volume which was dispersed rapidly to the east. This manuscript presents a summary of the eruption impacts and the way these supported science communication during the crisis, particularly in terms of hazard map development. The most significant proximal impact was damage from pyroclastic surges and ballistics to the popular and economically-important Tongariro Alpine Crossing track. The only hazard to affect the medial impact zone was a few mms of ashfall with minor impacts. Field testing indicated that the Te Maari ash had extremely low resistivity when wetted, implying a very high potential to cause disruption to nationally-important power transmission networks via the mechanism of insulator flashover. This was not observed, presumably due to insufficient ash accumulation on insulators. Virtually no impacts from distal ashfall were reported. Post-event analysis of PM10 data demonstrates the additional value of regional air quality monitoring networks in quantifying population exposure to airborne respirable ash. While the eruption was minor, it generated a high level of public interest and a demand for information on volcanic hazards and impacts from emergency managers, the public, critical infrastructure managers, health officials, and the agriculture sector. Meeting this demand fully taxed available resources. We present here aspects of the New Zealand experience which may have wider applicability in moving towards improved integration of hazard impact information, mapping, and communication. These include wide use of a wiki technical clearinghouse and email listservs, a focus on multi-agency consistent messages, and a recently developed environment of collaboration and alignment of both research funding and technical science advice. Hazard maps were integral to science communication during the crisis, but there is limited international best practice information available on hazard maps as communication devices, as most volcanic hazard mapping literature is concerned with defining hazard zones. We propose that hazard maps are only as good as the communications framework and inter-agency relationships in which they are embedded, and we document in detail the crisis hazard map development process. We distinguish crisis hazard maps from background hazard maps and ashfall prediction maps, illustrating the complementary nature of these three distinct communication mechanisms. We highlight issues that arose and implications for the development of future maps.
A genetic linkage map of grape, utilizing Vitis rupestris and Vitis arizonica.
Doucleff, M; Jin, Y; Gao, F; Riaz, S; Krivanek, A F; Walker, M A
2004-10-01
A genetic linkage map of grape was constructed, utilizing 116 progeny derived from a cross of two Vitis rupestris x V. arizonica interspecific hybrids, using the pseudo-testcross strategy. A total of 475 DNA markers-410 amplified fragment length polymorphism, 24 inter-simple sequence repeat, 32 random amplified polymorphic DNA, and nine simple sequence repeat markers-were used to construct the parental maps. Markers segregating 1:1 were used to construct parental framework maps with confidence levels >90% with the Plant Genome Research Initiative mapping program. In the maternal (D8909-15) map, 105 framework markers and 55 accessory markers were ordered in 17 linkage groups (756 cM). The paternal (F8909-17) map had 111 framework markers and 33 accessory markers ordered in 19 linkage groups (1,082 cM). One hundred eighty-one markers segregating 3:1 were used to connect the two parental maps' parents. This moderately dense map will be useful for the initial mapping of genes and/or QTL for resistance to the dagger nematode, Xiphinema index, and Xylella fastidiosa, the bacterial causal agent of Pierce's disease.
The Geographic Climate Information System Project (GEOCLIMA): Overview and preliminary results
NASA Astrophysics Data System (ADS)
Feidas, H.; Zanis, P.; Melas, D.; Vaitis, M.; Anadranistakis, E.; Symeonidis, P.; Pantelopoulos, S.
2012-04-01
The project GEOCLIMA aims at developing an integrated Geographic Information System (GIS) allowing the user to manage, analyze and visualize the information which is directly or indirectly related to climate and its future projections in Greece. The main components of the project are: a) collection and homogenization of climate and environmental related information, b) estimation of future climate change based on existing regional climate model (RCM) simulations as well as a supplementary high resolution (10 km x 10 km) simulation over the period 1961-2100 using RegCM3, c) compilation of an integrated uniform geographic database, and d) mapping of climate data, creation of digital thematic maps, and development of the integrated web GIS application. This paper provides an overview of the ongoing research efforts and preliminary results of the project. First, the trends in the annual and seasonal time series of precipitation and air temperature observations for all available stations in Greece are assessed. Then the set-up of the high resolution RCM simulation (10 km x 10 km) is discussed with respect to the selected convective scheme. Finally, the relationship of climatic variables with geophysical features over Greece such as altitude, location, distance from the sea, slope, aspect, distance from climatic barriers, land cover etc) is investigated, to support climate mapping. The research has been co-financed by the European Union (European Regional Development Fund) and Greek national funds through the Operational Program "Competitiveness and Entrepreneurship" of the National Strategic Reference Framework (NSRF) - Research Funding Program COOPERATION 2009.
NASA Astrophysics Data System (ADS)
French, N. H.; Erickson, T.; McKenzie, D.
2008-12-01
A major goal of the North American Carbon Program is to resolve uncertainties in understanding and managing the carbon cycle of North America. As carbon modeling tools become more comprehensive and spatially oriented, accurate datasets to spatially quantify carbon emissions from fire are needed, and these data resources need to be accessible to users for decision-making. Under a new NASA Carbon Cycle Science project, Drs. Nancy French and Tyler Erickson, of the Michigan Technological University, Michigan Tech Research Institute (MTRI), are teaming with specialists with the USDA Forest Service Fire and Environmental Research Applications (FERA) team to provide information for mapping fire-derived carbon emissions to users. The project focus includes development of a web-based system to provide spatially resolved fire emissions estimates for North America in a user-friendly environment. The web-based Decision Support System will be based on a variety of open source technologies. The Fuel Characteristic Classification System (FCCS) raster map of fuels and MODIS-derived burned area vector maps will be processed using the Geographic Data Abstraction Library (GDAL) and OGR Simple Features Library. Tabular and spatial project data will be stored in a PostgreSQL/PostGIS, a spatially enabled relational database server. The browser-based user interface will be created using the Django web page framework to allow user input for the decision support system. The OpenLayers mapping framework will be used to provide users with interactive maps within the browser. In addition, the data products will be made available in standard open data formats such as KML, to allow for easy integration into other spatial models and data systems.
A Practical Framework for Cartographic Design
NASA Astrophysics Data System (ADS)
Denil, Mark
2018-05-01
Creation of a map artifact that can be recognized, accepted, read, and absorbed is the cartographer's chief responsibility. This involves bringing coherence and order out of chaos and randomness through the construction of map artifacts that mediate processes of social communication. Maps are artifacts, first and foremost: they are artifacts with particular formal attributes. It is the formal aspects of the map artifact that allows it to invoke and sustain a reading as a map. This paper examines Cartographic Design as the sole means at the cartographer's disposal for constructing the meaning bearing artifacts we know as maps, by placing it in a center of a practical analytic framework. The framework draws together the Theoretic and Craft aspects of map making, and examines how Style and Taste operate through the rubric of a schema of Mapicity to produce high quality maps. The role of the Cartographic Canon, and the role of Critique, are also explored, and a few design resources are identified.
Development and Evaluation of a Cloud-Gap-Filled MODIS Daily Snow-Cover Product
NASA Technical Reports Server (NTRS)
Hall, Dorothy K.; Riggs, George A.; Foster, James L.; Kumar, Sujay V.
2010-01-01
The utility of the Moderate Resolution Imaging Spectroradiometer (MODIS) snow-cover products is limited by cloud cover which causes gaps in the daily snow-cover map products. We describe a cloud-gap-filled (CGF) daily snowcover map using a simple algorithm to track cloud persistence, to account for the uncertainty created by the age of the snow observation. Developed from the 0.050 resolution climate-modeling grid daily snow-cover product, MOD10C1, each grid cell of the CGF map provides a cloud-persistence count (CPC) that tells whether the current or a prior day was used to make the snow decision. Percentage of grid cells "observable" is shown to increase dramatically when prior days are considered. The effectiveness of the CGF product is evaluated by conducting a suite of data assimilation experiments using the community Noah land surface model in the NASA Land Information System (LIS) framework. The Noah model forecasts of snow conditions, such as snow-water equivalent (SWE), are updated based on the observations of snow cover which are obtained either from the MOD1 OC1 standard product or the new CGF product. The assimilation integrations using the CGF maps provide domain averaged bias improvement of -11 %, whereas such improvement using the standard MOD1 OC1 maps is -3%. These improvements suggest that the Noah model underestimates SWE and snow depth fields, and that the assimilation integrations contribute to correcting this systematic error. We conclude that the gap-filling strategy is an effective approach for increasing cloud-free observations of snow cover.
Wollbrett, Julien; Larmande, Pierre; de Lamotte, Frédéric; Ruiz, Manuel
2013-04-15
In recent years, a large amount of "-omics" data have been produced. However, these data are stored in many different species-specific databases that are managed by different institutes and laboratories. Biologists often need to find and assemble data from disparate sources to perform certain analyses. Searching for these data and assembling them is a time-consuming task. The Semantic Web helps to facilitate interoperability across databases. A common approach involves the development of wrapper systems that map a relational database schema onto existing domain ontologies. However, few attempts have been made to automate the creation of such wrappers. We developed a framework, named BioSemantic, for the creation of Semantic Web Services that are applicable to relational biological databases. This framework makes use of both Semantic Web and Web Services technologies and can be divided into two main parts: (i) the generation and semi-automatic annotation of an RDF view; and (ii) the automatic generation of SPARQL queries and their integration into Semantic Web Services backbones. We have used our framework to integrate genomic data from different plant databases. BioSemantic is a framework that was designed to speed integration of relational databases. We present how it can be used to speed the development of Semantic Web Services for existing relational biological databases. Currently, it creates and annotates RDF views that enable the automatic generation of SPARQL queries. Web Services are also created and deployed automatically, and the semantic annotations of our Web Services are added automatically using SAWSDL attributes. BioSemantic is downloadable at http://southgreen.cirad.fr/?q=content/Biosemantic.
2013-01-01
Background In recent years, a large amount of “-omics” data have been produced. However, these data are stored in many different species-specific databases that are managed by different institutes and laboratories. Biologists often need to find and assemble data from disparate sources to perform certain analyses. Searching for these data and assembling them is a time-consuming task. The Semantic Web helps to facilitate interoperability across databases. A common approach involves the development of wrapper systems that map a relational database schema onto existing domain ontologies. However, few attempts have been made to automate the creation of such wrappers. Results We developed a framework, named BioSemantic, for the creation of Semantic Web Services that are applicable to relational biological databases. This framework makes use of both Semantic Web and Web Services technologies and can be divided into two main parts: (i) the generation and semi-automatic annotation of an RDF view; and (ii) the automatic generation of SPARQL queries and their integration into Semantic Web Services backbones. We have used our framework to integrate genomic data from different plant databases. Conclusions BioSemantic is a framework that was designed to speed integration of relational databases. We present how it can be used to speed the development of Semantic Web Services for existing relational biological databases. Currently, it creates and annotates RDF views that enable the automatic generation of SPARQL queries. Web Services are also created and deployed automatically, and the semantic annotations of our Web Services are added automatically using SAWSDL attributes. BioSemantic is downloadable at http://southgreen.cirad.fr/?q=content/Biosemantic. PMID:23586394
NASA Astrophysics Data System (ADS)
Nass, Andrea; van Gasselt, Stephan; Hargitai, Hendrik; Hare, Trent; Manaud, Nicolas; Karachevtseva, Irina; Kersten, Elke; Roatsch, Thomas; Wählisch, Marita; Kereszturi, Akos
2016-04-01
Cartography is one of the most important communication channels between users of spatial information and laymen as well as the open public alike. This applies to all known real-world objects located either here on Earth or on any other object in our Solar System. In planetary sciences, however, the main use of cartography resides in a concept called planetary mapping with all its various attached meanings: it can be (1) systematic spacecraft observation from orbit, i.e. the retrieval of physical information, (2) the interpretation of discrete planetary surface units and their abstraction, or it can be (3) planetary cartography sensu strictu, i.e., the technical and artistic creation of map products. As the concept of planetary mapping covers a wide range of different information and knowledge levels, aims associated with the concept of mapping consequently range from a technical and engineering focus to a scientific distillation process. Among others, scientific centers focusing on planetary cartography are the United State Geological Survey (USGS, Flagstaff), the Moscow State University of Geodesy and Cartography (MIIGAiK, Moscow), Eötvös Loránd University (ELTE, Hungary), and the German Aerospace Center (DLR, Berlin). The International Astronomical Union (IAU), the Commission Planetary Cartography within International Cartographic Association (ICA), the Open Geospatial Consortium (OGC), the WG IV/8 Planetary Mapping and Spatial Databases within International Society for Photogrammetry and Remote Sensing (ISPRS) and a range of other institutions contribute on definition frameworks in planetary cartography. Classical cartography is nowadays often (mis-)understood as a tool mainly rather than a scientific discipline and an art of communication. Consequently, concepts of information systems, mapping tools and cartographic frameworks are used interchangeably, and cartographic workflows and visualization of spatial information in thematic maps have often been neglected or were left to software systems to decide by some arbitrary default values. The diversity of cartography as a research discipline and its different contributions in geospatial sciences and communication of information and knowledge will be highlighted in this contribution. We invite colleagues from this and other discipline to discuss concepts and topics for joint future collaboration and research.
Bedrock geologic map of Vermont
Ratcliffe, Nicholas M.; Stanley, Rolfe S.; Gale, Marjorie H.; Thompson, Peter J.; Walsh, Gregory J.; With contributions by Hatch, Norman L.; Rankin, Douglas W.; Doolan, Barry L.; Kim, Jonathan; Mehrtens, Charlotte J.; Aleinikoff, John N.; McHone, J. Gregory; Cartography by Masonic, Linda M.
2011-01-01
The Bedrock Geologic Map of Vermont is the result of a cooperative agreement between the U.S. Geological Survey (USGS) and the State of Vermont. The State's complex geology spans 1.4 billion years of Earth's history. The new map comes 50 years after the most recent map of the State by Charles G. Doll and others in 1961 and a full 150 years since the publication of the first geologic map of Vermont by Edward Hitchcock and others in 1861. At a scale of 1:100,000, the map shows an uncommon level of detail for State geologic maps. Mapped rock units are primarily based on lithology, or rock type, to facilitate derivative studies in multiple disciplines. The 1961 map was compiled from 1:62,500-scale or smaller maps. The current map was created to integrate more detailed (1:12,000- to 1:24,000-scale) modern and older (1:62,500-scale) mapping with the theory of plate tectonics to provide a framework for geologic, tectonic, economic, hydrogeologic, and environmental characterization of the bedrock of Vermont. The printed map consists of three oversize sheets (52 x 76 inches). Sheets 1 and 2 show the southern and northern halves of Vermont, respectively, and can be trimmed and joined so that the entire State can be displayed as a single entity. These sheets also include 10 cross sections and a geologic structure map. Sheet 3 on the front consists of descriptions of 486 map units, a correlation of map units, and references cited. Sheet 3 on the back features a list of the 195 sources of geologic map data keyed to an index map of 7.5-minute quadrangles in Vermont, as well as a table identifying ages of rocks dated by uranium-lead zircon geochronology.
Integration of Landsat-based disturbance maps in the Landscape Change Monitoring System (LCMS)
NASA Astrophysics Data System (ADS)
Healey, S. P.; Cohen, W. B.; Eidenshink, J. C.; Hernandez, A. J.; Huang, C.; Kennedy, R. E.; Moisen, G. G.; Schroeder, T. A.; Stehman, S.; Steinwand, D.; Vogelmann, J. E.; Woodcock, C.; Yang, L.; Yang, Z.; Zhu, Z.
2013-12-01
Land cover change can have a profound effect upon an area's natural resources and its role in biogeochemical and hydrological cycles. Many land cover changes processes are sensitive to climate, including: fire; storm damage, and insect activity. Monitoring of both past and ongoing land cover change is critical, particularly as we try to understand the impact of a changing climate on the natural systems we manage. The Landsat series of satellites, which initially launched in 1972, has allowed land observation at spatial and spectral resolutions appropriate for identification of many types of land cover change. Over the years, and particularly since the opening of the Landsat archive in 2008, many approaches have been developed to meet individual monitoring needs. Algorithms vary by the cover type targeted, the rate of change sought, and the period between observations. The Landscape Change Monitoring System (LCMS) is envisioned as a sustained, inter-agency monitoring program that brings together and operationally provides the best available land cover change maps over the United States. Expanding upon the successful USGS/Forest Service Monitoring Trends in Burn Severity project, LCMS is designed to serve a variety of research and management communities. The LCMS Science Team is currently assessing the relative strengths of a variety of leading change detection approaches, primarily emphasizing Landsat observations. Using standardized image pre-processing methods, maps produced by these algorithms have been compared at intensive validation sites across the country. Additionally, LCMS has taken steps toward a data-mining framework, in which ensembles of algorithm outputs are used with non-parametric models to create integrated predictions of change across a variety of scenarios and change dynamics. We present initial findings from the LCMS Science Team, including validation results from individual algorithms and assessment of initial 'integrated' products from the data-mining framework. It is anticipated that these results will directly impact land change information that will in the future be routinely available across the country through LCMS. With a baseline observation period of more than 40 years and a national scope, this data should shed light upon how trends in disturbance may be linked to climatic changes.
Non-integrability vs. integrability in pentagram maps
NASA Astrophysics Data System (ADS)
Khesin, Boris; Soloviev, Fedor
2015-01-01
We revisit recent results on integrable cases for higher-dimensional generalizations of the 2D pentagram map: short-diagonal, dented, deep-dented, and corrugated versions, and define a universal class of pentagram maps, which are proved to possess projective duality. We show that in many cases the pentagram map cannot be included into integrable flows as a time-one map, and discuss how the corresponding notion of discrete integrability can be extended to include jumps between invariant tori. We also present a numerical evidence that certain generalizations of the integrable 2D pentagram map are non-integrable and present a conjecture for a necessary condition of their discrete integrability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foster, J.W.; Schafer, A.J.; Critcher, R.
1996-04-15
We have constructed a whole genome radiation hybrid (WG-RH) map across a region of human chromosome 17q, from growth hormone (GH) to thymidine kinase (TK). A panel of 128 WG-RH hybrid cell lines generated by X-irradiation and fusion has been tested for the retention of 39 sequence-tagged site (STS) markers by the polymerase chain reaction. This genome mapping technique has allowed the integration of existing VNTR and microsatellite markers with additional new markers and existing STS markers previously mapped to this region by other means. The WG-RH map includes eight expressed sequence tag (EST) and three anonymous markers developed formore » this study, together with 23 anonymous microsatellites and five existing ESTs. Analysis of these data resulted in a high-density comprehensive map across this region of the genome. A subset of these markers has been used to produce a framework map consisting of 20 loci ordered with odds greater than 1000:1. The markers are of sufficient density to build a YAC contig across this region based on marker content. We have developed sequence tags for both ends of a 2.1-Mb YAC and mapped these using the WG-RH panel, allowing a direct comparison of cRay{sub 6000} to physical distance. 31 refs., 3 figs., 2 tabs.« less
Integrated Systems Health Management (ISHM) Toolkit
NASA Technical Reports Server (NTRS)
Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim
2013-01-01
A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.
Improving TOGAF ADM 9.1 Migration Planning Phase by ITIL V3 Service Transition
NASA Astrophysics Data System (ADS)
Hanum Harani, Nisa; Akhmad Arman, Arry; Maulana Awangga, Rolly
2018-04-01
Modification planning of business transformation involving technological utilization required a system of transition and migration planning process. Planning of system migration activity is the most important. The migration process is including complex elements such as business re-engineering, transition scheme mapping, data transformation, application development, individual involvement by computer and trial interaction. TOGAF ADM is the framework and method of enterprise architecture implementation. TOGAF ADM provides a manual refer to the architecture and migration planning. The planning includes an implementation solution, in this case, IT solution, but when the solution becomes an IT operational planning, TOGAF could not handle it. This paper presents a new model framework detail transitions process of integration between TOGAF and ITIL. We evaluated our models in field study inside a private university.
Applying the AcciMap methodology to investigate the tragic Sewol Ferry accident in South Korea.
Lee, Samuel; Moh, Young Bo; Tabibzadeh, Maryam; Meshkati, Najmedin
2017-03-01
This study applies the AcciMap methodology, which was originally proposed by Professor Jens Rasmussen (1997), to the analysis of the tragic Sewol Ferry accident in South Korea on April 16, 2014, which killed 304 mostly young people and is considered as a national disaster in that country. This graphical representation, by incorporating associated socio-technical factors into an integrated framework, provides a big-picture to illustrate the context in which an accident occurred as well as the interactions between different levels of the studied system that resulted in that event. In general, analysis of past accidents within the stated framework can define the patterns of hazards within an industrial sector. Such analysis can lead to the definition of preconditions for safe operations, which is a main focus of proactive risk management systems. In the case of the Sewol Ferry accident, a lot of the blame has been placed on the Sewol's captain and its crewmembers. However, according to this study, which relied on analyzing all available sources published in English and Korean, the disaster is the result of a series of lapses and disregards for safety across different levels of government and regulatory bodies, Chonghaejin Company, and the Sewol's crewmembers. The primary layers of the AcciMap framework, which include the political environment and non-proactive governmental body; inadequate regulations and their lax oversight and enforcement; poor safety culture; inconsideration of human factors issues; and lack of and/or outdated standard operating and emergency procedures were not only limited to the maritime industry in South Korea, and the Sewol Ferry accident, but they could also subject any safety-sensitive industry anywhere in the world. Copyright © 2016 Elsevier Ltd. All rights reserved.
Hammoudeh, Mohammad; Newman, Robert; Dennett, Christopher; Mount, Sarah; Aldabbas, Omar
2015-01-01
This paper presents a distributed information extraction and visualisation service, called the mapping service, for maximising information return from large-scale wireless sensor networks. Such a service would greatly simplify the production of higher-level, information-rich, representations suitable for informing other network services and the delivery of field information visualisations. The mapping service utilises a blend of inductive and deductive models to map sense data accurately using externally available knowledge. It utilises the special characteristics of the application domain to render visualisations in a map format that are a precise reflection of the concrete reality. This service is suitable for visualising an arbitrary number of sense modalities. It is capable of visualising from multiple independent types of the sense data to overcome the limitations of generating visualisations from a single type of sense modality. Furthermore, the mapping service responds dynamically to changes in the environmental conditions, which may affect the visualisation performance by continuously updating the application domain model in a distributed manner. Finally, a distributed self-adaptation function is proposed with the goal of saving more power and generating more accurate data visualisation. We conduct comprehensive experimentation to evaluate the performance of our mapping service and show that it achieves low communication overhead, produces maps of high fidelity, and further minimises the mapping predictive error dynamically through integrating the application domain model in the mapping service. PMID:26378539
Mapping biological ideas: Concept maps as knowledge integration tools for evolution education
NASA Astrophysics Data System (ADS)
Schwendimann, Beat Adrian
Many students leave school with a fragmented understanding of biology that does not allow them to connect their ideas to their everyday lives (Wandersee, 1989; Mintzes, Wandersee, & Novak, 1998; Mintzes, Wandersee, & Novak, 2000a). Understanding evolution ideas is seen as central to building an integrated knowledge of biology (Blackwell, Powell, & Dukes, 2003; Thagard & Findlay, 2010). However, the theory of evolution has been found difficult to understand as it incorporates a wide range of ideas from different areas (Bahar et al., 1999; Tsui & Treagust, 2003) and multiple interacting levels (Wilensky & Resnick, 1999; Duncan & Reiser, 2007; Hmelo-Silver et al., 2007). Research suggests that learners can hold a rich repertoire of co-existing alternative ideas of evolution (for example, Bishop & Anderson, 1990; Demastes, Good, & Peebles, 1996; Evans, 2008), especially of human evolution (for example, Nelson, 1986; Sinatra et al., 2003; Poling & Evans, 2004). Evolution ideas are difficult to understand because they often contradict existing alternative ideas (Mayr, 1982; Wolpert, 1994; Evans, 2008). Research suggests that understanding human evolution is a key to evolution education (for example, Blackwell et al., 2003; Besterman & Baggott la Velle, 2007). This dissertation research investigates how different concept mapping forms embedded in a collaborative technology-enhanced learning environment can support students' integration of evolution ideas using case studies of human evolution. Knowledge Integration (KI) (Linn et al., 2000; Linn et al., 2004) is used as the operational framework to explore concept maps as knowledge integration tools to elicit, add, critically distinguish, group, connect, and sort out alternative evolution ideas. Concept maps are a form of node-link diagram for organizing and representing connections between ideas as a semantic network (Novak & Gowin, 1984). This dissertation research describes the iterative development of a novel biology-specific form of concept map, called Knowledge Integration Map (KIM), which aims to help learners connect ideas across levels (for example, genotype and phenotype levels) towards an integrated understanding of evolution. Using a design-based research approach (Brown, 1992; Cobb et al., 2003), three iterative studies were implemented in ethically and economically diverse public high schools classrooms using the web-based inquiry science environment (WISE) (Linn et al., 2003; Linn et al., 2004). Study 1 investigates concept maps as generative assessment tools. Study 1A compares the concept map generation and critique process of biology novices and experts. Findings suggest that concept maps are sensitive to different levels of knowledge integration but require scaffolding and revision. Study 1B investigates the implementation of concept maps as summative assessment tools in a WISE evolution module. Results indicate that concept maps can reveal connections between students' alternative ideas of evolution. Study 2 introduces KIMs as embedded collaborative learning tools. After generating KIMs, student dyads revise KIMs through two different critique activities (comparison against an expert or peer generated KIM). Findings indicate that different critique activities can promote the use of different criteria for critique. Results suggest that the combination of generating and critiquing KIMs can support integrating evolution ideas but can be time-consuming. As time in biology classrooms is limited, study 3 distinguishes the learning effects from either generating or critiquing KIMs as more time efficient embedded learning tools. Findings suggest that critiquing KIMs can be more time efficient than generating KIMs. Using KIMs that include common alternative ideas for critique activities can create genuine opportunities for students to critically reflect on new and existing ideas. Critiquing KIMs can encourage knowledge integration by fostering self-monitoring of students' learning progress, identifying knowledge gaps, and distinguishing alternative evolution ideas. This dissertation research demonstrates that science instruction of complex topics, such as human evolution, can succeed through a combination of scaffolded inquiry activities using dynamic visualizations, explanation activities, and collaborative KIM activities. This research contributes to educational research and practice by describing ways to make KIMs effective and time efficient learning tools for evolution education. Supporting students' building of a more coherent understanding of core ideas of biology can foster their life-long interest and learning of science.
Crossover physics in the nonequilibrium dynamics of quenched quantum impurity systems.
Vasseur, Romain; Trinh, Kien; Haas, Stephan; Saleur, Hubert
2013-06-14
A general framework is proposed to tackle analytically local quantum quenches in integrable impurity systems, combining a mapping onto a boundary problem with the form factor approach to boundary-condition-changing operators introduced by Lesage and Saleur [Phys. Rev. Lett. 80, 4370 (1998)]. We discuss how to compute exactly the following two central quantities of interest: the Loschmidt echo and the distribution of the work done during the quantum quench. Our results display an interesting crossover physics characterized by the energy scale T(b) of the impurity corresponding to the Kondo temperature. We discuss in detail the noninteracting case as a paradigm and benchmark for more complicated integrable impurity models and check our results using numerical methods.
Tangprasertchai, Narin S; Zhang, Xiaojun; Ding, Yuan; Tham, Kenneth; Rohs, Remo; Haworth, Ian S; Qin, Peter Z
2015-01-01
The technique of site-directed spin labeling (SDSL) provides unique information on biomolecules by monitoring the behavior of a stable radical tag (i.e., spin label) using electron paramagnetic resonance (EPR) spectroscopy. In this chapter, we describe an approach in which SDSL is integrated with computational modeling to map conformations of nucleic acids. This approach builds upon a SDSL tool kit previously developed and validated, which includes three components: (i) a nucleotide-independent nitroxide probe, designated as R5, which can be efficiently attached at defined sites within arbitrary nucleic acid sequences; (ii) inter-R5 distances in the nanometer range, measured via pulsed EPR; and (iii) an efficient program, called NASNOX, that computes inter-R5 distances on given nucleic acid structures. Following a general framework of data mining, our approach uses multiple sets of measured inter-R5 distances to retrieve "correct" all-atom models from a large ensemble of models. The pool of models can be generated independently without relying on the inter-R5 distances, thus allowing a large degree of flexibility in integrating the SDSL-measured distances with a modeling approach best suited for the specific system under investigation. As such, the integrative experimental/computational approach described here represents a hybrid method for determining all-atom models based on experimentally-derived distance measurements. © 2015 Elsevier Inc. All rights reserved.
Tangprasertchai, Narin S.; Zhang, Xiaojun; Ding, Yuan; Tham, Kenneth; Rohs, Remo; Haworth, Ian S.; Qin, Peter Z.
2015-01-01
The technique of site-directed spin labeling (SDSL) provides unique information on biomolecules by monitoring the behavior of a stable radical tag (i.e., spin label) using electron paramagnetic resonance (EPR) spectroscopy. In this chapter, we describe an approach in which SDSL is integrated with computational modeling to map conformations of nucleic acids. This approach builds upon a SDSL tool kit previously developed and validated, which includes three components: (i) a nucleotide-independent nitroxide probe, designated as R5, which can be efficiently attached at defined sites within arbitrary nucleic acid sequences; (ii) inter-R5 distances in the nanometer range, measured via pulsed EPR; and (iii) an efficient program, called NASNOX, that computes inter-R5 distances on given nucleic acid structures. Following a general framework of data mining, our approach uses multiple sets of measured inter-R5 distances to retrieve “correct” all-atom models from a large ensemble of models. The pool of models can be generated independently without relying on the inter-R5 distances, thus allowing a large degree of flexibility in integrating the SDSL-measured distances with a modeling approach best suited for the specific system under investigation. As such, the integrative experimental/computational approach described here represents a hybrid method for determining all-atom models based on experimentally-derived distance measurements. PMID:26477260
NASA Astrophysics Data System (ADS)
Kienberger, S.; Notenbaert, A.; Zeil, P.; Bett, B.; Hagenlocher, M.; Omolo, A.
2012-04-01
Climate change has been stated as being one of the greatest challenges to global health in the current century. Climate change impacts on human health and the socio-economic and related poverty consequences are however still poorly understood. While epidemiological issues are strongly coupled with environmental and climatic parameters, the social and economic circumstances of populations might be of equal or even greater importance when trying to identify vulnerable populations and design appropriate and well-targeted adaptation measures. The inter-linkage between climate change, human health risk and socio-economic impacts remains an important - but largely outstanding - research field. We present an overview on how risk is traditionally being conceptualised in the human health domain and reflect critically on integrated approaches as being currently used in the climate change context. The presentation will also review existing approaches, and how they can be integrated towards adaptation tools. Following this review, an integrated risk concept is being presented, which has been currently adapted under the EC FP7 research project (HEALTHY FUTURES; http://www.healthyfutures.eu/). In this approach, health risk is not only defined through the disease itself (as hazard) but also by the inherent vulnerability of the system, population or region under study. It is in fact the interaction of environment and society that leads to the development of diseases and the subsequent risk of being negatively affected by it. In this conceptual framework vulnerability is being attributed to domains of lack of resilience as well as underlying preconditions determining susceptibilities. To fulfil a holistic picture vulnerability can be associated to social, economic, environmental, institutional, cultural and physical dimensions. The proposed framework also establishes the important nexus to adaptation and how different measures can be related to avoid disease outbreaks, reduce vulnerability in order to lower health risks and disease impacts. The proposed framework explains the generic concepts of disease hazard, vulnerability, risk and its connections. It can be applied to many different diseases and implemented in different ways. Statistical or dynamic disease models integrating future climate projections can - for example - be combined with forecast models. These can be evaluated against different socio-economic development pathways and feed into decisions support systems with an ultimate aim of designing the most appropriate risk reduction strategies. The paper will present first preliminary results on the mapping of vulnerability for the Eastern African region, including diseases such as Malaria, Schistosomiasis and Rift Valley Fever and conclude with current research challenges and how they will be addressed within the HEALTHY FUTURES project.
Integration of data-driven and physically-based methods to assess shallow landslides susceptibility
NASA Astrophysics Data System (ADS)
Lajas, Sara; Oliveira, Sérgio C.; Zêzere, José Luis
2016-04-01
Approaches used to assess shallow landslides susceptibility at the basin scale are conceptually different depending on the use of statistic or deterministic methods. The data-driven methods are sustained in the assumption that the same causes are likely to produce the same effects and for that reason a present/past landslide inventory and a dataset of factors assumed as predisposing factors are crucial for the landslide susceptibility assessment. The physically-based methods are based on a system controlled by physical laws and soil mechanics, where the forces which tend to promote movement are compared with forces that tend to promote resistance to movement. In this case, the evaluation of susceptibility is supported by the calculation of the Factor of safety (FoS), and dependent of the availability of detailed data related with the slope geometry and hydrological and geotechnical properties of the soils and rocks. Within this framework, this work aims to test two hypothesis: (i) although conceptually distinct and based on contrasting procedures, statistic and deterministic methods generate similar shallow landslides susceptibility results regarding the predictive capacity and spatial agreement; and (ii) the integration of the shallow landslides susceptibility maps obtained with data-driven and physically-based methods, for the same study area, generate a more reliable susceptibility model for shallow landslides occurrence. To evaluate these two hypotheses, we select the Information Value data-driven method and the physically-based Infinite Slope model to evaluate shallow landslides in the study area of Monfalim and Louriceira basins (13.9 km2), which is located in the north of Lisbon region (Portugal). The landslide inventory is composed by 111 shallow landslides and was divide in two independent groups based on temporal criteria (age ≤ 1983 and age > 1983): (i) the modelling group (51 cases) was used to define the weights for each predisposing factor (lithology, land use, slope, aspect, curvature, topographic position index and the slope over area ratio) with the Information Value method and was used also to calibrate the strength parameters (cohesion and friction angle) of the different lithological units considered in the Infinity Slope model; and (ii) the validation group (60 cases) was used to independent validate and define the predictive capacity of the shallow landslides susceptibility maps produced with the Information Value method and the Infinite Slope method. The comparison of both landslide susceptibility maps was supported by: (i) the computation of the Receiver Operator Characteristic (ROC) curves; (ii) the calculation of the Area Under the Curve (AUC); and (iii) the evaluation of the spatial agreement between the landslide susceptibility classes. Finally, the susceptibility maps produced with the Information Value and the Infinite Slope methods are integrated into a single landslide susceptibility map based on a set of integration rules define by cross-validation of the susceptibility classes of both maps and analysis of the corresponding contingency table. This work was supported by the FCT - Portuguese Foundation for Science and Technology and is within the framework of the FORLAND Project. Sérgio Oliveira was funded by a postdoctoral grant (SFRH/BPD/85827/2012) from the Portuguese Foundation for Science and Technology (FCT).
Accommodating the ecological fallacy in disease mapping in the absence of individual exposures.
Wang, Feifei; Wang, Jian; Gelfand, Alan; Li, Fan
2017-12-30
In health exposure modeling, in particular, disease mapping, the ecological fallacy arises because the relationship between aggregated disease incidence on areal units and average exposure on those units differs from the relationship between the event of individual incidence and the associated individual exposure. This article presents a novel modeling approach to address the ecological fallacy in the least informative data setting. We assume the known population at risk with an observed incidence for a collection of areal units and, separately, environmental exposure recorded during the period of incidence at a collection of monitoring stations. We do not assume any partial individual level information or random allocation of individuals to observed exposures. We specify a conceptual incidence surface over the study region as a function of an exposure surface resulting in a stochastic integral of the block average disease incidence. The true block level incidence is an unavailable Monte Carlo integration for this stochastic integral. We propose an alternative manageable Monte Carlo integration for the integral. Modeling in this setting is immediately hierarchical, and we fit our model within a Bayesian framework. To alleviate the resulting computational burden, we offer 2 strategies for efficient model fitting: one is through modularization, the other is through sparse or dimension-reduced Gaussian processes. We illustrate the performance of our model with simulations based on a heat-related mortality dataset in Ohio and then analyze associated real data. Copyright © 2017 John Wiley & Sons, Ltd.
Fundamentals of Structural Geology
NASA Astrophysics Data System (ADS)
Pollard, David D.; Fletcher, Raymond C.
2005-09-01
Fundamentals of Structural Geology provides a new framework for the investigation of geological structures by integrating field mapping and mechanical analysis. Assuming a basic knowledge of physical geology, introductory calculus and physics, it emphasizes the observational data, modern mapping technology, principles of continuum mechanics, and the mathematical and computational skills, necessary to quantitatively map, describe, model, and explain deformation in Earth's lithosphere. By starting from the fundamental conservation laws of mass and momentum, the constitutive laws of material behavior, and the kinematic relationships for strain and rate of deformation, the authors demonstrate the relevance of solid and fluid mechanics to structural geology. This book offers a modern quantitative approach to structural geology for advanced students and researchers in structural geology and tectonics. It is supported by a website hosting images from the book, additional colour images, student exercises and MATLAB scripts. Solutions to the exercises are available to instructors. The book integrates field mapping using modern technology with the analysis of structures based on a complete mechanics MATLAB is used to visualize physical fields and analytical results and MATLAB scripts can be downloaded from the website to recreate textbook graphics and enable students to explore their choice of parameters and boundary conditions The supplementary website hosts color images of outcrop photographs used in the text, supplementary color images, and images of textbook figures for classroom presentations The textbook website also includes student exercises designed to instill the fundamental relationships, and to encourage the visualization of the evolution of geological structures; solutions are available to instructors
[Design and implementation of Chinese materia medica resources survey results display system].
Wang, Hui; Zhang, Xiao-Bo; Ge, Xiao-Guang; Jin, Yan; Wang, Ling; Zhao, Yan-Ping; Jing, Zhi-Xian; Guo, Lan-Ping; Huang, Lu-Qi
2017-11-01
From the beginning of the fourth national census of traditional Chinese medicine resources in 2011, a large amount of data have been collected and compiled, including wild medicinal plant resource data, cultivation of medicinal plant information, traditional knowledge, and specimen information. The traditional paper-based recording method is inconvenient for query and application. The B/S architecture, JavaWeb framework and SOA are used to design and develop the fourth national census results display platform. Through the data integration and sorting, the users are to provide with integrated data services and data query display solutions. The platform realizes the fine data classification, and has the simple data retrieval and the university statistical analysis function. The platform uses Echarts components, Geo Server, Open Layers and other technologies to provide a variety of data display forms such as charts, maps and other visualization forms, intuitive reflects the number, distribution and type of Chinese material medica resources. It meets the data mapping requirements of different levels of users, and provides support for management decision-making. Copyright© by the Chinese Pharmaceutical Association.
An integrated method for atherosclerotic carotid plaque segmentation in ultrasound image.
Qian, Chunjun; Yang, Xiaoping
2018-01-01
Carotid artery atherosclerosis is an important cause of stroke. Ultrasound imaging has been widely used in the diagnosis of atherosclerosis. Therefore, segmenting atherosclerotic carotid plaque in ultrasound image is an important task. Accurate plaque segmentation is helpful for the measurement of carotid plaque burden. In this paper, we propose and evaluate a novel learning-based integrated framework for plaque segmentation. In our study, four different classification algorithms, along with the auto-context iterative algorithm, were employed to effectively integrate features from ultrasound images and later also the iteratively estimated and refined probability maps together for pixel-wise classification. The four classification algorithms were support vector machine with linear kernel, support vector machine with radial basis function kernel, AdaBoost and random forest. The plaque segmentation was implemented in the generated probability map. The performance of the four different learning-based plaque segmentation methods was tested on 29 B-mode ultrasound images. The evaluation indices for our proposed methods were consisted of sensitivity, specificity, Dice similarity coefficient, overlap index, error of area, absolute error of area, point-to-point distance, and Hausdorff point-to-point distance, along with the area under the ROC curve. The segmentation method integrated the random forest and an auto-context model obtained the best results (sensitivity 80.4 ± 8.4%, specificity 96.5 ± 2.0%, Dice similarity coefficient 81.0 ± 4.1%, overlap index 68.3 ± 5.8%, error of area -1.02 ± 18.3%, absolute error of area 14.7 ± 10.9%, point-to-point distance 0.34 ± 0.10 mm, Hausdorff point-to-point distance 1.75 ± 1.02 mm, and area under the ROC curve 0.897), which were almost the best, compared with that from the existed methods. Our proposed learning-based integrated framework investigated in this study could be useful for atherosclerotic carotid plaque segmentation, which will be helpful for the measurement of carotid plaque burden. Copyright © 2017 Elsevier B.V. All rights reserved.
McCallum, Meg; Carver, Janet; Dupere, David; Ganong, Sharon; Henderson, J David; McKim, Ann; McNeil-Campbell, Lisa; Richardson, Holly; Simpson, Judy; Tschupruk, Cheryl; Jewers, Heather
2018-05-15
In 2014, Nova Scotia released a provincial palliative care strategy and implementation working groups were established. The Capacity Building and Practice Change Working Group, comprised of health professionals, public advisors, academics, educators, and a volunteer supervisor, was asked to select palliative care education programs for health professionals and volunteers. The first step in achieving this mandate was to establish competencies for health professionals and volunteers caring for patients with life-limiting illness and their families and those specializing in palliative care. In 2015, a literature search for palliative care competencies and an environmental scan of related education programs were conducted. The Irish Palliative Care Competence Framework serves as the foundation of the Nova Scotia Palliative Care Competency Framework. Additional disciplines and competencies were added and any competencies not specific to palliative care were removed. To highlight interprofessional practice, the framework illustrates shared and discipline-specific competencies. Stakeholders were asked to validate the framework and map the competencies to educational programs. Numerous rounds of review refined the framework. The framework includes competencies for 22 disciplines, 9 nursing specialties, and 4 physician specialties. The framework, released in 2017, and the selection and implementation of education programs were a significant undertaking. The framework will support the implementation of the Nova Scotia Integrated Palliative Care Strategy, enhance the interprofessional nature of palliative care, and guide the further implementation of education programs. Other jurisdictions have expressed considerable interest in the framework.
Converting ODM Metadata to FHIR Questionnaire Resources.
Doods, Justin; Neuhaus, Philipp; Dugas, Martin
2016-01-01
Interoperability between systems and data sharing between domains is becoming more and more important. The portal medical-data-models.org offers more than 5.300 UMLS annotated forms in CDISC ODM format in order to support interoperability, but several additional export formats are available. CDISC's ODM and HL7's framework FHIR Questionnaire resource were analyzed, a mapping between elements created and a converter implemented. The developed converter was integrated into the portal with FHIR Questionnaire XML or JSON download options. New FHIR applications can now use this large library of forms.
Efficient and Scalable Graph Similarity Joins in MapReduce
Chen, Yifan; Zhang, Weiming; Tang, Jiuyang
2014-01-01
Along with the emergence of massive graph-modeled data, it is of great importance to investigate graph similarity joins due to their wide applications for multiple purposes, including data cleaning, and near duplicate detection. This paper considers graph similarity joins with edit distance constraints, which return pairs of graphs such that their edit distances are no larger than a given threshold. Leveraging the MapReduce programming model, we propose MGSJoin, a scalable algorithm following the filtering-verification framework for efficient graph similarity joins. It relies on counting overlapping graph signatures for filtering out nonpromising candidates. With the potential issue of too many key-value pairs in the filtering phase, spectral Bloom filters are introduced to reduce the number of key-value pairs. Furthermore, we integrate the multiway join strategy to boost the verification, where a MapReduce-based method is proposed for GED calculation. The superior efficiency and scalability of the proposed algorithms are demonstrated by extensive experimental results. PMID:25121135
Efficient and scalable graph similarity joins in MapReduce.
Chen, Yifan; Zhao, Xiang; Xiao, Chuan; Zhang, Weiming; Tang, Jiuyang
2014-01-01
Along with the emergence of massive graph-modeled data, it is of great importance to investigate graph similarity joins due to their wide applications for multiple purposes, including data cleaning, and near duplicate detection. This paper considers graph similarity joins with edit distance constraints, which return pairs of graphs such that their edit distances are no larger than a given threshold. Leveraging the MapReduce programming model, we propose MGSJoin, a scalable algorithm following the filtering-verification framework for efficient graph similarity joins. It relies on counting overlapping graph signatures for filtering out nonpromising candidates. With the potential issue of too many key-value pairs in the filtering phase, spectral Bloom filters are introduced to reduce the number of key-value pairs. Furthermore, we integrate the multiway join strategy to boost the verification, where a MapReduce-based method is proposed for GED calculation. The superior efficiency and scalability of the proposed algorithms are demonstrated by extensive experimental results.
Aulanier, Florian; Simard, Yvan; Roy, Nathalie; Gervaise, Cédric; Bandet, Marion
2017-12-15
Canadian Arctic and Subarctic regions experience a rapid decrease of sea ice accompanied with increasing shipping traffic. The resulting time-space changes in shipping noise are studied for four key regions of this pristine environment, for 2013 traffic conditions and a hypothetical tenfold traffic increase. A probabilistic modeling and mapping framework, called Ramdam, which integrates the intrinsic variability and uncertainties of shipping noise and its effects on marine habitats, is developed and applied. A substantial transformation of soundscapes is observed in areas where shipping noise changes from present occasional-transient contributor to a dominant noise source. Examination of impacts on low-frequency mammals within ecologically and biologically significant areas reveals that shipping noise has the potential to trigger behavioral responses and masking in the future, although no risk of temporary or permanent hearing threshold shifts is noted. Such probabilistic modeling and mapping is strategic in marine spatial planning of this emerging noise issues. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
A VGI data integration framework based on linked data model
NASA Astrophysics Data System (ADS)
Wan, Lin; Ren, Rongrong
2015-12-01
This paper aims at the geographic data integration and sharing method for multiple online VGI data sets. We propose a semantic-enabled framework for online VGI sources cooperative application environment to solve a target class of geospatial problems. Based on linked data technologies - which is one of core components of semantic web, we can construct the relationship link among geographic features distributed in diverse VGI platform by using linked data modeling methods, then deploy these semantic-enabled entities on the web, and eventually form an interconnected geographic data network to support geospatial information cooperative application across multiple VGI data sources. The mapping and transformation from VGI sources to RDF linked data model is presented to guarantee the unique data represent model among different online social geographic data sources. We propose a mixed strategy which combined spatial distance similarity and feature name attribute similarity as the measure standard to compare and match different geographic features in various VGI data sets. And our work focuses on how to apply Markov logic networks to achieve interlinks of the same linked data in different VGI-based linked data sets. In our method, the automatic generating method of co-reference object identification model according to geographic linked data is discussed in more detail. It finally built a huge geographic linked data network across loosely-coupled VGI web sites. The results of the experiment built on our framework and the evaluation of our method shows the framework is reasonable and practicable.
NASA Astrophysics Data System (ADS)
Jafarzadegan, K.; Merwade, V.; Saksena, S.
2017-12-01
Using conventional hydrodynamic methods for floodplain mapping in large-scale and data-scarce regions is problematic due to the high cost of these methods, lack of reliable data and uncertainty propagation. In this study a new framework is proposed to generate 100-year floodplains for any gauged or ungauged watershed across the United States (U.S.). This framework uses Flood Insurance Rate Maps (FIRMs), topographic, climatic and land use data which are freely available for entire U.S. for floodplain mapping. The framework consists of three components, including a Random Forest classifier for watershed classification, a Probabilistic Threshold Binary Classifier (PTBC) for generating the floodplains, and a lookup table for linking the Random Forest classifier to the PTBC. The effectiveness and reliability of the proposed framework is tested on 145 watersheds from various geographical locations in the U.S. The validation results show that around 80 percent of total watersheds are predicted well, 14 percent have acceptable fit and less than five percent are predicted poorly compared to FIRMs. Another advantage of this framework is its ability in generating floodplains for all small rivers and tributaries. Due to the high accuracy and efficiency of this framework, it can be used as a preliminary decision making tool to generate 100-year floodplain maps for data-scarce regions and all tributaries where hydrodynamic methods are difficult to use.
Modern Data Center Services Supporting Science
NASA Astrophysics Data System (ADS)
Varner, J. D.; Cartwright, J.; McLean, S. J.; Boucher, J.; Neufeld, D.; LaRocque, J.; Fischman, D.; McQuinn, E.; Fugett, C.
2011-12-01
The National Oceanic and Atmospheric Administration's National Geophysical Data Center (NGDC) World Data Center for Geophysics and Marine Geology provides scientific stewardship, products and services for geophysical data, including bathymetry, gravity, magnetics, seismic reflection, data derived from sediment and rock samples, as well as historical natural hazards data (tsunamis, earthquakes, and volcanoes). Although NGDC has long made many of its datasets available through map and other web services, it has now developed a second generation of services to improve the discovery and access to data. These new services use off-the-shelf commercial and open source software, and take advantage of modern JavaScript and web application frameworks. Services are accessible using both RESTful and SOAP queries as well as Open Geospatial Consortium (OGC) standard protocols such as WMS, WFS, WCS, and KML. These new map services (implemented using ESRI ArcGIS Server) are finer-grained than their predecessors, feature improved cartography, and offer dramatic speed improvements through the use of map caches. Using standards-based interfaces allows customers to incorporate the services without having to coordinate with the provider. Providing fine-grained services increases flexibility for customers building custom applications. The Integrated Ocean and Coastal Mapping program and Coastal and Marine Spatial Planning program are two examples of national initiatives that require common data inventories from multiple sources and benefit from these modern data services. NGDC is also consuming its own services, providing a set of new browser-based mapping applications which allow the user to quickly visualize and search for data. One example is a new interactive mapping application to search and display information about historical natural hazards. NGDC continues to increase the amount of its data holdings that are accessible and is augmenting the capabilities with modern web application frameworks such as Groovy and Grails. Data discovery is being improved and simplified by leveraging ISO metadata standards along with ESRI Geoportal Server.
Chanda, Emmanuel; Ameneshewa, Birkinesh; Mihreteab, Selam; Berhane, Araia; Zehaie, Assefash; Ghebrat, Yohannes; Usman, Abdulmumini
2015-12-02
Contemporary malaria vector control relies on the use of insecticide-based, indoor residual spraying (IRS) and long-lasting insecticidal nets (LLINs). However, malaria-endemic countries, including Eritrea, have struggled to effectively deploy these tools due technical and operational challenges, including the selection of insecticide resistance in malaria vectors. This manuscript outlines the processes undertaken in consolidating strategic planning and operational frameworks for vector control to expedite malaria elimination in Eritrea. The effort to strengthen strategic frameworks for vector control in Eritrea was the 'case' for this study. The integrated vector management (IVM) strategy was developed in 2010 but was not well executed, resulting in a rise in malaria transmission, prompting a process to redefine and relaunch the IVM strategy with integration of other vector borne diseases (VBDs) as the focus. The information sources for this study included all available data and accessible archived documentary records on malaria vector control in Eritrea. Structured literature searches of published, peer-reviewed sources using online, scientific, bibliographic databases, Google Scholar, PubMed and WHO, and a combination of search terms were utilized to gather data. The literature was reviewed and adapted to the local context and translated into the consolidated strategic framework. In Eritrea, communities are grappling with the challenge of VBDs posing public health concerns, including malaria. The global fund financed the scale-up of IRS and LLIN programmes in 2014. Eritrea is transitioning towards malaria elimination and strategic frameworks for vector control have been consolidated by: developing an integrated vector management (IVM) strategy (2015-2019); updating IRS and larval source management (LSM) guidelines; developing training manuals for IRS and LSM; training of national staff in malaria entomology and vector control, including insecticide resistance monitoring techniques; initiating the global plan for insecticide resistance management; conducting needs' assessments and developing standard operating procedure for insectaries; developing a guidance document on malaria vector control based on eco-epidemiological strata, a vector surveillance plan and harmonized mapping, data collection and reporting tools. Eritrea has successfully consolidated strategic frameworks for vector control. Rational decision-making remains critical to ensure that the interventions are effective and their choice is evidence-based, and to optimize the use of resources for vector control. Implementation of effective IVM requires proper collaboration and coordination, consistent technical and financial capacity and support to offer greater benefits.
Decision support system based on DPSIR framework for a low flow Mediterranean river basin
NASA Astrophysics Data System (ADS)
Bangash, Rubab Fatima; Kumar, Vikas; Schuhmacher, Marta
2013-04-01
The application of decision making practices are effectively enhanced by adopting a procedural approach setting out a general methodological framework within which specific methods, models and tools can be integrated. Integrated Catchment Management is a process that recognizes the river catchment as a basic organizing unit for understanding and managing ecosystem process. Decision support system becomes more complex by considering unavoidable human activities within a catchment that are motivated by multiple and often competing criteria and/or constraints. DPSIR is a causal framework for describing the interactions between society and the environment. This framework has been adopted by the European Environment Agency and the components of this model are: Driving forces, Pressures, States, Impacts and Responses. The proposed decision support system is a two step framework based on DPSIR. Considering first three component of DPSIR, Driving forces, Pressures and States, hydrological and ecosystem services models are developed. The last two components, Impact and Responses, helped to develop Bayesian Network to integrate the models. This decision support system also takes account of social, economic and environmental aspects. A small river of Catalonia (Northeastern Spain), Francoli River with a low flow (~2 m3/s) is selected for integration of catchment assessment models and to improve knowledge transfer from research to the stakeholders with a view to improve decision making process. DHI's MIKE BASIN software is used to evaluate the low-flow Francolí River with respect to the water bodies' characteristics and also to assess the impact of human activities aiming to achieve good water status for all waters to comply with the WFD's River Basin Management Plan. Based on ArcGIS, MIKE BASIN is a versatile decision support tool that provides a simple and powerful framework for managers and stakeholders to address multisectoral allocation and environmental issues in river basins. While InVEST is a spatially explicit tool, used to model and map a suite of ecosystem services caused by land cover changes or climate change impacts. Moreover, results obtained from low-flow hydrological simulation and ecosystem services models serves as useful tools to develop decision support system based on DPSIR framework by integrating models. Bayesian Networks is used as a knowledge integration and visualization tool to summarize the outcomes of hydrological and ecosystem services models at the "Response" stage of DPSIR. Bayesian Networks provide a framework for modelling the logical relationship between catchment variables and decision objectives by quantifying the strength of these relationships using conditional probabilities. Participatory nature of this framework can provide better communication of water research, particularly in the context of a perceived lack of future awareness-raising with the public that helps to develop more sustainable water management strategies. Acknowledgements The present study was financially supported by Spanish Ministry of Economy and Competitiveness for its financial support through the project SCARCE (Consolider-Ingenio 2010 CSD2009-00065). R. F. Bangash also received PhD fellowship from AGAUR (Commissioner for Universities and Research of the Department of Innovation, Universities and Enterprise of the "Generalitat de Catalunya" and the European Social Fund).
Interpersonal emotion regulation.
Zaki, Jamil; Williams, W Craig
2013-10-01
Contemporary emotion regulation research emphasizes intrapersonal processes such as cognitive reappraisal and expressive suppression, but people experiencing affect commonly choose not to go it alone. Instead, individuals often turn to others for help in shaping their affective lives. How and under what circumstances does such interpersonal regulation modulate emotional experience? Although scientists have examined allied phenomena such as social sharing, empathy, social support, and prosocial behavior for decades, there have been surprisingly few attempts to integrate these data into a single conceptual framework of interpersonal regulation. Here we propose such a framework. We first map a "space" differentiating classes of interpersonal regulation according to whether an individual uses an interpersonal regulatory episode to alter their own or another person's emotion. We then identify 2 types of processes--response-dependent and response-independent--that could support interpersonal regulation. This framework classifies an array of processes through which interpersonal contact fulfills regulatory goals. More broadly, it organizes diffuse, heretofore independent data on "pieces" of interpersonal regulation, and identifies growth points for this young and exciting research domain.
The operational context of care sport connectors in the Netherlands.
Leenaars, K E F; van der Velden-Bollemaat, E C; Smit, E; Wagemakers, A; Molleman, G R M; Koelen, M A
2017-02-22
To stimulate physical activity (PA) and guide primary care patients towards local sport facilities, Care Sport Connectors (CSCs), to whom a broker role has been ascribed, were introduced in 2012 in the Netherlands. The aim of this study is to describe CSCs' operational context. A theoretical framework was developed and used as the starting point for this study. Group interviews were held with policymakers in nine participating municipalities, and, when applicable, the CSC's manager was also present. Prior to the interviews, a first outline of the operational context was mapped, based on the analysis of policy documents and a questionnaire completed by the policymakers. A deductive content analysis, based on the theoretical framework, was used to analyse the interviews. Differences were found in CSCs' operational context in the different municipalities, especially the extent to which municipalities adopted an integral approach. An integral approach consists of an integral policy in combination with an imbedding of this policy in partnerships at management level. This integral approach is reflected in the activities of other municipal operations, for example the implementation of health and PA programs by different organisations. Given the CSC mandate, we think that this integral approach may be supportive of the CSCs' work, because it is reflected in other operations of the municipalities and thus creates conditions for the CSCs' work. Further study is required to ascertain whether this integral approach is actually supporting CSCs in their work to connect the primary care and the PA sector. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Booth, Andrew; Noyes, Jane; Flemming, Kate; Gerhardus, Ansgar; Wahlster, Philip; van der Wilt, Gert Jan; Mozygemba, Kati; Refolo, Pietro; Sacchini, Dario; Tummers, Marcia; Rehfuess, Eva
2018-07-01
To compare and contrast different methods of qualitative evidence synthesis (QES) against criteria identified from the literature and to map their attributes to inform selection of the most appropriate QES method to answer research questions addressed by qualitative research. Electronic databases, citation searching, and a study register were used to identify studies reporting QES methods. Attributes compiled from 26 methodological papers (2001-2014) were used as a framework for data extraction. Data were extracted into summary tables by one reviewer and then considered within the author team. We identified seven considerations determining choice of methods from the methodological literature, encapsulated within the mnemonic Review question-Epistemology-Time/Timescale-Resources-Expertise-Audience and purpose-Type of data. We mapped 15 different published QES methods against these seven criteria. The final framework focuses on stand-alone QES methods but may also hold potential when integrating quantitative and qualitative data. These findings offer a contemporary perspective as a conceptual basis for future empirical investigation of the advantages and disadvantages of different methods of QES. It is hoped that this will inform appropriate selection of QES approaches. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Rerikh, K. V.
1998-02-01
Using classic results of algebraic geometry for birational plane mappings in plane CP 2 we present a general approach to algebraic integrability of autonomous dynamical systems in C 2 with discrete time and systems of two autonomous functional equations for meromorphic functions in one complex variable defined by birational maps in C 2. General theorems defining the invariant curves, the dynamics of a birational mapping and a general theorem about necessary and sufficient conditions for integrability of birational plane mappings are proved on the basis of a new idea — a decomposition of the orbit set of indeterminacy points of direct maps relative to the action of the inverse mappings. A general method of generating integrable mappings and their rational integrals (invariants) I is proposed. Numerical characteristics Nk of intersections of the orbits Φn- kOi of fundamental or indeterminacy points Oi ɛ O ∩ S, of mapping Φn, where O = { O i} is the set of indeterminacy points of Φn and S is a similar set for invariant I, with the corresponding set O' ∩ S, where O' = { O' i} is the set of indeterminacy points of inverse mapping Φn-1, are introduced. Using the method proposed we obtain all nine integrable multiparameter quadratic birational reversible mappings with the zero fixed point and linear projective symmetry S = CΛC-1, Λ = diag(±1), with rational invariants generated by invariant straight lines and conics. The relations of numbers Nk with such numerical characteristics of discrete dynamical systems as the Arnold complexity and their integrability are established for the integrable mappings obtained. The Arnold complexities of integrable mappings obtained are determined. The main results are presented in Theorems 2-5, in Tables 1 and 2, and in Appendix A.
Sentinel-1 data exploitation for geohazard activity map generation
NASA Astrophysics Data System (ADS)
Barra, Anna; Solari, Lorenzo; Béjar-Pizarro, Marta; Monserrat, Oriol; Herrera, Gerardo; Bianchini, Silvia; Crosetto, Michele; María Mateos, Rosa; Sarro, Roberto; Moretti, Sandro
2017-04-01
This work is focused on geohazard mapping and monitoring by exploiting Sentinel-1 (A and B) data and the DInSAR (Differential interferometric SAR (Synthetic Aperture Radar)) techniques. Sometimes the interpretation of the DInSAR derived product (like the velocity map) can be complex, mostly for a final user who do not usually works with radar. The aim of this work is to generate, in a rapid way, a clear product to be easily exploited by the authorities in the geohazard management: intervention planning and prevention activities. Specifically, the presented methodology has been developed in the framework of the European project SAFETY, which is aimed at providing Civil Protection Authorities (CPA) with the capability of periodically evaluating and assessing the potential impact of geohazards (volcanic activity, earthquakes, landslides and subsidence) on urban areas. The methodology has three phases, the interferograms generation, the activity map generation, in terms of velocity and accumulated deformation (with time-series), and the Active Deformation Area (ADA) map generation. The last one is the final product, derived from the original activity map by analyzing the data in a Geographic Information System (GIS) environment, which isolate only the true deformation areas over the noise. This product can be more easily read by the authorities than the original activity map, i.e. can be better exploited to integrate other information and analysis. This product also permit an easy monitoring of the active areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bandaru, Varaprasad; Izaurralde, Roberto C.; Manowitz, David H.
2013-12-01
The use of marginal lands (MLs) for biofuel production has been contemplated as a promising solution for meeting biofuel demands. However, there have been concerns with spatial location of MLs, their inherent biofuel potential, and possible environmental consequences with the cultivation of energy crops. Here, we developed a new quantitative approach that integrates high-resolution land cover and land productivity maps and uses conditional probability density functions for analyzing land use patterns as a function of land productivity to classify the agricultural lands. We subsequently applied this method to determine available productive croplands (P-CLs) and non-crop marginal lands (NC-MLs) in amore » nine-county Southern Michigan. Furthermore, Spatially Explicit Integrated Modeling Framework (SEIMF) using EPIC (Environmental Policy Integrated Climate) was used to understand the net energy (NE) and soil organic carbon (SOC) implications of cultivating different annual and perennial production systems.« less
Semantically-enabled sensor plug & play for the sensor web.
Bröring, Arne; Maúe, Patrick; Janowicz, Krzysztof; Nüst, Daniel; Malewski, Christian
2011-01-01
Environmental sensors have continuously improved by becoming smaller, cheaper, and more intelligent over the past years. As consequence of these technological advancements, sensors are increasingly deployed to monitor our environment. The large variety of available sensor types with often incompatible protocols complicates the integration of sensors into observing systems. The standardized Web service interfaces and data encodings defined within OGC's Sensor Web Enablement (SWE) framework make sensors available over the Web and hide the heterogeneous sensor protocols from applications. So far, the SWE framework does not describe how to integrate sensors on-the-fly with minimal human intervention. The driver software which enables access to sensors has to be implemented and the measured sensor data has to be manually mapped to the SWE models. In this article we introduce a Sensor Plug & Play infrastructure for the Sensor Web by combining (1) semantic matchmaking functionality, (2) a publish/subscribe mechanism underlying the SensorWeb, as well as (3) a model for the declarative description of sensor interfaces which serves as a generic driver mechanism. We implement and evaluate our approach by applying it to an oil spill scenario. The matchmaking is realized using existing ontologies and reasoning engines and provides a strong case for the semantic integration capabilities provided by Semantic Web research.
Semantically-Enabled Sensor Plug & Play for the Sensor Web
Bröring, Arne; Maúe, Patrick; Janowicz, Krzysztof; Nüst, Daniel; Malewski, Christian
2011-01-01
Environmental sensors have continuously improved by becoming smaller, cheaper, and more intelligent over the past years. As consequence of these technological advancements, sensors are increasingly deployed to monitor our environment. The large variety of available sensor types with often incompatible protocols complicates the integration of sensors into observing systems. The standardized Web service interfaces and data encodings defined within OGC’s Sensor Web Enablement (SWE) framework make sensors available over the Web and hide the heterogeneous sensor protocols from applications. So far, the SWE framework does not describe how to integrate sensors on-the-fly with minimal human intervention. The driver software which enables access to sensors has to be implemented and the measured sensor data has to be manually mapped to the SWE models. In this article we introduce a Sensor Plug & Play infrastructure for the Sensor Web by combining (1) semantic matchmaking functionality, (2) a publish/subscribe mechanism underlying the SensorWeb, as well as (3) a model for the declarative description of sensor interfaces which serves as a generic driver mechanism. We implement and evaluate our approach by applying it to an oil spill scenario. The matchmaking is realized using existing ontologies and reasoning engines and provides a strong case for the semantic integration capabilities provided by Semantic Web research. PMID:22164033
Mapping Electrical Structures in the Jarud Basin, Northeast China through Magnetotelluric Sounding
NASA Astrophysics Data System (ADS)
Zhao, W.
2015-12-01
In recent years, China Geological Survey (CGS) has launched 3D geological mapping programs from regional to local scales. The project Deep geological survey at the periphery of the Songliao Basin funded by CGS was implemented from 2012 to 2014. Its main goals are to reveal the tectonic framework of the Jarud Basin (JB) as well as to identify the strata distribution of Permian Linxi Formation by integrating new electromagnetic data with existing geophysical and geological data since black mudstones in the Linxi Formation have shown the potential of shale gas. The study area covered dominantly with Cretaceous-Jurassic igneous rocks with exception of the southeast part is situated in Jarud Banner and Ar Horqin Banner, Inner Mongolia, China. It tectonically lies in the southern Great Khingan Range, western margin of the Songliao Basin, and north of Xar Moron Fault. Over the period of 2012 to 2014, a magnetotelluric survey was carried out at the JB. A total of 926 MT sites with nominal spacing 1 km was acquired in the effective frequency range of 0.01 Hz ~ 300 Hz on six NW and five NE profiles, covering area that exceeds 10, 000 km2. After dimensionality analysis and static shift removal, the nonlinear conjugate algorithm was used to conduct 2D inversion for TM and TE modes. The resistivity models underwent examination using sensitivity tests. The optimal resistivity models revealed numerous large faults, some of which constitute the boundaries of the JB, and modified the tectonic framework. Integrated with well logging and geological mapping data, the strata of Linxi Formation were identified and classified into three depressions: Depressions Arituguri, Gadasu and Wufen. Attention should be paid to Depression Gadasu with area of around 500 km2 since it contains reasonably thick conductive sediments exceeding 4 km in depth which are inferred to be black mudstones pertaining to shale gas.
A Semiparametric Approach for Composite Functional Mapping of Dynamic Quantitative Traits
Yang, Runqing; Gao, Huijiang; Wang, Xin; Zhang, Ji; Zeng, Zhao-Bang; Wu, Rongling
2007-01-01
Functional mapping has emerged as a powerful tool for mapping quantitative trait loci (QTL) that control developmental patterns of complex dynamic traits. Original functional mapping has been constructed within the context of simple interval mapping, without consideration of separate multiple linked QTL for a dynamic trait. In this article, we present a statistical framework for mapping QTL that affect dynamic traits by capitalizing on the strengths of functional mapping and composite interval mapping. Within this so-called composite functional-mapping framework, functional mapping models the time-dependent genetic effects of a QTL tested within a marker interval using a biologically meaningful parametric function, whereas composite interval mapping models the time-dependent genetic effects of the markers outside the test interval to control the genome background using a flexible nonparametric approach based on Legendre polynomials. Such a semiparametric framework was formulated by a maximum-likelihood model and implemented with the EM algorithm, allowing for the estimation and the test of the mathematical parameters that define the QTL effects and the regression coefficients of the Legendre polynomials that describe the marker effects. Simulation studies were performed to investigate the statistical behavior of composite functional mapping and compare its advantage in separating multiple linked QTL as compared to functional mapping. We used the new mapping approach to analyze a genetic mapping example in rice, leading to the identification of multiple QTL, some of which are linked on the same chromosome, that control the developmental trajectory of leaf age. PMID:17947431
Animation of Mapped Photo Collections for Storytelling
NASA Astrophysics Data System (ADS)
Fujita, Hideyuki; Arikawa, Masatoshi
Our research goal is to facilitate the sharing of stories with digital photographs. Some map websites now collect stories associated with peoples' relationships to places. Users map collections of places and include their intangible emotional associations with each location along with photographs, videos, etc. Though this framework of mapping stories is important, it is not sufficiently expressive to communicate stories in a narrative fashion. For example, when the number of the mapped collections of places is particularly large, it is neither easy for viewers to interpret the map nor is it easy for the creator to express a story as a series of events in the real world. This is because each narrative, in the form of a sequence of textual narratives, a sequence of photographs, a movie, or audio is mapped to just one point. As a result, it is up to the viewer to decide which points on the map must be read, and in what order. The conventional framework is fairly suitable for mapping and expressing fragments or snapshots of a whole story and not for conveying the whole story as a narrative using the entire map as the setting. We therefore propose a new framework, Spatial Slideshow, for mapping personal photo collections and representing them as stories such as route guidances, sightseeing guidances, historical topics, fieldwork records, personal diaries, and so on. It is a fusion of personal photo mapping and photo storytelling. Each story is conveyed through a sequence of mapped photographs, presented as a synchronized animation of a map and an enhanced photo slideshow. The main technical novelty of this paper is a method for creating three-dimensional animations of photographs that induce the visual effect of motion from photo to photo. We believe that the proposed framework may have considerable significance in facilitating the grassroots development of spatial content driven by visual communication concerning real-world locations or events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pietzcker, Robert C.; Ueckerdt, Falko; Carrara, Samuel
Mitigation-Process Integrated Assessment Models (MP-IAMs) are used to analyze long-term transformation pathways of the energy system required to achieve stringent climate change mitigation targets. Due to their substantial temporal and spatial aggregation, IAMs cannot explicitly represent all detailed challenges of integrating the variable renewable energies (VRE) wind and solar in power systems, but rather rely on parameterized modeling approaches. In the ADVANCE project, six international modeling teams have developed new approaches to improve the representation of power sector dynamics and VRE integration in IAMs. In this study, we qualitatively and quantitatively evaluate the last years' modeling progress and study themore » impact of VRE integration modeling on VRE deployment in IAM scenarios. For a comprehensive and transparent qualitative evaluation, we first develop a framework of 18 features of power sector dynamics and VRE integration. We then apply this framework to the newly-developed modeling approaches to derive a detailed map of strengths and limitations of the different approaches. For the quantitative evaluation, we compare the IAMs to the detailed hourly-resolution power sector model REMIX. We find that the new modeling approaches manage to represent a large number of features of the power sector, and the numerical results are in reasonable agreement with those derived from the detailed power sector model. Updating the power sector representation and the cost and resources of wind and solar substantially increased wind and solar shares across models: Under a carbon price of 30$/tCO2 in 2020 (increasing by 5% per year), the model-average cost-minimizing VRE share over the period 2050-2100 is 62% of electricity generation, 24%-points higher than with the old model version.« less
RIMS: An Integrated Mapping and Analysis System with Applications to Earth Sciences and Hydrology
NASA Astrophysics Data System (ADS)
Proussevitch, A. A.; Glidden, S.; Shiklomanov, A. I.; Lammers, R. B.
2011-12-01
A web-based information and computational system for analysis of spatially distributed Earth system, climate, and hydrologic data have been developed. The System allows visualization, data exploration, querying, manipulation and arbitrary calculations with any loaded gridded or vector polygon dataset. The system's acronym, RIMS, stands for its core functionality as a Rapid Integrated Mapping System. The system can be deployed for a Global scale projects as well as for regional hydrology and climatology studies. In particular, the Water Systems Analysis Group of the University of New Hampshire developed the global and regional (Northern Eurasia, pan-Arctic) versions of the system with different map projections and specific data. The system has demonstrated its potential for applications in other fields of Earth sciences and education. The key Web server/client components of the framework include (a) a visualization engine built on Open Source libraries (GDAL, PROJ.4, etc.) that are utilized in a MapServer; (b) multi-level data querying tools built on XML server-client communication protocols that allow downloading map data on-the-fly to a client web browser; and (c) data manipulation and grid cell level calculation tools that mimic desktop GIS software functionality via a web interface. Server side data management of the system is designed around a simple database of dataset metadata facilitating mounting of new data to the system and maintaining existing data in an easy manner. RIMS contains "built-in" river network data that allows for query of upstream areas on-demand which can be used for spatial data aggregation and analysis of sub-basin areas. RIMS is an ongoing effort and currently being used to serve a number of websites hosting a suite of hydrologic, environmental and other GIS data.
Participatory GIS for Soil Conservation in Phewa Watershed of Nepal
NASA Astrophysics Data System (ADS)
Bhandari, K. P.
2012-07-01
Participatory Geographic Information Systems (PGIS) can integrate participatory methodologies with geo-spatial technologies for the representation of characteristic of particular place. Over the last decade, researchers use this method to integrate the local knowledge of community within a GIS and Society conceptual framework. Participatory GIS are tailored to answer specific geographic questions at the local level and their modes of implementation vary considerably across space, ranging from field-based, qualitative approaches to more complex web-based applications. These broad ranges of techniques, PGIS are becoming an effective methodology for incorporating community local knowledge into complex spatial decision-making processes. The objective of this study is to reduce the soil erosion by formulating the general rule for the soil conservation by participation of the stakeholders. The poster was prepared by satellite image, topographic map and Arc GIS software including the local knowledge. The data were collected from the focus group discussion and the individual questionnaire for incorporate the local knowledge and use it to find the risk map on the basis of economic, social and manageable physical factors for the sensitivity analysis. The soil erosion risk map is prepared by the physical factors Rainfall-runoff erosivity, Soil erodibility, Slope length, Slope steepness, Cover-management, Conservation practice using RUSLE model. After the comparison and discussion among stakeholders, researcher and export group, and the soil erosion risk map showed that socioeconomic, social and manageable physical factors management can reduce the soil erosion. The study showed that the preparation of the poster GIS map and implement this in the watershed area could reduce the soil erosion in the study area compared to the existing national policy.
A framework for air quality monitoring based on free public data and open source tools
NASA Astrophysics Data System (ADS)
Nikolov, Hristo; Borisova, Denitsa
2014-10-01
In the recent years more and more widely accepted by the Space agencies (e.g. NASA, ESA) is the policy toward provision of Earth observation (EO) data and end products concerning air quality especially in large urban areas without cost to researchers and SMEs. Those EO data are complemented by increasing amount of in-situ data also provided at no cost either from national authorities or having crowdsourced origin. This accessibility together with the increased processing capabilities of the free and open source software is a prerequisite for creation of solid framework for air modeling in support of decision making at medium and large scale. Essential part of this framework is web-based GIS mapping tool responsible for dissemination of the output generated. In this research an attempt is made to establish a running framework based solely on openly accessible data on air quality and on set of freely available software tools for processing and modeling taking into account the present status quo in Bulgaria. Among the primary sources of data, especially for bigger urban areas, for different types of gases and dust particles, noted should be the National Institute of Meteorology and Hydrology of Bulgaria (NIMH) and National System for Environmental Monitoring managed by Bulgarian Executive Environmental Agency (ExEA). Both authorities provide data for concentration of several gases just to mention CO, CO2, NO2, SO2, and fine suspended dust (PM10, PM2.5) on monthly (for some data on daily) basis. In the framework proposed these data will complement the data from satellite-based sensors such as OMI instrument aboard EOS-Aura satellite and from TROPOMI instrument payload for future ESA Sentinel-5P mission. Integral part of the framework is the modern map for the land use/land cover which is provided from EEA by initiative GIO Land CORINE. This map is also a product from EO data distributed at European level. First and above all, our effort is focused on provision to the wider public living in urbanized areas with one reliable source of information on the present conditions concerning the air quality. Also this information might be used as indicator for presence of acid rains in agriculture areas close to industrial or electricity plants. Its availability at regular basis makes such information valuable source in case of manmade industrial disasters or incidents such as forest fires. Key issue in developing this framework is to ensure the delivery of reliable data products related to air quality at larger scale that those available at the moment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ciuca, Razvan; Hernández, Oscar F., E-mail: razvan.ciuca@mail.mcgill.ca, E-mail: oscarh@physics.mcgill.ca
There exists various proposals to detect cosmic strings from Cosmic Microwave Background (CMB) or 21 cm temperature maps. Current proposals do not aim to find the location of strings on sky maps, all of these approaches can be thought of as a statistic on a sky map. We propose a Bayesian interpretation of cosmic string detection and within that framework, we derive a connection between estimates of cosmic string locations and cosmic string tension G μ. We use this Bayesian framework to develop a machine learning framework for detecting strings from sky maps and outline how to implement this frameworkmore » with neural networks. The neural network we trained was able to detect and locate cosmic strings on noiseless CMB temperature map down to a string tension of G μ=5 ×10{sup −9} and when analyzing a CMB temperature map that does not contain strings, the neural network gives a 0.95 probability that G μ≤2.3×10{sup −9}.« less
Conceptualising and mapping coupled estuary, coast and inner shelf sediment systems
NASA Astrophysics Data System (ADS)
French, Jon; Burningham, Helene; Thornhill, Gillian; Whitehouse, Richard; Nicholls, Robert J.
2016-03-01
Whilst understanding and predicting the effects of coastal change are primarily modelling problems, it is essential that we have appropriate conceptual frameworks for (1) the formalisation of existing knowledge; (2) the formulation of relevant scientific questions and management issues; (3) the implementation and deployment of predictive models; and (4) meaningful engagement involvement of stakeholders. Important progress continues to be made on the modelling front, but our conceptual frameworks have not evolved at a similar pace. Accordingly, this paper presents a new approach that re-engages with formal systems analysis and provides a mesoscale geomorphological context within which the coastal management challenges of the 21st century can be more effectively addressed. Coastal and Estuarine System Mapping (CESM) is founded on an ontology of landforms and human interventions that is partly inspired by the coastal tract concept and its temporal hierarchy of sediment sharing systems, but places greater emphasis on a hierarchy of spatial scales. This extends from coastal regions, through landform complexes, to landforms, the morphological adjustment of which is constrained by diverse forms of human intervention. Crucially, CESM integrates open coastal environments with estuaries and relevant portions of the inner shelf that have previously been treated separately. In contrast to the nesting of littoral cells that has hitherto framed shoreline management planning, CESM charts a complex web of interactions, of which a sub-set of mass transfer pathways defines the sediment budget, and a multitude of human interventions constrains natural landform behaviour. Conducted within a geospatial framework, CESM constitutes a form of knowledge formalisation in which disparate sources of information (published research, imagery, mapping, raw data etc.) are generalised into usable knowledge. The resulting system maps provide a framework for the development and application of predictive models and a repository for the outputs they generate (not least, flux estimates for the major sediment system pathways). They also permit comparative analyses of the relative abundance of landforms and the multi-scale interactions between them. Finally, they articulate scientific understanding of the structure and function of complex geomorphological systems in a way that is transparent and accessible to diverse stakeholder audiences. As our models of mesoscale landform evolution increase in sophistication, CESM provides a platform for a more participatory approach to their application to coastal and estuarine management.
A cost-benefit analysis of The National Map
Halsing, David L.; Theissen, Kevin; Bernknopf, Richard
2003-01-01
The Geography Discipline of the U.S. Geological Survey (USGS) has conducted this cost-benefit analysis (CBA) of The National Map. This analysis is an evaluation of the proposed Geography Discipline initiative to provide the Nation with a mechanism to access current and consistent digital geospatial data. This CBA is a supporting document to accompany the Exhibit 300 Capital Asset Plan and Business Case of The National Map Reengineering Program. The framework for estimating the benefits is based on expected improvements in processing information to perform any of the possible applications of spatial data. This analysis does not attempt to determine the benefits and costs of performing geospatial-data applications. Rather, it estimates the change in the differences between those benefits and costs with The National Map and the current situation without it. The estimates of total costs and benefits of The National Map were based on the projected implementation time, development and maintenance costs, rates of data inclusion and integration, expected usage levels over time, and a benefits estimation model. The National Map provides data that are current, integrated, consistent, complete, and more accessible in order to decrease the cost of implementing spatial-data applications and (or) improve the outcome of those applications. The efficiency gains in per-application improvements are greater than the cost to develop and maintain The National Map, meaning that the program would bring a positive net benefit to the Nation. The average improvement in the net benefit of performing a spatial data application was multiplied by a simulated number of application implementations across the country. The numbers of users, existing applications, and rates of application implementation increase over time as The National Map is developed and accessed by spatial data users around the country. Results from the 'most likely' estimates of model parameters and data inputs indicate that, over its 30-year projected lifespan, The National Map will bring a net present value (NPV) of benefits of $2.05 billion in 2001 dollars. The average time until the initial investments (the break-even period) are recovered is 14 years. Table ES-1 shows a running total of NPV in each year of the simulation model. In year 14, The National Map first shows a positive NPV, and so the table is highlighted in gray after that point. Figure ES-1 is a graph of the total benefit and total cost curves of a single model run over time. The curves cross in year 14, when the project breaks even. A sensitivity analysis of the input variables illustrated that these results of the NPV of The National Map are quite robust. Figure ES-2 plots the mean NPV results from 60 different scenarios, each consisting of fifty 30-year runs. The error bars represent a two-standard-deviation range around each mean. The analysis that follows contains the details of the cost-benefit analysis, the framework for evaluating economic benefits, a computational simulation tool, and a sensitivity analysis of model variables and values.
A Fast and Scalable Radiation Hybrid Map Construction and Integration Strategy
Agarwala, Richa; Applegate, David L.; Maglott, Donna; Schuler, Gregory D.; Schäffer, Alejandro A.
2000-01-01
This paper describes a fast and scalable strategy for constructing a radiation hybrid (RH) map from data on different RH panels. The maps on each panel are then integrated to produce a single RH map for the genome. Recurring problems in using maps from several sources are that the maps use different markers, the maps do not place the overlapping markers in same order, and the objective functions for map quality are incomparable. We use methods from combinatorial optimization to develop a strategy that addresses these issues. We show that by the standard objective functions of obligate chromosome breaks and maximum likelihood, software for the traveling salesman problem produces RH maps with better quality much more quickly than using software specifically tailored for RH mapping. We use known algorithms for the longest common subsequence problem as part of our map integration strategy. We demonstrate our methods by reconstructing and integrating maps for markers typed on the Genebridge 4 (GB4) and the Stanford G3 panels publicly available from the RH database. We compare map quality of our integrated map with published maps for GB4 panel and G3 panel by considering whether markers occur in the same order on a map and in DNA sequence contigs submitted to GenBank. We find that all of the maps are inconsistent with the sequence data for at least 50% of the contigs, but our integrated maps are more consistent. The map integration strategy not only scales to multiple RH maps but also to any maps that have comparable criteria for measuring map quality. Our software improves on current technology for doing RH mapping in areas of computation time and algorithms for considering a large number of markers for mapping. The essential impediments to producing dense high-quality RH maps are data quality and panel size, not computation. PMID:10720576
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manem, V; Paganetti, H
Purpose: Evaluate the excess relative risk (ERR) induced by photons and protons in each voxel of the lung, and display it as a three-dimensional map, known as the ERRM (i.e. excess relative risk map) along with the dose distribution map. In addition, we also study the effect of variations in the linear energy transfer (LET) distribution on ERRM for a given proton plan. Methods: The excess relative risk due to radiation is estimated using the initiation-inactivation-proliferation formalism. This framework accounts for three biological phenomenon: mutation induction, cell kill and proliferation. Cell kill and mutation induction are taken as a functionmore » of LET using experimental data. LET distributions are calculated using a Monte Carlo algorithm. ERR is then estimated for each voxel in the organ, and displayed as a three dimensional carcinogenic map. Results: The differences in the ERR’s between photons and protons is seen from the three-dimensional ERR map. In addition, we also varied the LET of a proton plan and observed the differences in the corresponding ERR maps demonstrating variations in the ERR maps depend on features of a proton plan. Additionally, our results suggest that any two proton plans that have the same integral dose does not necessarily imply identical ERR maps, and these changes are due to the variations in the LET distribution map. Conclusion: Clinically, it is important to have a three dimensional display of biological end points. This study is an effort to introduce 3D ERR maps into the treatment planning workflow for certain sites such as pediatric head and neck tumors.« less
IntegratedMap: a Web interface for integrating genetic map data.
Yang, Hongyu; Wang, Hongyu; Gingle, Alan R
2005-05-01
IntegratedMap is a Web application and database schema for storing and interactively displaying genetic map data. Its Web interface includes a menu for direct chromosome/linkage group selection, a search form for selection based on mapped object location and linkage group displays. An overview display provides convenient access to the full range of mapped and anchored object types with genetic locus details, such as numbers, types and names of mapped/anchored objects displayed in a compact scrollable list box that automatically updates based on selected map location and object type. Also, multilinkage group and localized map views are available along with links that can be configured for integration with other Web resources. IntegratedMap is implemented in C#/ASP.NET and the package, including a MySQL schema creation script, is available from http://cggc.agtec.uga.edu/Data/download.asp
Cho, HyunGi; Yeon, Suyong; Choi, Hyunga; Doh, Nakju
2018-01-01
In a group of general geometric primitives, plane-based features are widely used for indoor localization because of their robustness against noises. However, a lack of linearly independent planes may lead to a non-trivial estimation. This in return can cause a degenerate state from which all states cannot be estimated. To solve this problem, this paper first proposed a degeneracy detection method. A compensation method that could fix orientations by projecting an inertial measurement unit’s (IMU) information was then explained. Experiments were conducted using an IMU-Kinect v2 integrated sensor system prone to fall into degenerate cases owing to its narrow field-of-view. Results showed that the proposed framework could enhance map accuracy by successful detection and compensation of degenerated orientations. PMID:29565287
Modeling Global Urbanization Supported by Nighttime Light Remote Sensing
NASA Astrophysics Data System (ADS)
Zhou, Y.
2015-12-01
Urbanization, a major driver of global change, profoundly impacts our physical and social world, for example, altering carbon cycling and climate. Understanding these consequences for better scientific insights and effective decision-making unarguably requires accurate information on urban extent and its spatial distributions. In this study, we developed a cluster-based method to estimate the optimal thresholds and map urban extents from the nighttime light remote sensing data, extended this method to the global domain by developing a computational method (parameterization) to estimate the key parameters in the cluster-based method, and built a consistent 20-year global urban map series to evaluate the time-reactive nature of global urbanization (e.g. 2000 in Fig. 1). Supported by urban maps derived from nightlights remote sensing data and socio-economic drivers, we developed an integrated modeling framework to project future urban expansion by integrating a top-down macro-scale statistical model with a bottom-up urban growth model. With the models calibrated and validated using historical data, we explored urban growth at the grid level (1-km) over the next two decades under a number of socio-economic scenarios. The derived spatiotemporal information of historical and potential future urbanization will be of great value with practical implications for developing adaptation and risk management measures for urban infrastructure, transportation, energy, and water systems when considered together with other factors such as climate variability and change, and high impact weather events.
Teaching Population Health: A Competency Map Approach to Education
Kaprielian, Victoria S.; Silberberg, Mina; McDonald, Mary Anne; Koo, Denise; Hull, Sharon K.; Murphy, Gwen; Tran, Anh N.; Sheline, Barbara L.; Halstater, Brian; Martinez-Bianchi, Viviana; Weigle, Nancy J.; de Oliveira, Justine Strand; Sangvai, Devdutta; Copeland, Joyce; Tilson, Hugh H.; Scutchfield, F. Douglas; Michener, J. Lloyd
2013-01-01
A 2012 Institute of Medicine report is the latest in the growing number of calls to incorporate a population health approach in health professionals’ training. Over the last decade, Duke University, particularly its Department of Community and Family Medicine, has been heavily involved with community partners in Durham, North Carolina to improve the local community’s health. Based on these initiatives, a group of interprofessional faculty began tackling the need to fill the curriculum gap to train future health professionals in public health practice, community engagement, critical thinking, and team skills to improve population health effectively in Durham and elsewhere. The Department of Community and Family Medicine has spent years in care delivery redesign and curriculum experimentation, design, and evaluation to distinguish the skills trainees and faculty need for population health improvement and to integrate them into educational programs. These clinical and educational experiences have led to a set of competencies that form an organizational framework for curricular planning and training. This framework delineates which learning objectives are appropriate and necessary for each learning level, from novice through expert, across multiple disciplines and domains. The resulting competency map has guided Duke’s efforts to develop, implement, and assess training in population health for learners and faculty. In this article, the authors describe the competency map development process as well as examples of its application and evaluation at Duke and limitations to its use with the hope that other institutions will apply it in different settings. PMID:23524919
A unified theoretical framework for mapping models for the multi-state Hamiltonian.
Liu, Jian
2016-11-28
We propose a new unified theoretical framework to construct equivalent representations of the multi-state Hamiltonian operator and present several approaches for the mapping onto the Cartesian phase space. After mapping an F-dimensional Hamiltonian onto an F+1 dimensional space, creation and annihilation operators are defined such that the F+1 dimensional space is complete for any combined excitation. Commutation and anti-commutation relations are then naturally derived, which show that the underlying degrees of freedom are neither bosons nor fermions. This sets the scene for developing equivalent expressions of the Hamiltonian operator in quantum mechanics and their classical/semiclassical counterparts. Six mapping models are presented as examples. The framework also offers a novel way to derive such as the well-known Meyer-Miller model.
Koh, Hyunwook; Blaser, Martin J; Li, Huilin
2017-04-24
The role of the microbiota in human health and disease has been increasingly studied, gathering momentum through the use of high-throughput technologies. Further identification of the roles of specific microbes is necessary to better understand the mechanisms involved in diseases related to microbiome perturbations. Here, we introduce a new microbiome-based group association testing method, optimal microbiome-based association test (OMiAT). OMiAT is a data-driven testing method which takes an optimal test throughout different tests from the sum of powered score tests (SPU) and microbiome regression-based kernel association test (MiRKAT). We illustrate that OMiAT efficiently discovers significant association signals arising from varying microbial abundances and different relative contributions from microbial abundance and phylogenetic information. We also propose a way to apply it to fine-mapping of diverse upper-level taxa at different taxonomic ranks (e.g., phylum, class, order, family, and genus), as well as the entire microbial community, within a newly introduced microbial taxa discovery framework, microbiome comprehensive association mapping (MiCAM). Our extensive simulations demonstrate that OMiAT is highly robust and powerful compared with other existing methods, while correctly controlling type I error rates. Our real data analyses also confirm that MiCAM is especially efficient for the assessment of upper-level taxa by integrating OMiAT as a group analytic method. OMiAT is attractive in practice due to the high complexity of microbiome data and the unknown true nature of the state. MiCAM also provides a hierarchical association map for numerous microbial taxa and can also be used as a guideline for further investigation on the roles of discovered taxa in human health and disease.
NASA Astrophysics Data System (ADS)
Carvalho, João; Inverno, Carlos; Matos, João Xavier; Rosa, Carlos; Granado, Isabel; Branch, Tim; Represas, Patrícia; Carabaneanu, Livia; Matias, Luís; Sousa, Pedro
2017-04-01
The Iberian Pyrite Belt (IPB) hosts world-class massive sulphide deposits, such as Neves-Corvo in Portugal and Rio Tinto in Spain. In Portugal, the Palaeozoic Volcanic-Sedimentary Complex (VSC) hosts these ore deposits, extending from the Grândola-Alcácer region to the Spanish border with a NW-SE to WNW-ESE trend. In the study area, between the Neves-Corvo mine region and Alcoutim (close to the Spanish border), the VSC outcrops only in a small horst near Alcoutim. Sparse exploration drill-hole data indicate that the depth to the top of the VSC varies from several 100 m to about 1 km beneath the Mértola Formation Flysch cover. Mapping of the VSC to the SE of Neves-Corvo mine is an important exploration goal and motivated the acquisition of six 2D seismic reflection profiles with a total length of approximately 82 km in order to map the hidden extension of the VSC. The data, providing information deeper than 10 km at some locations, were integrated in a 3D software environment along with potential-field, geological and drill-hole data to form a 3D structural framework model. Seismic data show strong reflections that represent several long Variscan thrust planes that smoothly dip to the NNE. Outcropping and previously unknown Late Variscan near-vertical faults were also mapped. Our data strongly suggest that the structural framework of Neves-Corvo extends south-eastwards to Alcoutim. Furthermore, the VSC top is located at depths that show the existence within the IPB of new areas with good potential to develop exploration projects envisaging the discovery of massive sulphide deposits of the Neves-Corvo type.
Orga, Ferran; Alías, Francesc; Alsina-Pagès, Rosa Ma
2017-12-23
Noise pollution is a critical factor affecting public health, the relationship between road traffic noise (RTN) and several diseases in urban areas being especially disturbing. The Environmental Noise Directive 2002/49/EC and the CNOSSOS-EU framework are the main instruments of the European Union to identify and combat noise pollution, requiring Member States to compose and publish noise maps and noise management action plans every five years. Nowadays, the noise maps are starting to be tailored by means of Wireless Acoustic Sensor Networks (WASN). In order to exclusively monitor the impact of RTN on the well-being of citizens through WASN-based approaches, those noise sources unrelated to RTN denoted as Anomalous Noise Events (ANEs) should be removed from the noise map generation. This paper introduces an analysis methodology considering both Signal-to-Noise Ratio (SNR) and duration of ANEs to evaluate their impact on the A-weighted equivalent RTN level calculation for different integration times. The experiments conducted on 9 h of real-life data from the WASN-based DYNAMAP project show that both individual high-impact events and aggregated medium-impact events bias significantly the equivalent noise levels of the RTN map, making any derived study about public health impact inaccurate.
2017-01-01
Noise pollution is a critical factor affecting public health, the relationship between road traffic noise (RTN) and several diseases in urban areas being especially disturbing. The Environmental Noise Directive 2002/49/EC and the CNOSSOS-EU framework are the main instruments of the European Union to identify and combat noise pollution, requiring Member States to compose and publish noise maps and noise management action plans every five years. Nowadays, the noise maps are starting to be tailored by means of Wireless Acoustic Sensor Networks (WASN). In order to exclusively monitor the impact of RTN on the well-being of citizens through WASN-based approaches, those noise sources unrelated to RTN denoted as Anomalous Noise Events (ANEs) should be removed from the noise map generation. This paper introduces an analysis methodology considering both Signal-to-Noise Ratio (SNR) and duration of ANEs to evaluate their impact on the A-weighted equivalent RTN level calculation for different integration times. The experiments conducted on 9 h of real-life data from the WASN-based DYNAMAP project show that both individual high-impact events and aggregated medium-impact events bias significantly the equivalent noise levels of the RTN map, making any derived study about public health impact inaccurate. PMID:29295492
Semantic framework for mapping object-oriented model to semantic web languages
Ježek, Petr; Mouček, Roman
2015-01-01
The article deals with and discusses two main approaches in building semantic structures for electrophysiological metadata. It is the use of conventional data structures, repositories, and programming languages on one hand and the use of formal representations of ontologies, known from knowledge representation, such as description logics or semantic web languages on the other hand. Although knowledge engineering offers languages supporting richer semantic means of expression and technological advanced approaches, conventional data structures and repositories are still popular among developers, administrators and users because of their simplicity, overall intelligibility, and lower demands on technical equipment. The choice of conventional data resources and repositories, however, raises the question of how and where to add semantics that cannot be naturally expressed using them. As one of the possible solutions, this semantics can be added into the structures of the programming language that accesses and processes the underlying data. To support this idea we introduced a software prototype that enables its users to add semantically richer expressions into a Java object-oriented code. This approach does not burden users with additional demands on programming environment since reflective Java annotations were used as an entry for these expressions. Moreover, additional semantics need not to be written by the programmer directly to the code, but it can be collected from non-programmers using a graphic user interface. The mapping that allows the transformation of the semantically enriched Java code into the Semantic Web language OWL was proposed and implemented in a library named the Semantic Framework. This approach was validated by the integration of the Semantic Framework in the EEG/ERP Portal and by the subsequent registration of the EEG/ERP Portal in the Neuroscience Information Framework. PMID:25762923
Semantic framework for mapping object-oriented model to semantic web languages.
Ježek, Petr; Mouček, Roman
2015-01-01
The article deals with and discusses two main approaches in building semantic structures for electrophysiological metadata. It is the use of conventional data structures, repositories, and programming languages on one hand and the use of formal representations of ontologies, known from knowledge representation, such as description logics or semantic web languages on the other hand. Although knowledge engineering offers languages supporting richer semantic means of expression and technological advanced approaches, conventional data structures and repositories are still popular among developers, administrators and users because of their simplicity, overall intelligibility, and lower demands on technical equipment. The choice of conventional data resources and repositories, however, raises the question of how and where to add semantics that cannot be naturally expressed using them. As one of the possible solutions, this semantics can be added into the structures of the programming language that accesses and processes the underlying data. To support this idea we introduced a software prototype that enables its users to add semantically richer expressions into a Java object-oriented code. This approach does not burden users with additional demands on programming environment since reflective Java annotations were used as an entry for these expressions. Moreover, additional semantics need not to be written by the programmer directly to the code, but it can be collected from non-programmers using a graphic user interface. The mapping that allows the transformation of the semantically enriched Java code into the Semantic Web language OWL was proposed and implemented in a library named the Semantic Framework. This approach was validated by the integration of the Semantic Framework in the EEG/ERP Portal and by the subsequent registration of the EEG/ERP Portal in the Neuroscience Information Framework.
Williams, S.J.; Bliss, J.D.; Arsenault, M.A.; Jenkins, C.J.; Goff, J.A.
2007-01-01
Geologic maps depicting offshore sedimentary features serve many scientific and applied purposes. Such maps have been lacking, but recent computer technology and software offer promise in the capture and display of diverse marine data. Continental margins contain landforms which provide a variety of important functions and contain important sedimentary records. Some shelf areas also contain deposits regarded as potential aggregate resources. Because proper management of coastal and offshore areas is increasingly important, knowledge of the framework geology and marine processes is critical. Especially valuable are comprehensive and integrated digital databases based on high-quality information from original sources. Products of interest are GIS maps containing thematic information, such as sediment character and texture. These products are useful to scientists modeling nearshore and shelf processes as well as planners and managers. The U.S. Geological Survey is leading a national program to gather a variety of extant marine geologic data into the usSEABED database system. This provides centralized, integrated marine geologic data collected over the past 50 years. To date, over 340,000 sediment data points from the U.S. reside in usSEABED, which combines an array of physical data and analytical and descriptive information about the sea floor and are available to the marine community through three USGS data reports for the Atlantic, Gulf of Mexico, and Pacific published in 2006, and the project web sites: (http://woodshole.er.usg s.gov/project-pages/aggregates/ and http://walrus.wr.usgs.gov/usseabed/)
Cauzzi, Carlo; Fah, Donat; Wald, David J.; Clinton, John; Losey, Stephane; Wiemer, Stefan
2018-01-01
In Switzerland, nearly all historical Mw ~ 6 earthquakes have induced damaging landslides, rockslides and snow avalanches that, in some cases, also resulted in damage to infrastructure and loss of lives. We describe the customisation to Swiss conditions of a globally calibrated statistical approach originally developed to rapidly assess earthquake-induced landslide likelihoods worldwide. The probability of occurrence of such earthquake-induced effects is modelled through a set of geospatial susceptibility proxies and peak ground acceleration. The predictive model is tuned to capture the observations from past events and optimised for near-real-time estimates based on USGS-style ShakeMaps routinely produced by the Swiss Seismological Service. Our emphasis is on the use of high-resolution geospatial datasets along with additional local information on ground failure susceptibility. Even if calibrated on historic events with moderate magnitudes, the methodology presented in this paper yields sensible results also for low-magnitude recent events. The model is integrated in the Swiss ShakeMap framework. This study has a high practical relevance to many Swiss ShakeMap stakeholders, especially those managing lifeline systems, and to other global users interested in conducting a similar customisation for their region of interest.
Grid cell hexagonal patterns formed by fast self-organized learning within entorhinal cortex.
Mhatre, Himanshu; Gorchetchnikov, Anatoli; Grossberg, Stephen
2012-02-01
Grid cells in the dorsal segment of the medial entorhinal cortex (dMEC) show remarkable hexagonal activity patterns, at multiple spatial scales, during spatial navigation. It has previously been shown how a self-organizing map can convert firing patterns across entorhinal grid cells into hippocampal place cells that are capable of representing much larger spatial scales. Can grid cell firing fields also arise during navigation through learning within a self-organizing map? This article describes a simple and general mathematical property of the trigonometry of spatial navigation which favors hexagonal patterns. The article also develops a neural model that can learn to exploit this trigonometric relationship. This GRIDSmap self-organizing map model converts path integration signals into hexagonal grid cell patterns of multiple scales. GRIDSmap creates only grid cell firing patterns with the observed hexagonal structure, predicts how these hexagonal patterns can be learned from experience, and can process biologically plausible neural input and output signals during navigation. These results support an emerging unified computational framework based on a hierarchy of self-organizing maps for explaining how entorhinal-hippocampal interactions support spatial navigation. Copyright © 2010 Wiley Periodicals, Inc.
Fagerholm, Nora; Käyhkö, Niina; Van Eetvelde, Veerle
2013-09-01
In many developing countries, political documentation acknowledges the crucial elements of participation and spatiality for effective land use planning. However, operative approaches to spatial data inclusion and representation in participatory land management are often lacking. In this paper, we apply and develop an integrated landscape characterization approach to enhance spatial knowledge generation about the complex human-nature interactions in landscapes in the context of Zanzibar, Tanzania. We apply an integrated landscape conceptualization as a theoretical framework where the expert and local knowledge can meet in spatial context. The characterization is based on combining multiple data sources in GIS, and involves local communities and their local spatial knowledge since the beginning into the process. Focusing on the expected information needs for community forest management, our characterization integrates physical landscape features and retrospective landscape change data with place-specific community knowledge collected through participatory GIS techniques. The characterization is established in a map form consisting of four themes and their synthesis. The characterization maps are designed to support intuitive interpretation, express the inherently uncertain nature of the data, and accompanied by photographs to enhance communication. Visual interpretation of the characterization mediates information about the character of areas and places in the studied local landscape, depicting the role of forest resources as part of the landscape entity. We conclude that landscape characterization applied in GIS is a highly potential tool for participatory land and resource management, where spatial argumentation, stakeholder communication, and empowerment are critical issues.
NASA Astrophysics Data System (ADS)
Fagerholm, Nora; Käyhkö, Niina; Van Eetvelde, Veerle
2013-09-01
In many developing countries, political documentation acknowledges the crucial elements of participation and spatiality for effective land use planning. However, operative approaches to spatial data inclusion and representation in participatory land management are often lacking. In this paper, we apply and develop an integrated landscape characterization approach to enhance spatial knowledge generation about the complex human-nature interactions in landscapes in the context of Zanzibar, Tanzania. We apply an integrated landscape conceptualization as a theoretical framework where the expert and local knowledge can meet in spatial context. The characterization is based on combining multiple data sources in GIS, and involves local communities and their local spatial knowledge since the beginning into the process. Focusing on the expected information needs for community forest management, our characterization integrates physical landscape features and retrospective landscape change data with place-specific community knowledge collected through participatory GIS techniques. The characterization is established in a map form consisting of four themes and their synthesis. The characterization maps are designed to support intuitive interpretation, express the inherently uncertain nature of the data, and accompanied by photographs to enhance communication. Visual interpretation of the characterization mediates information about the character of areas and places in the studied local landscape, depicting the role of forest resources as part of the landscape entity. We conclude that landscape characterization applied in GIS is a highly potential tool for participatory land and resource management, where spatial argumentation, stakeholder communication, and empowerment are critical issues.
NASA Astrophysics Data System (ADS)
Siddoway, C. S.; White, T.; Elkind, S.; Cox, S. C.; Lyttle, B. S.; Morin, P. J.
2016-12-01
Bedrock exposures are relatively sparse in Marie Byrd Land (MBL), where rock is concealed by the West Antarctic ice sheet, but they provide direct insight into the geological evolution and glacial history of West Antarctica. MBL is tectonically active, as evidenced by Late Pleistocene to Holocene volcanism and 2012 seismicity (3 events, M4.4 to M5.5) at sites beside Ross Sea. There are geological influences upon the ice sheet, namely, subglacial volcanism and associated geothermal flux, fault zone alteration/mineralization, and bedrock roughess. The former may influence the position and velocity of outlet glaciers and the latter may anchor or accelerate sectors of the ice sheet. To make MBL's geological framework accessible to investigators with diverse research priorities, we are preparing the first digital geological map of MBL by compiling ground-based geological data, incorporating firsthand observations, published geological maps and literature. The map covers an on-continent coastal area of 900 000 km2 between 090°E to 160°E, from 72°S to 80°S, at 1:250 000 scale or better. Exposed rock is delimited by 1976 polygons, occupying 410 km2. Supraglacial features and glacial till, seasonal water and blue ice, are also mapped, as a baseline for past and future glaciological change. Rendered in the ArcMap GIS software by Esri©, the database employs international GeoSciML data protocols for feature classification and description of rock and moraine polygons from the Antarctic Digital Database (www.add.scar.org), with shape and location adjusted to align with features in Landsat Image Mosaic of Antarctica imagery (lima.usgs.gov), where necessary. The GIS database is attribute-rich and queriable; including links to bibliographic source files for primary literature and published maps. It will soon be available as GoogleEarth kmz files and an ArcGIS online map service. An initial application is to the interpretation of sub-ice geology for a subglacial geotectonic map of this active region. This is undertaken as part of ROSETTA-Ice, an integrated systems science investigation of the Ross Ice Shelf that commenced in 2015. The next phases of MBL database development will assess icesheet-ocean interactions near grounding line, environmental domain analysis and ecological research.
Laser-Based Slam with Efficient Occupancy Likelihood Map Learning for Dynamic Indoor Scenes
NASA Astrophysics Data System (ADS)
Li, Li; Yao, Jian; Xie, Renping; Tu, Jinge; Feng, Chen
2016-06-01
Location-Based Services (LBS) have attracted growing attention in recent years, especially in indoor environments. The fundamental technique of LBS is the map building for unknown environments, this technique also named as simultaneous localization and mapping (SLAM) in robotic society. In this paper, we propose a novel approach for SLAMin dynamic indoor scenes based on a 2D laser scanner mounted on a mobile Unmanned Ground Vehicle (UGV) with the help of the grid-based occupancy likelihood map. Instead of applying scan matching in two adjacent scans, we propose to match current scan with the occupancy likelihood map learned from all previous scans in multiple scales to avoid the accumulation of matching errors. Due to that the acquisition of the points in a scan is sequential but not simultaneous, there unavoidably exists the scan distortion at different extents. To compensate the scan distortion caused by the motion of the UGV, we propose to integrate a velocity of a laser range finder (LRF) into the scan matching optimization framework. Besides, to reduce the effect of dynamic objects such as walking pedestrians often existed in indoor scenes as much as possible, we propose a new occupancy likelihood map learning strategy by increasing or decreasing the probability of each occupancy grid after each scan matching. Experimental results in several challenged indoor scenes demonstrate that our proposed approach is capable of providing high-precision SLAM results.
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields. PMID:25383096
NASA Astrophysics Data System (ADS)
Hagemeier-Klose, M.; Wagner, K.
2009-04-01
Flood risk communication with the general public and the population at risk is getting increasingly important for flood risk management, especially as a precautionary measure. This is also underlined by the EU Flood Directive. The flood related authorities therefore have to develop adjusted information tools which meet the demands of different user groups. This article presents the formative evaluation of flood hazard maps and web mapping services according to the specific requirements and needs of the general public using the dynamic-transactional approach as a theoretical framework. The evaluation was done by a mixture of different methods; an analysis of existing tools, a creative workshop with experts and laymen and an online survey. The currently existing flood hazard maps or web mapping services or web GIS still lack a good balance between simplicity and complexity with adequate readability and usability for the public. Well designed and associative maps (e.g. using blue colours for water depths) which can be compared with past local flood events and which can create empathy in viewers, can help to raise awareness, to heighten the activity and knowledge level or can lead to further information seeking. Concerning web mapping services, a linkage between general flood information like flood extents of different scenarios and corresponding water depths and real time information like gauge levels is an important demand by users. Gauge levels of these scenarios are easier to understand than the scientifically correct return periods or annualities. The recently developed Bavarian web mapping service tries to integrate these requirements.
Simplex-stochastic collocation method with improved scalability
NASA Astrophysics Data System (ADS)
Edeling, W. N.; Dwight, R. P.; Cinnella, P.
2016-04-01
The Simplex-Stochastic Collocation (SSC) method is a robust tool used to propagate uncertain input distributions through a computer code. However, it becomes prohibitively expensive for problems with dimensions higher than 5. The main purpose of this paper is to identify bottlenecks, and to improve upon this bad scalability. In order to do so, we propose an alternative interpolation stencil technique based upon the Set-Covering problem, and we integrate the SSC method in the High-Dimensional Model-Reduction framework. In addition, we address the issue of ill-conditioned sample matrices, and we present an analytical map to facilitate uniformly-distributed simplex sampling.
On Developing a Taxonomy for Multidisciplinary Design Optimization: A Decision-Based Perspective
NASA Technical Reports Server (NTRS)
Lewis, Kemper; Mistree, Farrokh
1995-01-01
In this paper, we approach MDO from a Decision-Based Design (DBD) perspective and explore classification schemes for designing complex systems and processes. Specifically, we focus on decisions, which are only a small portion of the Decision Support Problem (DSP) Technique, our implementation of DBD. We map coupled nonhierarchical and hierarchical representations from the DSP Technique into the Balling-Sobieski (B-S) framework (Balling and Sobieszczanski-Sobieski, 1994), and integrate domain-independent linguistic terms to complete our taxonomy. Application of DSPs to the design of complex, multidisciplinary systems include passenger aircraft, ships, damage tolerant structural and mechanical systems, and thermal energy systems. In this paper we show that Balling-Sobieski framework is consistent with that of the Decision Support Problem Technique through the use of linguistic entities to describe the same type of formulations. We show that the underlying linguistics of the solution approaches are the same and can be coalesced into a homogeneous framework with which to base the research, application, and technology MDO upon. We introduce, in the Balling-Sobieski framework, examples of multidisciplinary design, namely, aircraft, damage tolerant structural and mechanical systems, and thermal energy systems.
NASA Astrophysics Data System (ADS)
Sagarminaga, Y.; Galparsoro, I.; Reig, R.; Sánchez, J. A.
2012-04-01
Since 2000, an intense effort was conducted in AZTI's Marine Research Division to set up a data management system which could gather all the marine datasets that were being produced by different in-house research projects. For that, a corporative GIS was designed that included a data and metadata repository, a database, a layer catalog & search application and an internet map viewer. Several layers, mostly dealing with physical, chemical and biological in-situ sampling, and basic and thematic cartography including bathymetry, geomorphology, different species habitat maps, and human pressure and activities maps, were successfully gathered in this system. Very soon, it was realised that new marine technologies yielding continuous multidimensional data, sometimes called FES (Fluid Earth System) data, were difficult to handle in this structure. The data affected, mainly included numerical oceanographic and meteorological models, remote sensing data, coastal RADAR data, and some in-situ observational systems such as CTD's casts, moored or lagrangian buoys, etc. A management system for gridded multidimensional data was developed using standardized formats (netcdf using CF conventions) and tools such as THREDDS catalog (UNIDATA/UCAR) providing web services such as OPENDAP, NCSS, and WCS, as well as ncWMS service developed by the Reading e-science Center. At present, a system (ITSASGIS-5D) is being developed, based on OGC standards and open-source tools to allow interoperability between all the data types mentioned before. This system includes, in the server side, postgresql/postgis databases and geoserver for GIS layers, and THREDDS/Opendap and ncWMS services for FES gridded data. Moreover, an on-line client is being developed to allow joint access, user configuration, data visualisation & query and data distribution. This client is using mapfish, ExtJS - GeoEXT, and openlayers libraries. Through this presentation the elements of the first released version of this system will be described and showed, together with the new topics to be developed in new versions that include among others, the integration of geoNetwork libraries and tools for both FES and GIS metadata management, and the use of new OGC Sensor Observation Services (SOS) to integrate non gridded multidimensional data such as time series, depth profiles or trajectories provided by different observational systems. The final aim of this approach is to contribute to the multidisciplinary access and use of marine data for management and research activities, and facilitate the implementation of integrated ecosystem based approaches in the fields of fisheries advice and management, marine spatial planning, or the implementation of the European policies such as the Water Framework Directive, the Marine Strategy Framework Directive or the Habitat Framework Directive.
NASA Astrophysics Data System (ADS)
Creed, I. F.; Aldred, D.; Spargo, A.; Bayley, S.
2012-12-01
Wetlands are being lost at an alarming rate in the prairie pothole landscape of North America. The full consequence of this loss is not fully understood or recognized due to (1) inadequate or incomplete wetland inventories (with mapping emphasizing permanent and not ephemeral wetlands, and only capturing "easy to observe" wetland area defined by open water and not the true dynamic wetland extent defined by saturated soils), and (2) lack of appropriate theoretical frameworks to assess the functions and benefits of these wetlands. We present a theoretical framework that integrates indicators to estimate functions and benefits of wetland integrity in central Alberta. We establish indicators using the principles that are representative of the dominant processes operating on the landscape, are simple and are scalable. While some of these indicators may be widely recognized, their implementation is often not comprehensive or complete. First, we develop an automated method for fine scale mapping of permanent and ephemeral wetlands from a fusion of high-resolution elevation data and aerial photography. Second, we estimate historic wetland loss over the past 50 years during which intensive domestication of the landscape occurred by modeling the distribution of wetlands in an undisturbed landscape using area-frequency power functions and calculating the difference in the actual wetland inventory. Third, we define relative wetland assessment units using cluster analysis of hydrological and ecological variables, including climate, geology, topography, soils and land use/land covers. Fourth, for each assessment unit we define indicators of functions and benefits of aquatic ecosystem services including water storage (surface and subsurface), phosphorus retention, nitrate removal, sediment retention, ecological health/biodiversity and human use, and then use practical strategies rooted in the fusion of digital terrain analysis and remote sensing techniques to measure and monitor these indicators over the past years. For a time series of wetlands loss we derive these indicators of functions and benefits to estimate changes in the provision of specific aquatic ecosystem services on the landscape. Last, we develop formulae for integrating these indicators to determine whether a specific wetland or wetland complex should be prioritized for conservation, exemplifying potential trade-offs among ecosystem services in setting conservation targets on this wetland dominated landscapes. The proposed theoretical framework evolved from close collaboration between scientists and resource managers, and will inform those engaged in developing wetland policies for a broad range of jurisdictions.
NADM Conceptual Model 1.0 -- A Conceptual Model for Geologic Map Information
,
2004-01-01
Executive Summary -- The NADM Data Model Design Team was established in 1999 by the North American Geologic Map Data Model Steering Committee (NADMSC) with the purpose of drafting a geologic map data model for consideration as a standard for developing interoperable geologic map-centered databases by state, provincial, and federal geological surveys. The model is designed to be a technology-neutral conceptual model that can form the basis for a web-based interchange format using evolving information technology (e.g., XML, RDF, OWL), and guide implementation of geoscience databases in a common conceptual framework. The intended purpose is to allow geologic information sharing between geologic map data providers and users, independent of local information system implementation. The model emphasizes geoscience concepts and relationships related to information presented on geologic maps. Design has been guided by an informal requirements analysis, documentation of existing databases, technology developments, and other standardization efforts in the geoscience and computer-science communities. A key aspect of the model is the notion that representation of the conceptual framework (ontology) that underlies geologic map data must be part of the model, because this framework changes with time and understanding, and varies between information providers. The top level of the model distinguishes geologic concepts, geologic representation concepts, and metadata. The geologic representation part of the model provides a framework for representing the ontology that underlies geologic map data through a controlled vocabulary, and for establishing the relationships between this vocabulary and a geologic map visualization or portrayal. Top-level geologic classes in the model are Earth material (substance), geologic unit (parts of the Earth), geologic age, geologic structure, fossil, geologic process, geologic relation, and geologic event.
Visual navigation in insects: coupling of egocentric and geocentric information
Wehner; Michel; Antonsen
1996-01-01
Social hymenopterans such as bees and ants are central-place foragers; they regularly depart from and return to fixed positions in their environment. In returning to the starting point of their foraging excursion or to any other point, they could resort to two fundamentally different ways of navigation by using either egocentric or geocentric systems of reference. In the first case, they would rely on information continuously collected en route (path integration, dead reckoning), i.e. integrate all angles steered and all distances covered into a mean home vector. In the second case, they are expected, at least by some authors, to use a map-based system of navigation, i.e. to obtain positional information by virtue of the spatial position they occupy within a larger environmental framework. In bees and ants, path integration employing a skylight compass is the predominant mechanism of navigation, but geocentred landmark-based information is used as well. This information is obtained while the animal is dead-reckoning and, hence, added to the vector course. For example, the image of the horizon skyline surrounding the nest entrance is retinotopically stored while the animal approaches the goal along its vector course. As shown in desert ants (genus Cataglyphis), there is neither interocular nor intraocular transfer of landmark information. Furthermore, this retinotopically fixed, and hence egocentred, neural snapshot is linked to an external (geocentred) system of reference. In this way, geocentred information might more and more complement and potentially even supersede the egocentred information provided by the path-integration system. In competition experiments, however, Cataglyphis never frees itself of its homeward-bound vector - its safety-line, so to speak - by which it is always linked to home. Vector information can also be transferred to a longer-lasting (higher-order) memory. There is no need to invoke the concept of the mental analogue of a topographic map - a metric map - assembled by the insect navigator. The flexible use of vectors, snapshots and landmark-based routes suffices to interpret the insect's behaviour. The cognitive-map approach in particular, and the representational paradigm in general, are discussed.
NASA Astrophysics Data System (ADS)
Jokar Arsanjani, Jamal; Vaz, Eric
2015-03-01
Until recently, land surveys and digital interpretation of remotely sensed imagery have been used to generate land use inventories. These techniques however, are often cumbersome and costly, allocating large amounts of technical and temporal costs. The technological advances of web 2.0 have brought a wide array of technological achievements, stimulating the participatory role in collaborative and crowd sourced mapping products. This has been fostered by GPS-enabled devices, and accessible tools that enable visual interpretation of high resolution satellite images/air photos provided in collaborative mapping projects. Such technologies offer an integrative approach to geography by means of promoting public participation and allowing accurate assessment and classification of land use as well as geographical features. OpenStreetMap (OSM) has supported the evolution of such techniques, contributing to the existence of a large inventory of spatial land use information. This paper explores the introduction of this novel participatory phenomenon for land use classification in Europe's metropolitan regions. We adopt a positivistic approach to assess comparatively the accuracy of these contributions of OSM for land use classifications in seven large European metropolitan regions. Thematic accuracy and degree of completeness of OSM data was compared to available Global Monitoring for Environment and Security Urban Atlas (GMESUA) datasets for the chosen metropolises. We further extend our findings of land use within a novel framework for geography, justifying that volunteered geographic information (VGI) sources are of great benefit for land use mapping depending on location and degree of VGI dynamism and offer a great alternative to traditional mapping techniques for metropolitan regions throughout Europe. Evaluation of several land use types at the local level suggests that a number of OSM classes (such as anthropogenic land use, agricultural and some natural environment classes) are viable alternatives for land use classification. These classes are highly accurate and can be integrated into planning decisions for stakeholders and policymakers.
ERIC Educational Resources Information Center
Perera, Srinath; Babatunde, Solomon Olusola; Zhou, Lei; Pearson, John; Ekundayo, Damilola
2017-01-01
Recognition of the huge variation between professional graduate degree programmes and employer requirements, especially in the construction industry, necessitated a need for assessing and developing competencies that aligned with professionally oriented programmes. The purpose of this research is to develop a competency mapping framework (CMF) in…
A Hierarchical Framework for State-Space Matrix Inference and Clustering.
Zuo, Chandler; Chen, Kailei; Hewitt, Kyle J; Bresnick, Emery H; Keleş, Sündüz
2016-09-01
In recent years, a large number of genomic and epigenomic studies have been focusing on the integrative analysis of multiple experimental datasets measured over a large number of observational units. The objectives of such studies include not only inferring a hidden state of activity for each unit over individual experiments, but also detecting highly associated clusters of units based on their inferred states. Although there are a number of methods tailored for specific datasets, there is currently no state-of-the-art modeling framework for this general class of problems. In this paper, we develop the MBASIC ( M atrix B ased A nalysis for S tate-space I nference and C lustering) framework. MBASIC consists of two parts: state-space mapping and state-space clustering. In state-space mapping, it maps observations onto a finite state-space, representing the activation states of units across conditions. In state-space clustering, MBASIC incorporates a finite mixture model to cluster the units based on their inferred state-space profiles across all conditions. Both the state-space mapping and clustering can be simultaneously estimated through an Expectation-Maximization algorithm. MBASIC flexibly adapts to a large number of parametric distributions for the observed data, as well as the heterogeneity in replicate experiments. It allows for imposing structural assumptions on each cluster, and enables model selection using information criterion. In our data-driven simulation studies, MBASIC showed significant accuracy in recovering both the underlying state-space variables and clustering structures. We applied MBASIC to two genome research problems using large numbers of datasets from the ENCODE project. The first application grouped genes based on transcription factor occupancy profiles of their promoter regions in two different cell types. The second application focused on identifying groups of loci that are similar to a GATA2 binding site that is functional at its endogenous locus by utilizing transcription factor occupancy data and illustrated applicability of MBASIC in a wide variety of problems. In both studies, MBASIC showed higher levels of raw data fidelity than analyzing these data with a two-step approach using ENCODE results on transcription factor occupancy data.
NASA Astrophysics Data System (ADS)
Llamas, R. M.; Colditz, R. R.; Ressl, R.; Jurado Cruz, D. A.; Argumedo, J.; Victoria, A.; Meneses, C.
2017-12-01
The North American Land Change Monitoring System (NALCMS) is a tri-national initiative for mapping land cover across Mexico, United States and Canada, integrating efforts of institutions from the three countries. At the continental scale the group released land cover and change maps derived from MODIS image mosaics at 250m spatial resolution for 2005 and 2010. Current efforts are based on 30m Landsat images for 2010 ± 1 year. Each country uses its own mapping approach and sources for ancillary data, while ensuring that maps are produced in a coherent fashion across the continent. This paper presents the methodology and final land cover map of Mexico for the year 2010 that was later integrated into a continental map. The principal input for Mexico was the Monitoring Activity Data for Mexico (MAD-MEX) land cover map (version 4.3), derived from all available mostly cloud-free images for the year 2010. A total of 35 classes were regrouped to 15 classes of the NALCMS legend present in Mexico. Next, various issues of the automatically generated MAD-MEX land cover mosaic were corrected, such as: filling areas of no data due no cloud-free observation or gaps in Landsat 7 ETM+ images, filling inland water bodies which were left unclassified due to masking issues, relabeling isolated unclassified of falsely classified pixels, structural mislabeling due to data gaps, reclassifying areas of adjacent scenes with significant class disagreements and correcting obvious misclassifications, mostly of water and urban areas. In a second step minor missing areas and rare class snow and ice were digitized and a road network was added. A product such as NALCMS land cover map at 30m for North America is an unprecedented effort and will be without doubt an important source of information for many users around the world who need coherent land cover data over a continental domain as an input for a wide variety of environmental studies. The product release to the general public is expected by late summer of 2017 and will be made available through the Commission for Environmental Cooperation (CEC) at www.cec.org
Geologic and Geophysical Framework of the Santa Rosa 7.5' Quadrangle, Sonoma County, California
McLaughlin, R.J.; Langenheim, V.E.; Sarna-Wojcicki, A. M.; Fleck, R.J.; McPhee, D.K.; Roberts, C.W.; McCabe, C.A.; Wan, Elmira
2008-01-01
The geologic and geophysical maps of Santa Rosa 7.5? quadrangle and accompanying structure sections portray the sedimentary and volcanic stratigraphy and crustal structure of the Santa Rosa 7.5? quadrangle and provide a context for interpreting the evolution of volcanism and active faulting in this region. The quadrangle is located in the California Coast Ranges north of San Francisco Bay and is traversed by the active Rodgers Creek, Healdsburg and Maacama Fault Zones. The geologic and geophysical data presented in this report, are substantial improvements over previous geologic and geophysical maps of the Santa Rosa area, allowing us to address important geologic issues. First, the geologic mapping is integrated with gravity and magnetic data, allowing us to depict the thicknesses of Cenozoic deposits, the depth and configuration of the Mesozoic basement surface, and the geometry of fault structures beneath this region to depths of several kilometers. This information has important implications for constraining the geometries of major active faults and for understanding and predicting the distribution and intensity of damage from ground shaking during earthquakes. Secondly, the geologic map and the accompanying description of the area describe in detail the distribution, geometry and complexity of faulting associated with the Rodgers Creek, Healdsburg and Bennett Valley Fault Zones and associated faults in the Santa Rosa quadrangle. The timing of fault movements is constrained by new 40Ar/39Ar ages and tephrochronologic correlations. These new data provide a better understanding of the stratigraphy of the extensive sedimentary and volcanic cover in the area and, in particular, clarify the formational affinities of Pliocene and Pleistocene nonmarine sedimentary units in the map area. Thirdly, the geophysics, particularly gravity data, indicate the locations of thick sections of sedimentary and volcanic fill within ground water basins of the Santa Rosa plain and Rincon, Bennett, and northwestern Sonoma Valleys, providing geohydrologists a more realistic framework for groundwater flow models.
NASA Astrophysics Data System (ADS)
Albeke, S. E.; Perkins, D. G.; Ewers, S. L.; Ewers, B. E.; Holbrook, W. S.; Miller, S. N.
2015-12-01
The sharing of data and results is paramount for advancing scientific research. The Wyoming Center for Environmental Hydrology and Geophysics (WyCEHG) is a multidisciplinary group that is driving scientific breakthroughs to help manage water resources in the Western United States. WyCEHG is mandated by the National Science Foundation (NSF) to share their data. However, the infrastructure from which to share such diverse, complex and massive amounts of data did not exist within the University of Wyoming. We developed an innovative framework to meet the data organization, sharing, and discovery requirements of WyCEHG by integrating both open and closed source software, embedded metadata tags, semantic web technologies, and a web-mapping application. The infrastructure uses a Relational Database Management System as the foundation, providing a versatile platform to store, organize, and query myriad datasets, taking advantage of both structured and unstructured formats. Detailed metadata are fundamental to the utility of datasets. We tag data with Uniform Resource Identifiers (URI's) to specify concepts with formal descriptions (i.e. semantic ontologies), thus allowing users the ability to search metadata based on the intended context rather than conventional keyword searches. Additionally, WyCEHG data are geographically referenced. Using the ArcGIS API for Javascript, we developed a web mapping application leveraging database-linked spatial data services, providing a means to visualize and spatially query available data in an intuitive map environment. Using server-side scripting (PHP), the mapping application, in conjunction with semantic search modules, dynamically communicates with the database and file system, providing access to available datasets. Our approach provides a flexible, comprehensive infrastructure from which to store and serve WyCEHG's highly diverse research-based data. This framework has not only allowed WyCEHG to meet its data stewardship requirements, but can provide a template for others to follow.
Netzel, Pawel
2017-01-01
The United States is increasingly becoming a multi-racial society. To understand multiple consequences of this overall trend to our neighborhoods we need a methodology capable of spatio-temporal analysis of racial diversity at the local level but also across the entire U.S. Furthermore, such methodology should be accessible to stakeholders ranging from analysts to decision makers. In this paper we present a comprehensive framework for visualizing and analyzing diversity data that fulfills such requirements. The first component of our framework is a U.S.-wide, multi-year database of race sub-population grids which is freely available for download. These 30 m resolution grids have being developed using dasymetric modeling and are available for 1990-2000-2010. We summarize numerous advantages of gridded population data over commonly used Census tract-aggregated data. Using these grids frees analysts from constructing their own and allows them to focus on diversity analysis. The second component of our framework is a set of U.S.-wide, multi-year diversity maps at 30 m resolution. A diversity map is our product that classifies the gridded population into 39 communities based on their degrees of diversity, dominant race, and population density. It provides spatial information on diversity in a single, easy-to-understand map that can be utilized by analysts and end users alike. Maps based on subsequent Censuses provide information about spatio-temporal dynamics of diversity. Diversity maps are accessible through the GeoWeb application SocScape (http://sil.uc.edu/webapps/socscape_usa/) for an immediate online exploration. The third component of our framework is a proposal to quantitatively analyze diversity maps using a set of landscape metrics. Because of its form, a grid-based diversity map could be thought of as a diversity “landscape” and analyzed quantitatively using landscape metrics. We give a brief summary of most pertinent metrics and demonstrate how they can be applied to diversity maps. PMID:28358862
A new pressure ulcer conceptual framework.
Coleman, Susanne; Nixon, Jane; Keen, Justin; Wilson, Lyn; McGinnis, Elizabeth; Dealey, Carol; Stubbs, Nikki; Farrin, Amanda; Dowding, Dawn; Schols, Jos M G A; Cuddigan, Janet; Berlowitz, Dan; Jude, Edward; Vowden, Peter; Schoonhoven, Lisette; Bader, Dan L; Gefen, Amit; Oomens, Cees W J; Nelson, E Andrea
2014-10-01
This paper discusses the critical determinants of pressure ulcer development and proposes a new pressure ulcer conceptual framework. Recent work to develop and validate a new evidence-based pressure ulcer risk assessment framework was undertaken. This formed part of a Pressure UlceR Programme Of reSEarch (RP-PG-0407-10056), funded by the National Institute for Health Research. The foundation for the risk assessment component incorporated a systematic review and a consensus study that highlighted the need to propose a new conceptual framework. Discussion Paper. The new conceptual framework links evidence from biomechanical, physiological and epidemiological evidence, through use of data from a systematic review (search conducted March 2010), a consensus study (conducted December 2010-2011) and an international expert group meeting (conducted December 2011). A new pressure ulcer conceptual framework incorporating key physiological and biomechanical components and their impact on internal strains, stresses and damage thresholds is proposed. Direct and key indirect causal factors suggested in a theoretical causal pathway are mapped to the physiological and biomechanical components of the framework. The new proposed conceptual framework provides the basis for understanding the critical determinants of pressure ulcer development and has the potential to influence risk assessment guidance and practice. It could also be used to underpin future research to explore the role of individual risk factors conceptually and operationally. By integrating existing knowledge from epidemiological, physiological and biomechanical evidence, a theoretical causal pathway and new conceptual framework are proposed with potential implications for practice and research. © 2014 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.
A new pressure ulcer conceptual framework
Coleman, Susanne; Nixon, Jane; Keen, Justin; Wilson, Lyn; McGinnis, Elizabeth; Dealey, Carol; Stubbs, Nikki; Farrin, Amanda; Dowding, Dawn; Schols, Jos MGA; Cuddigan, Janet; Berlowitz, Dan; Jude, Edward; Vowden, Peter; Schoonhoven, Lisette; Bader, Dan L; Gefen, Amit; Oomens, Cees WJ; Nelson, E Andrea
2014-01-01
Aim This paper discusses the critical determinants of pressure ulcer development and proposes a new pressure ulcer conceptual framework. Background Recent work to develop and validate a new evidence-based pressure ulcer risk assessment framework was undertaken. This formed part of a Pressure UlceR Programme Of reSEarch (RP-PG-0407-10056), funded by the National Institute for Health Research. The foundation for the risk assessment component incorporated a systematic review and a consensus study that highlighted the need to propose a new conceptual framework. Design Discussion Paper. Data Sources The new conceptual framework links evidence from biomechanical, physiological and epidemiological evidence, through use of data from a systematic review (search conducted March 2010), a consensus study (conducted December 2010–2011) and an international expert group meeting (conducted December 2011). Implications for Nursing A new pressure ulcer conceptual framework incorporating key physiological and biomechanical components and their impact on internal strains, stresses and damage thresholds is proposed. Direct and key indirect causal factors suggested in a theoretical causal pathway are mapped to the physiological and biomechanical components of the framework. The new proposed conceptual framework provides the basis for understanding the critical determinants of pressure ulcer development and has the potential to influence risk assessment guidance and practice. It could also be used to underpin future research to explore the role of individual risk factors conceptually and operationally. Conclusion By integrating existing knowledge from epidemiological, physiological and biomechanical evidence, a theoretical causal pathway and new conceptual framework are proposed with potential implications for practice and research. PMID:24684197
Road Map For Diffusion Of Innovation In Health Care.
Balas, E Andrew; Chapman, Wendy W
2018-02-01
New scientific knowledge and innovation are often slow to disseminate. In other cases, providers rush into adopting what appears to be a clinically relevant innovation, based on a single clinical trial. In reality, adopting innovations without appropriate translation and repeated testing of practical application is problematic. In this article we provide examples of clinical innovations (for example, tight glucose control in critically ill patients) that were adopted inappropriately and that caused what we term a malfunction. To address the issue of malfunctions, we review various examples and suggest frameworks for the diffusion of knowledge leading to the adoption of useful innovations. The resulting model is termed an integrated road map for coordinating knowledge transformation and innovation adoption. We make recommendations for the targeted development of practice change procedures, practice change assessment, structured descriptions of tested interventions, intelligent knowledge management technologies, and policy support for knowledge transformation, including further standardization to facilitate sharing among institutions.
Diffeomorphic Sulcal Shape Analysis on the Cortex
Joshi, Shantanu H.; Cabeen, Ryan P.; Joshi, Anand A.; Sun, Bo; Dinov, Ivo; Narr, Katherine L.; Toga, Arthur W.; Woods, Roger P.
2014-01-01
We present a diffeomorphic approach for constructing intrinsic shape atlases of sulci on the human cortex. Sulci are represented as square-root velocity functions of continuous open curves in ℝ3, and their shapes are studied as functional representations of an infinite-dimensional sphere. This spherical manifold has some advantageous properties – it is equipped with a Riemannian metric on the tangent space and facilitates computational analyses and correspondences between sulcal shapes. Sulcal shape mapping is achieved by computing geodesics in the quotient space of shapes modulo scales, translations, rigid rotations and reparameterizations. The resulting sulcal shape atlas preserves important local geometry inherently present in the sample population. The sulcal shape atlas is integrated in a cortical registration framework and exhibits better geometric matching compared to the conventional euclidean method. We demonstrate experimental results for sulcal shape mapping, cortical surface registration, and sulcal classification for two different surface extraction protocols for separate subject populations. PMID:22328177
Architecture of fluid intelligence and working memory revealed by lesion mapping.
Barbey, Aron K; Colom, Roberto; Paul, Erick J; Grafman, Jordan
2014-03-01
Although cognitive neuroscience has made valuable progress in understanding the role of the prefrontal cortex in human intelligence, the functional networks that support adaptive behavior and novel problem solving remain to be well characterized. Here, we studied 158 human brain lesion patients to investigate the cognitive and neural foundations of key competencies for fluid intelligence and working memory. We administered a battery of neuropsychological tests, including the Wechsler Adult Intelligence Scale (WAIS) and the N-Back task. Latent variable modeling was applied to obtain error-free scores of fluid intelligence and working memory, followed by voxel-based lesion-symptom mapping to elucidate their neural substrates. The observed latent variable modeling and lesion results support an integrative framework for understanding the architecture of fluid intelligence and working memory and make specific recommendations for the interpretation and application of the WAIS and N-Back task to the study of fluid intelligence in health and disease.
NASA Astrophysics Data System (ADS)
Agate, M.; Catalano, R.; Chemello, R.; Lo Iacono, C.; Riggio, S.
2003-04-01
A GEOLOGICAL-ACOUSTICAL FRAMEWORK FOR AN INTEGRATED ENVIRONMENTAL EVALUATION IN MEDITERRANEAN MARINE PROTECTED AREAS. MARETTIMO ISLAND, A CASE STUDY. M. Agate (1), R. Catalano (1), R. Chemello (2), C. Lo Iacono (1) &S. Riggio (2) (1)Dipartimento di Geologia e Geodesia dell'Università di Palermo, via Archirafi 26, 90123 Palermo, clageo@katamail.com, rcatal@unipa.it (2)Dipartimento di Biologia animale dell'Università di Palermo, via Archirafi 18, 90123 Palermo,rchemello@unipa.it New analytical methods have been designed to support an objective quantitative evaluation of geological components whose results dictate the lines for a sustainable use of the natural resources. We tried to adopt the fundaments of the seascape concept, based on the thematic elements of landscape ecology and translated into terms fitting with the principles of coastal ecology. The seascape concept is central to our view of the environment and is referred to as an integrated unit (Environmental Unit) resulting from a long multidisciplinary approach, carried out in both the field and the laboratory by an interdisciplinary team of experts. Side Scan Sonar and Multi Beam acoustical data collected in the Marettimo and Ustica Islands (south-western Tyrrhenian Sea))inner shelves, make possible to sketch geomorphological and sedimentological maps, whose details have been tested as deep as 45 m in diving surveys. On the basis of the collected data sets, the inner shelf (0-60 m) has been subdivided into different portions, following the concept of the Environmental Unit (E.U). Every E.U. presents constant morphological and sedimentological features that, probably, can be associated to specified biological communities. In order to find the relationships between physical settings and communities, geological thematic maps are eventually overlaid and fitted to macrobenthic and fishery spatial distribution maps. The result, based on the rule of the Environmental Impact Assessment, puts into evidence the major environmental features and territorial links, useful for correct evaluation and management of a Marine Protected Area. This strategy has informed the GEBEC project, designed to sketch an overall picture of some coastal areas in Southern and Central Mediterranean (Egadi Islands, S. Maria di Castellabate coast, Ustica Island) needing protection and sustainable development.
Integrating Health Behavior Theory and Design Elements in Serious Games.
Cheek, Colleen; Fleming, Theresa; Lucassen, Mathijs Fg; Bridgman, Heather; Stasiak, Karolina; Shepherd, Matthew; Orpin, Peter
2015-01-01
Internet interventions for improving health and well-being have the potential to reach many people and fill gaps in service provision. Serious gaming interfaces provide opportunities to optimize user adherence and impact. Health interventions based in theory and evidence and tailored to psychological constructs have been found to be more effective to promote behavior change. Defining the design elements which engage users and help them to meet their goals can contribute to better informed serious games. To elucidate design elements important in SPARX, a serious game for adolescents with depression, from a user-centered perspective. We proposed a model based on an established theory of health behavior change and practical features of serious game design to organize ideas and rationale. We analyzed data from 5 studies comprising a total of 22 focus groups and 66 semistructured interviews conducted with youth and families in New Zealand and Australia who had viewed or used SPARX. User perceptions of the game were applied to this framework. A coherent framework was established using the three constructs of self-determination theory (SDT), autonomy, competence, and relatedness, to organize user perceptions and design elements within four areas important in design: computer game, accessibility, working alliance, and learning in immersion. User perceptions mapped well to the framework, which may assist developers in understanding the context of user needs. By mapping these elements against the constructs of SDT, we were able to propose a sound theoretical base for the model. This study's method allowed for the articulation of design elements in a serious game from a user-centered perspective within a coherent overarching framework. The framework can be used to deliberately incorporate serious game design elements that support a user's sense of autonomy, competence, and relatedness, key constructs which have been found to mediate motivation at all stages of the change process. The resulting model introduces promising avenues for future exploration. Involving users in program design remains an imperative if serious games are to be fit for purpose.
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.
1988-02-05
0-A193 971 IEMIS (INTEGRATED EMERGENCY MANAGEMENT INFORMATION SYSTEM ) FLOODPLRIN MAP.. (U) ARMY ENGINEER WATERWAYS EXPERIMENT STATION VICKSBURG HS J...illustrate the application of the automated mapping capabilities of the Integrated Emergency Management Information System (IEMIS) to FISs. Unclassified...mapping capabilities of the Integrated Emergency Management Information System (IEMIS) to FISs. II. BACKGROUND The concept of mounting laser ranging
Wallace, Bryan P.; DiMatteo, Andrew D.; Hurley, Brendan J.; Finkbeiner, Elena M.; Bolten, Alan B.; Chaloupka, Milani Y.; Hutchinson, Brian J.; Abreu-Grobois, F. Alberto; Amorocho, Diego; Bjorndal, Karen A.; Bourjea, Jerome; Bowen, Brian W.; Dueñas, Raquel Briseño; Casale, Paolo; Choudhury, B. C.; Costa, Alice; Dutton, Peter H.; Fallabrino, Alejandro; Girard, Alexandre; Girondot, Marc; Godfrey, Matthew H.; Hamann, Mark; López-Mendilaharsu, Milagros; Marcovaldi, Maria Angela; Mortimer, Jeanne A.; Musick, John A.; Nel, Ronel; Pilcher, Nicolas J.; Seminoff, Jeffrey A.; Troëng, Sebastian; Witherington, Blair; Mast, Roderic B.
2010-01-01
Background Resolving threats to widely distributed marine megafauna requires definition of the geographic distributions of both the threats as well as the population unit(s) of interest. In turn, because individual threats can operate on varying spatial scales, their impacts can affect different segments of a population of the same species. Therefore, integration of multiple tools and techniques — including site-based monitoring, genetic analyses, mark-recapture studies and telemetry — can facilitate robust definitions of population segments at multiple biological and spatial scales to address different management and research challenges. Methodology/Principal Findings To address these issues for marine turtles, we collated all available studies on marine turtle biogeography, including nesting sites, population abundances and trends, population genetics, and satellite telemetry. We georeferenced this information to generate separate layers for nesting sites, genetic stocks, and core distributions of population segments of all marine turtle species. We then spatially integrated this information from fine- to coarse-spatial scales to develop nested envelope models, or Regional Management Units (RMUs), for marine turtles globally. Conclusions/Significance The RMU framework is a solution to the challenge of how to organize marine turtles into units of protection above the level of nesting populations, but below the level of species, within regional entities that might be on independent evolutionary trajectories. Among many potential applications, RMUs provide a framework for identifying data gaps, assessing high diversity areas for multiple species and genetic stocks, and evaluating conservation status of marine turtles. Furthermore, RMUs allow for identification of geographic barriers to gene flow, and can provide valuable guidance to marine spatial planning initiatives that integrate spatial distributions of protected species and human activities. In addition, the RMU framework — including maps and supporting metadata — will be an iterative, user-driven tool made publicly available in an online application for comments, improvements, download and analysis. PMID:21253007
Chambers, Jeanne C.; Beck, Jeffrey L.; Bradford, John B.; Bybee, Jared; Campbell, Steve; Carlson, John; Christiansen, Thomas J; Clause, Karen J.; Collins, Gail; Crist, Michele R.; Dinkins, Jonathan B.; Doherty, Kevin E.; Edwards, Fred; Espinosa, Shawn; Griffin, Kathleen A.; Griffin, Paul; Haas, Jessica R.; Hanser, Steven E.; Havlina, Douglas W.; Henke, Kenneth F.; Hennig, Jacob D.; Joyce, Linda A; Kilkenny, Francis F.; Kulpa, Sarah M; Kurth, Laurie L; Maestas, Jeremy D; Manning, Mary E.; Mayer, Kenneth E.; Mealor, Brian A.; McCarthy, Clinton; Pellant, Mike; Perea, Marco A.; Prentice, Karen L.; Pyke, David A.; Wiechman , Lief A.; Wuenschel, Amarina
2017-01-01
The Science Framework is intended to link the Department of the Interior’s Integrated Rangeland Fire Management Strategy with long-term strategic conservation actions in the sagebrush biome. The Science Framework provides a multiscale approach for prioritizing areas for management and determining effective management strategies within the sagebrush biome. The emphasis is on sagebrush (Artemisia spp.) ecosystems and Greater sage-grouse (Centrocercus urophasianus). The approach provided in the Science Framework links sagebrush ecosystem resilience to disturbance and resistance to nonnative, invasive plant species to species habitat information based on the distribution and abundance of focal species. A geospatial process is presented that overlays information on ecosystem resilience and resistance, species habitats, and predominant threats and that can be used at the mid-scale to prioritize areas for management. A resilience and resistance habitat matrix is provided that can help decisionmakers evaluate risks and determine appropriate management strategies. Prioritized areas and management strategies can be refined by managers and stakeholders at the local scale based on higher resolution data and local knowledge. Decision tools are discussed for determining appropriate management actions for areas that are prioritized for management. Geospatial data, maps, and models are provided through the U.S. Geological Survey (USGS) ScienceBase and Bureau of Land Management (BLM) Landscape Approach Data Portal. The Science Framework is intended to be adaptive and will be updated as additional data become available on other values and species at risk. It is anticipated that the Science Framework will be widely used to: (1) inform emerging strategies to conserve sagebrush ecosystems, sagebrush dependent species, and human uses of the sagebrush system, and (2) assist managers in prioritizing and planning on-the-ground restoration and mitigation actions across the sagebrush biome.
2013-01-01
Background Molecular biology knowledge can be formalized and systematically represented in a computer-readable form as a comprehensive map of molecular interactions. There exist an increasing number of maps of molecular interactions containing detailed and step-wise description of various cell mechanisms. It is difficult to explore these large maps, to organize discussion of their content and to maintain them. Several efforts were recently made to combine these capabilities together in one environment, and NaviCell is one of them. Results NaviCell is a web-based environment for exploiting large maps of molecular interactions, created in CellDesigner, allowing their easy exploration, curation and maintenance. It is characterized by a combination of three essential features: (1) efficient map browsing based on Google Maps; (2) semantic zooming for viewing different levels of details or of abstraction of the map and (3) integrated web-based blog for collecting community feedback. NaviCell can be easily used by experts in the field of molecular biology for studying molecular entities of interest in the context of signaling pathways and crosstalk between pathways within a global signaling network. NaviCell allows both exploration of detailed molecular mechanisms represented on the map and a more abstract view of the map up to a top-level modular representation. NaviCell greatly facilitates curation, maintenance and updating the comprehensive maps of molecular interactions in an interactive and user-friendly fashion due to an imbedded blogging system. Conclusions NaviCell provides user-friendly exploration of large-scale maps of molecular interactions, thanks to Google Maps and WordPress interfaces, with which many users are already familiar. Semantic zooming which is used for navigating geographical maps is adopted for molecular maps in NaviCell, making any level of visualization readable. In addition, NaviCell provides a framework for community-based curation of maps. PMID:24099179
Kuperstein, Inna; Cohen, David P A; Pook, Stuart; Viara, Eric; Calzone, Laurence; Barillot, Emmanuel; Zinovyev, Andrei
2013-10-07
Molecular biology knowledge can be formalized and systematically represented in a computer-readable form as a comprehensive map of molecular interactions. There exist an increasing number of maps of molecular interactions containing detailed and step-wise description of various cell mechanisms. It is difficult to explore these large maps, to organize discussion of their content and to maintain them. Several efforts were recently made to combine these capabilities together in one environment, and NaviCell is one of them. NaviCell is a web-based environment for exploiting large maps of molecular interactions, created in CellDesigner, allowing their easy exploration, curation and maintenance. It is characterized by a combination of three essential features: (1) efficient map browsing based on Google Maps; (2) semantic zooming for viewing different levels of details or of abstraction of the map and (3) integrated web-based blog for collecting community feedback. NaviCell can be easily used by experts in the field of molecular biology for studying molecular entities of interest in the context of signaling pathways and crosstalk between pathways within a global signaling network. NaviCell allows both exploration of detailed molecular mechanisms represented on the map and a more abstract view of the map up to a top-level modular representation. NaviCell greatly facilitates curation, maintenance and updating the comprehensive maps of molecular interactions in an interactive and user-friendly fashion due to an imbedded blogging system. NaviCell provides user-friendly exploration of large-scale maps of molecular interactions, thanks to Google Maps and WordPress interfaces, with which many users are already familiar. Semantic zooming which is used for navigating geographical maps is adopted for molecular maps in NaviCell, making any level of visualization readable. In addition, NaviCell provides a framework for community-based curation of maps.
Elementary Integrated Curriculum Framework
ERIC Educational Resources Information Center
Montgomery County Public Schools, 2010
2010-01-01
The Elementary Integrated Curriculum (EIC) Framework is the guiding curriculum document for the Elementary Integrated Curriculum and represents the elementary portion of the Montgomery County (Maryland) Public Schools (MCPS) Pre-K-12 Curriculum Frameworks. The EIC Framework contains the detailed indicators and objectives that describe what…
Distributed neural system for emotional intelligence revealed by lesion mapping.
Barbey, Aron K; Colom, Roberto; Grafman, Jordan
2014-03-01
Cognitive neuroscience has made considerable progress in understanding the neural architecture of human intelligence, identifying a broadly distributed network of frontal and parietal regions that support goal-directed, intelligent behavior. However, the contributions of this network to social and emotional aspects of intellectual function remain to be well characterized. Here we investigated the neural basis of emotional intelligence in 152 patients with focal brain injuries using voxel-based lesion-symptom mapping. Latent variable modeling was applied to obtain measures of emotional intelligence, general intelligence and personality from the Mayer, Salovey, Caruso Emotional Intelligence Test (MSCEIT), the Wechsler Adult Intelligence Scale and the Neuroticism-Extroversion-Openness Inventory, respectively. Regression analyses revealed that latent scores for measures of general intelligence and personality reliably predicted latent scores for emotional intelligence. Lesion mapping results further indicated that these convergent processes depend on a shared network of frontal, temporal and parietal brain regions. The results support an integrative framework for understanding the architecture of executive, social and emotional processes and make specific recommendations for the interpretation and application of the MSCEIT to the study of emotional intelligence in health and disease.
Distributed neural system for emotional intelligence revealed by lesion mapping
Colom, Roberto; Grafman, Jordan
2014-01-01
Cognitive neuroscience has made considerable progress in understanding the neural architecture of human intelligence, identifying a broadly distributed network of frontal and parietal regions that support goal-directed, intelligent behavior. However, the contributions of this network to social and emotional aspects of intellectual function remain to be well characterized. Here we investigated the neural basis of emotional intelligence in 152 patients with focal brain injuries using voxel-based lesion-symptom mapping. Latent variable modeling was applied to obtain measures of emotional intelligence, general intelligence and personality from the Mayer, Salovey, Caruso Emotional Intelligence Test (MSCEIT), the Wechsler Adult Intelligence Scale and the Neuroticism-Extroversion-Openness Inventory, respectively. Regression analyses revealed that latent scores for measures of general intelligence and personality reliably predicted latent scores for emotional intelligence. Lesion mapping results further indicated that these convergent processes depend on a shared network of frontal, temporal and parietal brain regions. The results support an integrative framework for understanding the architecture of executive, social and emotional processes and make specific recommendations for the interpretation and application of the MSCEIT to the study of emotional intelligence in health and disease. PMID:23171618
Designing Caregiver-Implemented Shared-Reading Interventions to Overcome Implementation Barriers
Logan, Jessica R.; Damschroder, Laura
2015-01-01
Purpose This study presents an application of the theoretical domains framework (TDF; Michie et al., 2005), an integrative framework drawing on behavior-change theories, to speech-language pathology. Methods A multistep procedure was used to identify barriers affecting caregivers' implementation of shared-reading interventions with their children with language impairment (LI). The authors examined caregiver-level data corresponding to implementation issues from two randomized controlled trials and mapped these to domains in the TDF as well as empirically validated behavior-change techniques. Results Four barriers to implementation were identified as potentially affecting caregivers' implementation: time pressures, reading difficulties, discomfort with reading, and lack of awareness of benefits. These were mapped to 3 TDF domains: intentions, beliefs about capabilities, and skills. In turn, 4 behavior-change techniques were identified as potential vehicles for affecting these domains: reward, feedback, model, and encourage. An ongoing study is described that is determining the effects of these techniques for improving caregivers' implementation of a shared-reading intervention. Conclusions A description of the steps to identifying barriers to implementation, in conjunction with an ongoing experiment that will explicitly determine whether behavior-change techniques affect these barriers, provides a model for how implementation science can be used to identify and overcome implementation barriers in the treatment of communication disorders. PMID:26262941
Designing Caregiver-Implemented Shared-Reading Interventions to Overcome Implementation Barriers.
Justice, Laura M; Logan, Jessica R; Damschroder, Laura
2015-12-01
This study presents an application of the theoretical domains framework (TDF; Michie et al., 2005), an integrative framework drawing on behavior-change theories, to speech-language pathology. A multistep procedure was used to identify barriers affecting caregivers' implementation of shared-reading interventions with their children with language impairment (LI). The authors examined caregiver-level data corresponding to implementation issues from two randomized controlled trials and mapped these to domains in the TDF as well as empirically validated behavior-change techniques. Four barriers to implementation were identified as potentially affecting caregivers' implementation: time pressures, reading difficulties, discomfort with reading, and lack of awareness of benefits. These were mapped to 3 TDF domains: intentions, beliefs about capabilities, and skills. In turn, 4 behavior-change techniques were identified as potential vehicles for affecting these domains: reward, feedback, model, and encourage. An ongoing study is described that is determining the effects of these techniques for improving caregivers' implementation of a shared-reading intervention. A description of the steps to identifying barriers to implementation, in conjunction with an ongoing experiment that will explicitly determine whether behavior-change techniques affect these barriers, provides a model for how implementation science can be used to identify and overcome implementation barriers in the treatment of communication disorders.
Statistical framework and noise sensitivity of the amplitude radial correlation contrast method.
Kipervaser, Zeev Gideon; Pelled, Galit; Goelman, Gadi
2007-09-01
A statistical framework for the amplitude radial correlation contrast (RCC) method, which integrates a conventional pixel threshold approach with cluster-size statistics, is presented. The RCC method uses functional MRI (fMRI) data to group neighboring voxels in terms of their degree of temporal cross correlation and compares coherences in different brain states (e.g., stimulation OFF vs. ON). By defining the RCC correlation map as the difference between two RCC images, the map distribution of two OFF states is shown to be normal, enabling the definition of the pixel cutoff. The empirical cluster-size null distribution obtained after the application of the pixel cutoff is used to define a cluster-size cutoff that allows 5% false positives. Assuming that the fMRI signal equals the task-induced response plus noise, an analytical expression of amplitude-RCC dependency on noise is obtained and used to define the pixel threshold. In vivo and ex vivo data obtained during rat forepaw electric stimulation are used to fine-tune this threshold. Calculating the spatial coherences within in vivo and ex vivo images shows enhanced coherence in the in vivo data, but no dependency on the anesthesia method, magnetic field strength, or depth of anesthesia, strengthening the generality of the proposed cutoffs. Copyright (c) 2007 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Pullanagari, Reddy; Kereszturi, Gábor; Yule, Ian J.; Ghamisi, Pedram
2017-04-01
Accurate and spatially detailed mapping of complex urban environments is essential for land managers. Classifying high spectral and spatial resolution hyperspectral images is a challenging task because of its data abundance and computational complexity. Approaches with a combination of spectral and spatial information in a single classification framework have attracted special attention because of their potential to improve the classification accuracy. We extracted multiple features from spectral and spatial domains of hyperspectral images and evaluated them with two supervised classification algorithms; support vector machines (SVM) and an artificial neural network. The spatial features considered are produced by a gray level co-occurrence matrix and extended multiattribute profiles. All of these features were stacked, and the most informative features were selected using a genetic algorithm-based SVM. After selecting the most informative features, the classification model was integrated with a segmentation map derived using a hidden Markov random field. We tested the proposed method on a real application of a hyperspectral image acquired from AisaFENIX and on widely used hyperspectral images. From the results, it can be concluded that the proposed framework significantly improves the results with different spectral and spatial resolutions over different instrumentation.
Conflation and integration of archived geologic maps and associated uncertainties
Shoberg, Thomas G.
2016-01-01
Old, archived geologic maps are often available with little or no associated metadata. This creates special problems in terms of extracting their data to use with a modern database. This research focuses on some problems and uncertainties associated with conflating older geologic maps in regions where modern geologic maps are, as yet, non-existent as well as vertically integrating the conflated maps with layers of modern GIS data (in this case, The National Map of the U.S. Geological Survey). Ste. Genevieve County, Missouri was chosen as the test area. It is covered by six archived geologic maps constructed in the years between 1928 and 1994. Conflating these maps results in a map that is internally consistent with these six maps, is digitally integrated with hydrography, elevation and orthoimagery data, and has a 95% confidence interval useful for further data set integration.
Integrating Databases with Maps: The Delivery of Cultural Data through TimeMap.
ERIC Educational Resources Information Center
Johnson, Ian
TimeMap is a unique integration of database management, metadata and interactive maps, designed to contextualise and deliver cultural data through maps. TimeMap extends conventional maps with the time dimension, creating and animating maps "on-the-fly"; delivers them as a kiosk application or embedded in Web pages; links flexibly to…
Myers, Jeffrey D.
2012-01-01
Maps are often used to convey information generated by models, for example, modeled cancer risk from air pollution. The concrete nature of images, such as maps, may convey more certainty than warranted for modeled information. Three map features were selected to communicate the uncertainty of modeled cancer risk: (a) map contours appeared in or out of focus, (b) one or three colors were used, and (c) a verbal-relative or numeric risk expression was used in the legend. Study aims were to assess how these features influenced risk beliefs and the ambiguity of risk beliefs at four assigned map locations that varied by risk level. We applied an integrated conceptual framework to conduct this full factorial experiment with 32 maps that varied by the three dichotomous features and four risk levels; 826 university students participated. Data was analyzed using structural equation modeling. Unfocused contours and the verbal-relative risk expression generated more ambiguity than their counterparts. Focused contours generated stronger risk beliefs for higher risk levels and weaker beliefs for lower risk levels. Number of colors had minimal influence. The magnitude of risk level, conveyed using incrementally darker shading, had a substantial dose-response influence on the strength of risk beliefs. Personal characteristics of prior beliefs and numeracy also had substantial influences. Bottom-up and top-down information processing suggest why iconic visual features of incremental shading and contour focus had the strongest visual influences on risk beliefs and ambiguity. Variations in contour focus and risk expression show promise for fostering appropriate levels of ambiguity. PMID:22985196
Hawboldt, John; Nash, Rose; FitzPatrick, Beverly
2017-03-06
International standards of pharmacy curricula are necessary to ensure student readiness for international placements. This paper explores whether curricula from two pharmacy programs, in Australia and Canada, are congruent with international standards and if students feel prepared for international placements. Nationally prescribed educational standards for the two schools were compared to each other and then against the International Pharmaceutical Federation (FIP) Global Competency Framework. Written student reflections complemented this analysis. Mapping results suggested substantial agreement between the FIP framework and Australia and Canada, with two gaps being identified. Moreover, the students felt their programs prepared them for their international placements. Despite differences in countries, pharmacy programs, and health-systems all students acclimatized to their new practice sites. Implications are that if pharmacy programs align well with FIP, pharmacists should be able to integrate and practise in other jurisdictions that also align with the FIP. This has implications for the mobility of pharmacy practitioners to countries not of their origin of training.
NASA Astrophysics Data System (ADS)
Creed, I. F.; Webster, K. L.; Kreutzweiser, D. P.; Beall, F.
2014-12-01
Canada's boreal forest supports many aquatic ecosystem services (AES) due to the intimate linkage between aquatic systems and their surrounding terrestrial watersheds in forested landscapes. There is an increasing risk to AES because natural development activities (forest management, mining, energy) have resulted in disruptions that deteriorate aquatic ecosystems at local (10s of km2) to regional (100s of km2) scales. These activities are intensifying and expanding, placing at risk the healthy aquatic ecosystems that provide AES and may threaten the continued development of the energy, forest, and mining sectors. Remarkably, we know little about the consequences of these activities on AES. The idea that AES should be explicitly integrated into modern natural resource management regulations is gaining broad acceptance. A major need is the ability to measure cumulative effects and determine thresholds (the points where aquatic ecosystems and their services cannot recover to a desired state within a reasonable time frame) in these cumulative effects. However, there is no single conceptual approach to assessing cumulative effects that is widely accepted by both scientists and managers. We present an integrated science-policy framework that enables the integration of AES into forest management risk assessment and prevention/mitigation strategies. We use this framework to explore the risk of further deterioration of AES by (1) setting risk criteria; (2) using emerging technologies to map process-based indicators representing causes and consequences of risk events to the deterioration of AES; (3) assessing existing prevention and mitigation policies in place to avoid risk events; and (4) identifying priorities for policy change needed to reduce risk event. Ultimately, the success of this framework requires that higher value be placed on AES, and in turn to improve the science and management of the boreal forest.
Mapping of Supply Chain Learning: A Framework for SMEs
ERIC Educational Resources Information Center
Thakkar, Jitesh; Kanda, Arun; Deshmukh, S. G.
2011-01-01
Purpose: The aim of this paper is to propose a mapping framework for evaluating supply chain learning potential for the context of small- to medium-sized enterprises (SMEs). Design/methodology/approach: The extracts of recently completed case based research for ten manufacturing SME units and facts reported in the previous research are utilized…
Integrated Tourism E-Commerce Platform for Scenery Administration Bureau, Travel Agency and Tourist
NASA Astrophysics Data System (ADS)
Liang, Zhixue; Wang, Shui
Collaboration among multiple travel agencies and with scenery administration bureaus is vital for small or medium sized travel companies to succeed in the fierce competition of the tourism industry; business processes such as regrouping individual travelers between different agencies prove to be difficult and unpleasant user experience; tourists want to be more informed and have more initiative. To address these issues, proposes an integrated tourism e-commerce platform for travel agencies and scenery administration bureaus as well as tourists to interact in a more smooth way; this platform is constructed upon J2EE framework, provides online collaboration & coordination for companies and information services (such as self-navigation using Google Map etc) for tourists. A running implementation of this platform has been put into real business for a small travel company.
NASA Astrophysics Data System (ADS)
Falco, N.; Wainwright, H. M.; Dafflon, B.; Leger, E.; Peterson, J.; Steltzer, H.; Wilmer, C.; Williams, K. H.; Hubbard, S. S.
2017-12-01
Mountainous watershed systems are characterized by extreme heterogeneity in hydrological and pedological properties that influence biotic activities, plant communities and their dynamics. To gain predictive understanding of how ecosystem and watershed system evolve under climate change, it is critical to capture such heterogeneity and to quantify the effect of key environmental variables such as topography, and soil properties. In this study, we exploit advanced geophysical and remote sensing techniques - coupled with machine learning - to better characterize and quantify the interactions between plant communities' distribution and subsurface properties. First, we have developed a remote sensing data fusion framework based on the random forest (RF) classification algorithm to estimate the spatial distribution of plant communities. The framework allows the integration of both plant spectral and structural information, which are derived from multispectral satellite images and airborne LiDAR data. We then use the RF method to evaluate the estimated plant community map, exploiting the subsurface properties (such as bedrock depth, soil moisture and other properties) and geomorphological parameters (such as slope, curvature) as predictors. Datasets include high-resolution geophysical data (electrical resistivity tomography) and LiDAR digital elevation maps. We demonstrate our approach on a mountain hillslope and meadow within the East River watershed in Colorado, which is considered to be a representative headwater catchment in the Upper Colorado Basin. The obtained results show the existence of co-evolution between above and below-ground processes; in particular, dominant shrub communities in wet and flat areas. We show that successful integration of remote sensing data with geophysical measurements allows identifying and quantifying the key environmental controls on plant communities' distribution, and provides insights into their potential changes in the future climate conditions.
A watershed model to integrate EO data
NASA Astrophysics Data System (ADS)
Jauch, Eduardo; Chambel-Leitao, Pedro; Carina, Almeida; Brito, David; Cherif, Ines; Alexandridis, Thomas; Neves, Ramiro
2013-04-01
MOHID LAND is a open source watershed model developed by MARETEC and is part of the MOHID Framework. It integrates four mediums (or compartments): porous media, surface, rivers and atmosphere. The movement of water between these mediums are based on mass and momentum balance equations. The atmosphere medium is not explicity simulated. Instead, it's used as boundary condition to the model through meteorological properties: precipitation, solar radiation, wind speed/direction, relative humidity and air temperature. The surface medium includes the overland runoff and vegetation growth processes and is simulated using a 2D grid. The porous media includes both the unsaturated (soil) and saturated zones (aquifer) and is simulated using a 3D grid. The river flow is simulated through a 1D drainage network. All these mediums are linked through evapotranspiration and flow exchanges (infiltration, river-soil growndwater flow, surface-river overland flow). Besides the water movement, it is also possible to simulate water quality processes and solute/sediment transport. Model setup include the definition of the geometry and the properties of each one of its compartments. After the setup of the model, the only continuous input data that MOHID LAND requires are the atmosphere properties (boundary conditions) that can be provided as timeseries or spacial data. MOHID LAND has been adapted the last 4 years under FP7 and ESA projects to integrate Earth Observation (EO) data, both variable in time and in space. EO data can be used to calibrate/validate or as input/assimilation data to the model. The currently EO data used include LULC (Land Use Land Cover) maps, LAI (Leaf Area Index) maps, EVTP (Evapotranspiration) maps and SWC (Soil Water Content) maps. Model results are improved by the EO data, but the advantage of this integration is that the model can still run without the EO data. This means that model do not stop due to unavailability of EO data and can run on a forecast mode. The LCLU maps are coupled with a database that transforms land use into model properties through lookup tables. The LAI maps, usually based on NDVI satellite images, can be used directly as input to the model. When the vegetation growth is being simulated, the use of a LAI distributed in space improve the model results, by improving, for example, the estimated evapotranspiration, the estimated values of biomass, the nutrient uptake, etc. MOHID LAND calculates a Reference Evapotranspiration (rEVTP), based on the meteorological properties. The Actual Evapotranspiration (aEVTP) is then computed based on vegetation transpiration, soil evaporation and the available water in soil. Alternatively, EO derived maps of EVTP can be used as input to the model, in the place of the rEVTP, or even in the place of the aEVTP, both being provided as boundary condition. The same can be done with SWC maps, that can be used to initialize the model soil water content. The integration of EO data with MOHID LAND was tested and is being continuously developed and applied for support farmers and to help water managers to improve the water management.
Depth image super-resolution via semi self-taught learning framework
NASA Astrophysics Data System (ADS)
Zhao, Furong; Cao, Zhiguo; Xiao, Yang; Zhang, Xiaodi; Xian, Ke; Li, Ruibo
2017-06-01
Depth images have recently attracted much attention in computer vision and high-quality 3D content for 3DTV and 3D movies. In this paper, we present a new semi self-taught learning application framework for enhancing resolution of depth maps without making use of ancillary color images data at the target resolution, or multiple aligned depth maps. Our framework consists of cascade random forests reaching from coarse to fine results. We learn the surface information and structure transformations both from a small high-quality depth exemplars and the input depth map itself across different scales. Considering that edge plays an important role in depth map quality, we optimize an effective regularized objective that calculates on output image space and input edge space in random forests. Experiments show the effectiveness and superiority of our method against other techniques with or without applying aligned RGB information
Semantics-informed cartography: the case of Piemonte Geological Map
NASA Astrophysics Data System (ADS)
Piana, Fabrizio; Lombardo, Vincenzo; Mimmo, Dario; Giardino, Marco; Fubelli, Giandomenico
2016-04-01
In modern digital geological maps, namely those supported by a large geo-database and devoted to dynamical, interactive representation on WMS-WebGIS services, there is the need to provide, in an explicit form, the geological assumptions used for the design and compilation of the database of the Map, and to get a definition and/or adoption of semantic representation and taxonomies, in order to achieve a formal and interoperable representation of the geologic knowledge. These approaches are fundamental for the integration and harmonisation of geological information and services across cultural (e.g. different scientific disciplines) and/or physical barriers (e.g. administrative boundaries). Initiatives such as GeoScience Markup Language (last version is GeoSciML 4.0, 2015, http://www.geosciml.org) and the INSPIRE "Data Specification on Geology" http://inspire.jrc.ec.europa.eu/documents/Data_Specifications/INSPIRE_DataSpecification_GE_v3.0rc3.pdf (an operative simplification of GeoSciML, last version is 3.0 rc3, 2013), as well as the recent terminological shepherding of the Geoscience Terminology Working Group (GTWG) have been promoting information exchange of the geologic knowledge. Grounded on these standard vocabularies, schemas and data models, we provide a shared semantic classification of geological data referring to the study case of the synthetic digital geological map of the Piemonte region (NW Italy), named "GEOPiemonteMap", developed by the CNR Institute of Geosciences and Earth Resources, Torino (CNR IGG TO) and hosted as a dynamical interactive map on the geoportal of ARPA Piemonte Environmental Agency. The Piemonte Geological Map is grounded on a regional-scale geo-database consisting of some hundreds of GeologicUnits whose thousands instances (Mapped Features, polygons geometry) widely occur in Piemonte region, and each one is bounded by GeologicStructures (Mapped Features, line geometry). GeologicUnits and GeologicStructures have been spatially correlated through the whole region and described using the GeoSciML vocabularies. A hierarchical schema is provided for the Piemonte Geological Map that gives the parental relations between several orders of GeologicUnits referring to mostly recurring geological objects and main GeologicEvents, in a logical framework compliant with GeoSciML and INSPIRE data models. The classification criteria and the Hierarchy Schema used to define the GEOPiemonteMap Legend, as well as the intended meanings of the geological concepts used to achieve the overall classification schema, are explicitly described in several WikiGeo pages (implemented by "MediaWiki" open source software, https://www.mediawiki.org/wiki/MediaWiki). Moreover, a further step toward a formal classification of the contents (both data and interpretation) of the GEOPiemonteMap was triggered, by setting up an ontological framework, named "OntoGeonous", in order to achieve a thorough semantic characterization of the Map.
Ecoregions of Arizona (poster)
Griffith, Glenn E.; Omernik, James M.; Johnson, Colleen Burch; Turner, Dale S.
2014-01-01
Ecoregions denote areas of general similarity in ecosystems and in the type, quality, and quantity of environmental resources; they are designed to serve as a spatial framework for the research, assessment, management, and monitoring of ecosystems and ecosystem components. By recognizing the spatial differences in the capacities and potentials of ecosystems, ecoregions stratify the environment by its probable response to disturbance. These general purpose regions are critical for structuring and implementing ecosystem management strategies across federal agencies, state agencies, and nongovernment organizations that are responsible for different types of resources within the same geographical areas. The Arizona ecoregion map was compiled at a scale of 1:250,000. It revises and subdivides an earlier national ecoregion map that was originally compiled at a smaller scale. The approach used to compile this map is based on the premise that ecological regions can be identified through the analysis of the spatial patterns and the composition of biotic and abiotic phenomena that affect or reflect differences in ecosystem quality and integrity. These phenomena include geology, physiography, vegetation, climate, soils, land use, wildlife, and hydrology. The relative importance of each characteristic varies from one ecological region to another regardless of the hierarchical level. A Roman numeral hierarchical scheme has been adopted for different levels of ecological regions. Level I is the coarsest level, dividing North America into 15 ecological regions. Level II divides the continent into 50 regions. At level III, the continental United States contains 105 ecoregions and the conterminous United States has 85 ecoregions. Level IV is a further subdivision of level III ecoregions. Arizona contains arid deserts and canyonlands, semiarid shrub- and grass-covered plains, woodland- and shrubland-covered hills, lava fields and volcanic plateaus, forested mountains, glaciated peaks, and river alluvial floodplains. Ecological diversity is remarkably high. There are 7 level III ecoregions and 52 level IV ecoregions in Arizona and many continue into ecologically similar parts of adjacent states. This poster is part of a collaborative project primarily between the U.S. Geological Survey (USGS), USEPA National Health and Environmental Effects Research Laboratory (Corvallis, Oregon), USEPA Region IX, U.S. Department of Agriculture (USDA)–Natural Resources Conservation Service (NRCS), The Nature Conservancy, and several Arizona state agencies. The project is associated with an interagency effort to develop a common national framework of ecological regions. Reaching that objective requires recognition of the differences in the conceptual approaches and mapping methodologies applied to develop the most common ecoregion-type frameworks, including those developed by the USDA–Forest Service, the USEPA, and the NRCS. As each of these frameworks is further refined, their differences are becoming less discernible. Collaborative ecoregion projects, such as this one in Arizona, are a step toward attaining consensus and consistency in ecoregion frameworks for the entire nation.
Han, Yuepeng; Chagné, David; Gasic, Ksenija; Rikkerink, Erik H A; Beever, Jonathan E; Gardiner, Susan E; Korban, Schuyler S
2009-03-01
A genome-wide BAC physical map of the apple, Malus x domestica Borkh., has been recently developed. Here, we report on integrating the physical and genetic maps of the apple using a SNP-based approach in conjunction with bin mapping. Briefly, BAC clones located at ends of BAC contigs were selected, and sequenced at both ends. The BAC end sequences (BESs) were used to identify candidate SNPs. Subsequently, these candidate SNPs were genetically mapped using a bin mapping strategy for the purpose of mapping the physical onto the genetic map. Using this approach, 52 (23%) out of 228 BESs tested were successfully exploited to develop SNPs. These SNPs anchored 51 contigs, spanning approximately 37 Mb in cumulative physical length, onto 14 linkage groups. The reliability of the integration of the physical and genetic maps using this SNP-based strategy is described, and the results confirm the feasibility of this approach to construct an integrated physical and genetic maps for apple.
NASA Astrophysics Data System (ADS)
Hapca, Simona
2015-04-01
Many soil properties and functions emerge from interactions of physical, chemical and biological processes at microscopic scales, which can be understood only by integrating techniques that traditionally are developed within separate disciplines. While recent advances in imaging techniques, such as X-ray computed tomography (X-ray CT), offer the possibility to reconstruct the 3D physical structure at fine resolutions, for the distribution of chemicals in soil, existing methods, based on scanning electron microscope (SEM) and energy dispersive X-ray detection (EDX), allow for characterization of the chemical composition only on 2D surfaces. At present, direct 3D measurement techniques are still lacking, sequential sectioning of soils, followed by 2D mapping of chemical elements and interpolation to 3D, being an alternative which is explored in this study. Specifically, we develop an integrated experimental and theoretical framework which combines 3D X-ray CT imaging technique with 2D SEM-EDX and use spatial statistics methods to map the chemical composition of soil in 3D. The procedure involves three stages 1) scanning a resin impregnated soil cube by X-ray CT, followed by precision cutting to produce parallel thin slices, the surfaces of which are scanned by SEM-EDX, 2) alignment of the 2D chemical maps within the internal 3D structure of the soil cube, and 3) development, of spatial statistics methods to predict the chemical composition of 3D soil based on the observed 2D chemical and 3D physical data. Specifically, three statistical models consisting of a regression tree, a regression tree kriging and cokriging model were used to predict the 3D spatial distribution of carbon, silicon, iron and oxygen in soil, these chemical elements showing a good spatial agreement between the X-ray grayscale intensities and the corresponding 2D SEM-EDX data. Due to the spatial correlation between the physical and chemical data, the regression-tree model showed a great potential in predicting chemical composition in particular for iron, which is generally sparsely distributed in soil. For carbon, silicon and oxygen, which are more densely distributed, the additional kriging of the regression tree residuals improved significantly the prediction, whereas prediction based on co-kriging was less consistent across replicates, underperforming regression-tree kriging. The present study shows a great potential in integrating geo-statistical methods with imaging techniques to unveil the 3D chemical structure of soil at very fine scales, the framework being suitable to be further applied to other types of imaging data such as images of biological thin sections for characterization of microbial distribution. Key words: X-ray CT, SEM-EDX, segmentation techniques, spatial correlation, 3D soil images, 2D chemical maps.
Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields.
Research on Integrated Mapping——A Case Study of Integrated Land Use with Swamp Mapping
NASA Astrophysics Data System (ADS)
Zhang, S.; Yan, F.; Chang, L.
2015-12-01
Unified real estate registration system shows the attention, determination and effort to of CPC Central Committee and State Council on real estate registration in China. However, under current situation, China's real estate registration work made less progress. One of the reasons is that it's hard to express the property right of real estate on one map under the multi-sector management system. Under current multi-sector management system in China, different departments usually just survey and mapping the land type under its jurisdiction. For example, wetland investigation only mapping all kinds of wetland resources but not mapping other resource types. As a result, it cause he problem of coincidence or leak in integration of different results from different departments. As resources of the earth's surface, the total area of forest, grassland, wetland and so on should be equal to the total area of the earth's surface area. However, under the current system, the area of all kinds of resources is not equal to the sum of the earth's surface. Therefore, it is of great importance to express all the resources on one map. On one hand, this is conducive to find out the real area and distribution of resources and avoid the problem of coincidence or leak in integration; On the other hand, it is helpful to study the dynamic change of different resources. Therefore, we first proposed the "integrated mapping" as a solution, and take integrated land use with swamp mapping in Northeast China as an example to investigate the feasibility and difficulty. Study showed that: integrated land use with swamp mapping can be achieved through combining land use survey standards with swamps survey standards and "second mapping" program. Based on the experience of integrated land use with swamp mapping, we point out its reference function on integrated mapping and unified real estate registration system. We concluded that: (1) Comprehending and integrating different survey standard of different resources is the premise of "integrated mapping", (2) We put forward "multiple code" and "multiple interpretation" scheme in order to solve the problem of "attribute overlap", (3) The area of "attribute overlap" can be segmented by a certain ratio to determine the property right in unified real estate registration.
ERIC Educational Resources Information Center
Schwendimann, Beat A.; Linn, Marcia C.
2016-01-01
Concept map activities often lack a subsequent revision step that facilitates knowledge integration. This study compares two collaborative critique activities using a Knowledge Integration Map (KIM), a form of concept map. Four classes of high school biology students (n?=?81) using an online inquiry-based learning unit on evolution were assigned…
ESDORA: A Data Archive Infrastructure Using Digital Object Model and Open Source Frameworks
NASA Astrophysics Data System (ADS)
Shrestha, Biva; Pan, Jerry; Green, Jim; Palanisamy, Giriprakash; Wei, Yaxing; Lenhardt, W.; Cook, R. Bob; Wilson, B. E.; Leggott, M.
2011-12-01
There are an array of challenges associated with preserving, managing, and using contemporary scientific data. Large volume, multiple formats and data services, and the lack of a coherent mechanism for metadata/data management are some of the common issues across data centers. It is often difficult to preserve the data history and lineage information, along with other descriptive metadata, hindering the true science value for the archived data products. In this project, we use digital object abstraction architecture as the information/knowledge framework to address these challenges. We have used the following open-source frameworks: Fedora-Commons Repository, Drupal Content Management System, Islandora (Drupal Module) and Apache Solr Search Engine. The system is an active archive infrastructure for Earth Science data resources, which include ingestion, archiving, distribution, and discovery functionalities. We use an ingestion workflow to ingest the data and metadata, where many different aspects of data descriptions (including structured and non-structured metadata) are reviewed. The data and metadata are published after reviewing multiple times. They are staged during the reviewing phase. Each digital object is encoded in XML for long-term preservation of the content and relations among the digital items. The software architecture provides a flexible, modularized framework for adding pluggable user-oriented functionality. Solr is used to enable word search as well as faceted search. A home grown spatial search module is plugged in to allow user to make a spatial selection in a map view. A RDF semantic store within the Fedora-Commons Repository is used for storing information on data lineage, dissemination services, and text-based metadata. We use the semantic notion "isViewerFor" to register internally or externally referenced URLs, which are rendered within the same web browser when possible. With appropriate mapping of content into digital objects, many different data descriptions, including structured metadata, data history, auditing trails, are captured and coupled with the data content. The semantic store provides a foundation for possible further utilizations, including provide full-fledged Earth Science ontology for data interpretation or lineage tracking. Datasets from the NASA-sponsored Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) as well as from the Synthesis Thematic Data Center (MAST-DC) are used in a testing deployment with the system. The testing deployment allows us to validate the features and values described here for the integrated system, which will be presented here. Overall, we believe that the integrated system is valid, reusable data archive software that provides digital stewardship for Earth Sciences data content, now and in the future. References: [1] Devarakonda, Ranjeet, and Harold Shanafield. "Drupal: Collaborative framework for science research." Collaboration Technologies and Systems (CTS), 2011 International Conference on. IEEE, 2011. [2] Devarakonda, Ranjeet, et al. "Semantic search integration to climate data." Collaboration Technologies and Systems (CTS), 2014 International Conference on. IEEE, 2014.
NASA Astrophysics Data System (ADS)
Walstra, J.; Heyvaert, V.; Verkinderen, P.
2012-04-01
For many thousands of years the alluvial plains of Khuzestan (SW Iran) have been subject to intensive settlement and agriculture. Ancient societies depended on the position of major rivers for their economic survival and hence, there is ample evidence of human activities trying to control the distribution of water. Throughout the plains ancient irrigation and settlement patterns are visible, although traces are rapidly disappearing due to expanding modern land use. Aim of this study is to unlock and integrate the rich information on landscape and archaeology, which only survives through the available historical imagery and some limited archaeological surveys. A GIS-based geomorphological mapping procedure was developed, using a variety of imagery, including historical aerial photographs, CORONA, Landsat and SPOT images. In addition, supported by the evidence from previous geological field surveys, archaeological elements were identified, mapped and included in a GIS database. The resulting map layers display the positions of successive palaeochannel belts and extensive irrigation networks, together indicating a complex alluvial history characterized by avulsions and significant human impact. As shown in several case-studies, integrating information from multiple disciplines provides valuable insights in the complex landscape evolution of this region, both from geological and historical perspectives. Remote sensing and GIS are essential tools in such a research context. The presented work was undertaken within the framework of the Interuniversity Attraction Pole "Greater Mesopotamia: Reconstruction of its Environment and History" (IAP 6/34), funded by the Belgian Science Policy.
2013-01-01
Background Cassava is a well-known starchy root crop utilized for food, feed and biofuel production. However, the comprehension underlying the process of starch production in cassava is not yet available. Results In this work, we exploited the recently released genome information and utilized the post-genomic approaches to reconstruct the metabolic pathway of starch biosynthesis in cassava using multiple plant templates. The quality of pathway reconstruction was assured by the employed parsimonious reconstruction framework and the collective validation steps. Our reconstructed pathway is presented in the form of an informative map, which describes all important information of the pathway, and an interactive map, which facilitates the integration of omics data into the metabolic pathway. Additionally, to demonstrate the advantage of the reconstructed pathways beyond just the schematic presentation, the pathway could be used for incorporating the gene expression data obtained from various developmental stages of cassava roots. Our results exhibited the distinct activities of the starch biosynthesis pathway in different stages of root development at the transcriptional level whereby the activity of the pathway is higher toward the development of mature storage roots. Conclusions To expand its applications, the interactive map of the reconstructed starch biosynthesis pathway is available for download at the SBI group’s website (http://sbi.pdti.kmutt.ac.th/?page_id=33). This work is considered a big step in the quantitative modeling pipeline aiming to investigate the dynamic regulation of starch biosynthesis in cassava roots. PMID:23938102
Saithong, Treenut; Rongsirikul, Oratai; Kalapanulak, Saowalak; Chiewchankaset, Porntip; Siriwat, Wanatsanan; Netrphan, Supatcharee; Suksangpanomrung, Malinee; Meechai, Asawin; Cheevadhanarak, Supapon
2013-08-10
Cassava is a well-known starchy root crop utilized for food, feed and biofuel production. However, the comprehension underlying the process of starch production in cassava is not yet available. In this work, we exploited the recently released genome information and utilized the post-genomic approaches to reconstruct the metabolic pathway of starch biosynthesis in cassava using multiple plant templates. The quality of pathway reconstruction was assured by the employed parsimonious reconstruction framework and the collective validation steps. Our reconstructed pathway is presented in the form of an informative map, which describes all important information of the pathway, and an interactive map, which facilitates the integration of omics data into the metabolic pathway. Additionally, to demonstrate the advantage of the reconstructed pathways beyond just the schematic presentation, the pathway could be used for incorporating the gene expression data obtained from various developmental stages of cassava roots. Our results exhibited the distinct activities of the starch biosynthesis pathway in different stages of root development at the transcriptional level whereby the activity of the pathway is higher toward the development of mature storage roots. To expand its applications, the interactive map of the reconstructed starch biosynthesis pathway is available for download at the SBI group's website (http://sbi.pdti.kmutt.ac.th/?page_id=33). This work is considered a big step in the quantitative modeling pipeline aiming to investigate the dynamic regulation of starch biosynthesis in cassava roots.
Geovisualization to support the exploration of large health and demographic survey data
Koua, Etien L; Kraak, Menno-Jan
2004-01-01
Background Survey data are increasingly abundant from many international projects and national statistics. They are generally comprehensive and cover local, regional as well as national levels census in many domains including health, demography, human development, and economy. These surveys result in several hundred indicators. Geographical analysis of such large amount of data is often a difficult task and searching for patterns is particularly a difficult challenge. Geovisualization research is increasingly dealing with the exploration of patterns and relationships in such large datasets for understanding underlying geographical processes. One of the attempts has been to use Artificial Neural Networks as a technology especially useful in situations where the numbers are vast and the relationships are often unclear or even hidden. Results We investigate ways to integrate computational analysis based on a Self-Organizing Map neural network, with visual representations of derived structures and patterns in a framework for exploratory visualization to support visual data mining and knowledge discovery. The framework suggests ways to explore the general structure of the dataset in its multidimensional space in order to provide clues for further exploration of correlations and relationships. Conclusion In this paper, the proposed framework is used to explore a demographic and health survey data. Several graphical representations (information spaces) are used to depict the general structure and clustering of the data and get insight about the relationships among the different variables. Detail exploration of correlations and relationships among the attributes is provided. Results of the analysis are also presented in maps and other graphics. PMID:15180898
Febrer, Melanie; Goicoechea, Jose Luis; Wright, Jonathan; McKenzie, Neil; Song, Xiang; Lin, Jinke; Collura, Kristi; Wissotski, Marina; Yu, Yeisoo; Ammiraju, Jetty S. S.; Wolny, Elzbieta; Idziak, Dominika; Betekhtin, Alexander; Kudrna, Dave; Hasterok, Robert; Wing, Rod A.; Bevan, Michael W.
2010-01-01
The pooid subfamily of grasses includes some of the most important crop, forage and turf species, such as wheat, barley and Lolium. Developing genomic resources, such as whole-genome physical maps, for analysing the large and complex genomes of these crops and for facilitating biological research in grasses is an important goal in plant biology. We describe a bacterial artificial chromosome (BAC)-based physical map of the wild pooid grass Brachypodium distachyon and integrate this with whole genome shotgun sequence (WGS) assemblies using BAC end sequences (BES). The resulting physical map contains 26 contigs spanning the 272 Mb genome. BES from the physical map were also used to integrate a genetic map. This provides an independent vaildation and confirmation of the published WGS assembly. Mapped BACs were used in Fluorescence In Situ Hybridisation (FISH) experiments to align the integrated physical map and sequence assemblies to chromosomes with high resolution. The physical, genetic and cytogenetic maps, integrated with whole genome shotgun sequence assemblies, enhance the accuracy and durability of this important genome sequence and will directly facilitate gene isolation. PMID:20976139
Lacasella, Federica; Marta, Silvio; Singh, Aditya; Stack Whitney, Kaitlin; Hamilton, Krista; Townsend, Phil; Kucharik, Christopher J; Meehan, Timothy D; Gratton, Claudio
2017-03-01
Noxious species, i.e., crop pest or invasive alien species, are major threats to both natural and managed ecosystems. Invasive pests are of special importance, and knowledge about their distribution and abundance is fundamental to minimize economic losses and prioritize management activities. Occurrence models are a common tool used to identify suitable zones and map priority areas (i.e., risk maps) for noxious species management, although they provide a simplified description of species dynamics (i.e., no indication on species density). An alternative is to use abundance models, but translating abundance data into risk maps is often challenging. Here, we describe a general framework for generating abundance-based risk maps using multi-year pest data. We used an extensive data set of 3968 records collected between 2003 and 2013 in Wisconsin during annual surveys of soybean aphid (SBA), an exotic invasive pest in this region. By using an integrative approach, we modelled SBA responses to weather, seasonal, and habitat variability using generalized additive models (GAMs). Our models showed good to excellent performance in predicting SBA occurrence and abundance (TSS = 0.70, AUC = 0.92; R 2 = 0.63). We found that temperature, precipitation, and growing degree days were the main drivers of SBA trends. In addition, a significant positive relationship between SBA abundance and the availability of overwintering habitats was observed. Our models showed aphid populations were also sensitive to thresholds associated with high and low temperatures, likely related to physiological tolerances of the insects. Finally, the resulting aphid predictions were integrated using a spatial prioritization algorithm ("Zonation") to produce an abundance-based risk map for the state of Wisconsin that emphasized the spatiotemporal consistency and magnitude of past infestation patterns. This abundance-based risk map can provide information on potential foci of pest outbreaks where scouting efforts and prophylactic measures should be concentrated. The approach we took is general, relatively simple, and can be applied to other species, habitats and geographical areas for which species abundance data and biotic and abiotic data are available. © 2016 by the Ecological Society of America.
Reconstruction of biological pathways and metabolic networks from in silico labeled metabolites.
Hadadi, Noushin; Hafner, Jasmin; Soh, Keng Cher; Hatzimanikatis, Vassily
2017-01-01
Reaction atom mappings track the positional changes of all of the atoms between the substrates and the products as they undergo the biochemical transformation. However, information on atom transitions in the context of metabolic pathways is not widely available in the literature. The understanding of metabolic pathways at the atomic level is of great importance as it can deconvolute the overlapping catabolic/anabolic pathways resulting in the observed metabolic phenotype. The automated identification of atom transitions within a metabolic network is a very challenging task since the degree of complexity of metabolic networks dramatically increases when we transit from metabolite-level studies to atom-level studies. Despite being studied extensively in various approaches, the field of atom mapping of metabolic networks is lacking an automated approach, which (i) accounts for the information of reaction mechanism for atom mapping and (ii) is extendable from individual atom-mapped reactions to atom-mapped reaction networks. Hereby, we introduce a computational framework, iAM.NICE (in silico Atom Mapped Network Integrated Computational Explorer), for the systematic atom-level reconstruction of metabolic networks from in silico labelled substrates. iAM.NICE is to our knowledge the first automated atom-mapping algorithm that is based on the underlying enzymatic biotransformation mechanisms, and its application goes beyond individual reactions and it can be used for the reconstruction of atom-mapped metabolic networks. We illustrate the applicability of our method through the reconstruction of atom-mapped reactions of the KEGG database and we provide an example of an atom-level representation of the core metabolic network of E. coli. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
An AFLP genetic linkage map of pacific abalone ( Haliotis discus hannai)
NASA Astrophysics Data System (ADS)
Qi, Li; Yanhong, Xu; Ruihai, Yu; Akihiro, Kijima
2007-07-01
A genetic linkage map of Pacific abalone ( Haliotis discus hannai) was constructed using AFLP markers based on a two-way pseudo-testeross strategy in a full-sib family. With 33 primer combinations, a total of 455 markers (225 from the female parent and 230 from the male parent) segregated in a 1:1 ratio, corresponding to DNA polymorphism: heterozygous in one parent and null in the other. The female framework map consisted of 174 markers distributed in 18 linkage groups, equivalent to the H. discus hannai haploid chromosome number, and spanning a total length of 2031.4 cM, with an average interval of 13.0 cM between adjacent markers. The male framework map consisted of 195 markers mapped on 19 linkage groups, spanning a total length of 2273.4 cM, with an average spacing of 12.9 cM between adjacent markers. The estimated coverage for the framework linkage maps was 81.2% for the female and 82.1% for the male, on the basis of two estimates of genome length. Fifty-two markers (11.4%) remained unlinked. The level of segregation distortion observed in this cross was 20.4%. These linkage maps will serve as a starting point for linkage studies in the Pacific abalone with potential application for marker-assisted selection in breeding programs.
Mapping Informative Clusters in a Hierarchial Framework of fMRI Multivariate Analysis
Xu, Rui; Zhen, Zonglei; Liu, Jia
2010-01-01
Pattern recognition methods have become increasingly popular in fMRI data analysis, which are powerful in discriminating between multi-voxel patterns of brain activities associated with different mental states. However, when they are used in functional brain mapping, the location of discriminative voxels varies significantly, raising difficulties in interpreting the locus of the effect. Here we proposed a hierarchical framework of multivariate approach that maps informative clusters rather than voxels to achieve reliable functional brain mapping without compromising the discriminative power. In particular, we first searched for local homogeneous clusters that consisted of voxels with similar response profiles. Then, a multi-voxel classifier was built for each cluster to extract discriminative information from the multi-voxel patterns. Finally, through multivariate ranking, outputs from the classifiers were served as a multi-cluster pattern to identify informative clusters by examining interactions among clusters. Results from both simulated and real fMRI data demonstrated that this hierarchical approach showed better performance in the robustness of functional brain mapping than traditional voxel-based multivariate methods. In addition, the mapped clusters were highly overlapped for two perceptually equivalent object categories, further confirming the validity of our approach. In short, the hierarchical framework of multivariate approach is suitable for both pattern classification and brain mapping in fMRI studies. PMID:21152081
ERIC Educational Resources Information Center
Sevian, H.; Bernholt, S.; Szteinberg, G. A.; Auguste, S.; Pérez, L. C.
2015-01-01
A perspective is presented on how the representation mapping framework by Hahn and Chater (1998) may be used to characterize reasoning during problem solving in chemistry. To provide examples for testing the framework, an exploratory study was conducted with students and professors from three different courses in the middle of the undergraduate…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forest, E.; Bengtsson, J.; Reusch, M.F.
1991-04-01
The full power of Yoshida's technique is exploited to produce an arbitrary order implicit symplectic integrator and multi-map explicit integrator. This implicit integrator uses a characteristic function involving the force term alone. Also we point out the usefulness of the plain Ruth algorithm in computing Taylor series map using the techniques first introduced by Berz in his 'COSY-INFINITY' code.
On testing for spatial correspondence between maps of human brain structure and function.
Alexander-Bloch, Aaron F; Shou, Haochang; Liu, Siyuan; Satterthwaite, Theodore D; Glahn, David C; Shinohara, Russell T; Vandekar, Simon N; Raznahan, Armin
2018-06-01
A critical issue in many neuroimaging studies is the comparison between brain maps. Nonetheless, it remains unclear how one should test hypotheses focused on the overlap or spatial correspondence between two or more brain maps. This "correspondence problem" affects, for example, the interpretation of comparisons between task-based patterns of functional activation, resting-state networks or modules, and neuroanatomical landmarks. To date, this problem has been addressed with remarkable variability in terms of methodological approaches and statistical rigor. In this paper, we address the correspondence problem using a spatial permutation framework to generate null models of overlap by applying random rotations to spherical representations of the cortical surface, an approach for which we also provide a theoretical statistical foundation. We use this method to derive clusters of cognitive functions that are correlated in terms of their functional neuroatomical substrates. In addition, using publicly available data, we formally demonstrate the correspondence between maps of task-based functional activity, resting-state fMRI networks and gyral-based anatomical landmarks. We provide open-access code to implement the methods presented for two commonly-used tools for surface based cortical analysis (https://www.github.com/spin-test). This spatial permutation approach constitutes a useful advance over widely-used methods for the comparison of cortical maps, thereby opening new possibilities for the integration of diverse neuroimaging data. Copyright © 2018 Elsevier Inc. All rights reserved.
Computations on metric maps in mammals: getting oriented and choosing a multi-destination route.
Gallistel, C R; Cramer, A E
1996-01-01
The capacity to construct a cognitive map is hypothesized to rest on two foundations: (1) dead reckoning (path integration); (2) the perception of the direction and distance of terrain features relative to the animal. A map may be constructed by combining these two sources of positional information, with the result that the positions of all terrain features are represented in the coordinate framework used for dead reckoning. When animals need to become reoriented in a mapped space, results from rats and human toddlers indicate that they focus exclusively on the shape of the perceived environment, ignoring non-geometric features such as surface colors. As a result, in a rectangular space, they are misoriented half the time even when the two ends of the space differ strikingly in their appearance. In searching for a hidden object after becoming reoriented, both kinds of subjects search on the basis of the object's mapped position in the space rather than on the basis of its relationship to a goal sign (e.g. a distinctive container or nearby marker), even though they have demonstrably noted the relationship between the goal and the goal sign. When choosing a multidestination foraging route, vervet monkeys look at least three destinations ahead, even though they are only capable of keeping a maximum of six destinations in mind at once.
Haron, Zaiton; Bakar, Suhaimi Abu; Dimon, Mohamad Ngasri
2015-01-01
Strategic noise mapping provides important information for noise impact assessment and noise abatement. However, producing reliable strategic noise mapping in a dynamic, complex working environment is difficult. This study proposes the implementation of the random walk approach as a new stochastic technique to simulate noise mapping and to predict the noise exposure level in a workplace. A stochastic simulation framework and software, namely RW-eNMS, were developed to facilitate the random walk approach in noise mapping prediction. This framework considers the randomness and complexity of machinery operation and noise emission levels. Also, it assesses the impact of noise on the workers and the surrounding environment. For data validation, three case studies were conducted to check the accuracy of the prediction data and to determine the efficiency and effectiveness of this approach. The results showed high accuracy of prediction results together with a majority of absolute differences of less than 2 dBA; also, the predicted noise doses were mostly in the range of measurement. Therefore, the random walk approach was effective in dealing with environmental noises. It could predict strategic noise mapping to facilitate noise monitoring and noise control in the workplaces. PMID:25875019
Iterative framework radiation hybrid mapping
USDA-ARS?s Scientific Manuscript database
Building comprehensive radiation hybrid maps for large sets of markers is a computationally expensive process, since the basic mapping problem is equivalent to the traveling salesman problem. The mapping problem is also susceptible to noise, and as a result, it is often beneficial to remove markers ...
Using concept mapping for assessing and promoting relational conceptual change in science
NASA Astrophysics Data System (ADS)
Liu, Xiufeng
2004-05-01
In this article, we adopted the relational conceptual change as our theoretical framework to accommodate current views of conceptual change such as ontological beliefs, epistemological commitment, and social/affective contexts commonly mentioned in the literature. We used a specific concept mapping format and process - digraphs and digraphing - as an operational framework for assessing and promoting relational conceptual change. We wanted to find out how concept mapping can be used to account for relational conceptual change. We collected data from a Grade 12 chemistry class using collaborative computerized concept mapping on an ongoing basis during a unit of instruction. Analysis of progressive concept maps and interview transcripts of representative students and the teacher showed that ongoing and collaborative computerized concept mapping is able to account for student conceptual change in ontological, epistemological, and social/affective domains.
Li, Beiwen; Liu, Ziping; Zhang, Song
2016-10-03
We propose a hybrid computational framework to reduce motion-induced measurement error by combining the Fourier transform profilometry (FTP) and phase-shifting profilometry (PSP). The proposed method is composed of three major steps: Step 1 is to extract continuous relative phase maps for each isolated object with single-shot FTP method and spatial phase unwrapping; Step 2 is to obtain an absolute phase map of the entire scene using PSP method, albeit motion-induced errors exist on the extracted absolute phase map; and Step 3 is to shift the continuous relative phase maps from Step 1 to generate final absolute phase maps for each isolated object by referring to the absolute phase map with error from Step 2. Experiments demonstrate the success of the proposed computational framework for measuring multiple isolated rapidly moving objects.
ChlamyCyc: an integrative systems biology database and web-portal for Chlamydomonas reinhardtii.
May, Patrick; Christian, Jan-Ole; Kempa, Stefan; Walther, Dirk
2009-05-04
The unicellular green alga Chlamydomonas reinhardtii is an important eukaryotic model organism for the study of photosynthesis and plant growth. In the era of modern high-throughput technologies there is an imperative need to integrate large-scale data sets from high-throughput experimental techniques using computational methods and database resources to provide comprehensive information about the molecular and cellular organization of a single organism. In the framework of the German Systems Biology initiative GoFORSYS, a pathway database and web-portal for Chlamydomonas (ChlamyCyc) was established, which currently features about 250 metabolic pathways with associated genes, enzymes, and compound information. ChlamyCyc was assembled using an integrative approach combining the recently published genome sequence, bioinformatics methods, and experimental data from metabolomics and proteomics experiments. We analyzed and integrated a combination of primary and secondary database resources, such as existing genome annotations from JGI, EST collections, orthology information, and MapMan classification. ChlamyCyc provides a curated and integrated systems biology repository that will enable and assist in systematic studies of fundamental cellular processes in Chlamydomonas. The ChlamyCyc database and web-portal is freely available under http://chlamycyc.mpimp-golm.mpg.de.
Connecting mathematics learning through spatial reasoning
NASA Astrophysics Data System (ADS)
Mulligan, Joanne; Woolcott, Geoffrey; Mitchelmore, Michael; Davis, Brent
2018-03-01
Spatial reasoning, an emerging transdisciplinary area of interest to mathematics education research, is proving integral to all human learning. It is particularly critical to science, technology, engineering and mathematics (STEM) fields. This project will create an innovative knowledge framework based on spatial reasoning that identifies new pathways for mathematics learning, pedagogy and curriculum. Novel analytical tools will map the unknown complex systems linking spatial and mathematical concepts. It will involve the design, implementation and evaluation of a Spatial Reasoning Mathematics Program (SRMP) in Grades 3 to 5. Benefits will be seen through development of critical spatial skills for students, increased teacher capability and informed policy and curriculum across STEM education.
Operational integration in primary health care: patient encounters and workflows.
Sifaki-Pistolla, Dimitra; Chatzea, Vasiliki-Eirini; Markaki, Adelais; Kritikos, Kyriakos; Petelos, Elena; Lionis, Christos
2017-11-29
Despite several countrywide attempts to strengthen and standardise the primary healthcare (PHC) system, Greece is still lacking a sustainable, policy-based model of integrated services. The aim of our study was to identify operational integration levels through existing patient care pathways and to recommend an alternative PHC model for optimum integration. The study was part of a large state-funded project, which included 22 randomly selected PHC units located across two health regions of Greece. Dimensions of operational integration in PHC were selected based on the work of Kringos and colleagues. A five-point Likert-type scale, coupled with an algorithm, was used to capture and transform theoretical framework features into measurable attributes. PHC services were grouped under the main categories of chronic care, urgent/acute care, preventive care, and home care. A web-based platform was used to assess patient pathways, evaluate integration levels and propose improvement actions. Analysis relied on a comparison of actual pathways versus optimal, the latter ones having been identified through literature review. Overall integration varied among units. The majority (57%) of units corresponded to a basic level. Integration by type of PHC service ranged as follows: basic (86%) or poor (14%) for chronic care units, poor (78%) or basic (22%) for urgent/acute care units, basic (50%) for preventive care units, and partial or basic (50%) for home care units. The actual pathways across all four categories of PHC services differed from those captured in the optimum integration model. Certain similarities were observed in the operational flows between chronic care management and urgent/acute care management. Such similarities were present at the highest level of abstraction, but also in common steps along the operational flows. Existing patient care pathways were mapped and analysed, and recommendations for an optimum integration PHC model were made. The developed web platform, based on a strong theoretical framework, can serve as a robust integration evaluation tool. This could be a first step towards restructuring and improving PHC services within a financially restrained environment.
Applying data fusion techniques for benthic habitat mapping and monitoring in a coral reef ecosystem
NASA Astrophysics Data System (ADS)
Zhang, Caiyun
2015-06-01
Accurate mapping and effective monitoring of benthic habitat in the Florida Keys are critical in developing management strategies for this valuable coral reef ecosystem. For this study, a framework was designed for automated benthic habitat mapping by combining multiple data sources (hyperspectral, aerial photography, and bathymetry data) and four contemporary imagery processing techniques (data fusion, Object-based Image Analysis (OBIA), machine learning, and ensemble analysis). In the framework, 1-m digital aerial photograph was first merged with 17-m hyperspectral imagery and 10-m bathymetry data using a pixel/feature-level fusion strategy. The fused dataset was then preclassified by three machine learning algorithms (Random Forest, Support Vector Machines, and k-Nearest Neighbor). Final object-based habitat maps were produced through ensemble analysis of outcomes from three classifiers. The framework was tested for classifying a group-level (3-class) and code-level (9-class) habitats in a portion of the Florida Keys. Informative and accurate habitat maps were achieved with an overall accuracy of 88.5% and 83.5% for the group-level and code-level classifications, respectively.
EMAP and EMAGE: a framework for understanding spatially organized data.
Baldock, Richard A; Bard, Jonathan B L; Burger, Albert; Burton, Nicolas; Christiansen, Jeff; Feng, Guanjie; Hill, Bill; Houghton, Derek; Kaufman, Matthew; Rao, Jianguo; Sharpe, James; Ross, Allyson; Stevenson, Peter; Venkataraman, Shanmugasundaram; Waterhouse, Andrew; Yang, Yiya; Davidson, Duncan R
2003-01-01
The Edinburgh MouseAtlas Project (EMAP) is a time-series of mouse-embryo volumetric models. The models provide a context-free spatial framework onto which structural interpretations and experimental data can be mapped. This enables collation, comparison, and query of complex spatial patterns with respect to each other and with respect to known or hypothesized structure. The atlas also includes a time-dependent anatomical ontology and mapping between the ontology and the spatial models in the form of delineated anatomical regions or tissues. The models provide a natural, graphical context for browsing and visualizing complex data. The Edinburgh Mouse Atlas Gene-Expression Database (EMAGE) is one of the first applications of the EMAP framework and provides a spatially mapped gene-expression database with associated tools for data mapping, submission, and query. In this article, we describe the underlying principles of the Atlas and the gene-expression database, and provide a practical introduction to the use of the EMAP and EMAGE tools, including use of new techniques for whole body gene-expression data capture and mapping.
Regional Mapping of Plantation Extent Using Multisensor Imagery
NASA Astrophysics Data System (ADS)
Torbick, N.; Ledoux, L.; Hagen, S.; Salas, W.
2016-12-01
Industrial forest plantations are expanding rapidly across the tropics and monitoring extent is critical for understanding environmental and socioeconomic impacts. In this study, new, multisensor imagery were evaluated and integrated to extract the strengths of each sensor for mapping plantation extent at regional scales. Three distinctly different landscapes with multiple plantation types were chosen to consider scalability and transferability. These were Tanintharyi, Myanmar, West Kalimantan, Indonesia, and southern Ghana. Landsat-8 Operational Land Imager (OLI), Phased Array L-band Synthetic Aperture Radar-2 (PALSAR-2), and Sentinel-1A images were fused within a Classification and Regression Tree (CART) framework using random forest and high-resolution surveys. Multi-criteria evaluations showed both L-and C-band gamma nought γ° backscatter decibel (dB), Landsat reflectance ρλ, and texture indices were useful for distinguishing oil palm and rubber plantations from other land types. The classification approach identified 750,822 ha or 23% of the Taninathryi, Myanmar, and 216,086 ha or 25% of western West Kalimantan as plantation with very high cross validation accuracy. The mapping approach was scalable and transferred well across the different geographies and plantation types. As archives for Sentinel-1, Landsat-8, and PALSAR-2 continue to grow, mapping plantation extent and dynamics at moderate resolution over large regions should be feasible.
A DSS for sustainable development and environmental protection of agricultural regions.
Manos, Basil D; Papathanasiou, Jason; Bournaris, Thomas; Voudouris, Kostas
2010-05-01
This paper presents a decision support system (DSS) for sustainable development and environmental protection of agricultural regions developed in the framework of the Interreg-Archimed project entitled WaterMap (development and utilization of vulnerability maps for the monitoring and management of groundwater resources in the ARCHIMED areas). Its aim is to optimize the production plan of an agricultural region taking in account the available resources, the environmental parameters, and the vulnerability map of the region. The DSS is based on an optimization multicriteria model. The spatial integration of vulnerability maps in the DSS enables regional authorities to design policies for optimal agricultural development and groundwater protection from the agricultural land uses. The DSS can further be used to simulate different scenarios and policies by the local stakeholders due to changes on different social, economic, and environmental parameters. In this way, they can achieve alternative production plans and agricultural land uses as well as to estimate economic, social, and environmental impacts of different policies. The DSS is computerized and supported by a set of relational databases. The corresponding software has been developed in a Microsoft Windows XP platform, using Microsoft Visual Basic, Microsoft Access, and the LINDO library. For demonstration reasons, the paper includes an application of the DSS in a region of Northern Greece.
Research on strategy marine noise map based on i4ocean platform: Constructing flow and key approach
NASA Astrophysics Data System (ADS)
Huang, Baoxiang; Chen, Ge; Han, Yong
2016-02-01
Noise level in a marine environment has raised extensive concern in the scientific community. The research is carried out on i4Ocean platform following the process of ocean noise model integrating, noise data extracting, processing, visualizing, and interpreting, ocean noise map constructing and publishing. For the convenience of numerical computation, based on the characteristics of ocean noise field, a hybrid model related to spatial locations is suggested in the propagation model. The normal mode method K/I model is used for far field and ray method CANARY model is used for near field. Visualizing marine ambient noise data is critical to understanding and predicting marine noise for relevant decision making. Marine noise map can be constructed on virtual ocean scene. The systematic marine noise visualization framework includes preprocessing, coordinate transformation interpolation, and rendering. The simulation of ocean noise depends on realistic surface. Then the dynamic water simulation gird was improved with GPU fusion to achieve seamless combination with the visualization result of ocean noise. At the same time, the profile and spherical visualization include space, and time dimensionality were also provided for the vertical field characteristics of ocean ambient noise. Finally, marine noise map can be published with grid pre-processing and multistage cache technology to better serve the public.
Loizzo, Joseph J
2016-06-01
Meditation research has begun to clarify the brain effects and mechanisms of contemplative practices while generating a range of typologies and explanatory models to guide further study. This comparative review explores a neglected area relevant to current research: the validity of a traditional central nervous system (CNS) model that coevolved with the practices most studied today and that provides the first comprehensive neural-based typology and mechanistic framework of contemplative practices. The subtle body model, popularly known as the chakra system from Indian yoga, was and is used as a map of CNS function in traditional Indian and Tibetan medicine, neuropsychiatry, and neuropsychology. The study presented here, based on the Nalanda tradition, shows that the subtle body model can be cross-referenced with modern CNS maps and challenges modern brain maps with its embodied network model of CNS function. It also challenges meditation research by: (1) presenting a more rigorous, neural-based typology of contemplative practices; (2) offering a more refined and complete network model of the mechanisms of contemplative practices; and (3) serving as an embodied, interoceptive neurofeedback aid that is more user friendly and complete than current teaching aids for clinical and practical applications of contemplative practice. © 2016 New York Academy of Sciences.
Grand challenges for integrated USGS science—A workshop report
Jenni, Karen E.; Goldhaber, Martin B.; Betancourt, Julio L.; Baron, Jill S.; Bristol, R. Sky; Cantrill, Mary; Exter, Paul E.; Focazio, Michael J.; Haines, John W.; Hay, Lauren E.; Hsu, Leslie; Labson, Victor F.; Lafferty, Kevin D.; Ludwig, Kristin A.; Milly, Paul C. D.; Morelli, Toni L.; Morman, Suzette A.; Nassar, Nedal T.; Newman, Timothy R.; Ostroff, Andrea C.; Read, Jordan S.; Reed, Sasha C.; Shapiro, Carl D.; Smith, Richard A.; Sanford, Ward E.; Sohl, Terry L.; Stets, Edward G.; Terando, Adam J.; Tillitt, Donald E.; Tischler, Michael A.; Toccalino, Patricia L.; Wald, David J.; Waldrop, Mark P.; Wein, Anne; Weltzin, Jake F.; Zimmerman, Christian E.
2017-06-30
Executive SummaryThe U.S. Geological Survey (USGS) has a long history of advancing the traditional Earth science disciplines and identifying opportunities to integrate USGS science across disciplines to address complex societal problems. The USGS science strategy for 2007–2017 laid out key challenges in disciplinary and interdisciplinary arenas, culminating in a call for increased focus on a number of crosscutting science directions. Ten years on, to further the goal of integrated science and at the request of the Executive Leadership Team (ELT), a workshop with three dozen invited scientists spanning different disciplines and career stages in the Bureau convened on February 7–10, 2017, at the USGS John Wesley Powell Center for Analysis and Synthesis in Fort Collins, Colorado.The workshop focused on identifying “grand challenges” for integrated USGS science. Individual participants identified nearly 70 potential grand challenges before the workshop and through workshop discussions. After discussion, four overarching grand challenges emerged:Natural resource security,Societal risk from existing and emerging threats,Smart infrastructure development, andAnticipatory science for changing landscapes.Participants also identified a “comprehensive science challenge” that highlights the development of integrative science, data, models, and tools—all interacting in a modular framework—that can be used to address these and other future grand challenges:Earth Monitoring, Analyses, and Projections (EarthMAP)EarthMAP is our long-term vision for an integrated scientific framework that spans traditional scientific boundaries and disciplines, and integrates the full portfolio of USGS science: research, monitoring, assessment, analysis, and information delivery.The Department of Interior, and the Nation in general, have a vast array of information needs. The USGS meets these needs by having a broadly trained and agile scientific workforce. Encouraging and supporting cross-discipline engagement would position the USGS to tackle complex and multifaceted scientific and societal challenges in the 21st Century.
Using concept mapping to design an indicator framework for addiction treatment centres.
Nabitz, Udo; van Den Brink, Wim; Jansen, Paul
2005-06-01
The objective of this study is to determine an indicator framework for addiction treatment centres based on the demands of stakeholders and in alignment with the European Foundation for Quality Management (EFQM) Excellence Model. The setting is the Jellinek Centre based in Amsterdam, the Netherlands, which serves as a prototype for an addiction treatment centre. Concept mapping was used in the construction of the indicator framework. During the 1-day workshop, 16 stakeholders generated, prioritized and sorted 73 items concerning quality and performance. Multidimensional scaling and cluster analysis was applied in constructing a framework consisting of two dimensions and eight clusters. The horizontal axis of the indicator framework is named 'Organization' and has two poles, namely, 'Processes' and 'Results'. The vertical axis is named ' Task' and the poles are named 'Efficient treatment' and 'Prevention programs'. The eight clusters in the two-dimensional framework are arranged in the following, prioritized sequence: 'Efficient treatment network', 'Effective service', ' Target group', 'Quality of life', 'Efficient service', 'Knowledge transfer', 'Reducing addiction related problems', and 'Prevention programs'. The most important items in the framework are: 'patients are satisfied with their treatment', 'early interventions', and 'efficient treatment chain'. The indicator framework aligns with three clusters of the results criteria of the EFQM Excellence Model. It is based on the stakeholders' perspectives and is believed to be specific for addiction treatment centres. The study demonstrates that concept mapping is a suitable strategy for generating indicator frameworks.
Mathew, Lisa S; Spannagl, Manuel; Al-Malki, Ameena; George, Binu; Torres, Maria F; Al-Dous, Eman K; Al-Azwani, Eman K; Hussein, Emad; Mathew, Sweety; Mayer, Klaus F X; Mohamoud, Yasmin Ali; Suhre, Karsten; Malek, Joel A
2014-04-15
The date palm is one of the oldest cultivated fruit trees. It is critical in many ways to cultures in arid lands by providing highly nutritious fruit while surviving extreme heat and environmental conditions. Despite its importance from antiquity, few genetic resources are available for improving the productivity and development of the dioecious date palm. To date there has been no genetic map and no sex chromosome has been identified. Here we present the first genetic map for date palm and identify the putative date palm sex chromosome. We placed ~4000 markers on the map using nearly 1200 framework markers spanning a total of 1293 cM. We have integrated the genetic map, derived from the Khalas cultivar, with the draft genome and placed up to 19% of the draft genome sequence scaffolds onto linkage groups for the first time. This analysis revealed approximately ~1.9 cM/Mb on the map. Comparison of the date palm linkage groups revealed significant long-range synteny to oil palm. Analysis of the date palm sex-determination region suggests it is telomeric on linkage group 12 and recombination is not suppressed in the full chromosome. Based on a modified genotyping-by-sequencing approach we have overcome challenges due to lack of genetic resources and provide the first genetic map for date palm. Combined with the recent draft genome sequence of the same cultivar, this resource offers a critical new tool for date palm biotechnology, palm comparative genomics and a better understanding of sex chromosome development in the palms.
Inferring the effective TOR-dependent network: a computational study in yeast
2013-01-01
Background Calorie restriction (CR) is one of the most conserved non-genetic interventions that extends healthspan in evolutionarily distant species, ranging from yeast to mammals. The target of rapamycin (TOR) has been shown to play a key role in mediating healthspan extension in response to CR by integrating different signals that monitor nutrient-availability and orchestrating various components of cellular machinery in response. Both genetic and pharmacological interventions that inhibit the TOR pathway exhibit a similar phenotype, which is not further amplified by CR. Results In this paper, we present the first comprehensive, computationally derived map of TOR downstream effectors, with the objective of discovering key lifespan mediators, their crosstalk, and high-level organization. We adopt a systematic approach for tracing information flow from the TOR complex and use it to identify relevant signaling elements. By constructing a high-level functional map of TOR downstream effectors, we show that our approach is not only capable of recapturing previously known pathways, but also suggests potential targets for future studies. Information flow scores provide an aggregate ranking of relevance of proteins with respect to the TOR signaling pathway. These rankings must be normalized for degree bias, appropriately interpreted, and mapped to associated roles in pathways. We propose a novel statistical framework for integrating information flow scores, the set of differentially expressed genes in response to rapamycin treatment, and the transcriptional regulatory network. We use this framework to identify the most relevant transcription factors in mediating the observed transcriptional response, and to construct the effective response network of the TOR pathway. This network is hypothesized to mediate life-span extension in response to TOR inhibition. Conclusions Our approach, unlike experimental methods, is not limited to specific aspects of cellular response. Rather, it predicts transcriptional changes and post-translational modifications in response to TOR inhibition. The constructed effective response network greatly enhances understanding of the mechanisms underlying the aging process and helps in identifying new targets for further investigation of anti-aging regimes. It also allows us to identify potential network biomarkers for diagnosis and prognosis of age-related pathologies. PMID:24005029
Metadata mapping and reuse in caBIG.
Kunz, Isaac; Lin, Ming-Chin; Frey, Lewis
2009-02-05
This paper proposes that interoperability across biomedical databases can be improved by utilizing a repository of Common Data Elements (CDEs), UML model class-attributes and simple lexical algorithms to facilitate the building domain models. This is examined in the context of an existing system, the National Cancer Institute (NCI)'s cancer Biomedical Informatics Grid (caBIG). The goal is to demonstrate the deployment of open source tools that can be used to effectively map models and enable the reuse of existing information objects and CDEs in the development of new models for translational research applications. This effort is intended to help developers reuse appropriate CDEs to enable interoperability of their systems when developing within the caBIG framework or other frameworks that use metadata repositories. The Dice (di-grams) and Dynamic algorithms are compared and both algorithms have similar performance matching UML model class-attributes to CDE class object-property pairs. With algorithms used, the baselines for automatically finding the matches are reasonable for the data models examined. It suggests that automatic mapping of UML models and CDEs is feasible within the caBIG framework and potentially any framework that uses a metadata repository. This work opens up the possibility of using mapping algorithms to reduce cost and time required to map local data models to a reference data model such as those used within caBIG. This effort contributes to facilitating the development of interoperable systems within caBIG as well as other metadata frameworks. Such efforts are critical to address the need to develop systems to handle enormous amounts of diverse data that can be leveraged from new biomedical methodologies.
NASA Astrophysics Data System (ADS)
Xu, Jin; Li, Zheng; Li, Shuliang; Zhang, Yanyan
2015-07-01
There is still a lack of effective paradigms and tools for analysing and discovering the contents and relationships of project knowledge contexts in the field of project management. In this paper, a new framework for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps under big data environments is proposed and developed. The conceptual paradigm, theoretical underpinning, extended topic model, and illustration examples of the ontology model for project knowledge maps are presented, with further research work envisaged.
Gregoriano cadastre (1818-35) from old maps to a GIS of historical landscape data
NASA Astrophysics Data System (ADS)
Frazzica, V.; Galletti, F.; Orciani, M.; Colosi, L.; Cartaro, A.
2009-04-01
Our analysis covered specifically an area located along the "internal Marche ridge" of the Apennines, in the province of Ancona (Marche Region, Italy). The cartographical working-out for our historical analysis has been conduct drawing up maps originating from the nineteenth century Gregoriano Cadastre (Catasto Gregoriano) maps preserved in the State Archive of Rome, which have been reproduced in digital format, georeferenced and vectorialized. With the creation of a database, it has been possible to add to the maps the information gathered from the property registers concerning crop production and socioeconomic variables, in order to set up a Geographical Information System (G.I.S.). The combination of the database with the digitalized maps has allowed to create an univocal relation between each parcel and the related historical data, obtaining an information system which integrally and completely evidences the original cadastre data as a final result. It was also possible to create a three-dimensional model of the historical landscapes which permits to visualize the cultural diversification of that historical period. The integration in Territorial Information System (S.I.T.) of historical information from Gregoriano Cadastre, of socio-economic analyses concerning business changes and in parallel the study of the transformations of territorial framework, showed to be a very important instrument for the area planning, allowing to identify specific planning approaches not only for urban settlement but also for restoration of variety and complexity of agricultural landscape. The work opens further research in various directions, identifying some pilot areas which test new managerial models, foreseeing simulation of management impacts both on business profitability and landscape configuration. The future development of the project is also the upgrade and evolution of the database, followed by the acquisition of data related to the following historical periods. It'll also allow to improve the three-dimensional model (rendering) of the landscape described in the Gregoriano Cadastre.
A Regulatory Framework for Nanotechnology
informed by a map of the regulatory landscape of nanotechnology and a review of the regulatory frameworks for the aviation and biotechnology industries...aviation and biotechnology and maps the regulatory landscape in the United States by examining stakeholders, regulatory entities, and applicable legislation...state of nanotechnology if the limitations of technical expertise are addressed. This expertise can be provided by advisory committees of technical
NASA Technical Reports Server (NTRS)
Wheeler, Kevin; Timucin, Dogan; Rabbette, Maura; Curry, Charles; Allan, Mark; Lvov, Nikolay; Clanton, Sam; Pilewskie, Peter
2002-01-01
The goal of visual inference programming is to develop a software framework data analysis and to provide machine learning algorithms for inter-active data exploration and visualization. The topics include: 1) Intelligent Data Understanding (IDU) framework; 2) Challenge problems; 3) What's new here; 4) Framework features; 5) Wiring diagram; 6) Generated script; 7) Results of script; 8) Initial algorithms; 9) Independent Component Analysis for instrument diagnosis; 10) Output sensory mapping virtual joystick; 11) Output sensory mapping typing; 12) Closed-loop feedback mu-rhythm control; 13) Closed-loop training; 14) Data sources; and 15) Algorithms. This paper is in viewgraph form.
Mapping of Drug-like Chemical Universe with Reduced Complexity Molecular Frameworks.
Kontijevskis, Aleksejs
2017-04-24
The emergence of the DNA-encoded chemical libraries (DEL) field in the past decade has attracted the attention of the pharmaceutical industry as a powerful mechanism for the discovery of novel drug-like hits for various biological targets. Nuevolution Chemetics technology enables DNA-encoded synthesis of billions of chemically diverse drug-like small molecule compounds, and the efficient screening and optimization of these, facilitating effective identification of drug candidates at an unprecedented speed and scale. Although many approaches have been developed by the cheminformatics community for the analysis and visualization of drug-like chemical space, most of them are restricted to the analysis of a maximum of a few millions of compounds and cannot handle collections of 10 8 -10 12 compounds typical for DELs. To address this big chemical data challenge, we developed the Reduced Complexity Molecular Frameworks (RCMF) methodology as an abstract and very general way of representing chemical structures. By further introducing RCMF descriptors, we constructed a global framework map of drug-like chemical space and demonstrated how chemical space occupied by multi-million-member drug-like Chemetics DNA-encoded libraries and virtual combinatorial libraries with >10 12 members could be analyzed and mapped without a need for library enumeration. We further validate the approach by performing RCMF-based searches in a drug-like chemical universe and mapping Chemetics library selection outputs for LSD1 targets on a global framework chemical space map.
An integrated molecular cytogenetic map of Cucumis sativus L. chromosome 2.
Han, Yonghua; Zhang, Zhonghua; Huang, Sanwen; Jin, Weiwei
2011-01-27
Integration of molecular, genetic and cytological maps is still a challenge for most plant species. Recent progress in molecular and cytogenetic studies created a basis for developing integrated maps in cucumber (Cucumis sativus L.). In this study, eleven fosmid clones and three plasmids containing 45S rDNA, the centromeric satellite repeat Type III and the pericentriomeric repeat CsRP1 sequences respectively were hybridized to cucumber metaphase chromosomes to assign their cytological location on chromosome 2. Moreover, an integrated molecular cytogenetic map of cucumber chromosomes 2 was constructed by fluorescence in situ hybridization (FISH) mapping of 11 fosmid clones together with the cucumber centromere-specific Type III sequence on meiotic pachytene chromosomes. The cytogenetic map was fully integrated with genetic linkage map since each fosmid clone was anchored by a genetically mapped simple sequence repeat marker (SSR). The relationship between the genetic and physical distances along chromosome was analyzed. Recombination was not evenly distributed along the physical length of chromosome 2. Suppression of recombination was found in centromeric and pericentromeric regions. Our results also indicated that the molecular markers composing the linkage map for chromosome 2 provided excellent coverage of the chromosome.
Construction of a microsatellites-based linkage map for the white grouper (Epinephelus aeneus).
Dor, Lior; Shirak, Andrey; Gorshkov, Sergei; Band, Mark R; Korol, Abraham; Ronin, Yefim; Curzon, Arie; Hulata, Gideon; Seroussi, Eyal; Ron, Micha
2014-06-05
The white grouper (Epinephelus aeneus) is a promising candidate for domestication and aquaculture due to its fast growth, excellent taste, and high market price. A linkage map is an essential framework for mapping quantitative trait loci for economic traits and the study of genome evolution. DNA of a single individual was deep-sequenced, and microsatellite markers were identified in 177 of the largest scaffolds of the sequence assembly. The success rate of developing polymorphic homologous markers was 94.9% compared with 63.1% of heterologous markers from other grouper species. Of the 12 adult mature fish present in the broodstock tank, two males and two females were identified as parents of the assigned offspring by parenthood analysis using 34 heterologous markers. A single full-sib family of 48 individuals was established for the construction of first-generation linkage maps based on genotyping data of 222 microsatellites. The markers were assigned to 24 linkage groups in accordance to the 24 chromosomal pairs. The female and male maps consisting of 203 and 202 markers spanned 1053 and 886 cM, with an average intermarker distance of 5.8 and 5.0 cM, respectively. Mapping of markers to linkage groups ends was enriched by using markers originating from scaffolds harboring telomeric repeat-containing RNA. Comparative mapping showed high synteny relationships among the white grouper, kelp grouper (E. bruneus), orange-spotted grouper (E. coioides), and Nile tilapia (Oreochromis niloticus). Thus, it would be useful to integrate the markers that were developed for different groupers, depending on sharing of sequence data, into a comprehensive consensus map. Copyright © 2014 Dor et al.
Rotation number of integrable symplectic mappings of the plane
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zolkin, Timofey; Nagaitsev, Sergei; Danilov, Viatcheslav
2017-04-11
Symplectic mappings are discrete-time analogs of Hamiltonian systems. They appear in many areas of physics, including, for example, accelerators, plasma, and fluids. Integrable mappings, a subclass of symplectic mappings, are equivalent to a Twist map, with a rotation number, constant along the phase trajectory. In this letter, we propose a succinct expression to determine the rotation number and present two examples. Similar to the period of the bounded motion in Hamiltonian systems, the rotation number is the most fundamental property of integrable maps and it provides a way to analyze the phase-space dynamics.
Critical thinking in graduate medical education: A role for concept mapping assessment?
West, D C; Pomeroy, J R; Park, J K; Gerstenberger, E A; Sandoval, J
2000-09-06
Tools to assess the evolving conceptual framework of physicians-in-training are limited, despite their critical importance to physicians' evolving clinical expertise. Concept mapping assessment (CMA) enables teachers to view students' organization of their knowledge at various points in training. To assess whether CMA reflects expected differences and changes in the conceptual framework of resident physicians, whether concept maps can be scored reliably, and how well CMA scores relate to the results of standard in-training examination. A group of 21 resident physicians (9 first-year and 12 second- and third-year residents) from a university-based pediatric training program underwent concept map training, drew a preinstruction concept map about seizures, completed an education course on seizures, and then drew a postinstruction map. Maps were scored independently by 3 raters using a standardized method. The study was conducted in May and June 1999. Preinstruction map total scores and subscores in 4 categories compared with postinstruction map scores; map scores of second- and third-year residents compared with first-year residents; and interrater correlation of map scores. Total CMA scores increased after instruction from a mean (SD) preinstruction map score of 429 (119) to a mean postinstruction map score of 516 (196) (P =.03). Second- and third-year residents scored significantly higher than first-year residents before instruction (mean [SD] score of 472 [116] vs 371 [102], respectively; P =.04), but not after instruction (mean [SD] scores, 561 [203] vs 456 [179], respectively; P =.16). Second- and third-year residents had greater preinstruction map complexity as measured by cross-link score (P =.01) than first-year residents. The CMA score had a weak to no correlation with the American Board of Pediatrics In-training Examination score (r = 0.10-0.54). Interrater correlation of map scoring ranged from weak to moderate for the preinstruction map (r = 0.51-0.69) and moderate to strong for the postinstruction map (r = 0.74-0.88). Our data provide preliminary evidence that concept mapping assessment reflects expected differences and change in the conceptual framework of resident physicians. Concept mapping assessment and standardized testing may measure different cognitive domains. JAMA. 2000;284:1105-1110
A complex analysis approach to the motion of uniform vortices
NASA Astrophysics Data System (ADS)
Riccardi, Giorgio
2018-02-01
A new mathematical approach to kinematics and dynamics of planar uniform vortices in an incompressible inviscid fluid is presented. It is based on an integral relation between Schwarz function of the vortex boundary and induced velocity. This relation is firstly used for investigating the kinematics of a vortex having its Schwarz function with two simple poles in a transformed plane. The vortex boundary is the image of the unit circle through the conformal map obtained by conjugating its Schwarz function. The resulting analysis is based on geometric and algebraic properties of that map. Moreover, it is shown that the steady configurations of a uniform vortex, possibly in presence of point vortices, can be also investigated by means of the integral relation. The vortex equilibria are divided in two classes, depending on the behavior of the velocity on the boundary, measured in a reference system rotating with this curve. If it vanishes, the analysis is rather simple. However, vortices having nonvanishing relative velocity are also investigated, in presence of a polygonal symmetry. In order to study the vortex dynamics, the definition of Schwarz function is then extended to a Lagrangian framework. This Lagrangian Schwarz function solves a nonlinear integrodifferential Cauchy problem, that is transformed in a singular integral equation. Its analytical solution is here approached in terms of successive approximations. The self-induced dynamics, as well as the interactions with a point vortex, or between two uniform vortices are analyzed.
Instruments Measuring Integrated Care: A Systematic Review of Measurement Properties.
Bautista, Mary Ann C; Nurjono, Milawaty; Lim, Yee Wei; Dessers, Ezra; Vrijhoef, Hubertus Jm
2016-12-01
Policy Points: Investigations on systematic methodologies for measuring integrated care should coincide with the growing interest in this field of research. A systematic review of instruments provides insights into integrated care measurement, including setting the research agenda for validating available instruments and informing the decision to develop new ones. This study is the first systematic review of instruments measuring integrated care with an evidence synthesis of the measurement properties. We found 209 index instruments measuring different constructs related to integrated care; the strength of evidence on the adequacy of the majority of their measurement properties remained largely unassessed. Integrated care is an important strategy for increasing health system performance. Despite its growing significance, detailed evidence on the measurement properties of integrated care instruments remains vague and limited. Our systematic review aims to provide evidence on the state of the art in measuring integrated care. Our comprehensive systematic review framework builds on the Rainbow Model for Integrated Care (RMIC). We searched MEDLINE/PubMed for published articles on the measurement properties of instruments measuring integrated care and identified eligible articles using a standard set of selection criteria. We assessed the methodological quality of every validation study reported using the COSMIN checklist and extracted data on study and instrument characteristics. We also evaluated the measurement properties of each examined instrument per validation study and provided a best evidence synthesis on the adequacy of measurement properties of the index instruments. From the 300 eligible articles, we assessed the methodological quality of 379 validation studies from which we identified 209 index instruments measuring integrated care constructs. The majority of studies reported on instruments measuring constructs related to care integration (33%) and patient-centered care (49%); fewer studies measured care continuity/comprehensive care (15%) and care coordination/case management (3%). We mapped 84% of the measured constructs to the clinical integration domain of the RMIC, with fewer constructs related to the domains of professional (3.7%), organizational (3.4%), and functional (0.5%) integration. Only 8% of the instruments were mapped to a combination of domains; none were mapped exclusively to the system or normative integration domains. The majority of instruments were administered to either patients (60%) or health care providers (20%). Of the measurement properties, responsiveness (4%), measurement error (7%), and criterion (12%) and cross-cultural validity (14%) were less commonly reported. We found <50% of the validation studies to be of good or excellent quality for any of the measurement properties. Only a minority of index instruments showed strong evidence of positive findings for internal consistency (15%), content validity (19%), and structural validity (7%); with moderate evidence of positive findings for internal consistency (14%) and construct validity (14%). Our results suggest that the quality of measurement properties of instruments measuring integrated care is in need of improvement with the less-studied constructs and domains to become part of newly developed instruments. © 2016 Milbank Memorial Fund.
Instruments Measuring Integrated Care: A Systematic Review of Measurement Properties
BAUTISTA, MARY ANN C.; NURJONO, MILAWATY; DESSERS, EZRA; VRIJHOEF, HUBERTUS JM
2016-01-01
Policy Points: Investigations on systematic methodologies for measuring integrated care should coincide with the growing interest in this field of research.A systematic review of instruments provides insights into integrated care measurement, including setting the research agenda for validating available instruments and informing the decision to develop new ones.This study is the first systematic review of instruments measuring integrated care with an evidence synthesis of the measurement properties.We found 209 index instruments measuring different constructs related to integrated care; the strength of evidence on the adequacy of the majority of their measurement properties remained largely unassessed. Context Integrated care is an important strategy for increasing health system performance. Despite its growing significance, detailed evidence on the measurement properties of integrated care instruments remains vague and limited. Our systematic review aims to provide evidence on the state of the art in measuring integrated care. Methods Our comprehensive systematic review framework builds on the Rainbow Model for Integrated Care (RMIC). We searched MEDLINE/PubMed for published articles on the measurement properties of instruments measuring integrated care and identified eligible articles using a standard set of selection criteria. We assessed the methodological quality of every validation study reported using the COSMIN checklist and extracted data on study and instrument characteristics. We also evaluated the measurement properties of each examined instrument per validation study and provided a best evidence synthesis on the adequacy of measurement properties of the index instruments. Findings From the 300 eligible articles, we assessed the methodological quality of 379 validation studies from which we identified 209 index instruments measuring integrated care constructs. The majority of studies reported on instruments measuring constructs related to care integration (33%) and patient‐centered care (49%); fewer studies measured care continuity/comprehensive care (15%) and care coordination/case management (3%). We mapped 84% of the measured constructs to the clinical integration domain of the RMIC, with fewer constructs related to the domains of professional (3.7%), organizational (3.4%), and functional (0.5%) integration. Only 8% of the instruments were mapped to a combination of domains; none were mapped exclusively to the system or normative integration domains. The majority of instruments were administered to either patients (60%) or health care providers (20%). Of the measurement properties, responsiveness (4%), measurement error (7%), and criterion (12%) and cross‐cultural validity (14%) were less commonly reported. We found <50% of the validation studies to be of good or excellent quality for any of the measurement properties. Only a minority of index instruments showed strong evidence of positive findings for internal consistency (15%), content validity (19%), and structural validity (7%); with moderate evidence of positive findings for internal consistency (14%) and construct validity (14%). Conclusions Our results suggest that the quality of measurement properties of instruments measuring integrated care is in need of improvement with the less‐studied constructs and domains to become part of newly developed instruments. PMID:27995711
NASA Astrophysics Data System (ADS)
Lindsay, Jan M.; Robertson, Richard E. A.
2018-04-01
We report on the process of generating the first suite of integrated volcanic hazard zonation maps for the islands of Dominica, Grenada (including Kick 'em Jenny and Ronde/Caille), Nevis, Saba, St. Eustatius, St. Kitts, Saint Lucia and St Vincent in the Lesser Antilles. We developed a systematic approach that accommodated the range in prior knowledge of the volcanoes in the region. A first-order hazard assessment for each island was used to develop one or more scenario(s) of likely future activity, for which scenario-based hazard maps were generated. For the most-likely scenario on each island we also produced a poster-sized integrated volcanic hazard zonation map, which combined the individual hazardous phenomena depicted in the scenario-based hazard maps into integrated hazard zones. We document the philosophy behind the generation of this suite of maps, and the method by which hazard information was combined to create integrated hazard zonation maps, and illustrate our approach through a case study of St. Vincent. We also outline some of the challenges we faced using this approach, and the lessons we have learned by observing how stakeholders have interacted with the maps over the past 10 years. Based on our experience, we recommend that future map makers involve stakeholders in the entire map generation process, especially when making design choices such as type of base map, use of colour and gradational boundaries, and indeed what to depict on the map. We also recommend careful consideration of how to evaluate and depict offshore hazard of island volcanoes, and recommend computer-assisted modelling of all phenomena to generate more realistic hazard footprints. Finally, although our systematic approach to integrating individual hazard data into zones generally worked well, we suggest that a better approach might be to treat the integration of hazards on a case-by-case basis to ensure the final product meets map users' needs. We hope that the documentation of our experience might be useful for other map makers to take into account when creating new or updating existing maps.
NASA Astrophysics Data System (ADS)
Bydlon, S. A.; Beroza, G. C.
2015-12-01
Recent debate on the efficacy of Probabilistic Seismic Hazard Analysis (PSHA), and the utility of hazard maps (i.e. Stein et al., 2011; Hanks et al., 2012), has prompted a need for validation of such maps using recorded strong ground motion data. Unfortunately, strong motion records are limited spatially and temporally relative to the area and time windows hazard maps encompass. We develop a framework to test the predictive powers of PSHA maps that is flexible with respect to a map's specified probability of exceedance and time window, and the strong motion receiver coverage. Using a combination of recorded and interpolated strong motion records produced through the ShakeMap environment, we compile a record of ground motion intensity measures for California from 2002-present. We use this information to perform an area-based test of California PSHA maps inspired by the work of Ward (1995). Though this framework is flexible in that it can be applied to seismically active areas where ShakeMap-like ground shaking interpolations have or can be produced, this testing procedure is limited by the relatively short lifetime of strong motion recordings and by the desire to only test with data collected after the development of the PSHA map under scrutiny. To account for this, we use the assumption that PSHA maps are time independent to adapt the testing procedure for periods of recorded data shorter than the lifetime of a map. We note that accuracy of this testing procedure will only improve as more data is collected, or as the time-horizon of interest is reduced, as has been proposed for maps of areas experiencing induced seismicity. We believe that this procedure can be used to determine whether PSHA maps are accurately portraying seismic hazard and whether discrepancies are localized or systemic.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lathrop, R.G. Jr.
1988-01-01
The utility of three operational satellite remote sensing systems, namely, the Landsat Thematic Mapper (TM), the SPOT High Resolution Visible (HRV) sensors and the NOAA Advanced Very High Resolution Radiometer (AVHRR), were evaluated as a means of estimating water quality and surface temperature. Empirical calibration through linear regression techniques was used to relate near-simultaneously acquired satellite radiance/reflectance data and water quality observations obtained in Green Bay and the nearshore waters of Lake Michigan. Four dates of TM and one date each of SPOT and AVHRR imagery/surface reference data were acquired and analyzed. Highly significant relationships were identified between the TMmore » and SPOT data and secchi disk depth, nephelometric turbidity, chlorophyll a, total suspended solids (TSS), absorbance, and surface temperature (TM only). The AVHRR data were not analyzed independently but were used for comparison with the TM data. Calibrated water quality image maps were input to a PC-based raster GIS package, EPPL7. Pattern interpretation and spatial analysis techniques were used to document the circulation dynamics and model mixing processes in Green Bay. A GIS facilitates the retrieval, query and spatial analysis of mapped information and provides the framework for an integrated operational monitoring system for the Great Lakes.« less
An Imaging Sensor-Aided Vision Navigation Approach that Uses a Geo-Referenced Image Database.
Li, Yan; Hu, Qingwu; Wu, Meng; Gao, Yang
2016-01-28
In determining position and attitude, vision navigation via real-time image processing of data collected from imaging sensors is advanced without a high-performance global positioning system (GPS) and an inertial measurement unit (IMU). Vision navigation is widely used in indoor navigation, far space navigation, and multiple sensor-integrated mobile mapping. This paper proposes a novel vision navigation approach aided by imaging sensors and that uses a high-accuracy geo-referenced image database (GRID) for high-precision navigation of multiple sensor platforms in environments with poor GPS. First, the framework of GRID-aided vision navigation is developed with sequence images from land-based mobile mapping systems that integrate multiple sensors. Second, a highly efficient GRID storage management model is established based on the linear index of a road segment for fast image searches and retrieval. Third, a robust image matching algorithm is presented to search and match a real-time image with the GRID. Subsequently, the image matched with the real-time scene is considered to calculate the 3D navigation parameter of multiple sensor platforms. Experimental results show that the proposed approach retrieves images efficiently and has navigation accuracies of 1.2 m in a plane and 1.8 m in height under GPS loss in 5 min and within 1500 m.
Quantifying multiple telecouplings using an integrated suite of spatially-explicit tools
NASA Astrophysics Data System (ADS)
Tonini, F.; Liu, J.
2016-12-01
Telecoupling is an interdisciplinary research umbrella concept that enables natural and social scientists to understand and generate information for managing how humans and nature can sustainably coexist worldwide. To systematically study telecoupling, it is essential to build a comprehensive set of spatially-explicit tools for describing and quantifying multiple reciprocal socioeconomic and environmental interactions between a focal area and other areas. Here we introduce the Telecoupling Toolbox, a new free and open-source set of tools developed to map and identify the five major interrelated components of the telecoupling framework: systems, flows, agents, causes, and effects. The modular design of the toolbox allows the integration of existing tools and software (e.g. InVEST) to assess synergies and tradeoffs associated with policies and other local to global interventions. We show applications of the toolbox using a number of representative studies that address a variety of scientific and management issues related to telecouplings throughout the world. The results suggest that the toolbox can thoroughly map and quantify multiple telecouplings under various contexts while providing users with an easy-to-use interface. It provides a powerful platform to address globally important issues, such as land use and land cover change, species invasion, migration, flows of ecosystem services, and international trade of goods and products.
Dagliati, Arianna; Marinoni, Andrea; Cerra, Carlo; Decata, Pasquale; Chiovato, Luca; Gamba, Paolo; Bellazzi, Riccardo
2015-12-01
A very interesting perspective of "big data" in diabetes management stands in the integration of environmental information with data gathered for clinical and administrative purposes, to increase the capability of understanding spatial and temporal patterns of diseases. Within the MOSAIC project, funded by the European Union with the goal to design new diabetes analytics, we have jointly analyzed a clinical-administrative dataset of nearly 1.000 type 2 diabetes patients with environmental information derived from air quality maps acquired from remote sensing (satellite) data. Within this context we have adopted a general analysis framework able to deal with a large variety of temporal, geo-localized data. Thanks to the exploitation of time series analysis and satellite images processing, we studied whether glycemic control showed seasonal variations and if they have a spatiotemporal correlation with air pollution maps. We observed a link between the seasonal trends of glycated hemoglobin and air pollution in some of the considered geographic areas. Such findings will need future investigations for further confirmation. This work shows that it is possible to successfully deal with big data by implementing new analytics and how their exploration may provide new scenarios to better understand clinical phenomena. © 2015 Diabetes Technology Society.
Neural mechanisms of discourse comprehension: a human lesion study
Colom, Roberto; Grafman, Jordan
2014-01-01
Discourse comprehension is a hallmark of human social behaviour and refers to the act of interpreting a written or spoken message by constructing mental representations that integrate incoming language with prior knowledge and experience. Here, we report a human lesion study (n = 145) that investigates the neural mechanisms underlying discourse comprehension (measured by the Discourse Comprehension Test) and systematically examine its relation to a broad range of psychological factors, including psychometric intelligence (measured by the Wechsler Adult Intelligence Scale), emotional intelligence (measured by the Mayer, Salovey, Caruso Emotional Intelligence Test), and personality traits (measured by the Neuroticism-Extraversion-Openness Personality Inventory). Scores obtained from these factors were submitted to voxel-based lesion-symptom mapping to elucidate their neural substrates. Stepwise regression analyses revealed that working memory and extraversion reliably predict individual differences in discourse comprehension: higher working memory scores and lower extraversion levels predict better discourse comprehension performance. Lesion mapping results indicated that these convergent variables depend on a shared network of frontal and parietal regions, including white matter association tracts that bind these areas into a coordinated system. The observed findings motivate an integrative framework for understanding the neural foundations of discourse comprehension, suggesting that core elements of discourse processing emerge from a distributed network of brain regions that support specific competencies for executive and social function. PMID:24293267
An Imaging Sensor-Aided Vision Navigation Approach that Uses a Geo-Referenced Image Database
Li, Yan; Hu, Qingwu; Wu, Meng; Gao, Yang
2016-01-01
In determining position and attitude, vision navigation via real-time image processing of data collected from imaging sensors is advanced without a high-performance global positioning system (GPS) and an inertial measurement unit (IMU). Vision navigation is widely used in indoor navigation, far space navigation, and multiple sensor-integrated mobile mapping. This paper proposes a novel vision navigation approach aided by imaging sensors and that uses a high-accuracy geo-referenced image database (GRID) for high-precision navigation of multiple sensor platforms in environments with poor GPS. First, the framework of GRID-aided vision navigation is developed with sequence images from land-based mobile mapping systems that integrate multiple sensors. Second, a highly efficient GRID storage management model is established based on the linear index of a road segment for fast image searches and retrieval. Third, a robust image matching algorithm is presented to search and match a real-time image with the GRID. Subsequently, the image matched with the real-time scene is considered to calculate the 3D navigation parameter of multiple sensor platforms. Experimental results show that the proposed approach retrieves images efficiently and has navigation accuracies of 1.2 m in a plane and 1.8 m in height under GPS loss in 5 min and within 1500 m. PMID:26828496
Dunford, R; Harrison, P A; Jäger, J; Rounsevell, M D A; Tinch, R
Addressing climate change vulnerability requires an understanding of both the level of climate impacts and the capacity of the exposed population to cope. This study developed a methodology for allowing users to explore vulnerability to changes in ecosystem services as a result of climatic and socio-economic changes. It focuses on the vulnerability of Europe across multiple sectors by combining the outputs of a regional integrated assessment (IA) model, the CLIMSAVE IA Platform, with maps of coping capacity based on the five capitals approach. The presented methodology enables stakeholder-derived socio-economic futures to be represented within a quantitative integrated modelling framework in a way that changes spatially and temporally with the socio-economic storyline. Vulnerability was mapped for six key ecosystem services in 40 combined climate and socio-economic scenarios. The analysis shows that, whilst the north and west of Europe are generally better placed to cope with climate impacts than the south and east, coping could be improved in all areas. Furthermore, whilst the lack of coping capacity in dystopian scenarios often leads to greater vulnerability, there are complex interactions between sectors that lead to patterns of vulnerability that vary spatially, with scenario and by sector even within the more utopian futures.
NASA Astrophysics Data System (ADS)
Thompson, J. A.; Giles, K. A.; Rowan, M. G.; Hearon, T. E., IV
2016-12-01
The Paradox Basin in southeastern Utah and southwestern Colorado is a foreland basin formed in response to flexural loading by the Pennsylvanian-aged Uncompaghre uplift during the Ancestral Rocky Mountain orogen. Thick sequences of evaporites (Paradox Formation) were deposited within the foreland basin, which interfinger with clastic sediments in the foredeep and carbonates around the basin margin. Differential loading of the Pennsylvanian-Jurassic sediments onto the evaporites drove synsedimentary halokinesis, creating a series of salt walls and adjacent minibasins within the larger foreland basin. The growing salt walls within the basin influenced patterns of sediment deposition from the Pennsylvanian through the Cretaceous. By integrating previously published mapping with recent field observations, mapping, and subsurface interpretations of well logs and 2D seismic lines, we present interpretations of the timing, geometry, and nature of halokinesis within the Paradox Basin, which record the complex salt tectonic history in the basin. Furthermore, we present recent work on the relationships between the local passive salt history and the formation of syndepositional counter-regional extensional fault systems within the foreland. These results will be integrated into a new regional salt-tectonic and stratigraphic framework of the Paradox Basin, and have broader implications for interpreting sedimentary records in other basins with a mobile substrate.
Nali, C; Balducci, E; Frati, L; Paoli, L; Loppi, S; Lorenzini, G
2007-05-01
A biennial integrated survey, based on the use of vascular plants for the bioindication of the effects of tropospheric ozone together with the use of automatic analysers of ozone, as well as the mapping of lichen biodiversity was performed in the area of Castelfiorentino (Tuscany, central Italy). Photochemically produced ozone proved to be a fundamental presence during the warm season, with maximum hourly means reaching 114 ppb, exceeding the information threshold as fixed by EU: the use of supersensitive tobacco Bel-W3 confirmed the opportunity of carrying out detailed cost-effective monitoring surveys. The potential for didactical and educational implications of this methodology are appealing. Critical levels set up for the protection of vegetation have exceeded considerably. The comparison of biomass productivity in sensitive and resistant individuals (NC-S and NC-R white clover clones, in the framework of an European network) provided evidence that ambient ozone levels are associated with relevant reduction (up to 30%) in the performance of sensitive material; effects on flowering were also pronounced. The economic assessment of such an impact deserves attention. Mapping of epiphytic lichen biodiversity--which has been used to monitor air quality worldwide--was not related to ozone geographical distribution as depicted by tobacco response.
Constructing Adverse Outcome Pathways: a Demonstration of ...
Adverse outcome pathway (AOP) provides a conceptual framework to evaluate and integrate chemical toxicity and its effects across the levels of biological organization. As such, it is essential to develop a resource-efficient and effective approach to extend molecular initiating events (MIEs) of chemicals to their downstream phenotypes of a greater regulatory relevance. A number of ongoing public phenomics (high throughput phenotyping) efforts have been generating abundant phenotypic data annotated with ontology terms. These phenotypes can be analyzed semantically and linked to MIEs of interest, all in the context of a knowledge base integrated from a variety of ontologies for various species and knowledge domains. In such analyses, two phenotypic profiles (PPs; anchored by genes or diseases) each characterized by multiple ontology terms are compared for their semantic similarities within a common ontology graph, but across boundaries of species and knowledge domains. Taking advantage of publicly available ontologies and software tool kits, we have implemented an OS-Mapping (Ontology-based Semantics Mapping) approach as a Java application, and constructed a network of 19383 PPs as nodes with edges weighed by their pairwise semantic similarity scores. Individual PPs were assembled from public phenomics data. Out of possible 1.87×108 pairwise connections among these nodes, about 71% of them have similarity scores between 0.2 and the maximum possible of 1.0.
Using Distributed Data over HBase in Big Data Analytics Platform for Clinical Services
Zamani, Hamid
2017-01-01
Big data analytics (BDA) is important to reduce healthcare costs. However, there are many challenges of data aggregation, maintenance, integration, translation, analysis, and security/privacy. The study objective to establish an interactive BDA platform with simulated patient data using open-source software technologies was achieved by construction of a platform framework with Hadoop Distributed File System (HDFS) using HBase (key-value NoSQL database). Distributed data structures were generated from benchmarked hospital-specific metadata of nine billion patient records. At optimized iteration, HDFS ingestion of HFiles to HBase store files revealed sustained availability over hundreds of iterations; however, to complete MapReduce to HBase required a week (for 10 TB) and a month for three billion (30 TB) indexed patient records, respectively. Found inconsistencies of MapReduce limited the capacity to generate and replicate data efficiently. Apache Spark and Drill showed high performance with high usability for technical support but poor usability for clinical services. Hospital system based on patient-centric data was challenging in using HBase, whereby not all data profiles were fully integrated with the complex patient-to-hospital relationships. However, we recommend using HBase to achieve secured patient data while querying entire hospital volumes in a simplified clinical event model across clinical services. PMID:29375652
Using Distributed Data over HBase in Big Data Analytics Platform for Clinical Services.
Chrimes, Dillon; Zamani, Hamid
2017-01-01
Big data analytics (BDA) is important to reduce healthcare costs. However, there are many challenges of data aggregation, maintenance, integration, translation, analysis, and security/privacy. The study objective to establish an interactive BDA platform with simulated patient data using open-source software technologies was achieved by construction of a platform framework with Hadoop Distributed File System (HDFS) using HBase (key-value NoSQL database). Distributed data structures were generated from benchmarked hospital-specific metadata of nine billion patient records. At optimized iteration, HDFS ingestion of HFiles to HBase store files revealed sustained availability over hundreds of iterations; however, to complete MapReduce to HBase required a week (for 10 TB) and a month for three billion (30 TB) indexed patient records, respectively. Found inconsistencies of MapReduce limited the capacity to generate and replicate data efficiently. Apache Spark and Drill showed high performance with high usability for technical support but poor usability for clinical services. Hospital system based on patient-centric data was challenging in using HBase, whereby not all data profiles were fully integrated with the complex patient-to-hospital relationships. However, we recommend using HBase to achieve secured patient data while querying entire hospital volumes in a simplified clinical event model across clinical services.
Profiling a Mind Map User: A Descriptive Appraisal
ERIC Educational Resources Information Center
Tucker, Joanne M.; Armstrong, Gary R.; Massad, Victor J.
2010-01-01
Whether manually or through the use of software, a non-linear information organization framework known as mind mapping offers an alternative method for capturing thoughts, ideas and information to linear thinking modes such as outlining. Mind mapping is brainstorming, organizing, and problem solving. This paper examines mind mapping techniques,…
Valentijn, Pim P; Bruijnzeels, Marc A; de Leeuw, Rob J; Schrijvers, Guus J.P
2012-01-01
Purpose Capacity problems and political pressures have led to a rapid change in the organization of primary care from mono disciplinary small business to complex inter-organizational relationships. It is assumed that inter-organizational collaboration is the driving force to achieve integrated (primary) care. Despite the importance of collaboration and integration of services in primary care, there is no unambiguous definition for both concepts. The purpose of this study is to examine and link the conceptualisation and validation of the terms inter-organizational collaboration and integrated primary care using a theoretical framework. Theory The theoretical framework is based on the complex collaboration process of negotiation among multiple stakeholder groups in primary care. Methods A literature review of health sciences and business databases, and targeted grey literature sources. Based on the literature review we operationalized the constructs of inter-organizational collaboration and integrated primary care in a theoretical framework. The framework is being validated in an explorative study of 80 primary care projects in the Netherlands. Results and conclusions Integrated primary care is considered as a multidimensional construct based on a continuum of integration, extending from segregation to integration. The synthesis of the current theories and concepts of inter-organizational collaboration is insufficient to deal with the complexity of collaborative issues in primary care. One coherent and integrated theoretical framework was found that could make the complex collaboration process in primary care transparent. This study presented theoretical framework is a first step to understand the patterns of successful collaboration and integration in primary care services. These patterns can give insights in the organization forms needed to create a good working integrated (primary) care system that fits the local needs of a population. Preliminary data of the patterns of collaboration and integration will be presented.
MROrchestrator: A Fine-Grained Resource Orchestration Framework for MapReduce Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Bikash; Prabhakar, Ramya; Kandemir, Mahmut
2012-01-01
Efficient resource management in data centers and clouds running large distributed data processing frameworks like MapReduce is crucial for enhancing the performance of hosted applications and boosting resource utilization. However, existing resource scheduling schemes in Hadoop MapReduce allocate resources at the granularity of fixed-size, static portions of nodes, called slots. In this work, we show that MapReduce jobs have widely varying demands for multiple resources, making the static and fixed-size slot-level resource allocation a poor choice both from the performance and resource utilization standpoints. Furthermore, lack of co-ordination in the management of mul- tiple resources across nodes prevents dynamic slotmore » reconfigura- tion, and leads to resource contention. Motivated by this, we propose MROrchestrator, a MapReduce resource Orchestrator framework, which can dynamically identify resource bottlenecks, and resolve them through fine-grained, co-ordinated, and on- demand resource allocations. We have implemented MROrches- trator on two 24-node native and virtualized Hadoop clusters. Experimental results with a suite of representative MapReduce benchmarks demonstrate up to 38% reduction in job completion times, and up to 25% increase in resource utilization. We further show how popular resource managers like NGM and Mesos when augmented with MROrchestrator can hike up their performance.« less
A hybrid BAC physical map of potato: a framework for sequencing a heterozygous genome
2011-01-01
Background Potato is the world's third most important food crop, yet cultivar improvement and genomic research in general remain difficult because of the heterozygous and tetraploid nature of its genome. The development of physical map resources that can facilitate genomic analyses in potato has so far been very limited. Here we present the methods of construction and the general statistics of the first two genome-wide BAC physical maps of potato, which were made from the heterozygous diploid clone RH89-039-16 (RH). Results First, a gel electrophoresis-based physical map was made by AFLP fingerprinting of 64478 BAC clones, which were aligned into 4150 contigs with an estimated total length of 1361 Mb. Screening of BAC pools, followed by the KeyMaps in silico anchoring procedure, identified 1725 AFLP markers in the physical map, and 1252 BAC contigs were anchored the ultradense potato genetic map. A second, sequence-tag-based physical map was constructed from 65919 whole genome profiling (WGP) BAC fingerprints and these were aligned into 3601 BAC contigs spanning 1396 Mb. The 39733 BAC clones that overlap between both physical maps provided anchors to 1127 contigs in the WGP physical map, and reduced the number of contigs to around 2800 in each map separately. Both physical maps were 1.64 times longer than the 850 Mb potato genome. Genome heterozygosity and incomplete merging of BAC contigs are two factors that can explain this map inflation. The contig information of both physical maps was united in a single table that describes hybrid potato physical map. Conclusions The AFLP physical map has already been used by the Potato Genome Sequencing Consortium for sequencing 10% of the heterozygous genome of clone RH on a BAC-by-BAC basis. By layering a new WGP physical map on top of the AFLP physical map, a genetically anchored genome-wide framework of 322434 sequence tags has been created. This reference framework can be used for anchoring and ordering of genomic sequences of clone RH (and other potato genotypes), and opens the possibility to finish sequencing of the RH genome in a more efficient way via high throughput next generation approaches. PMID:22142254
Developing a Mobile Social Media Framework for Creative Pedagogies
ERIC Educational Resources Information Center
Cochrane, Thomas; Antonczak, Laurent; Guinibert, Matthew; Mulrennan, Danni
2014-01-01
This paper explores an overview of an evolving framework to enable creative pedagogies as applied to three different higher education contexts. Based upon our experiences, we propose a critical framework for supporting and implementing mobile social media for pedagogical change within higher education. Our framework maps the SAMR educational…
Valentijn, Pim P.; Schepman, Sanneke M.; Opheij, Wilfrid; Bruijnzeels, Marc A.
2013-01-01
Introduction Primary care has a central role in integrating care within a health system. However, conceptual ambiguity regarding integrated care hampers a systematic understanding. This paper proposes a conceptual framework that combines the concepts of primary care and integrated care, in order to understand the complexity of integrated care. Methods The search method involved a combination of electronic database searches, hand searches of reference lists (snowball method) and contacting researchers in the field. The process of synthesizing the literature was iterative, to relate the concepts of primary care and integrated care. First, we identified the general principles of primary care and integrated care. Second, we connected the dimensions of integrated care and the principles of primary care. Finally, to improve content validity we held several meetings with researchers in the field to develop and refine our conceptual framework. Results The conceptual framework combines the functions of primary care with the dimensions of integrated care. Person-focused and population-based care serve as guiding principles for achieving integration across the care continuum. Integration plays complementary roles on the micro (clinical integration), meso (professional and organisational integration) and macro (system integration) level. Functional and normative integration ensure connectivity between the levels. Discussion The presented conceptual framework is a first step to achieve a better understanding of the inter-relationships among the dimensions of integrated care from a primary care perspective. PMID:23687482
Valentijn, Pim P; Schepman, Sanneke M; Opheij, Wilfrid; Bruijnzeels, Marc A
2013-01-01
Primary care has a central role in integrating care within a health system. However, conceptual ambiguity regarding integrated care hampers a systematic understanding. This paper proposes a conceptual framework that combines the concepts of primary care and integrated care, in order to understand the complexity of integrated care. The search method involved a combination of electronic database searches, hand searches of reference lists (snowball method) and contacting researchers in the field. The process of synthesizing the literature was iterative, to relate the concepts of primary care and integrated care. First, we identified the general principles of primary care and integrated care. Second, we connected the dimensions of integrated care and the principles of primary care. Finally, to improve content validity we held several meetings with researchers in the field to develop and refine our conceptual framework. The conceptual framework combines the functions of primary care with the dimensions of integrated care. Person-focused and population-based care serve as guiding principles for achieving integration across the care continuum. Integration plays complementary roles on the micro (clinical integration), meso (professional and organisational integration) and macro (system integration) level. Functional and normative integration ensure connectivity between the levels. The presented conceptual framework is a first step to achieve a better understanding of the inter-relationships among the dimensions of integrated care from a primary care perspective.
MacNab, Ying C
2016-08-01
This paper concerns with multivariate conditional autoregressive models defined by linear combination of independent or correlated underlying spatial processes. Known as linear models of coregionalization, the method offers a systematic and unified approach for formulating multivariate extensions to a broad range of univariate conditional autoregressive models. The resulting multivariate spatial models represent classes of coregionalized multivariate conditional autoregressive models that enable flexible modelling of multivariate spatial interactions, yielding coregionalization models with symmetric or asymmetric cross-covariances of different spatial variation and smoothness. In the context of multivariate disease mapping, for example, they facilitate borrowing strength both over space and cross variables, allowing for more flexible multivariate spatial smoothing. Specifically, we present a broadened coregionalization framework to include order-dependent, order-free, and order-robust multivariate models; a new class of order-free coregionalized multivariate conditional autoregressives is introduced. We tackle computational challenges and present solutions that are integral for Bayesian analysis of these models. We also discuss two ways of computing deviance information criterion for comparison among competing hierarchical models with or without unidentifiable prior parameters. The models and related methodology are developed in the broad context of modelling multivariate data on spatial lattice and illustrated in the context of multivariate disease mapping. The coregionalization framework and related methods also present a general approach for building spatially structured cross-covariance functions for multivariate geostatistics. © The Author(s) 2016.
Combining aesthetic with ecological values for landscape sustainability.
Yang, Dewei; Luo, Tao; Lin, Tao; Qiu, Quanyi; Luo, Yunjian
2014-01-01
Humans receive multiple benefits from various landscapes that foster ecological services and aesthetic attractiveness. In this study, a hybrid framework was proposed to evaluate ecological and aesthetic values of five landscape types in Houguanhu Region of central China. Data from the public aesthetic survey and professional ecological assessment were converted into a two-dimensional coordinate system and distribution maps of landscape values. Results showed that natural landscapes (i.e. water body and forest) contributed positively more to both aesthetic and ecological values than semi-natural and human-dominated landscapes (i.e. farmland and non-ecological land). The distribution maps of landscape values indicated that the aesthetic, ecological and integrated landscape values were significantly associated with landscape attributes and human activity intensity. To combine aesthetic preferences with ecological services, the methods (i.e. field survey, landscape value coefficients, normalized method, a two-dimensional coordinate system, and landscape value distribution maps) were employed in landscape assessment. Our results could facilitate to identify the underlying structure-function-value chain, and also improve the understanding of multiple functions in landscape planning. The situation context could also be emphasized to bring ecological and aesthetic goals into better alignment.
Mapping public policy options responding to obesity: the case of Spain.
González-Zapata, L I; Ortiz-Moncada, R; Alvarez-Dardet, C
2007-05-01
This study assesses the opinions of the main Spanish stakeholders from food and physical exercise policy networks on public policy options for responding to obesity. We followed the multi-criteria mapping methodology in the framework of the European project 'Policy options in responding to obesity' (PorGrow), through a structured interview to 21 stakeholders. A four-step approach was taken: options, criteria, scoring and weighting, obtaining in this way a measure of the performance of each option which integrates qualitative and quantitative information. In an overall analysis, the more popular policy options where those grouped as educational initiatives: include food and health in the school curriculum, improve health education to the general public, improve the training of health professionals in obesity care and prevention, incentives to caterers to provide healthier menus and improve community sports facilities. Fiscal measures as subsidies and taxes had the lowest support. The criteria assessed as priorities were grouped as efficacy and societal benefits. Obesity in Spain can be approached through public policies, although the process will not be easy or immediate. The feasibility of changes requires concerned public policymakers developing long-term actions taking into account the map of prioritized options by the stakeholders.
Combining Aesthetic with Ecological Values for Landscape Sustainability
Yang, Dewei; Luo, Tao; Lin, Tao; Qiu, Quanyi; Luo, Yunjian
2014-01-01
Humans receive multiple benefits from various landscapes that foster ecological services and aesthetic attractiveness. In this study, a hybrid framework was proposed to evaluate ecological and aesthetic values of five landscape types in Houguanhu Region of central China. Data from the public aesthetic survey and professional ecological assessment were converted into a two-dimensional coordinate system and distribution maps of landscape values. Results showed that natural landscapes (i.e. water body and forest) contributed positively more to both aesthetic and ecological values than semi-natural and human-dominated landscapes (i.e. farmland and non-ecological land). The distribution maps of landscape values indicated that the aesthetic, ecological and integrated landscape values were significantly associated with landscape attributes and human activity intensity. To combine aesthetic preferences with ecological services, the methods (i.e. field survey, landscape value coefficients, normalized method, a two-dimensional coordinate system, and landscape value distribution maps) were employed in landscape assessment. Our results could facilitate to identify the underlying structure-function-value chain, and also improve the understanding of multiple functions in landscape planning. The situation context could also be emphasized to bring ecological and aesthetic goals into better alignment. PMID:25050886
Giannini, Tereza C; Tambosi, Leandro R; Acosta, André L; Jaffé, Rodolfo; Saraiva, Antonio M; Imperatriz-Fonseca, Vera L; Metzger, Jean Paul
2015-01-01
Ecosystem services provided by mobile agents are increasingly threatened by the loss and modification of natural habitats and by climate change, risking the maintenance of biodiversity, ecosystem functions, and human welfare. Research oriented towards a better understanding of the joint effects of land use and climate change over the provision of specific ecosystem services is therefore essential to safeguard such services. Here we propose a methodological framework, which integrates species distribution forecasts and graph theory to identify key conservation areas, which if protected or restored could improve habitat connectivity and safeguard ecosystem services. We applied the proposed framework to the provision of pollination services by a tropical stingless bee (Melipona quadrifasciata), a key pollinator of native flora from the Brazilian Atlantic Forest and important agricultural crops. Based on the current distribution of this bee and that of the plant species used to feed and nest, we projected the joint distribution of bees and plants in the future, considering a moderate climate change scenario (following IPPC). We then used this information, the bee's flight range, and the current mapping of Atlantic Forest remnants to infer habitat suitability and quantify local and regional habitat connectivity for 2030, 2050 and 2080. Our results revealed north to south and coastal to inland shifts in the pollinator distribution during the next 70 years. Current and future connectivity maps unraveled the most important corridors, which if protected or restored, could facilitate the dispersal and establishment of bees during distribution shifts. Our results also suggest that coffee plantations from eastern São Paulo and southern Minas Gerais States could suffer a pollinator deficit in the future, whereas pollination services seem to be secured in southern Brazil. Landowners and governmental agencies could use this information to implement new land use schemes. Overall, our proposed methodological framework could help design novel conservational and agricultural practices that can be crucial to conserve ecosystem services by buffering the joint effect of habitat configuration and climate change.
A Flexible Socioeconomic Scenarios Framework for the Study of Plausible Arctic Futures
NASA Astrophysics Data System (ADS)
Reissell, A. K.; Peters, G. P.; Riahi, K.; Kroglund, M.; Lovecraft, A. L.; Nilsson, A. E.; Preston, B. L.; van Ruijven, B. J.
2016-12-01
Future developments of the Arctic region are associated with different drivers of change - climate, environmental, and socio-economic - and their interactions, and are highly uncertain. The uncertainty poses challenges for decision-making, calling for development of new analytical frameworks. Scenarios - coherent narratives describing potential futures, pathways to futures, and drivers of change along the way - can be used to explore the consequences of the key uncertainties, particularly in the long-term. In a participatory scenarios workshop, we used both top-down and bottom-up approaches for the development of a flexible socioeconomic scenarios framework. The top-down approach was linked to the global Integrated Assessment Modeling framework and its Shared Socio-Economic Pathways (SSPs), developing an Arctic extension of the set of five storylines on the main socioeconomic uncertainties in global climate change research. The bottom-up approach included participatory development of narratives originating from within the Arctic region. For extension of global SSPs to the regional level, we compared the key elements in the global SSPs (Population, Human Development, Economy & Lifestyle, Policies & Institutions, Technology, and Environment & Natural Resources) and key elements in the Arctic. Additional key elements for the Arctic scenarios include, for example, seasonal migration, the large role of traditional knowledge and culture, mixed economy, nested governance structure, human and environmental security, quality of infrastructure. The bottom-up derived results suggested that the scenarios developed independent of the SSPs could be mapped back to the SSPs to demonstrate consistency with respect to representing similar boundary conditions. The two approaches are complimentary, as the top-down approach can be used to set the global socio-economic and climate boundary conditions, and the bottom-up approach providing the regional context. One key uncertainty and driving force is the demand for resources (global or regional) that was mapped against the role of governance as well as adaptive and transformative capacity among actors within the Arctic. Resources demand has significant influence on the society, culture, economy and environment of the Arctic.
Public health program capacity for sustainability: a new framework
2013-01-01
Background Public health programs can only deliver benefits if they are able to sustain activities over time. There is a broad literature on program sustainability in public health, but it is fragmented and there is a lack of consensus on core constructs. The purpose of this paper is to present a new conceptual framework for program sustainability in public health. Methods This developmental study uses a comprehensive literature review, input from an expert panel, and the results of concept-mapping to identify the core domains of a conceptual framework for public health program capacity for sustainability. The concept-mapping process included three types of participants (scientists, funders, and practitioners) from several public health areas (e.g., tobacco control, heart disease and stroke, physical activity and nutrition, and injury prevention). Results The literature review identified 85 relevant studies focusing on program sustainability in public health. Most of the papers described empirical studies of prevention-oriented programs aimed at the community level. The concept-mapping process identified nine core domains that affect a program’s capacity for sustainability: Political Support, Funding Stability, Partnerships, Organizational Capacity, Program Evaluation, Program Adaptation, Communications, Public Health Impacts, and Strategic Planning. Concept-mapping participants further identified 93 items across these domains that have strong face validity—89% of the individual items composing the framework had specific support in the sustainability literature. Conclusions The sustainability framework presented here suggests that a number of selected factors may be related to a program’s ability to sustain its activities and benefits over time. These factors have been discussed in the literature, but this framework synthesizes and combines the factors and suggests how they may be interrelated with one another. The framework presents domains for public health decision makers to consider when developing and implementing prevention and intervention programs. The sustainability framework will be useful for public health decision makers, program managers, program evaluators, and dissemination and implementation researchers. PMID:23375082
Public health program capacity for sustainability: a new framework.
Schell, Sarah F; Luke, Douglas A; Schooley, Michael W; Elliott, Michael B; Herbers, Stephanie H; Mueller, Nancy B; Bunger, Alicia C
2013-02-01
Public health programs can only deliver benefits if they are able to sustain activities over time. There is a broad literature on program sustainability in public health, but it is fragmented and there is a lack of consensus on core constructs. The purpose of this paper is to present a new conceptual framework for program sustainability in public health. This developmental study uses a comprehensive literature review, input from an expert panel, and the results of concept-mapping to identify the core domains of a conceptual framework for public health program capacity for sustainability. The concept-mapping process included three types of participants (scientists, funders, and practitioners) from several public health areas (e.g., tobacco control, heart disease and stroke, physical activity and nutrition, and injury prevention). The literature review identified 85 relevant studies focusing on program sustainability in public health. Most of the papers described empirical studies of prevention-oriented programs aimed at the community level. The concept-mapping process identified nine core domains that affect a program's capacity for sustainability: Political Support, Funding Stability, Partnerships, Organizational Capacity, Program Evaluation, Program Adaptation, Communications, Public Health Impacts, and Strategic Planning. Concept-mapping participants further identified 93 items across these domains that have strong face validity-89% of the individual items composing the framework had specific support in the sustainability literature. The sustainability framework presented here suggests that a number of selected factors may be related to a program's ability to sustain its activities and benefits over time. These factors have been discussed in the literature, but this framework synthesizes and combines the factors and suggests how they may be interrelated with one another. The framework presents domains for public health decision makers to consider when developing and implementing prevention and intervention programs. The sustainability framework will be useful for public health decision makers, program managers, program evaluators, and dissemination and implementation researchers.
Metadata mapping and reuse in caBIG™
Kunz, Isaac; Lin, Ming-Chin; Frey, Lewis
2009-01-01
Background This paper proposes that interoperability across biomedical databases can be improved by utilizing a repository of Common Data Elements (CDEs), UML model class-attributes and simple lexical algorithms to facilitate the building domain models. This is examined in the context of an existing system, the National Cancer Institute (NCI)'s cancer Biomedical Informatics Grid (caBIG™). The goal is to demonstrate the deployment of open source tools that can be used to effectively map models and enable the reuse of existing information objects and CDEs in the development of new models for translational research applications. This effort is intended to help developers reuse appropriate CDEs to enable interoperability of their systems when developing within the caBIG™ framework or other frameworks that use metadata repositories. Results The Dice (di-grams) and Dynamic algorithms are compared and both algorithms have similar performance matching UML model class-attributes to CDE class object-property pairs. With algorithms used, the baselines for automatically finding the matches are reasonable for the data models examined. It suggests that automatic mapping of UML models and CDEs is feasible within the caBIG™ framework and potentially any framework that uses a metadata repository. Conclusion This work opens up the possibility of using mapping algorithms to reduce cost and time required to map local data models to a reference data model such as those used within caBIG™. This effort contributes to facilitating the development of interoperable systems within caBIG™ as well as other metadata frameworks. Such efforts are critical to address the need to develop systems to handle enormous amounts of diverse data that can be leveraged from new biomedical methodologies. PMID:19208192
NASA Astrophysics Data System (ADS)
Dietrich, Peter; Werban, Ulrike; Sauer, Uta
2010-05-01
High-resolution soil property maps are one major prerequisite for the specific protection of soil functions and restoration of degraded soils as well as sustainable land use, water and environmental management. To generate such maps the combination of digital soil mapping approaches and remote as well as proximal soil sensing techniques is most promising. However, a feasible and reliable combination of these technologies for the investigation of large areas (e.g. catchments and landscapes) and the assessment of soil degradation threats is missing. Furthermore, there is insufficient dissemination of knowledge on digital soil mapping and proximal soil sensing in the scientific community, to relevant authorities as well as prospective users. As one consequence there is inadequate standardization of techniques. At the poster we present the EU collaborative project iSOIL within the 7th framework program of the European Commission. iSOIL focuses on improving fast and reliable mapping methods of soil properties, soil functions and soil degradation risks. This requires the improvement and integration of advanced soil sampling approaches, geophysical and spectroscopic measuring techniques, as well as pedometric and pedophysical approaches. The focus of the iSOIL project is to develop new and to improve existing strategies and innovative methods for generating accurate, high resolution soil property maps. At the same time the developments will reduce costs compared to traditional soil mapping. ISOIL tackles the challenges by the integration of three major components: (i)high resolution, non-destructive geophysical (e.g. Electromagnetic Induction EMI; Ground Penetrating Radar, GPR; magnetics, seismics) and spectroscopic (e.g., Near Surface Infrared, NIR) methods, (ii)Concepts of Digital Soil Mapping (DSM) and pedometrics as well as (iii)optimized soil sampling with respect to profound soil scientific and (geo)statistical strategies. A special focus of iSOIL lies on the sustainable dissemination of technologies and concepts developed in the projects through workshops for stakeholders and the publication of a handbook "Methods and Technologies for Mapping of Soil Properties, Function and Threat Risks". Besides, the CEN Workshop offers a new mechanism and approach to standardization. During the project we decided that the topic of the CEN Workshop should focus on a voluntary standardization of electromagnetic induction measurement to ensure that results can be evaluated and processed under uniform circumstances and can be comparable. At the poster we will also present the idea and the objectives of our CEN Workshop "Best Practice Approach for electromagnetic induction measurements of the near surface"and invite every interested person to participate.
NASA Astrophysics Data System (ADS)
Abedi, Maysam; Norouzi, Gholam-Hossain
2016-04-01
This work presents the promising application of three variants of TOPSIS method (namely the conventional, adjusted and modified versions) as a straightforward knowledge-driven technique in multi criteria decision making processes for data fusion of a broad exploratory geo-dataset in mineral potential/prospectivity mapping. The method is implemented to airborne geophysical data (e.g. potassium radiometry, aeromagnetic and frequency domain electromagnetic data), surface geological layers (fault and host rock zones), extracted alteration layers from remote sensing satellite imagery data, and five evidential attributes from stream sediment geochemical data. The central Iranian volcanic-sedimentary belt in Kerman province at the SE of Iran that is embedded in the Urumieh-Dokhtar Magmatic Assemblage arc (UDMA) is chosen to integrate broad evidential layers in the region of prospect. The studied area has high potential of ore mineral occurrences especially porphyry copper/molybdenum and the generated mineral potential maps aim to outline new prospect zones for further investigation in future. Two evidential layers of the downward continued aeromagnetic data and its analytic signal filter are prepared to be incorporated in fusion process as geophysical plausible footprints of the porphyry type mineralization. The low values of the apparent resistivity layer calculated from the airborne frequency domain electromagnetic data are also used as an electrical criterion in this investigation. Four remote sensing evidential layers of argillic, phyllic, propylitic and hydroxyl alterations were extracted from ASTER images in order to map the altered areas associated with porphyry type deposits, whilst the ETM+ satellite imagery data were used as well to map iron oxide layer. Since potassium alteration is generally the mainstay of porphyry ore mineralization, the airborne potassium radiometry data was used. The geochemical layers of Cu/B/Pb/Zn elements and the first component of PCA analysis were considered as powerful traces to prepare final maps. The conventional, adjusted and modified variants of the TOPSIS method produced three mineral potential maps, in which the outputs indicate adequately matching of high potential zones with previous working and active mines in the region.
On set-valued functionals: Multivariate risk measures and Aumann integrals
NASA Astrophysics Data System (ADS)
Ararat, Cagin
In this dissertation, multivariate risk measures for random vectors and Aumann integrals of set-valued functions are studied. Both are set-valued functionals with values in a complete lattice of subsets of Rm. Multivariate risk measures are considered in a general d-asset financial market with trading opportunities in discrete time. Specifically, the following features of the market are incorporated in the evaluation of multivariate risk: convex transaction costs modeled by solvency regions, intermediate trading constraints modeled by convex random sets, and the requirement of liquidation into the first m ≤ d of the assets. It is assumed that the investor has a "pure" multivariate risk measure R on the space of m-dimensional random vectors which represents her risk attitude towards the assets but does not take into account the frictions of the market. Then, the investor with a d-dimensional position minimizes the set-valued functional R over all m-dimensional positions that she can reach by trading in the market subject to the frictions described above. The resulting functional Rmar on the space of d-dimensional random vectors is another multivariate risk measure, called the market-extension of R. A dual representation for R mar that decomposes the effects of R and the frictions of the market is proved. Next, multivariate risk measures are studied in a utility-based framework. It is assumed that the investor has a complete risk preference towards each individual asset, which can be represented by a von Neumann-Morgenstern utility function. Then, an incomplete preference is considered for multivariate positions which is represented by the vector of the individual utility functions. Under this structure, multivariate shortfall and divergence risk measures are defined as the optimal values of set minimization problems. The dual relationship between the two classes of multivariate risk measures is constructed via a recent Lagrange duality for set optimization. In particular, it is shown that a shortfall risk measure can be written as an intersection over a family of divergence risk measures indexed by a scalarization parameter. Examples include the multivariate versions of the entropic risk measure and the average value at risk. In the second part, Aumann integrals of set-valued functions on a measurable space are viewed as set-valued functionals and a Daniell-Stone type characterization theorem is proved for such functionals. More precisely, it is shown that a functional that maps measurable set-valued functions into a certain complete lattice of subsets of Rm can be written as the Aumann integral with respect to a measure if and only if the functional is (1) additive and (2) positively homogeneous, (3) it preserves decreasing limits, (4) it maps halfspace-valued functions to halfspaces, and (5) it maps shifted cone-valued functions to shifted cones. While the first three properties already exist in the classical Daniell-Stone theorem for the Lebesgue integral, the last two properties are peculiar to the set-valued framework and they suffice to complement the first three properties to identify a set-valued functional as the Aumann integral with respect to a measure.
Vink, Sylvia; van Tartwijk, Jan; Verloop, Nico; Gosselink, Manon; Driessen, Erik; Bolk, Jan
2016-08-01
To determine the content of integrated curricula, clinical concepts and the underlying basic science concepts need to be made explicit. Preconstructed concept maps are recommended for this purpose. They are mainly constructed by experts. However, concept maps constructed by residents are hypothesized to be less complex, to reveal more tacit basic science concepts and these basic science concepts are expected to be used for the organization of the maps. These hypotheses are derived from studies about knowledge development of individuals. However, integrated curricula require a high degree of cooperation between clinicians and basic scientists. This study examined whether there are consistent variations regarding the articulation of integration when groups of experienced clinicians and basic scientists and groups of residents and basic scientists-in-training construct concept maps. Seven groups of three clinicians and basic scientists on experienced level and seven such groups on resident level constructed concept maps illuminating clinical problems. They were guided by instructions that focused them on articulation of integration. The concept maps were analysed by features that described integration. Descriptive statistics showed consistent variations between the two expertise levels. The concept maps of the resident groups exceeded those of the experienced groups in articulated integration. First, they used significantly more links between clinical and basic science concepts. Second, these links connected basic science concepts with a greater variety of clinical concepts than the experienced groups. Third, although residents did not use significantly more basic science concepts, they used them significantly more frequent to organize the clinical concepts. The conclusion was drawn that not all hypotheses could be confirmed and that the resident concept maps were more elaborate than expected. This article discusses the implications for the role that residents and basic scientists-in-training might play in the construction of preconstructed concept maps and the development of integrated curricula.
Okubo, Chris H.; Gaither, Tenielle A.
2017-05-12
This map product contains a set of three 1:18,000-scale maps showing the geology and structure of study areas in the western Candor Chasma region of Valles Marineris, Mars. These maps are part of an informal series of large-scale maps and map-based topical studies aimed at refining current understanding of the geologic history of western Candor Chasma. The map bases consist of digital elevation models and orthorectified images derived from High Resolution Imaging Science Experiment (HiRISE) data. These maps are accompanied by geologic cross sections, colorized elevation maps, and cutouts of HiRISE images showing key superposition relations. Also included in this product is a Correlation of Map Units that integrates units across all three map areas, as well as an integrated Description of Map Units and an integrated Explanation of Map Symbols. The maps were assembled using ArcGIS software produced by Environmental Systems Research Institute (http://www.esri.com). The ArcGIS projects and databases associated with each map are included online as supplemental data.
Development of a prototype spatial information processing system for hydrologic research
NASA Technical Reports Server (NTRS)
Sircar, Jayanta K.
1991-01-01
Significant advances have been made in the last decade in the areas of Geographic Information Systems (GIS) and spatial analysis technology, both in hardware and software. Science user requirements are so problem specific that currently no single system can satisfy all of the needs. The work presented here forms part of a conceptual framework for an all-encompassing science-user workstation system. While definition and development of the system as a whole will take several years, it is intended that small scale projects such as the current work will address some of the more short term needs. Such projects can provide a quick mechanism to integrate tools into the workstation environment forming a larger, more complete hydrologic analysis platform. Described here are two components that are very important to the practical use of remote sensing and digital map data in hydrology. Described here is a graph-theoretic technique to rasterize elevation contour maps. Also described is a system to manipulate synthetic aperture radar (SAR) data files and extract soil moisture data.
Undular bore theory for the Gardner equation
NASA Astrophysics Data System (ADS)
Kamchatnov, A. M.; Kuo, Y.-H.; Lin, T.-C.; Horng, T.-L.; Gou, S.-C.; Clift, R.; El, G. A.; Grimshaw, R. H. J.
2012-09-01
We develop modulation theory for undular bores (dispersive shock waves) in the framework of the Gardner, or extended Korteweg-de Vries (KdV), equation, which is a generic mathematical model for weakly nonlinear and weakly dispersive wave propagation, when effects of higher order nonlinearity become important. Using a reduced version of the finite-gap integration method we derive the Gardner-Whitham modulation system in a Riemann invariant form and show that it can be mapped onto the well-known modulation system for the Korteweg-de Vries equation. The transformation between the two counterpart modulation systems is, however, not invertible. As a result, the study of the resolution of an initial discontinuity for the Gardner equation reveals a rich phenomenology of solutions which, along with the KdV-type simple undular bores, include nonlinear trigonometric bores, solibores, rarefaction waves, and composite solutions representing various combinations of the above structures. We construct full parametric maps of such solutions for both signs of the cubic nonlinear term in the Gardner equation. Our classification is supported by numerical simulations.
PRoViScout: a planetary scouting rover demonstrator
NASA Astrophysics Data System (ADS)
Paar, Gerhard; Woods, Mark; Gimkiewicz, Christiane; Labrosse, Frédéric; Medina, Alberto; Tyler, Laurence; Barnes, David P.; Fritz, Gerald; Kapellos, Konstantinos
2012-01-01
Mobile systems exploring Planetary surfaces in future will require more autonomy than today. The EU FP7-SPACE Project ProViScout (2010-2012) establishes the building blocks of such autonomous exploration systems in terms of robotics vision by a decision-based combination of navigation and scientific target selection, and integrates them into a framework ready for and exposed to field demonstration. The PRoViScout on-board system consists of mission management components such as an Executive, a Mars Mission On-Board Planner and Scheduler, a Science Assessment Module, and Navigation & Vision Processing modules. The platform hardware consists of the rover with the sensors and pointing devices. We report on the major building blocks and their functions & interfaces, emphasizing on the computer vision parts such as image acquisition (using a novel zoomed 3D-Time-of-Flight & RGB camera), mapping from 3D-TOF data, panoramic image & stereo reconstruction, hazard and slope maps, visual odometry and the recognition of potential scientifically interesting targets.
The Applications of Model-Based Geostatistics in Helminth Epidemiology and Control
Magalhães, Ricardo J. Soares; Clements, Archie C.A.; Patil, Anand P.; Gething, Peter W.; Brooker, Simon
2011-01-01
Funding agencies are dedicating substantial resources to tackle helminth infections. Reliable maps of the distribution of helminth infection can assist these efforts by targeting control resources to areas of greatest need. The ability to define the distribution of infection at regional, national and subnational levels has been enhanced greatly by the increased availability of good quality survey data and the use of model-based geostatistics (MBG), enabling spatial prediction in unsampled locations. A major advantage of MBG risk mapping approaches is that they provide a flexible statistical platform for handling and representing different sources of uncertainty, providing plausible and robust information on the spatial distribution of infections to inform the design and implementation of control programmes. Focussing on schistosomiasis and soil-transmitted helminthiasis, with additional examples for lymphatic filariasis and onchocerciasis, we review the progress made to date with the application of MBG tools in large-scale, real-world control programmes and propose a general framework for their application to inform integrative spatial planning of helminth disease control programmes. PMID:21295680
Novel and efficient tag SNPs selection algorithms.
Chen, Wen-Pei; Hung, Che-Lun; Tsai, Suh-Jen Jane; Lin, Yaw-Ling
2014-01-01
SNPs are the most abundant forms of genetic variations amongst species; the association studies between complex diseases and SNPs or haplotypes have received great attention. However, these studies are restricted by the cost of genotyping all SNPs; thus, it is necessary to find smaller subsets, or tag SNPs, representing the rest of the SNPs. In fact, the existing tag SNP selection algorithms are notoriously time-consuming. An efficient algorithm for tag SNP selection was presented, which was applied to analyze the HapMap YRI data. The experimental results show that the proposed algorithm can achieve better performance than the existing tag SNP selection algorithms; in most cases, this proposed algorithm is at least ten times faster than the existing methods. In many cases, when the redundant ratio of the block is high, the proposed algorithm can even be thousands times faster than the previously known methods. Tools and web services for haplotype block analysis integrated by hadoop MapReduce framework are also developed using the proposed algorithm as computation kernels.
NASA Astrophysics Data System (ADS)
Mongeau, R.; Baudouin, Y.; Cavayas, F.
2017-10-01
Ville de Montreal wanted to develop a system to identify heat islands and microparticles at the urban scale and to study their formation. UQAM and UdeM universities have joined their expertise under the framework "Observatoire Spatial Urbain" to create a representative geospatial database of thermal and atmospheric parameters collected during the summer months. They innovated in the development of a methodology for processing high resolution hyperspectral images (1-2 m). In partnership with Ville de Montreal, they integrated 3D geospatial data (topography, transportation and meteorology) in the process. The 3D mapping of intraurban heat islands as well as air micro-particles makes it possible, initially, to identify the problematic situations for future civil protection interventions during extreme heat. Moreover, it will be used as a reference for the Ville de Montreal to establish a strategy for public domain tree planting and in the analysis of urban development projects.
Influences on infant speech processing: toward a new synthesis.
Werker, J F; Tees, R C
1999-01-01
To comprehend and produce language, we must be able to recognize the sound patterns of our language and the rules for how these sounds "map on" to meaning. Human infants are born with a remarkable array of perceptual sensitivities that allow them to detect the basic properties that are common to the world's languages. During the first year of life, these sensitivities undergo modification reflecting an exquisite tuning to just that phonological information that is needed to map sound to meaning in the native language. We review this transition from language-general to language-specific perceptual sensitivity that occurs during the first year of life and consider whether the changes propel the child into word learning. To account for the broad-based initial sensitivities and subsequent reorganizations, we offer an integrated transactional framework based on the notion of a specialized perceptual-motor system that has evolved to serve human speech, but which functions in concert with other developing abilities. In so doing, we highlight the links between infant speech perception, babbling, and word learning.
[Discrimination and homophobia associated to the human immunodeficiency virus epidemic].
Orozco-Núñez, Emanuel; Alcalde-Rabanal, Jacqueline Elizabeth; Ruiz-Larios, José Arturo; Sucilla-Pérez, Héctor; García-Cerde, Rodrigo
2015-01-01
To describe a political mapping on discrimination and homophobia associated to human immunodeficiency virus (HIV) in the context of public institutions in Mexico. The political mapping was conducted in six Mexican states. Stakeholders who were involved in HIV actions from public and private sectors were included. Semistructured interviews were applied to explore homophobia and discrimination associated with HIV. Information was systematized using the Policy Maker software, which is a good support for analyzing health policies. Discriminatory and homophobic practices in the public domain occurred, damaging people´s integrity via insults, derision and hate crimes. Most stakeholders expressed a supportive position to prevent discrimination and homophobia and some of them had great influence on policy-making decisions. It was found that state policy frameworks are less specific in addressing these issues. Homophobia and discrimination associated to HIV are still considered problematic in Mexico. Homophobia is a very sensitive issue that requires further attention. Also, an actual execution of governmental authority requires greater enforcement of laws against discrimination and homophobia.
Verheyen, K.; Guntenspergen, Glenn R.; Biesbrouck, B.; Hermy, M.
2003-01-01
A framework that summarizes the direct and indirect effects of past land use on forest herb recolonization is proposed, and used to analyse the colonization patterns of forest understorey herbaceous species in a 360-ha mixed forest, grassland and arable landscape in the Dijle river valley (central Belgium).Fine-scale distribution maps were constructed for 14 species. The species were mapped in 15 946 forest plots and outside forests (along parcel margins) in 5188 plots. Forest stands varied in age between 1 and more than 224 years. Detailed land-use history data were combined with the species distribution maps to identify species-specific colonization sources and to calculate colonization distances.The six most frequent species were selected for more detailed statistical analysis.Logistic regression models indicated that species frequency in forest parcels was a function of secondary forest age, distance from the nearest colonization source and their interaction. Similar age and distance effects were found within hedgerows.In 199 forest stands, data about soils, canopy structure and the cover of competitive species were collected. The relative importance of habitat quality and spatio-temporal isolation for the colonization of the forest herb species was quantified using structural equation modelling (SEM), within the framework proposed for the effects of past land use.The results of the SEM indicate that, except for the better colonizing species, the measured habitat quality variables are of minor importance in explaining colonization patterns, compared with the combination of secondary forest age and distance from colonization sources.Our results suggest the existence of a two-stage colonization process in which diaspore availability determines the initial pattern, which is affected by environmental sorting at later stages.
Gäbler, Gabriele; Coenen, Michaela; Lycett, Deborah; Stamm, Tanja
2018-03-03
High quality, continuity and safe interdisciplinary healthcare is essential. Nutrition and dietetics plays an important part within the interdisciplinary team in many health conditions. In order to work more effectively as an interdisciplinary team, a common terminology is needed. This study investigates which categories of the ICF-Dietetics are used in clinical dietetic care records in Austria and which are most relevant to shared language in different medical areas. A national multicenter retrospective study was conducted to collect clinical dietetic care documentation reports. The analysis included the "best fit" framework synthesis, and a mapping exercise using the ICF Linking Rules. Medical diagnosis and intervention concepts were excluded from the mapping, since they are not supposed to be classified by the ICF. From 100 dietetic records, 307 concepts from 1807 quotations were extracted. Of these, 241 assessment, dietetics diagnosis, goal setting and evaluation concepts were linked to 153 ICF-Dietetics categories. The majority (91.3%) could be mapped to a precise ICF-Dietetics category. The highest number of ICF-Dietetics categories was found in the medical area of diabetes and metabolism and belonged to the ICF component Body Function, while very few categories were used from the component Participation and Environmental Factors. The integration of the ICF-Dietetics in nutrition and dietetic care process is possible. Moreover, it could be considered as a conceptual framework for interdisciplinary nutrition and dietetics care. However, a successful implementation of the ICF-Dietetics in clinical practice requires a paradigm shift from medical diagnosis-focused health care to a holistic perspective of functioning with more attention on Participation and Environmental Factors. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Addressing the Challenges of Multi-Domain Data Integration with the SemantEco Framework
NASA Astrophysics Data System (ADS)
Patton, E. W.; Seyed, P.; McGuinness, D. L.
2013-12-01
Data integration across multiple domains will continue to be a challenge with the proliferation of big data in the sciences. Data origination issues and how data are manipulated are critical to enable scientists to understand and consume disparate datasets as research becomes more multidisciplinary. We present the SemantEco framework as an exemplar for designing an integrative portal for data discovery, exploration, and interpretation that uses best practice W3C Recommendations. We use the Resource Description Framework (RDF) with extensible ontologies described in the Web Ontology Language (OWL) to provide graph-based data representation. Furthermore, SemantEco ingests data via the software package csv2rdf4lod, which generates data provenance using the W3C provenance recommendation (PROV). Our presentation will discuss benefits and challenges of semantic integration, their effect on runtime performance, and how the SemantEco framework assisted in identifying performance issues and improved query performance across multiple domains by an order of magnitude. SemantEco benefits from a semantic approach that provides an 'open world', which allows data to incrementally change just as it does in the real world. SemantEco modules may load new ontologies and data using the W3C's SPARQL Protocol and RDF Query Language via HTTP. Modules may also provide user interface elements for applications and query capabilities to support new use cases. Modules can associate with domains, which are first-class objects in SemantEco. This enables SemantEco to perform integration and reasoning both within and across domains on module-provided data. The SemantEco framework has been used to construct a web portal for environmental and ecological data. The portal includes water and air quality data from the U.S. Geological Survey (USGS) and Environmental Protection Agency (EPA) and species observation counts for birds and fish from the Avian Knowledge Network and the Santa Barbara Long Term Ecological Research, respectively. We provide regulation ontologies using OWL2 datatype facets to detect out-of-range measurements for environmental standards set by the EPA, i.a. Users adjust queries using module-defined facets and a map presents the resulting measurement sites. Custom icons identify sites that violate regulations, making them easy to locate. Selecting a site gives the option of charting spatially proximate data from different domains over time. Our portal currently provides 1.6 billion triples of scientific data in RDF. We segment data by ZIP code and reasoning over 2157 measurements with our EPA regulation ontology that contains 131 regulations takes 2.5 seconds on a 2.4 GHz Intel Core 2 Quad with 8 GB of RAM. SemantEco's modular design and reasoning capabilities make it an exemplar for building multidisciplinary data integration tools that provide data access to scientists and the general population alike. Its provenance tracking provides accountability and its reasoning services can assist users in interpreting data. Future work includes support for geographical queries using the Open Geospatial Consortium's GeoSPARQL standard.
NASA Astrophysics Data System (ADS)
Karmakar, Mampi; Maiti, Saumen; Singh, Amrita; Ojha, Maheswar; Maity, Bhabani Sankar
2017-07-01
Modeling and classification of the subsurface lithology is very important to understand the evolution of the earth system. However, precise classification and mapping of lithology using a single framework are difficult due to the complexity and the nonlinearity of the problem driven by limited core sample information. Here, we implement a joint approach by combining the unsupervised and the supervised methods in a single framework for better classification and mapping of rock types. In the unsupervised method, we use the principal component analysis (PCA), K-means cluster analysis (K-means), dendrogram analysis, Fuzzy C-means (FCM) cluster analysis and self-organizing map (SOM). In the supervised method, we use the Bayesian neural networks (BNN) optimized by the Hybrid Monte Carlo (HMC) (BNN-HMC) and the scaled conjugate gradient (SCG) (BNN-SCG) techniques. We use P-wave velocity, density, neutron porosity, resistivity and gamma ray logs of the well U1343E of the Integrated Ocean Drilling Program (IODP) Expedition 323 in the Bering Sea slope region. While the SOM algorithm allows us to visualize the clustering results in spatial domain, the combined classification schemes (supervised and unsupervised) uncover the different patterns of lithology such of as clayey-silt, diatom-silt and silty-clay from an un-cored section of the drilled hole. In addition, the BNN approach is capable of estimating uncertainty in the predictive modeling of three types of rocks over the entire lithology section at site U1343. Alternate succession of clayey-silt, diatom-silt and silty-clay may be representative of crustal inhomogeneity in general and thus could be a basis for detail study related to the productivity of methane gas in the oceans worldwide. Moreover, at the 530 m depth down below seafloor (DSF), the transition from Pliocene to Pleistocene could be linked to lithological alternation between the clayey-silt and the diatom-silt. The present results could provide the basis for the detailed study to get deeper insight into the Bering Sea' sediment deposition and sequence.
Registration of 4D time-series of cardiac images with multichannel Diffeomorphic Demons.
Peyrat, Jean-Marc; Delingette, Hervé; Sermesant, Maxime; Pennec, Xavier; Xu, Chenyang; Ayache, Nicholas
2008-01-01
In this paper, we propose a generic framework for intersubject non-linear registration of 4D time-series images. In this framework, spatio-temporal registration is defined by mapping trajectories of physical points as opposed to spatial registration that solely aims at mapping homologous points. First, we determine the trajectories we want to register in each sequence using a motion tracking algorithm based on the Diffeomorphic Demons algorithm. Then, we perform simultaneously pairwise registrations of corresponding time-points with the constraint to map the same physical points over time. We show this trajectory registration can be formulated as a multichannel registration of 3D images. We solve it using the Diffeomorphic Demons algorithm extended to vector-valued 3D images. This framework is applied to the inter-subject non-linear registration of 4D cardiac CT sequences.
Neural and cognitive plasticity: from maps to minds.
Mercado, Eduardo
2008-01-01
Some species and individuals are able to learn cognitive skills more flexibly than others. Learning experiences and cortical function are known to contribute to such differences, but the specific factors that determine an organism's intellectual capacities remain unclear. Here, an integrative framework is presented suggesting that variability in cognitive plasticity reflects neural constraints on the precision and extent of an organism's stimulus representations. Specifically, it is hypothesized that cognitive plasticity depends on the number and diversity of cortical modules that an organism has available as well as the brain's capacity to flexibly reconfigure and customize networks of these modules. The author relates this framework to past proposals on the neural mechanisms of intelligence, including (a) the relationship between brain size and intellectual capacity; (b) the role of prefrontal cortex in cognitive control and the maintenance of stimulus representations; and (c) the impact of neural plasticity and efficiency on the acquisition and performance of cognitive skills. The proposed framework provides a unified account of variability in cognitive plasticity as a function of species, age, and individual, and it makes specific predictions about how manipulations of cortical structure and function will impact intellectual capacity. Copyright (c) 2008 APA.
What is missing? An operational inundation mapping framework by SAR data
NASA Astrophysics Data System (ADS)
Shen, X.; Anagnostou, E. N.; Zeng, Z.; Kettner, A.; Hong, Y.
2017-12-01
Compared to optical sensors, synthetic aperture radar (SAR) works all-day all-weather. In addition, its spatial resolution does not decrease with the height of the platform and is thus applicable to a range of important studies. However, existing studies did not address the operational demands of real-time inundation mapping. The direct proof is that no water body product exists for any SAR-based satellites. Then what is missing between science and products? Automation and quality. What makes it so difficult to develop an operational inundation mapping technique based on SAR data? Spectrum-wise, unlike optical water indices such as MNDWI, AWEI etc., where a relative constant threshold may apply across acquisition of images, regions and sensors, the threshold to separate water from non-water pixels in each SAR images has to be individually chosen. The optimization of the threshold is the first obstacle to the automation of the SAR data algorithm. Morphologically, the quality and reliability of the results have been compromised by over-detection caused by smooth surface and shadowing area, the noise-like speckle and under-detection caused by strong-scatter disturbance. In this study, we propose a three-step framework that addresses all aforementioned issues of operational inundation mapping by SAR data. The framework consists of 1) optimization of Wishart distribution parameters of single/dual/fully-polarized SAR data, 2) morphological removal of over-detection, and 3) machine-learning based removal of under-detection. The framework utilizes not only the SAR data, but also the synergy of digital elevation model (DEM), and optical sensor-based products of fine resolution, including the water probability map, land cover classification map (optional), and river width. The framework has been validated throughout multiple areas in different parts of the world using different satellite SAR data and globally available ancillary data products. Therefore, it has the potential to contribute as an operational inundation mapping algorithm to any SAR missions, such as SWOT, ALOS, Sentinel, etc. Selected results using ALOS/PALSAR-1 L-band dual polarized data around the Connecticut River is provided in the attached Figure.
Integrated Quantitative Transcriptome Maps of Human Trisomy 21 Tissues and Cells
Pelleri, Maria Chiara; Cattani, Chiara; Vitale, Lorenza; Antonaros, Francesca; Strippoli, Pierluigi; Locatelli, Chiara; Cocchi, Guido; Piovesan, Allison; Caracausi, Maria
2018-01-01
Down syndrome (DS) is due to the presence of an extra full or partial chromosome 21 (Hsa21). The identification of genes contributing to DS pathogenesis could be the key to any rational therapy of the associated intellectual disability. We aim at generating quantitative transcriptome maps in DS integrating all gene expression profile datasets available for any cell type or tissue, to obtain a complete model of the transcriptome in terms of both expression values for each gene and segmental trend of gene expression along each chromosome. We used the TRAM (Transcriptome Mapper) software for this meta-analysis, comparing transcript expression levels and profiles between DS and normal brain, lymphoblastoid cell lines, blood cells, fibroblasts, thymus and induced pluripotent stem cells, respectively. TRAM combined, normalized, and integrated datasets from different sources and across diverse experimental platforms. The main output was a linear expression value that may be used as a reference for each of up to 37,181 mapped transcripts analyzed, related to both known genes and expression sequence tag (EST) clusters. An independent example in vitro validation of fibroblast transcriptome map data was performed through “Real-Time” reverse transcription polymerase chain reaction showing an excellent correlation coefficient (r = 0.93, p < 0.0001) with data obtained in silico. The availability of linear expression values for each gene allowed the testing of the gene dosage hypothesis of the expected 3:2 DS/normal ratio for Hsa21 as well as other human genes in DS, in addition to listing genes differentially expressed with statistical significance. Although a fraction of Hsa21 genes escapes dosage effects, Hsa21 genes are selectively over-expressed in DS samples compared to genes from other chromosomes, reflecting a decisive role in the pathogenesis of the syndrome. Finally, the analysis of chromosomal segments reveals a high prevalence of Hsa21 over-expressed segments over the other genomic regions, suggesting, in particular, a specific region on Hsa21 that appears to be frequently over-expressed (21q22). Our complete datasets are released as a new framework to investigate transcription in DS for individual genes as well as chromosomal segments in different cell types and tissues. PMID:29740474
Knebl, M R; Yang, Z-L; Hutchison, K; Maidment, D R
2005-06-01
This paper develops a framework for regional scale flood modeling that integrates NEXRAD Level III rainfall, GIS, and a hydrological model (HEC-HMS/RAS). The San Antonio River Basin (about 4000 square miles, 10,000 km2) in Central Texas, USA, is the domain of the study because it is a region subject to frequent occurrences of severe flash flooding. A major flood in the summer of 2002 is chosen as a case to examine the modeling framework. The model consists of a rainfall-runoff model (HEC-HMS) that converts precipitation excess to overland flow and channel runoff, as well as a hydraulic model (HEC-RAS) that models unsteady state flow through the river channel network based on the HEC-HMS-derived hydrographs. HEC-HMS is run on a 4 x 4 km grid in the domain, a resolution consistent with the resolution of NEXRAD rainfall taken from the local river authority. Watershed parameters are calibrated manually to produce a good simulation of discharge at 12 subbasins. With the calibrated discharge, HEC-RAS is capable of producing floodplain polygons that are comparable to the satellite imagery. The modeling framework presented in this study incorporates a portion of the recently developed GIS tool named Map to Map that has been created on a local scale and extends it to a regional scale. The results of this research will benefit future modeling efforts by providing a tool for hydrological forecasts of flooding on a regional scale. While designed for the San Antonio River Basin, this regional scale model may be used as a prototype for model applications in other areas of the country.
NASA Astrophysics Data System (ADS)
Shahtahmassebi, Amir Reza; Song, Jie; Zheng, Qing; Blackburn, George Alan; Wang, Ke; Huang, Ling Yan; Pan, Yi; Moore, Nathan; Shahtahmassebi, Golnaz; Sadrabadi Haghighi, Reza; Deng, Jing Song
2016-04-01
A substantial body of literature has accumulated on the topic of using remotely sensed data to map impervious surfaces which are widely recognized as an important indicator of urbanization. However, the remote sensing of impervious surface growth has not been successfully addressed. This study proposes a new framework for deriving and summarizing urban expansion and re-densification using time series of impervious surface fractions (ISFs) derived from remotely sensed imagery. This approach integrates multiple endmember spectral mixture analysis (MESMA), analysis of regression residuals, spatial statistics (Getis_Ord) and urban growth theories; hence, the framework is abbreviated as MRGU. The performance of MRGU was compared with commonly used change detection techniques in order to evaluate the effectiveness of the approach. The results suggested that the ISF regression residuals were optimal for detecting impervious surface changes while Getis_Ord was effective for mapping hotspot regions in the regression residuals image. Moreover, the MRGU outputs agreed with the mechanisms proposed in several existing urban growth theories, but importantly the outputs enable the refinement of such models by explicitly accounting for the spatial distribution of both expansion and re-densification mechanisms. Based on Landsat data, the MRGU is somewhat restricted in its ability to measure re-densification in the urban core but this may be improved through the use of higher spatial resolution satellite imagery. The paper ends with an assessment of the present gaps in remote sensing of impervious surface growth and suggests some solutions. The application of impervious surface fractions in urban change detection is a stimulating new research idea which is driving future research with new models and algorithms.
Advances in Landslide Nowcasting: Evaluation of a Global and Regional Modeling Approach
NASA Technical Reports Server (NTRS)
Kirschbaum, Dalia Bach; Peters-Lidard, Christa; Adler, Robert; Hong, Yang; Kumar, Sujay; Lerner-Lam, Arthur
2011-01-01
The increasing availability of remotely sensed data offers a new opportunity to address landslide hazard assessment at larger spatial scales. A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that may experience landslide activity. This system combines a calculation of static landslide susceptibility with satellite-derived rainfall estimates and uses a threshold approach to generate a set of nowcasts that classify potentially hazardous areas. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale near real-time landslide hazard assessment efforts, it requires several modifications before it can be fully realized as an operational tool. This study draws upon a prior work s recommendations to develop a new approach for considering landslide susceptibility and hazard at the regional scale. This case study calculates a regional susceptibility map using remotely sensed and in situ information and a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America. The susceptibility map is evaluated with a regional rainfall intensity duration triggering threshold and results are compared with the global algorithm framework for the same event. Evaluation of this regional system suggests that this empirically based approach provides one plausible way to approach some of the data and resolution issues identified in the global assessment. The presented methodology is straightforward to implement, improves upon the global approach, and allows for results to be transferable between regions. The results also highlight several remaining challenges, including the empirical nature of the algorithm framework and adequate information for algorithm validation. Conclusions suggest that integrating additional triggering factors such as soil moisture may help to improve algorithm performance accuracy. The regional algorithm scenario represents an important step forward in advancing regional and global-scale landslide hazard assessment.
Integrable mappings and the notion of anticonfinement
NASA Astrophysics Data System (ADS)
Mase, T.; Willox, R.; Ramani, A.; Grammaticos, B.
2018-06-01
We examine the notion of anticonfinement and the role it has to play in the singularity analysis of discrete systems. A singularity is said to be anticonfined if singular values continue to arise indefinitely for the forward and backward iterations of a mapping, with only a finite number of iterates taking regular values in between. We show through several concrete examples that the behaviour of some anticonfined singularities is strongly related to the integrability properties of the discrete mappings in which they arise, and we explain how to use this information to decide on the integrability or non-integrability of the mapping.
Architecture of cognitive flexibility revealed by lesion mapping
Barbey, Aron K.; Colom, Roberto; Grafman, Jordan
2013-01-01
Neuroscience has made remarkable progress in understanding the architecture of human intelligence, identifying a distributed network of brain structures that support goal-directed, intelligent behavior. However, the neural foundations of cognitive flexibility and adaptive aspects of intellectual function remain to be well characterized. Here, we report a human lesion study (n = 149) that investigates the neural bases of key competencies of cognitive flexibility (i.e., mental flexibility and the fluent generation of new ideas) and systematically examine their contributions to a broad spectrum of cognitive and social processes, including psychometric intelligence (Wechsler Adult Intelligence Scale), emotional intelligence (Mayer, Salovey, Caruso Emotional Intelligence Test), and personality (Neuroticism–Extraversion–Openness Personality Inventory). Latent variable modeling was applied to obtain error-free indices of each factor, followed by voxel-based lesion-symptom mapping to elucidate their neural substrates. Regression analyses revealed that latent scores for psychometric intelligence reliably predict latent scores for cognitive flexibility (adjusted R2 = 0.94). Lesion mapping results further indicated that these convergent processes depend on a shared network of frontal, temporal, and parietal regions, including white matter association tracts, which bind these areas into an integrated system. A targeted analysis of the unique variance explained by cognitive flexibility further revealed selective damage within the right superior temporal gyrus, a region known to support insight and the recognition of novel semantic relations. The observed findings motivate an integrative framework for understanding the neural foundations of adaptive behavior, suggesting that core elements of cognitive flexibility emerge from a distributed network of brain regions that support specific competencies for human intelligence. PMID:23721727
The Role of Geologic Mapping in NASA PDSI Planning
NASA Astrophysics Data System (ADS)
Williams, D. A.; Skinner, J. A.; Radebaugh, J.
2017-12-01
Geologic mapping is an investigative process designed to derive the geologic history of planetary objects at local, regional, hemispheric or global scales. Geologic maps are critical products that aid future exploration by robotic spacecraft or human missions, support resource exploration, and provide context for and help guide scientific discovery. Creation of these tools, however, can be challenging in that, relative to their terrestrial counterparts, non-terrestrial planetary geologic maps lack expansive field-based observations. They rely, instead, on integrating diverse data types wth a range of spatial scales and areal coverage. These facilitate establishment of geomorphic and geologic context but are generally limited with respect to identifying outcrop-scale textural details and resolving temporal and spatial changes in depositional environments. As a result, planetary maps should be prepared with clearly defined contact and unit descriptions as well as a range of potential interpretations. Today geologic maps can be made from images obtained during the traverses of the Mars rovers, and for every new planetary object visited by NASA orbital or flyby spacecraft (e.g., Vesta, Ceres, Titan, Enceladus, Pluto). As Solar System Exploration develops and as NASA prepares to send astronauts back to the Moon and on to Mars, the importance of geologic mapping will increase. In this presentation, we will discuss the past role of geologic mapping in NASA's planetary science activities and our thoughts on the role geologic mapping will have in exploration in the coming decades. Challenges that planetary mapping must address include, among others: 1) determine the geologic framework of all Solar System bodies through the systematic development of geologic maps at appropriate scales, 2) develop digital Geographic Information Systems (GIS)-based mapping techniques and standards to assist with communicating map information to the scientific community and public, 3) develop public awareness of the role and application of geologic map-information to the resolution of national issues relevant to planetary science and eventual off-planet resource assessments, 4) use topical science to drive mapping in areas likely to be determined vital to the welfare of endeavors related to planetary science and exploration.
Ellefsen, Karl J.; Burton, William C.; Lacombe, Pierre J.
2012-01-01
Fractured sedimentary bedrock and groundwater at the former Naval Air Warfare Center in West Trenton, New Jersey (United States of America) are contaminated with chlorinated solvents. Predicting contaminant migration or removing the contaminants requires an understanding of the geology. Consequently, the geologic framework near the site was characterized with four different methods having different spatial scales: geologic field mapping, analyses of bedrock drill core, analyses of soil and regolith, and S-wave refraction surveys. A fault zone is in the southeast corner of the site and separates two distinct sedimentary formations; the fault zone dips (steeply) southeasterly, strikes northeasterly, and extends at least 550 m along its strike direction. Drill core from the fault zone is extensively brecciated and includes evidence of tectonic contraction. Approximately 300 m east of this fault zone is another fault zone, which offsets the contact between the two sedimentary formations. The S-wave refraction surveys identified both fault zones beneath soil and regolith and thereby provided constraints on their lateral extent and location.
How scientists develop competence in visual communication
NASA Astrophysics Data System (ADS)
Ostergren, Marilyn
Visuals (maps, charts, diagrams and illustrations) are an important tool for communication in most scientific disciplines, which means that scientists benefit from having strong visual communication skills. This dissertation examines the nature of competence in visual communication and the means by which scientists acquire this competence. This examination takes the form of an extensive multi-disciplinary integrative literature review and a series of interviews with graduate-level science students. The results are presented as a conceptual framework that lays out the components of competence in visual communication, including the communicative goals of science visuals, the characteristics of effective visuals, the skills and knowledge needed to create effective visuals and the learning experiences that promote the acquisition of these forms of skill and knowledge. This conceptual framework can be used to inform pedagogy and thus help graduate students achieve a higher level of competency in this area; it can also be used to identify aspects of acquiring competence in visual communication that need further study.
Erika S. Svendsen; Lindsay K. Campbell; Dana R. Fisher; James J.T. Connolly; Michelle L. Johnson; Nancy Falxa Sonti; Dexter H. Locke; Lynne M. Westphal; Cherie LeBlanc Fisher; Morgan Grove; Michele Romolini; Dale J. Blahna; Kathleen L. Wolf
2016-01-01
The Stewardship Mapping and Assessment Project (STEW-MAP) is designed to answer who, where, why and how environmental stewardship groups are caring for our urbanized landscapes. This report is intended to be a guide for those who wish to start STEW-MAP in their own city. It contains step-by-step directions for how to plan and implement a STEW-MAP project. STEW-MAP is...
Construct Maps as a Foundation for Standard Setting
ERIC Educational Resources Information Center
Wyse, Adam E.
2013-01-01
Construct maps are tools that display how the underlying achievement construct upon which one is trying to set cut-scores is related to other information used in the process of standard setting. This article reviews what construct maps are, uses construct maps to provide a conceptual framework to view commonly used standard-setting procedures (the…
Construct Maps: A Tool to Organize Validity Evidence
ERIC Educational Resources Information Center
McClarty, Katie Larsen
2013-01-01
The construct map is a promising tool for organizing the data standard-setting panelists interpret. The challenge in applying construct maps to standard-setting procedures will be the judicious selection of data to include within this organizing framework. Therefore, this commentary focuses on decisions about what to include in the construct map.…
No clustering for linkage map based on low-copy and undermethylated microsatellites.
Zhou, Yi; Gwaze, David P; Reyes-Valdés, M Humberto; Bui, Thomas; Williams, Claire G
2003-10-01
Clustering has been reported for conifer genetic maps based on hypomethylated or low-copy molecular markers, resulting in uneven marker distribution. To test this, a framework genetic map was constructed from three types of microsatellites: low-copy, undermethylated, and genomic. These Pinus taeda L. microsatellites were mapped using a three-generation pedigree with 118 progeny. The microsatellites were highly informative; of the 32 markers in intercross configuration, 29 were segregating for three or four alleles in the progeny. The sex-averaged map placed 51 of the 95 markers in 15 linkage groups at LOD > 4.0. No clustering or uneven distribution across the genome was observed. The three types of P. taeda microsatellites were randomly dispersed within each linkage group. The 51 microsatellites covered a map distance of 795 cM, an average distance of 21.8 cM between markers, roughly half of the estimated total map length. The minimum and maximum distances between any two bins was 4.4 and 45.3 cM, respectively. These microsatellites provided anchor points for framework mapping for polymorphism in P. taeda and other closely related hard pines.
Synchromodal optical in vivo imaging employing microlens array optics: a complete framework
NASA Astrophysics Data System (ADS)
Peter, Joerg
2013-03-01
A complete mathematical framework for preclinical optical imaging (OI) support comprising bioluminescence imaging (BLI), fluorescence surface imaging (FSI) and fluorescence optical tomography (FOT) is presented in which optical data is acquired by means of a microlens array (MLA) based light detector (MLA-D). The MLA-D has been developed to enable unique OI, especially in synchromodal operation with secondary imaging modalities (SIM) such as positron emission tomography (PET) or magnetic resonance imaging (MRI). An MLA-D consists of a (large-area) photon sensor array, a matched MLA for field-of-view definition, and a septum mask of specific geometry made of anodized aluminum that is positioned between the sensor and the MLA to suppresses light cross-talk and to shield the sensor's radiofrequency interference signal (essential when used inside an MRI system). The software framework, while freely parameterizable for any MLA-D, is tailored towards an OI prototype system for preclinical SIM application comprising a multitude of cylindrically assembled, gantry-mounted, simultaneously operating MLA-D's. Besides the MLA-D specificity, the framework incorporates excitation and illumination light-source declarations of large-field and point geometry to facilitate multispectral FSI and FOT as well as three-dimensional object recognition. When used in synchromodal operation, reconstructed tomographic SIM volume data can be used for co-modal image fusion and also as a prior for estimating the imaged object's 3D surface by means of gradient vector flow. Superimposed planar (without object prior) or surface-aligned inverse mapping can be performed to estimate and to fuse the emission light map with the boundary of the imaged object. Triangulation and subsequent optical reconstruction (FOT) or constrained flow estimation (BLI), both including the possibility of SIM priors, can be performed to estimate the internal three-dimensional emission light distribution. The framework is susceptible to a number of variables controlling convergence and computational speed. Utilization and performance is illustrated on experimentally acquired data employing the OI prototype system in stand-alone operation, and when integrated into an unmodified preclinical PET system performing synchromodal BLI-PET in vivo imaging.
Integrating Health Behavior Theory and Design Elements in Serious Games
Fleming, Theresa; Lucassen, Mathijs FG; Bridgman, Heather; Stasiak, Karolina; Shepherd, Matthew; Orpin, Peter
2015-01-01
Background Internet interventions for improving health and well-being have the potential to reach many people and fill gaps in service provision. Serious gaming interfaces provide opportunities to optimize user adherence and impact. Health interventions based in theory and evidence and tailored to psychological constructs have been found to be more effective to promote behavior change. Defining the design elements which engage users and help them to meet their goals can contribute to better informed serious games. Objective To elucidate design elements important in SPARX, a serious game for adolescents with depression, from a user-centered perspective. Methods We proposed a model based on an established theory of health behavior change and practical features of serious game design to organize ideas and rationale. We analyzed data from 5 studies comprising a total of 22 focus groups and 66 semistructured interviews conducted with youth and families in New Zealand and Australia who had viewed or used SPARX. User perceptions of the game were applied to this framework. Results A coherent framework was established using the three constructs of self-determination theory (SDT), autonomy, competence, and relatedness, to organize user perceptions and design elements within four areas important in design: computer game, accessibility, working alliance, and learning in immersion. User perceptions mapped well to the framework, which may assist developers in understanding the context of user needs. By mapping these elements against the constructs of SDT, we were able to propose a sound theoretical base for the model. Conclusions This study’s method allowed for the articulation of design elements in a serious game from a user-centered perspective within a coherent overarching framework. The framework can be used to deliberately incorporate serious game design elements that support a user’s sense of autonomy, competence, and relatedness, key constructs which have been found to mediate motivation at all stages of the change process. The resulting model introduces promising avenues for future exploration. Involving users in program design remains an imperative if serious games are to be fit for purpose. PMID:26543916
DOT National Transportation Integrated Search
2009-04-10
This report documents research on the conceptual framework of an integrated transportation system with a prototype application under the framework. Three levels of control are involved in this framework: at the global level (an entire transportation ...
Offshore Energy Mapping for Northeast Atlantic and Mediterranean: MARINA PLATFORM project
NASA Astrophysics Data System (ADS)
Kallos, G.; Galanis, G.; Spyrou, C.; Kalogeri, C.; Adam, A.; Athanasiadis, P.
2012-04-01
Deep offshore ocean energy mapping requires detailed modeling of the wind, wave, tidal and ocean circulation estimations. It requires also detailed mapping of the associated extremes. An important issue in such work is the co-generation of energy (generation of wind, wave, tides, currents) in order to design platforms on an efficient way. For example wind and wave fields exhibit significant phase differences and therefore the produced energy from both sources together requires special analysis. The other two sources namely tides and currents have different temporal scales from the previous two. Another important issue is related to the estimation of the environmental frequencies in order to avoid structural problems. These are issues studied at the framework of the FP7 project MARINA PLATFORM. The main objective of the project is to develop deep water structures that can exploit the energy from wind, wave, tidal and ocean current energy sources. In particular, a primary goal will be the establishment of a set of equitable and transparent criteria for the evaluation of multi-purpose platforms for marine renewable energy. Using these criteria, a novel system set of design and optimisation tools will be produced addressing new platform design, component engineering, risk assessment, spatial planning, platform-related grid connection concepts, all focussed on system integration and reducing costs. The University of Athens group is in charge for estimation and mapping of wind, wave, tidal and ocean current resources, estimate available energy potential, map extreme event characteristics and provide any additional environmental parameter required.
The Use of Intervention Mapping to Develop a Tailored Web-Based Intervention, Condom-HIM
2017-01-01
Background Many HIV (human immunodeficiency virus) prevention interventions are currently being implemented and evaluated, with little information published on their development. A framework highlighting the method of development of an intervention can be used by others wanting to replicate interventions or develop similar interventions to suit other contexts and settings. It provides researchers with a comprehensive development process of the intervention. Objective The objective of this paper was to describe how a systematic approach, intervention mapping, was used to develop a tailored Web-based intervention to increase condom use among HIV-positive men who have sex with men. Methods The intervention was developed in consultation with a multidisciplinary team composed of academic researchers, community members, Web designers, and the target population. Intervention mapping involved a systematic process of 6 steps: (1) needs assessment; (2) identification of proximal intervention objectives; (3) selection of theory-based intervention methods and practical strategies; (4) development of intervention components and materials; (5) adoption, implementation, and maintenance; and (6) evaluation planning. Results The application of intervention mapping resulted in the development of a tailored Web-based intervention for HIV-positive men who have sex with men, called Condom-HIM. Conclusions Using intervention mapping as a systematic process to develop interventions is a feasible approach that specifically integrates the use of theory and empirical findings. Outlining the process used to develop a particular intervention provides clarification on the conceptual use of experimental interventions in addition to potentially identifying reasons for intervention failures. PMID:28428162
Yim, Young-Sun; Davis, Georgia L.; Duru, Ngozi A.; Musket, Theresa A.; Linton, Eric W.; Messing, Joachim W.; McMullen, Michael D.; Soderlund, Carol A.; Polacco, Mary L.; Gardiner, Jack M.; Coe, Edward H.
2002-01-01
Three maize (Zea mays) bacterial artificial chromosome (BAC) libraries were constructed from inbred line B73. High-density filter sets from all three libraries, made using different restriction enzymes (HindIII, EcoRI, and MboI, respectively), were evaluated with a set of complex probes including the185-bp knob repeat, ribosomal DNA, two telomere-associated repeat sequences, four centromere repeats, the mitochondrial genome, a multifragment chloroplast DNA probe, and bacteriophage λ. The results indicate that the libraries are of high quality with low contamination by organellar and λ-sequences. The use of libraries from multiple enzymes increased the chance of recovering each region of the genome. Ninety maize restriction fragment-length polymorphism core markers were hybridized to filters of the HindIII library, representing 6× coverage of the genome, to initiate development of a framework for anchoring BAC contigs to the intermated B73 × Mo17 genetic map and to mark the bin boundaries on the physical map. All of the clones used as hybridization probes detected at least three BACs. Twenty-two single-copy number core markers identified an average of 7.4 ± 3.3 positive clones, consistent with the expectation of six clones. This information is integrated into fingerprinting data generated by the Arizona Genomics Institute to assemble the BAC contigs using fingerprint contig and contributed to the process of physical map construction. PMID:12481051
The Scottish Credit and Qualifications Framework: What's Academic Practice Got to Do with It?
ERIC Educational Resources Information Center
Fernie, Scott; Pilcher, Nick; Smith, Karen L.
2014-01-01
National Qualifications Frameworks (NQF) are a globally established and expanding phenomenon. They are increasingly merging and being mapped onto meta-qualifications frameworks. One key NQF in both these roles is the Scottish Credit and Qualifications Framework (SCQF). Much research categorises the different types of NQF, details their success and…
Shen, Nelson; Yufe, Shira; Saadatfard, Omid; Sockalingam, Sanjeev; Wiljer, David
2017-01-01
Information system research has stressed the importance of theory in understanding how user perceptions can motivate the use and adoption of technology such as web-based continuing professional development programs for interprofessional education (WCPD-IPE). A systematic review was conducted to provide an information system perspective on the current state of WCPD-IPE program evaluation and how current evaluations capture essential theoretical constructs in promoting technology adoption. Six databases were searched to identify studies evaluating WCPD-IPE. Three investigators determined eligibility of the articles. Evaluation items extracted from the studies were assessed using the Kirkpatrick-Barr framework and mapped to the Benefits Evaluation Framework. Thirty-seven eligible studies yielded 362 evaluation items for analysis. Most items (n = 252) were assessed as Kirkpatrick-Barr level 1 (reaction) and were mainly focused on the quality (information, service, and quality) and satisfaction dimensions of the Benefits Evaluation. System quality was the least evaluated quality dimension, accounting for 26 items across 13 studies. WCPD-IPE use was reported in 17 studies and its antecedent factors were evaluated in varying degrees of comprehensiveness. Although user reactions were commonly evaluated, greater focus on user perceptions of system quality (ie, functionality and performance), usefulness, and usability of the web-based platform is required. Surprisingly, WCPD-IPE use was reported in less than half of the studies. This is problematic as use is a prerequisite to realizing any individual, organizational, or societal benefit of WCPD-IPE. This review proposes an integrated framework which accounts for these factors and provides a theoretically grounded guide for future evaluations.
2014-01-01
Background The date palm is one of the oldest cultivated fruit trees. It is critical in many ways to cultures in arid lands by providing highly nutritious fruit while surviving extreme heat and environmental conditions. Despite its importance from antiquity, few genetic resources are available for improving the productivity and development of the dioecious date palm. To date there has been no genetic map and no sex chromosome has been identified. Results Here we present the first genetic map for date palm and identify the putative date palm sex chromosome. We placed ~4000 markers on the map using nearly 1200 framework markers spanning a total of 1293 cM. We have integrated the genetic map, derived from the Khalas cultivar, with the draft genome and placed up to 19% of the draft genome sequence scaffolds onto linkage groups for the first time. This analysis revealed approximately ~1.9 cM/Mb on the map. Comparison of the date palm linkage groups revealed significant long-range synteny to oil palm. Analysis of the date palm sex-determination region suggests it is telomeric on linkage group 12 and recombination is not suppressed in the full chromosome. Conclusions Based on a modified gentoyping-by-sequencing approach we have overcome challenges due to lack of genetic resources and provide the first genetic map for date palm. Combined with the recent draft genome sequence of the same cultivar, this resource offers a critical new tool for date palm biotechnology, palm comparative genomics and a better understanding of sex chromosome development in the palms. PMID:24735434
Voldbjerg, Siri Lygum; Laugesen, Britt; Bahnsen, Iben Bøgh; Jørgensen, Lone; Sørensen, Ingrid Maria; Grønkjaer, Mette; Sørensen, Erik Elgaard
2018-06-01
To describe and discuss the process of integrating the Fundamentals of Care framework in a baccalaureate nursing education at a School of Nursing in Denmark. Nursing education plays an essential role in educating nurses to work within healthcare systems in which a demanding workload on nurses results in fundamental nursing care being left undone. Newly graduated nurses often lack knowledge and skills to meet the challenges of delivering fundamental care in clinical practice. To develop nursing students' understanding of fundamental nursing, the conceptual Fundamentals of Care framework has been integrated in nursing education at a School of Nursing in Denmark. Discursive paper using an adjusted descriptive case study design for describing and discussing the process of integrating the conceptual Fundamentals of Care Framework in nursing education. The process of integrating the Fundamentals of Care framework is illuminated through a description of the context, in which the process occurs including the faculty members, lectures, case-based work and simulation laboratory in nursing education. Based on this description, opportunities such as supporting a holistic approach to an evidence-based integrative patient care and challenges such as scepticism among the faculty are discussed. It is suggested how integration of Fundamentals of Care Framework in lectures, case-based work and simulation laboratory can make fundamental nursing care more explicit in nursing education, support critical thinking and underline the relevance of evidence-based practice. The process relies on a supportive context, a well-informed and engaged faculty, and continuous reflections on how the conceptual framework can be integrated. Integrating the Fundamentals of Care framework can support nursing students' critical thinking and reflection on what fundamental nursing care is and requires and eventually educate nurses in providing evidence-based fundamental nursing care. © 2018 John Wiley & Sons Ltd.