Sample records for process mapping methodology

  1. The use of concept maps during knowledge elicitation in ontology development processes – the nutrigenomics use case

    PubMed Central

    Castro, Alexander Garcia; Rocca-Serra, Philippe; Stevens, Robert; Taylor, Chris; Nashar, Karim; Ragan, Mark A; Sansone, Susanna-Assunta

    2006-01-01

    Background Incorporation of ontologies into annotations has enabled 'semantic integration' of complex data, making explicit the knowledge within a certain field. One of the major bottlenecks in developing bio-ontologies is the lack of a unified methodology. Different methodologies have been proposed for different scenarios, but there is no agreed-upon standard methodology for building ontologies. The involvement of geographically distributed domain experts, the need for domain experts to lead the design process, the application of the ontologies and the life cycles of bio-ontologies are amongst the features not considered by previously proposed methodologies. Results Here, we present a methodology for developing ontologies within the biological domain. We describe our scenario, competency questions, results and milestones for each methodological stage. We introduce the use of concept maps during knowledge acquisition phases as a feasible transition between domain expert and knowledge engineer. Conclusion The contributions of this paper are the thorough description of the steps we suggest when building an ontology, example use of concept maps, consideration of applicability to the development of lower-level ontologies and application to decentralised environments. We have found that within our scenario conceptual maps played an important role in the development process. PMID:16725019

  2. Land cover mapping for development planning in Eastern and Southern Africa

    NASA Astrophysics Data System (ADS)

    Oduor, P.; Flores Cordova, A. I.; Wakhayanga, J. A.; Kiema, J.; Farah, H.; Mugo, R. M.; Wahome, A.; Limaye, A. S.; Irwin, D.

    2016-12-01

    Africa continues to experience intensification of land use, driven by competition for resources and a growing population. Land cover maps are some of the fundamental datasets required by numerous stakeholders to inform a number of development decisions. For instance, they can be integrated with other datasets to create value added products such as vulnerability impact assessment maps, and natural capital accounting products. In addition, land cover maps are used as inputs into Greenhouse Gas (GHG) inventories to inform the Agriculture, Forestry and other Land Use (AFOLU) sector. However, the processes and methodologies of creating land cover maps consistent with international and national land cover classification schemes can be challenging, especially in developing countries where skills, hardware and software resources can be limiting. To meet this need, SERVIR Eastern and Southern Africa developed methodologies and stakeholder engagement processes that led to a successful initiative in which land cover maps for 9 countries (Malawi, Rwanda, Namibia, Botswana, Lesotho, Ethiopia, Uganda, Zambia and Tanzania) were developed, using 2 major classification schemes. The first sets of maps were developed based on an internationally acceptable classification system, while the second sets of maps were based on a nationally defined classification system. The mapping process benefited from reviews from national experts and also from technical advisory groups. The maps have found diverse uses, among them the definition of the Forest Reference Levels in Zambia. In Ethiopia, the maps have been endorsed by the national mapping agency as part of national data. The data for Rwanda is being used to inform the Natural Capital Accounting process, through the WAVES program, a World Bank Initiative. This work illustrates the methodologies and stakeholder engagement processes that brought success to this land cover mapping initiative.

  3. A methodology for eliciting, representing, and analysing stakeholder knowledge for decision making on complex socio-ecological systems: from cognitive maps to agent-based models.

    PubMed

    Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J

    2015-03-15

    This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. WE-H-BRC-04: Implement Lean Methodology to Make Our Current Process of CT Simulation to Treatment More Efficient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boddu, S; Morrow, A; Krishnamurthy, N

    Purpose: Our goal is to implement lean methodology to make our current process of CT simulation to treatment more efficient. Methods: In this study, we implemented lean methodology and tools and employed flowchart in excel for process-mapping. We formed a group of physicians, physicists, dosimetrists, therapists and a clinical physics assistant and huddled bi-weekly to map current value streams. We performed GEMBA walks and observed current processes from scheduling patient CT Simulations to treatment plan approval. From this, the entire workflow was categorized into processes, sub-processes, and tasks. For each process we gathered data on touch time, first time quality,more » undesirable effects (UDEs), and wait-times from relevant members of each task. UDEs were binned per frequency of their occurrence. We huddled to map future state and to find solutions to high frequency UDEs. We implemented visual controls, hard stops, and documented issues found during chart checks prior to treatment plan approval. Results: We have identified approximately 64 UDEs in our current workflow that could cause delays, re-work, compromise the quality and safety of patient treatments, or cause wait times between 1 – 6 days. While some UDEs are unavoidable, such as re-planning due to patient weight loss, eliminating avoidable UDEs is our goal. In 2015, we found 399 issues with patient treatment plans, of which 261, 95 and 43 were low, medium and high severity, respectively. We also mapped patient-specific QA processes for IMRT/Rapid Arc and SRS/SBRT, involving 10 and 18 steps, respectively. From these, 13 UDEs were found and 5 were addressed that solved 20% of issues. Conclusion: We have successfully implemented lean methodology and tools. We are further mapping treatment site specific workflows to identify bottlenecks, potential breakdowns and personnel allocation and employ tools like failure mode effects analysis to mitigate risk factors to make this process efficient.« less

  5. Participatory Development and Analysis of a Fuzzy Cognitive Map of the Establishment of a Bio-Based Economy in the Humber Region

    PubMed Central

    Penn, Alexandra S.; Knight, Christopher J. K.; Lloyd, David J. B.; Avitabile, Daniele; Kok, Kasper; Schiller, Frank; Woodward, Amy; Druckman, Angela; Basson, Lauren

    2013-01-01

    Fuzzy Cognitive Mapping (FCM) is a widely used participatory modelling methodology in which stakeholders collaboratively develop a ‘cognitive map’ (a weighted, directed graph), representing the perceived causal structure of their system. This can be directly transformed by a workshop facilitator into simple mathematical models to be interrogated by participants by the end of the session. Such simple models provide thinking tools which can be used for discussion and exploration of complex issues, as well as sense checking the implications of suggested causal links. They increase stakeholder motivation and understanding of whole systems approaches, but cannot be separated from an intersubjective participatory context. Standard FCM methodologies make simplifying assumptions, which may strongly influence results, presenting particular challenges and opportunities. We report on a participatory process, involving local companies and organisations, focussing on the development of a bio-based economy in the Humber region. The initial cognitive map generated consisted of factors considered key for the development of the regional bio-based economy and their directional, weighted, causal interconnections. A verification and scenario generation procedure, to check the structure of the map and suggest modifications, was carried out with a second session. Participants agreed on updates to the original map and described two alternate potential causal structures. In a novel analysis all map structures were tested using two standard methodologies usually used independently: linear and sigmoidal FCMs, demonstrating some significantly different results alongside some broad similarities. We suggest a development of FCM methodology involving a sensitivity analysis with different mappings and discuss the use of this technique in the context of our case study. Using the results and analysis of our process, we discuss the limitations and benefits of the FCM methodology in this case and in general. We conclude by proposing an extended FCM methodology, including multiple functional mappings within one participant-constructed graph. PMID:24244303

  6. Process mapping as a tool for home health network analysis.

    PubMed

    Pluto, Delores M; Hirshorn, Barbara A

    2003-01-01

    Process mapping is a qualitative tool that allows service providers, policy makers, researchers, and other concerned stakeholders to get a "bird's eye view" of a home health care organizational network or a very focused, in-depth view of a component of such a network. It can be used to share knowledge about community resources directed at the older population, identify gaps in resource availability and access, and promote on-going collaborative interactions that encourage systemic policy reassessment and programmatic refinement. This article is a methodological description of process mapping, which explores its utility as a practice and research tool, illustrates its use in describing service-providing networks, and discusses some of the issues that are key to successfully using this methodology.

  7. Mapping patterns of change in emotion-focused psychotherapy: Implications for theory, research, practice, and training.

    PubMed

    Watson, Jeanne C

    2018-05-01

    An important objective in humanistic-experiential psychotherapies and particularly emotion-focused psychotherapy (EFT) is to map patterns of change. Effective mapping of the processes and pathways of change requires that in-session processes be linked to in-session resolutions, immediate post-session changes, intermediate outcome, final therapy outcome, and longer-term change. This is a challenging and long-term endeavour. Fine-grained descriptions of in-session processes that lead to resolution of specific interpersonal and intrapersonal issues linked with longer-term outcomes are the foundation of EFT, the process-experiential approach. In this paper, evidence in support of EFT as a treatment approach will be reviewed along with research on two mechanisms of change, viewed as central to EFT, clients' emotional processing and the therapeutic relationship conditions. The implications for psychotherapy research are discussed. Given the methodological constraints, there is a need for more innovative methodologies and strategies to investigate specific psychotherapy processes within and across different approaches to map patterns and mechanisms of change to enhance theory, research, practice, and training.

  8. The Use of Outcome Mapping in the Educational Context

    ERIC Educational Resources Information Center

    Lewis, Anna

    2014-01-01

    Outcome Mapping is intended to measure the process by which change occurs, it shifts away from the products of the program to focus on changes in behaviors, relationships, actions, and/or activities of the people involved in the treatment program. This process-oriented methodology, most often used in designing and evaluating community development…

  9. Friction Stir Process Mapping Methodology

    NASA Technical Reports Server (NTRS)

    Bjorkman, Gerry; Kooney, Alex; Russell, Carolyn

    2003-01-01

    The weld process performance for a given weld joint configuration and tool setup is summarized on a 2-D plot of RPM vs. IPM. A process envelope is drawn within the map to identify the range of acceptable welds. The sweet spot is selected as the nominal weld schedule The nominal weld schedule is characterized in the expected manufacturing environment. The nominal weld schedule in conjunction with process control ensures a consistent and predictable weld performance.

  10. Grounded theory: a methodological spiral from positivism to postmodernism.

    PubMed

    Mills, Jane; Chapman, Ysanne; Bonner, Ann; Francis, Karen

    2007-04-01

    Our aim in this paper is to explain a methodological/methods package devised to incorporate situational and social world mapping with frame analysis, based on a grounded theory study of Australian rural nurses' experiences of mentoring. Situational analysis, as conceived by Adele Clarke, shifts the research methodology of grounded theory from being located within a postpositivist paradigm to a postmodern paradigm. Clarke uses three types of maps during this process: situational, social world and positional, in combination with discourse analysis. During our grounded theory study, the process of concurrent interview data generation and analysis incorporated situational and social world mapping techniques. An outcome of this was our increased awareness of how outside actors influenced participants in their constructions of mentoring. In our attempts to use Clarke's methodological package, however, it became apparent that our constructivist beliefs about human agency could not be reconciled with the postmodern project of discourse analysis. We then turned to the literature on symbolic interactionism and adopted frame analysis as a method to examine the literature on rural nursing and mentoring as secondary form of data. While we found situational and social world mapping very useful, we were less successful in using positional maps. In retrospect, we would argue that collective action framing provides an alternative to analysing such positions in the literature. This is particularly so for researchers who locate themselves within a constructivist paradigm, and who are therefore unwilling to reject the notion of human agency and the ability of individuals to shape their world in some way. Our example of using this package of situational and social worlds mapping with frame analysis is intended to assist other researchers to locate participants more transparently in the social worlds that they negotiate in their everyday practice.

  11. Development of a Methodology for Predicting Forest Area for Large-Area Resource Monitoring

    Treesearch

    William H. Cooke

    2001-01-01

    The U.S. Department of Agriculture, Forest Service, Southcm Research Station, appointed a remote-sensing team to develop an image-processing methodology for mapping forest lands over large geographic areds. The team has presented a repeatable methodology, which is based on regression modeling of Advanced Very High Resolution Radiometer (AVHRR) and Landsat Thematic...

  12. Using Concept Mapping as as Tool for Program Theory Development

    ERIC Educational Resources Information Center

    Orsi, Rebecca

    2011-01-01

    The purpose of this methodological study is to explore how well a process called "concept mapping" (Trochim, 1989) can articulate the theory which underlies a social program. Articulation of a program's theory is a key step in completing a sound theory based evaluation (Weiss, 1997a). In this study, concept mapping is used to…

  13. Friction Stir Process Mapping Methodology

    NASA Technical Reports Server (NTRS)

    Kooney, Alex; Bjorkman, Gerry; Russell, Carolyn; Smelser, Jerry (Technical Monitor)

    2002-01-01

    In FSW (friction stir welding), the weld process performance for a given weld joint configuration and tool setup is summarized on a 2-D plot of RPM vs. IPM. A process envelope is drawn within the map to identify the range of acceptable welds. The sweet spot is selected as the nominal weld schedule. The nominal weld schedule is characterized in the expected manufacturing environment. The nominal weld schedule in conjunction with process control ensures a consistent and predictable weld performance.

  14. A time-driven, activity-based costing methodology for determining the costs of red blood cell transfusion in patients with beta thalassaemia major.

    PubMed

    Burns, K E; Haysom, H E; Higgins, A M; Waters, N; Tahiri, R; Rushford, K; Dunstan, T; Saxby, K; Kaplan, Z; Chunilal, S; McQuilten, Z K; Wood, E M

    2018-04-10

    To describe the methodology to estimate the total cost of administration of a single unit of red blood cells (RBC) in adults with beta thalassaemia major in an Australian specialist haemoglobinopathy centre. Beta thalassaemia major is a genetic disorder of haemoglobin associated with multiple end-organ complications and typically requiring lifelong RBC transfusion therapy. New therapeutic agents are becoming available based on advances in understanding of the disorder and its consequences. Assessment of the true total cost of transfusion, incorporating both product and activity costs, is required in order to evaluate the benefits and costs of these new therapies. We describe the bottom-up, time-driven, activity-based costing methodology used to develop process maps to provide a step-by-step outline of the entire transfusion pathway. Detailed flowcharts for each process are described. Direct observations and timing of the process maps document all activities, resources, staff, equipment and consumables in detail. The analysis will include costs associated with performing these processes, including resources and consumables. Sensitivity analyses will be performed to determine the impact of different staffing levels, timings and probabilities associated with performing different tasks. Thirty-one process maps have been developed, with over 600 individual activities requiring multiple timings. These will be used for future detailed cost analyses. Detailed process maps using bottom-up, time-driven, activity-based costing for determining the cost of RBC transfusion in thalassaemia major have been developed. These could be adapted for wider use to understand and compare the costs and complexities of transfusion in other settings. © 2018 British Blood Transfusion Society.

  15. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.

  16. Towards data integration automation for the French rare disease registry.

    PubMed

    Maaroufi, Meriem; Choquet, Rémy; Landais, Paul; Jaulent, Marie-Christine

    2015-01-01

    Building a medical registry upon an existing infrastructure and rooted practices is not an easy task. It is the case for the BNDMR project, the French rare disease registry, that aims to collect administrative and medical data of rare disease patients seen in different hospitals. To avoid duplicating data entry for health professionals, the project plans to deploy connectors with the existing systems to automatically retrieve data. Given the data heterogeneity and the large number of source systems, the automation of connectors creation is required. In this context, we propose a methodology that optimizes the use of existing alignment approaches in the data integration processes. The generated mappings are formalized in exploitable mapping expressions. Following this methodology, a process has been experimented on specific data types of a source system: Boolean and predefined lists. As a result, effectiveness of the used alignment approach has been enhanced and more good mappings have been detected. Nonetheless, further improvements could be done to deal with the semantic issue and process other data types.

  17. Towards data integration automation for the French rare disease registry

    PubMed Central

    Maaroufi, Meriem; Choquet, Rémy; Landais, Paul; Jaulent, Marie-Christine

    2015-01-01

    Building a medical registry upon an existing infrastructure and rooted practices is not an easy task. It is the case for the BNDMR project, the French rare disease registry, that aims to collect administrative and medical data of rare disease patients seen in different hospitals. To avoid duplicating data entry for health professionals, the project plans to deploy connectors with the existing systems to automatically retrieve data. Given the data heterogeneity and the large number of source systems, the automation of connectors creation is required. In this context, we propose a methodology that optimizes the use of existing alignment approaches in the data integration processes. The generated mappings are formalized in exploitable mapping expressions. Following this methodology, a process has been experimented on specific data types of a source system: Boolean and predefined lists. As a result, effectiveness of the used alignment approach has been enhanced and more good mappings have been detected. Nonetheless, further improvements could be done to deal with the semantic issue and process other data types. PMID:26958224

  18. Evidence mapping: illustrating an emerging methodology to improve evidence-based practice in youth mental health.

    PubMed

    Hetrick, Sarah E; Parker, Alexandra G; Callahan, Patrick; Purcell, Rosemary

    2010-12-01

    Within the field of evidence-based practice, a process termed 'evidence mapping' is emerging as a less exhaustive yet systematic and replicable methodology that allows an understanding of the extent and distribution of evidence in a broad clinical area, highlighting both what is known and where gaps in evidence exist. This article describes the general principles of mapping methodology by using illustrations derived from our experience conducting an evidence map of interventions for youth mental-health disorders. Evidence maps are based on an explicit research question relating to the field of enquiry, which may vary in depth, but should be informed by end-users. The research question then drives the search for, and collection of, appropriate studies utilizing explicit and reproducible methods at each stage. This includes clear definition of components of the research question, development of a thorough and reproducible search strategy, development of explicit inclusion and exclusion criteria, and transparent decisions about the level of information to be obtained from each study. Evidence mapping is emerging as a rigorous methodology for gathering and disseminating up-to-date information to end-users. Thoughtful planning and assessment of available resources (e.g. staff, time, budget) are required by those applying this methodology to their particular field of clinical enquiry given the potential scope of the work. The needs of the end-user need to be balanced with available resources. Information derived needs to be effectively communicated, with the uptake of that evidence into clinical practice the ultimate aim. © 2010 The Authors. Journal compilation © 2010 Blackwell Publishing Ltd.

  19. ISSUES IN DIGITAL IMAGE PROCESSING OF AERIAL PHOTOGRAPHY FOR MAPPING SUBMERSED AQUATIC VEGETATION

    EPA Science Inventory

    The paper discusses the numerous issues that needed to be addressed when developing a methodology for mapping Submersed Aquatic Vegetation (SAV) from digital aerial photography. Specifically, we discuss 1) choice of film; 2) consideration of tide and weather constraints; 3) in-s...

  20. GIS and Multi-criteria evaluation (MCE) for landform geodiversity assessment

    NASA Astrophysics Data System (ADS)

    Najwer, Alicja; Reynard, Emmanuel; Zwoliński, Zbigniew

    2014-05-01

    In geomorphology, at the contemporary stage of methodology and methodological development, it is very significant to undertake new research problems, from theoretical and application point of view. As an example of applying geoconservation results in landscape studies and environmental conservation one can refer to the problem of the landform geodiversity. The concept of geodiversity was created relatively recently and, therefore, little progress has been made in its objective assessment and mapping. In order to ensure clarity and coherency, it is recommended that the evaluation process to be rigorous. Multi-criteria evaluation meets these criteria well. The main objective of this presentation is to demonstrate a new methodology for the assessment of the selected natural environment components in response to the definition of geodiversity, as well as visualization of the landforms geodiversity, using the opportunities offered by the geoinformation environment. The study area consists of two peculiar alpine valleys: Illgraben and Derborence, located in the Swiss Alps. Apart from glacial and fluvial landforms, the morphology of these two sites is largely due to the extreme phenomena(rockslides, torrential processes). Both valleys are recognized as geosites of national importance. The basis of the assessment is the selection of the geographical environment features. Firstly, six factor maps were prepared for each area: the landform energy, the landform fragmentation, the contemporary landform preservation, geological settings and hydrographic elements (lakes and streams). Input maps were then standardized and resulted from map algebra operations carried out by multi-criteria evaluation (MCE) with GIS-based Weighted Sum technique. Weights for particular classes were calculated using pair-comparison matrixes method. The final stage of deriving landform geodiversity maps was the reclassification procedure with the use of natural breaks method. The final maps of landform geodiversity were generated with the use of the same methodological algorithm and multiplication of each factor map by its given weight with consistency ratio = 0.07. However, the results that were obtained were radically different. The map of geodiversity for Derborence is characterized by much more significant fragmentation. Areas of low geodiveristy constitute a greater contribution. In the Illgraben site, there is a significant contribution of high and very high geodiversity classes. The obtained maps were reviewed during the field exploration with positive results, which gives a basis to conclude that the methodology used is correct and can be applied for other similar areas. Therefore, it is very important to develop an objective methodology that can be implemented for areas at the local and regional scale, but also giving satisfactory results for areas with a landscape different from the alpine one. The maps of landform geodiversity may be used for environment conservation management, preservation of specific features within the geosite perimeter, spatial planning or tourism management.

  1. Sustainability assessment in forest management based on individual preferences.

    PubMed

    Martín-Fernández, Susana; Martinez-Falero, Eugenio

    2018-01-15

    This paper presents a methodology to elicit the preferences of any individual in the assessment of sustainable forest management at the stand level. The elicitation procedure was based on the comparison of the sustainability of pairs of forest locations. A sustainability map of the whole territory was obtained according to the individual's preferences. Three forest sustainability indicators were pre-calculated for each point in a study area in a Scots pine forest in the National Park of Sierra de Guadarrama in the Madrid Region in Spain to obtain the best management plan with the sustainability map. We followed a participatory process involving fifty people to assess the sustainability of the forest management and the methodology. The results highlighted the demand for conservative forest management, the usefulness of the methodology for managers, and the importance and necessity of incorporating stakeholders into forestry decision-making processes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Application of self-organizing maps to the study of U-Zr-Ti-Nb distribution in sandstone-hosted uranium ores

    NASA Astrophysics Data System (ADS)

    Klus, Jakub; Pořízka, Pavel; Prochazka, David; Mikysek, Petr; Novotný, Jan; Novotný, Karel; Slobodník, Marek; Kaiser, Jozef

    2017-05-01

    This paper presents a novel approach for processing the spectral information obtained from high-resolution elemental mapping performed by means of Laser-Induced Breakdown Spectroscopy. The proposed methodology is aimed at the description of possible elemental associations within a heterogeneous sample. High-resolution elemental mapping provides a large number of measurements. Moreover, typical laser-induced plasma spectrum consists of several thousands of spectral variables. Analysis of heterogeneous samples, where valuable information is hidden in a limited fraction of sample mass, requires special treatment. The sample under study is a sandstone-hosted uranium ore that shows irregular distribution of ore elements such as zirconium, titanium, uranium and niobium. Presented processing methodology shows the way to reduce the dimensionality of data and retain the spectral information by utilizing self-organizing maps (SOM). The spectral information from SOM is processed further to detect either simultaneous or isolated presence of elements. Conclusions suggested by SOM are in good agreement with geological studies of mineralization phases performed at the deposit. Even deeper investigation of the SOM results enables discrimination of interesting measurements and reveals new possibilities in the visualization of chemical mapping information. Suggested approach improves the description of elemental associations in mineral phases, which is crucial for the mining industry.

  3. Does Value Stream Mapping affect the structure, process, and outcome quality in care facilities? A systematic review.

    PubMed

    Nowak, Marina; Pfaff, Holger; Karbach, Ute

    2017-08-24

    Quality improvement within health and social care facilities is needed and has to be evidence-based and patient-centered. Value Stream Mapping, a method of Lean management, aims to increase the patients' value and quality of care by a visualization and quantification of the care process. The aim of this research is to examine the effectiveness of Value Stream Mapping on structure, process, and outcome quality in care facilities. A systematic review is conducted. PubMed, EBSCOhost, including Business Source Complete, Academic Search Complete, PSYCInfo, PSYNDX, SocINDEX with Full Text, Web of Knowledge, and EMBASE ScienceDirect are searched in February 2016. All peer-reviewed papers evaluating Value Stream Mapping and published in English or German from January 2000 are included. For data synthesis, all study results are categorized into Donabedian's model of structure, process, and outcome quality. To assess and interpret the effectiveness of Value Stream Mapping, the frequencies of the results statistically examined are considered. Of the 903 articles retrieved, 22 studies fulfill the inclusion criteria. Of these, 11 studies are used to answer the research question. Value Stream Mapping has positive effects on the time dimension of process and outcome quality. It seems to reduce non-value-added time (e.g., waiting time) and length of stay. All study designs are before and after studies without control, and methodologically sophisticated studies are missing. For a final conclusion about Value Stream Mapping's effectiveness, more research with improved methodology is needed. Despite this lack of evidence, Value Stream Mapping has the potential to improve quality of care on the time dimension. The contextual influence has to be investigated to make conclusions about the relationship between different quality domains when applying Value Stream Mapping. However, for using this review's conclusion, the limitation of including heterogeneous and potentially biased results has to be considered.

  4. Georeferenced LiDAR 3D vine plantation map generation.

    PubMed

    Llorens, Jordi; Gil, Emilio; Llop, Jordi; Queraltó, Meritxell

    2011-01-01

    The use of electronic devices for canopy characterization has recently been widely discussed. Among such devices, LiDAR sensors appear to be the most accurate and precise. Information obtained with LiDAR sensors during reading while driving a tractor along a crop row can be managed and transformed into canopy density maps by evaluating the frequency of LiDAR returns. This paper describes a proposed methodology to obtain a georeferenced canopy map by combining the information obtained with LiDAR with that generated using a GPS receiver installed on top of a tractor. Data regarding the velocity of LiDAR measurements and UTM coordinates of each measured point on the canopy were obtained by applying the proposed transformation process. The process allows overlap of the canopy density map generated with the image of the intended measured area using Google Earth(®), providing accurate information about the canopy distribution and/or location of damage along the rows. This methodology was applied and tested on different vine varieties and crop stages in two important vine production areas in Spain. The results indicate that the georeferenced information obtained with LiDAR sensors appears to be an interesting tool with the potential to improve crop management processes.

  5. Updating National Topographic Data Base Using Change Detection Methods

    NASA Astrophysics Data System (ADS)

    Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.

    2016-06-01

    The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.

  6. Integration of Value Stream Map and Healthcare Failure Mode and Effect Analysis into Six Sigma Methodology to Improve Process of Surgical Specimen Handling.

    PubMed

    Hung, Sheng-Hui; Wang, Pa-Chun; Lin, Hung-Chun; Chen, Hung-Ying; Su, Chao-Ton

    2015-01-01

    Specimen handling is a critical patient safety issue. Problematic handling process, such as misidentification (of patients, surgical site, and specimen counts), specimen loss, or improper specimen preparation can lead to serious patient harms and lawsuits. Value stream map (VSM) is a tool used to find out non-value-added works, enhance the quality, and reduce the cost of the studied process. On the other hand, healthcare failure mode and effect analysis (HFMEA) is now frequently employed to avoid possible medication errors in healthcare process. Both of them have a goal similar to Six Sigma methodology for process improvement. This study proposes a model that integrates VSM and HFMEA into the framework, which mainly consists of define, measure, analyze, improve, and control (DMAIC), of Six Sigma. A Six Sigma project for improving the process of surgical specimen handling in a hospital was conducted to demonstrate the effectiveness of the proposed model.

  7. Introduction to a special issue on concept mapping.

    PubMed

    Trochim, William M; McLinden, Daniel

    2017-02-01

    Concept mapping was developed in the 1980s as a unique integration of qualitative (group process, brainstorming, unstructured sorting, interpretation) and quantitative (multidimensional scaling, hierarchical cluster analysis) methods designed to enable a group of people to articulate and depict graphically a coherent conceptual framework or model of any topic or issue of interest. This introduction provides the basic definition and description of the methodology for the newcomer and describes the steps typically followed in its most standard canonical form (preparation, generation, structuring, representation, interpretation and utilization). It also introduces this special issue which reviews the history of the methodology, describes its use in a variety of contexts, shows the latest ways it can be integrated with other methodologies, considers methodological advances and developments, and sketches a vision of the future of the method's evolution. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Research Methodologies Explored for a Paradigm Shift in University Teaching.

    ERIC Educational Resources Information Center

    Venter, I. M.; Blignaut, R. J.; Stoltz, D.

    2001-01-01

    Innovative teaching methods such as collaborative learning, teamwork, and mind maps were introduced to teach computer science and statistics courses at a South African university. Soft systems methodology was adapted and used to manage the research process of evaluating the effectiveness of the teaching methods. This research method provided proof…

  9. Mapping knowledge translation and innovation processes in Cancer Drug Development: the case of liposomal doxorubicin.

    PubMed

    Fajardo-Ortiz, David; Duran, Luis; Moreno, Laura; Ochoa, Hector; Castaño, Victor M

    2014-09-03

    We explored how the knowledge translation and innovation processes are structured when theyresult in innovations, as in the case of liposomal doxorubicin research. In order to map the processes, a literature network analysis was made through Cytoscape and semantic analysis was performed by GOPubmed which is based in the controlled vocabularies MeSH (Medical Subject Headings) and GO (Gene Ontology). We found clusters related to different stages of the technological development (invention, innovation and imitation) and the knowledge translation process (preclinical, translational and clinical research), and we were able to map the historic emergence of Doxil as a paradigmatic nanodrug. This research could be a powerful methodological tool for decision-making and innovation management in drug delivery research.

  10. Automated Recognition of Vegetation and Water Bodies on the Territory of Megacities in Satellite Images of Visible and IR Bands

    NASA Astrophysics Data System (ADS)

    Mozgovoy, Dmitry k.; Hnatushenko, Volodymyr V.; Vasyliev, Volodymyr V.

    2018-04-01

    Vegetation and water bodies are a fundamental element of urban ecosystems, and water mapping is critical for urban and landscape planning and management. A methodology of automated recognition of vegetation and water bodies on the territory of megacities in satellite images of sub-meter spatial resolution of the visible and IR bands is proposed. By processing multispectral images from the satellite SuperView-1A, vector layers of recognized plant and water objects were obtained. Analysis of the results of image processing showed a sufficiently high accuracy of the delineation of the boundaries of recognized objects and a good separation of classes. The developed methodology provides a significant increase of the efficiency and reliability of updating maps of large cities while reducing financial costs. Due to the high degree of automation, the proposed methodology can be implemented in the form of a geo-information web service functioning in the interests of a wide range of public services and commercial institutions.

  11. Evolutionary Maps: A New Model for the Analysis of Conceptual Development, with Application to the Diurnal Cycle

    ERIC Educational Resources Information Center

    Navarro, Manuel

    2014-01-01

    This paper presents a model of how children generate concrete concepts from perception through processes of differentiation and integration. The model informs the design of a novel methodology ("evolutionary maps" or "emaps"), whose implementation on certain domains unfolds the web of itineraries that children may follow in the…

  12. A national scale flood hazard mapping methodology: The case of Greece - Protection and adaptation policy approaches.

    PubMed

    Kourgialas, Nektarios N; Karatzas, George P

    2017-12-01

    The present work introduces a national scale flood hazard assessment methodology, using multi-criteria analysis and artificial neural networks (ANNs) techniques in a GIS environment. The proposed methodology was applied in Greece, where flash floods are a relatively frequent phenomenon and it has become more intense over the last decades, causing significant damages in rural and urban sectors. In order the most prone flooding areas to be identified, seven factor-maps (that are directly related to flood generation) were combined in a GIS environment. These factor-maps are: a) the Flow accumulation (F), b) the Land use (L), c) the Altitude (A), b) the Slope (S), e) the soil Erodibility (E), f) the Rainfall intensity (R), and g) the available water Capacity (C). The name to the proposed method is "FLASERC". The flood hazard for each one of these factors is classified into five categories: Very low, low, moderate, high, and very high. The above factors are combined and processed using the appropriate ANN algorithm tool. For the ANN training process spatial distribution of historical flooded points in Greece within the five different flood hazard categories of the aforementioned seven factor-maps were combined. In this way, the overall flood hazard map for Greece was determined. The final results are verified using additional historical flood events that have occurred in Greece over the last 100years. In addition, an overview of flood protection measures and adaptation policy approaches were proposed for agricultural and urban areas located at very high flood hazard areas. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Ohio's Abandoned Mine Lands Reclamation Program: a Study of Data Collection and Evaluation Techniques

    NASA Technical Reports Server (NTRS)

    Sperry, S. L.

    1982-01-01

    The planning process for a statewide reclamation plan of Ohio abandoned minelands in response to the Federal Surface Mining Control and Reclamation Act of 1977 included: (1) the development of a screening and ranking methodology; (2) the establishment of a statewide review of major watersheds affected by mining; (3) the development of an immediate action process; and (4) a prototypical study of a priority watershed demonstrating the data collection, analysis, display and evaluation to be used for the remaining state watersheds. Historical methods for satisfying map information analysis and evaluation, as well as current methodologies being used were discussed. Various computer mapping and analysis programs were examined for their usability in evaluating the priority reclamation sites. Hand methods were chosen over automated procedures; intuitive evaluation was the primary reason.

  14. A Multivariate Methodological Workflow for the Analysis of FTIR Chemical Mapping Applied on Historic Paint Stratigraphies

    PubMed Central

    Sciutto, Giorgia; Oliveri, Paolo; Catelli, Emilio; Bonacini, Irene

    2017-01-01

    In the field of applied researches in heritage science, the use of multivariate approach is still quite limited and often chemometric results obtained are often underinterpreted. Within this scenario, the present paper is aimed at disseminating the use of suitable multivariate methodologies and proposes a procedural workflow applied on a representative group of case studies, of considerable importance for conservation purposes, as a sort of guideline on the processing and on the interpretation of this FTIR data. Initially, principal component analysis (PCA) is performed and the score values are converted into chemical maps. Successively, the brushing approach is applied, demonstrating its usefulness for a deep understanding of the relationships between the multivariate map and PC score space, as well as for the identification of the spectral bands mainly involved in the definition of each area localised within the score maps. PMID:29333162

  15. Body Mapping as a Youth Sexual Health Intervention and Data Collection Tool.

    PubMed

    Lys, Candice; Gesink, Dionne; Strike, Carol; Larkin, June

    2018-06-01

    In this article, we describe and evaluate body mapping as (a) an arts-based activity within Fostering Open eXpression Among Youth (FOXY), an educational intervention targeting Northwest Territories (NWT) youth, and (b) a research data collection tool. Data included individual interviews with 41 female participants (aged 13-17 years) who attended FOXY body mapping workshops in six communities in 2013, field notes taken by the researcher during the workshops and interviews, and written reflections from seven FOXY facilitators on the body mapping process (from 2013 to 2016). Thematic analysis explored the utility of body mapping using a developmental evaluation methodology. The results show body mapping is an intervention tool that supports and encourages participant self-reflection, introspection, personal connectedness, and processing difficult emotions. Body mapping is also a data collection catalyst that enables trust and youth voice in research, reduces verbal communication barriers, and facilitates the collection of rich data regarding personal experiences.

  16. Applying a contemporary grounded theory methodology.

    PubMed

    Licqurish, Sharon; Seibold, Carmel

    2011-01-01

    The aim of this paper is to discuss the application of a contemporary grounded theory methodology to a research project exploring the experiences of students studying for a degree in midwifery. Grounded theory is a qualitative research approach developed by Glaser and Strauss in the 1950s but the methodology for this study was modelled on Clarke's (2005) approach and was underpinned by a symbolic interactionist theoretical perspective, post-structuralist theories of Michel Foucault and a constructionist epistemology. The study participants were 19 midwifery students completing their final placement. Data were collected through individual in-depth interviews and participant observation, and analysed using the grounded theory analysis techniques of coding, constant comparative analysis and theoretical sampling, as well as situational maps. The analysis focused on social action and interaction and the operation of power in the students' environment. The social process in which the students were involved, as well as the actors and discourses that affected the students' competency development, were highlighted. The methodology allowed a thorough exploration of the students' experiences of achieving competency. However, some difficulties were encountered. One of the major issues related to the understanding and application of complex sociological theories that challenged positivist notions of truth and power. Furthermore, the mapping processes were complex. Despite these minor challenges, the authors recommend applying this methodology to other similar research projects.

  17. MapEdit: solution to continuous raster map creation

    NASA Astrophysics Data System (ADS)

    Rančić, Dejan; Djordjevi-Kajan, Slobodanka

    2003-03-01

    The paper describes MapEdit, MS Windows TM software for georeferencing and rectification of scanned paper maps. The software produces continuous raster maps which can be used as background in geographical information systems. Process of continuous raster map creation using MapEdit "mosaicking" function is also described as well as the georeferencing and rectification algorithms which are used in MapEdit. Our approach for georeferencing and rectification using four control points and two linear transformations for each scanned map part, together with nearest neighbor resampling method, represents low cost—high speed solution that produce continuous raster maps with satisfactory quality for many purposes (±1 pixel). Quality assessment of several continuous raster maps at different scales that have been created using our software and methodology, has been undertaken and results are presented in the paper. For the quality control of the produced raster maps we referred to three wide adopted standards: US Standard for Digital Cartographic Data, National Standard for Spatial Data Accuracy and US National Map Accuracy Standard. The results obtained during the quality assessment process are given in the paper and show that our maps meat all three standards.

  18. Uncertainties in ecosystem service maps: a comparison on the European scale.

    PubMed

    Schulp, Catharina J E; Burkhard, Benjamin; Maes, Joachim; Van Vliet, Jasper; Verburg, Peter H

    2014-01-01

    Safeguarding the benefits that ecosystems provide to society is increasingly included as a target in international policies. To support such policies, ecosystem service maps are made. However, there is little attention for the accuracy of these maps. We made a systematic review and quantitative comparison of ecosystem service maps on the European scale to generate insights in the uncertainty of ecosystem service maps and discuss the possibilities for quantitative validation. Maps of climate regulation and recreation were reasonably similar while large uncertainties among maps of erosion protection and flood regulation were observed. Pollination maps had a moderate similarity. Differences among the maps were caused by differences in indicator definition, level of process understanding, mapping aim, data sources and methodology. Absence of suitable observed data on ecosystem services provisioning hampers independent validation of the maps. Consequently, there are, so far, no accurate measures for ecosystem service map quality. Policy makers and other users need to be cautious when applying ecosystem service maps for decision-making. The results illustrate the need for better process understanding and data acquisition to advance ecosystem service mapping, modelling and validation.

  19. Use of lean and six sigma methodology to improve operating room efficiency in a high-volume tertiary-care academic medical center.

    PubMed

    Cima, Robert R; Brown, Michael J; Hebl, James R; Moore, Robin; Rogers, James C; Kollengode, Anantha; Amstutz, Gwendolyn J; Weisbrod, Cheryl A; Narr, Bradly J; Deschamps, Claude

    2011-07-01

    Operating rooms (ORs) are resource-intense and costly hospital units. Maximizing OR efficiency is essential to maintaining an economically viable institution. OR efficiency projects often focus on a limited number of ORs or cases. Efforts across an entire OR suite have not been reported. Lean and Six Sigma methodologies were developed in the manufacturing industry to increase efficiency by eliminating non-value-added steps. We applied Lean and Six Sigma methodologies across an entire surgical suite to improve efficiency. A multidisciplinary surgical process improvement team constructed a value stream map of the entire surgical process from the decision for surgery to discharge. Each process step was analyzed in 3 domains, ie, personnel, information processed, and time. Multidisciplinary teams addressed 5 work streams to increase value at each step: minimizing volume variation; streamlining the preoperative process; reducing nonoperative time; eliminating redundant information; and promoting employee engagement. Process improvements were implemented sequentially in surgical specialties. Key performance metrics were collected before and after implementation. Across 3 surgical specialties, process redesign resulted in substantial improvements in on-time starts and reduction in number of cases past 5 pm. Substantial gains were achieved in nonoperative time, staff overtime, and ORs saved. These changes resulted in substantial increases in margin/OR/day. Use of Lean and Six Sigma methodologies increased OR efficiency and financial performance across an entire operating suite. Process mapping, leadership support, staff engagement, and sharing performance metrics are keys to enhancing OR efficiency. The performance gains were substantial, sustainable, positive financially, and transferrable to other specialties. Copyright © 2011 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  20. Linear programming model to develop geodiversity map using utility theory

    NASA Astrophysics Data System (ADS)

    Sepehr, Adel

    2015-04-01

    In this article, the classification and mapping of geodiversity based on a quantitative methodology was accomplished using linear programming, the central idea of which being that geosites and geomorphosites as main indicators of geodiversity can be evaluated by utility theory. A linear programming method was applied for geodiversity mapping over Khorasan-razavi province located in eastern north of Iran. In this route, the main criteria for distinguishing geodiversity potential in the studied area were considered regarding rocks type (lithology), faults position (tectonic process), karst area (dynamic process), Aeolian landforms frequency and surface river forms. These parameters were investigated by thematic maps including geology, topography and geomorphology at scales 1:100'000, 1:50'000 and 1:250'000 separately, imagery data involving SPOT, ETM+ (Landsat 7) and field operations directly. The geological thematic layer was simplified from the original map using a practical lithologic criterion based on a primary genetic rocks classification representing metamorphic, igneous and sedimentary rocks. The geomorphology map was provided using DEM at scale 30m extracted by ASTER data, geology and google earth images. The geology map shows tectonic status and geomorphology indicated dynamic processes and landform (karst, Aeolian and river). Then, according to the utility theory algorithms, we proposed a linear programming to classify geodiversity degree in the studied area based on geology/morphology parameters. The algorithm used in the methodology was consisted a linear function to be maximized geodiversity to certain constraints in the form of linear equations. The results of this research indicated three classes of geodiversity potential including low, medium and high status. The geodiversity potential shows satisfied conditions in the Karstic areas and Aeolian landscape. Also the utility theory used in the research has been decreased uncertainty of the evaluations.

  1. An investigation into creative design methodologies for textiles and fashion

    NASA Astrophysics Data System (ADS)

    Gault, Alison

    2017-10-01

    Understanding market intelligence, trends, influences and personal approaches are essential tools for design students to develop their ideas in textiles and fashion. Identifying different personal approaches including, visual, process-led or concept by employing creative methodologies are key to developing a brief. A series of ideas or themes start to emerge and through the design process serve to underpin and inform an entire collection. These investigations ensure that the design collections are able to produce a diverse range of outcomes. Following key structures and coherent stages in the design process creates authentic collections in textiles and fashion. A range of undergraduate students presented their design portfolios (180) and the methodologies employed were mapped against success at module level, industry response and graduate employment.

  2. The Greek National Observatory of Forest Fires (NOFFi)

    NASA Astrophysics Data System (ADS)

    Tompoulidou, Maria; Stefanidou, Alexandra; Grigoriadis, Dionysios; Dragozi, Eleni; Stavrakoudis, Dimitris; Gitas, Ioannis Z.

    2016-08-01

    Efficient forest fire management is a key element for alleviating the catastrophic impacts of wildfires. Overall, the effective response to fire events necessitates adequate planning and preparedness before the start of the fire season, as well as quantifying the environmental impacts in case of wildfires. Moreover, the estimation of fire danger provides crucial information required for the optimal allocation and distribution of the available resources. The Greek National Observatory of Forest Fires (NOFFi)—established by the Greek Forestry Service in collaboration with the Laboratory of Forest Management and Remote Sensing of the Aristotle University of Thessaloniki and the International Balkan Center—aims to develop a series of modern products and services for supporting the efficient forest fire prevention management in Greece and the Balkan region, as well as to stimulate the development of transnational fire prevention and impacts mitigation policies. More specifically, NOFFi provides three main fire-related products and services: a) a remote sensing-based fuel type mapping methodology, b) a semi-automatic burned area mapping service, and c) a dynamically updatable fire danger index providing mid- to long-term predictions. The fuel type mapping methodology was developed and applied across the country, following an object-oriented approach and using Landsat 8 OLI satellite imagery. The results showcase the effectiveness of the generated methodology in obtaining highly accurate fuel type maps on a national level. The burned area mapping methodology was developed as a semi-automatic object-based classification process, carefully crafted to minimize user interaction and, hence, be easily applicable on a near real-time operational level as well as for mapping historical events. NOFFi's products can be visualized through the interactive Fire Forest portal, which allows the involvement and awareness of the relevant stakeholders via the Public Participation GIS (PPGIS) tool.

  3. Functional-to-form mapping for assembly design automation

    NASA Astrophysics Data System (ADS)

    Xu, Z. G.; Liu, W. M.; Shen, W. D.; Yang, D. Y.; Liu, T. T.

    2017-11-01

    Assembly-level function-to-form mapping is the most effective procedure towards design automation. The research work mainly includes: the assembly-level function definitions, product network model and the two-step mapping mechanisms. The function-to-form mapping is divided into two steps, i.e. mapping of function-to-behavior, called the first-step mapping, and the second-step mapping, i.e. mapping of behavior-to-structure. After the first step mapping, the three dimensional transmission chain (or 3D sketch) is studied, and the feasible design computing tools are developed. The mapping procedure is relatively easy to be implemented interactively, but, it is quite difficult to finish it automatically. So manual, semi-automatic, automatic and interactive modification of the mapping model are studied. A mechanical hand F-F mapping process is illustrated to verify the design methodologies.

  4. Developing Tsunami Evacuation Plans, Maps, And Procedures: Pilot Project in Central America

    NASA Astrophysics Data System (ADS)

    Arcos, N. P.; Kong, L. S. L.; Arcas, D.; Aliaga, B.; Coetzee, D.; Leonard, J.

    2015-12-01

    In the End-to-End tsunami warning chain, once a forecast is provided and a warning alert issued, communities must know what to do and where to go. The 'where to' answer would be reliable and practical community-level tsunami evacuation maps. Following the Exercise Pacific Wave 2011, a questionnaire was sent to the 46 Member States of Pacific Tsunami Warning System (PTWS). The results revealed over 42 percent of Member States lacked tsunami mass coastal evacuation plans. Additionally, a significant gap in mapping was exposed as over 55 percent of Member States lacked tsunami evacuation maps, routes, signs and assembly points. Thereby, a significant portion of countries in the Pacific lack appropriate tsunami planning and mapping for their at-risk coastal communities. While a variety of tools exist to establish tsunami inundation areas, these are inconsistent while a methodology has not been developed to assist countries develop tsunami evacuation maps, plans, and procedures. The International Tsunami Information Center (ITIC) and partners is leading a Pilot Project in Honduras demonstrating that globally standardized tools and methodologies can be applied by a country, with minimal tsunami warning and mitigation resources, towards the determination of tsunami inundation areas and subsequently community-owned tsunami evacuation maps and plans for at-risk communities. The Pilot involves a 1- to 2-year long process centered on a series of linked tsunami training workshops on: evacuation planning, evacuation map development, inundation modeling and map creation, tsunami warning & emergency response Standard Operating Procedures (SOPs), and conducting tsunami exercises (including evacuation). The Pilot's completion is capped with a UNESCO/IOC document so that other countries can replicate the process in their tsunami-prone communities.

  5. Evolutionary Maps: A new model for the analysis of conceptual development, with application to the diurnal cycle

    NASA Astrophysics Data System (ADS)

    Navarro, Manuel

    2014-05-01

    This paper presents a model of how children generate concrete concepts from perception through processes of differentiation and integration. The model informs the design of a novel methodology (evolutionary maps or emaps), whose implementation on certain domains unfolds the web of itineraries that children may follow in the construction of concrete conceptual knowledge and pinpoints, for each conception, the architecture of the conceptual change that leads to the scientific concept. Remarkably, the generative character of its syntax yields conceptions that, if unknown, amount to predictions that can be tested experimentally. Its application to the diurnal cycle (including the sun's trajectory in the sky) indicates that the model is correct and the methodology works (in some domains). Specifically, said emap predicts a number of exotic trajectories of the sun in the sky that, in the experimental work, were drawn spontaneously both on paper and a dome. Additionally, the application of the emaps theoretical framework in clinical interviews has provided new insight into other cognitive processes. The field of validity of the methodology and its possible applications to science education are discussed.

  6. Description and validation of an automated methodology for mapping mineralogy, vegetation, and hydrothermal alteration type from ASTER satellite imagery with examples from the San Juan Mountains, Colorado

    USGS Publications Warehouse

    Rockwell, Barnaby W.

    2012-01-01

    The efficacy of airborne spectroscopic, or "hyperspectral," remote sensing for geoenvironmental watershed evaluations and deposit-scale mapping of exposed mineral deposits has been demonstrated. However, the acquisition, processing, and analysis of such airborne data at regional and national scales can be time and cost prohibitive. The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) sensor carried by the NASA Earth Observing System Terra satellite was designed for mineral mapping and the acquired data can be efficiently used to generate uniform mineral maps over very large areas. Multispectral remote sensing data acquired by the ASTER sensor were analyzed to identify and map minerals, mineral groups, hydrothermal alteration types, and vegetation groups in the western San Juan Mountains, Colorado, including the Silverton and Lake City calderas. This mapping was performed in support of multidisciplinary studies involving the predictive modeling of surface water geochemistry at watershed and regional scales. Detailed maps of minerals, vegetation groups, and water were produced from an ASTER scene using spectroscopic, expert system-based analysis techniques which have been previously described. New methodologies are presented for the modeling of hydrothermal alteration type based on the Boolean combination of the detailed mineral maps, and for the entirely automated mapping of alteration types, mineral groups, and green vegetation. Results of these methodologies are compared with the more detailed maps and with previously published mineral mapping results derived from analysis of high-resolution spectroscopic data acquired by the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) sensor. Such comparisons are also presented for other mineralized and (or) altered areas including the Goldfield and Cuprite mining districts, Nevada and the central Marysvale volcanic field, Wah Wah Mountains, and San Francisco Mountains, Utah. The automated mineral group mapping products described in this study are ideal for application to mineral resource and mineral-environmental assessments at regional and national scales.

  7. The use of concept mapping in measurement development and evaluation: Application and future directions.

    PubMed

    Rosas, Scott R; Ridings, John W

    2017-02-01

    The past decade has seen an increase of measurement development research in social and health sciences that featured the use of concept mapping as a core technique. The purpose, application, and utility of concept mapping have varied across this emerging literature. Despite the variety of uses and range of outputs, little has been done to critically review how researchers have approached the application of concept mapping in the measurement development and evaluation process. This article focuses on a review of the current state of practice regarding the use of concept mapping as methodological tool in this process. We systematically reviewed 23 scale or measure development and evaluation studies, and detail the application of concept mapping in the context of traditional measurement development and psychometric testing processes. Although several limitations surfaced, we found several strengths in the contemporary application of the method. We determined concept mapping provides (a) a solid method for establishing content validity, (b) facilitates researcher decision-making, (c) insight into target population perspectives that are integrated a priori, and (d) a foundation for analytical and interpretative choices. Based on these results, we outline how concept mapping can be situated in the measurement development and evaluation processes for new instrumentation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Digital Methodology to implement the ECOUTER engagement process.

    PubMed

    Wilson, Rebecca C; Butters, Oliver W; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J

    2016-01-01

    ECOUTER ( E mploying CO ncept u al schema for policy and T ranslation E  in R esearch - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes.

  9. Body Mapping as a Youth Sexual Health Intervention and Data Collection Tool

    PubMed Central

    Lys, Candice; Gesink, Dionne; Strike, Carol; Larkin, June

    2018-01-01

    In this article, we describe and evaluate body mapping as (a) an arts-based activity within Fostering Open eXpression Among Youth (FOXY), an educational intervention targeting Northwest Territories (NWT) youth, and (b) a research data collection tool. Data included individual interviews with 41 female participants (aged 13–17 years) who attended FOXY body mapping workshops in six communities in 2013, field notes taken by the researcher during the workshops and interviews, and written reflections from seven FOXY facilitators on the body mapping process (from 2013 to 2016). Thematic analysis explored the utility of body mapping using a developmental evaluation methodology. The results show body mapping is an intervention tool that supports and encourages participant self-reflection, introspection, personal connectedness, and processing difficult emotions. Body mapping is also a data collection catalyst that enables trust and youth voice in research, reduces verbal communication barriers, and facilitates the collection of rich data regarding personal experiences. PMID:29303048

  10. A hierarchical modeling methodology for the definition and selection of requirements

    NASA Astrophysics Data System (ADS)

    Dufresne, Stephane

    This dissertation describes the development of a requirements analysis methodology that takes into account the concept of operations and the hierarchical decomposition of aerospace systems. At the core of the methodology, the Analytic Network Process (ANP) is used to ensure the traceability between the qualitative and quantitative information present in the hierarchical model. The proposed methodology is implemented to the requirements definition of a hurricane tracker Unmanned Aerial Vehicle. Three research objectives are identified in this work; (1) improve the requirements mapping process by matching the stakeholder expectations with the concept of operations, systems and available resources; (2) reduce the epistemic uncertainty surrounding the requirements and requirements mapping; and (3) improve the requirements down-selection process by taking into account the level of importance of the criteria and the available resources. Several challenges are associated with the identification and definition of requirements. The complexity of the system implies that a large number of requirements are needed to define the systems. These requirements are defined early in the conceptual design, where the level of knowledge is relatively low and the level of uncertainty is large. The proposed methodology intends to increase the level of knowledge and reduce the level of uncertainty by guiding the design team through a structured process. To address these challenges, a new methodology is created to flow-down the requirements from the stakeholder expectations to the systems alternatives. A taxonomy of requirements is created to classify the information gathered during the problem definition. Subsequently, the operational and systems functions and measures of effectiveness are integrated to a hierarchical model to allow the traceability of the information. Monte Carlo methods are used to evaluate the variations of the hierarchical model elements and consequently reduce the epistemic uncertainty. The proposed methodology is applied to the design of a hurricane tracker Unmanned Aerial Vehicles to demonstrate the origin and impact of requirements on the concept of operations and systems alternatives. This research demonstrates that the hierarchical modeling methodology provides a traceable flow-down of the requirements from the problem definition to the systems alternatives phases of conceptual design.

  11. Groundwater pollution risk assessment. Application to different carbonate aquifers in south Spain

    NASA Astrophysics Data System (ADS)

    Jimenez Madrid, A.; Martinez Navarrete, C.; Carrasco Cantos, F.

    2009-04-01

    Water protection has been considered one of the most important environmental goals in the European politics since the 2000/60/CE Water Framework Directive came into force in 2000, and more specifically in 2006 with the 2006/118/CE Directive on groundwater protection. As one of the necessary requirements to tackle groundwater protection, a pollution risk assessment has been made through the analysis of both the existing hazard human activities map and the intrinsic aquifer vulnerability map, by applying the methodologies proposed by COST Action 620 in an experimental study site in south Spain containing different carbonated aquifers, which supply 8 towns ranging from 2000 to 2500 inhabitants. In order to generate both maps it was necessary to make a field inventory over a 1:10000 topographic base map, followed by Geographic Information System (GIS) processing. The outcome maps show a clear spatial distribution of both pollution risk and intrinsic vulnerability of the carbonated aquifers studied. As a final result, a map of the intensity of groundwater pollution risk is presented, representing and important base for the development of a proper methodology for the protection of groundwater resources for human consumption protection. Keywords. Hazard, Vulnerability, Risk, SIG, Protection

  12. GIS methodology for geothermal play fairway analysis: Example from the Snake River Plain volcanic province

    USGS Publications Warehouse

    DeAngelo, Jacob; Shervais, John W.; Glen, Jonathan; Nielson, Dennis L.; Garg, Sabodh; Dobson, Patrick; Gasperikova, Erika; Sonnenthal, Eric; Visser, Charles; Liberty, Lee M.; Siler, Drew; Evans, James P.; Santellanes, Sean

    2016-01-01

    Play fairway analysis in geothermal exploration derives from a systematic methodology originally developed within the petroleum industry and is based on a geologic and hydrologic framework of identified geothermal systems. We are tailoring this methodology to study the geothermal resource potential of the Snake River Plain and surrounding region. This project has contributed to the success of this approach by cataloging the critical elements controlling exploitable hydrothermal systems, establishing risk matrices that evaluate these elements in terms of both probability of success and level of knowledge, and building automated tools to process results. ArcGIS was used to compile a range of different data types, which we refer to as ‘elements’ (e.g., faults, vents, heatflow…), with distinct characteristics and confidence values. Raw data for each element were transformed into data layers with a common format. Because different data types have different uncertainties, each evidence layer had an accompanying confidence layer, which reflects spatial variations in these uncertainties. Risk maps represent the product of evidence and confidence layers, and are the basic building blocks used to construct Common Risk Segment (CRS) maps for heat, permeability, and seal. CRS maps quantify the variable risk associated with each of these critical components. In a final step, the three CRS maps were combined into a Composite Common Risk Segment (CCRS) map for analysis that reveals favorable areas for geothermal exploration. Python scripts were developed to automate data processing and to enhance the flexibility of the data analysis. Python scripting provided the structure that makes a custom workflow possible. Nearly every tool available in the ArcGIS ArcToolbox can be executed using commands in the Python programming language. This enabled the construction of a group of tools that could automate most of the processing for the project. Currently, our tools are repeatable, scalable, modifiable, and transferrable, allowing us to automate the task of data analysis and the production of CRS and CCRS maps. Our ultimate goal is to produce a toolkit that can be imported into ArcGIS and applied to any geothermal play type, with fully tunable parameters that will allow for the production of multiple versions of the CRS and CCRS maps in order to better test for sensitivity and to validate results.

  13. Participatory methodologies in research with children: creative and innovative approaches.

    PubMed

    Pereira, Viviane Ribeiro; Coimbra, Valéria Cristina Christello; Cardoso, Clarissa de Souza; Oliveira, Naiana Alves; Vieira, Ana Cláudia Garcia; Nobre, Márcia de Oliveira; Nino, Magda Eliete Lamas

    2017-05-18

    To describe the use of participatory methodologies in research with children. Experience report with a qualitative approach, conducted with children between six and eleven years of age, from a municipal school in Pelotas and in the Psychosocial Children and Youth Care Center, in São Lourenço do Sul, both municipalities of the Rio Grande do Sul State. Data collection was based on records made in field and observation diaries, held from April to July 2016. The report pointed out that the Photovoice promoted motivation in the group, in addition to increasing the self-esteem and self-confidence of children. The Five Field Map made it possible to help children express feelings through the game. Photovoice and the Five Field Map are seen as tools that enable new methodological approaches in research with children, facilitating the construction of the proposed activities aimed at innovative and creative research processes in health/nursing.

  14. The Scenario-Based Engineering Process (SEP): a user-centered approach for the development of health care systems.

    PubMed

    Harbison, K; Kelly, J; Burnell, L; Silva, J

    1995-01-01

    The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.

  15. Dissecting delays in trauma care using corporate lean six sigma methodology.

    PubMed

    Parks, Jennifer K; Klein, Jorie; Frankel, Heidi L; Friese, Randall S; Shafi, Shahid

    2008-11-01

    The Institute of Medicine has identified trauma center overcrowding as a crisis. We applied corporate Lean Six Sigma methodology to reduce overcrowding by quantifying patient dwell times in trauma resuscitation units (TRU) and to identify opportunities for reducing them. TRU dwell time of all patients treated at a Level I trauma center were measured prospectively during a 3-month period (n = 1,184). Delays were defined as TRU dwell time >6 hours. Using personnel trained in corporate Lean Six Sigma methodology, we created a detailed process map of patient flow through our TRU and measured time spent at each step prospectively during a 24/7 week-long time study (n = 43). Patients with TRU dwell time below the median (3 hours) were compared with those with longer dwell times to identify opportunities for improvement. TRU delays occurred in 183 of 1,184 trauma patients (15%), and peaked on days with >15 patients or with presence of five simultaneous patients. However, 135 delays (74%) occurred on days when

  16. SU-F-T-250: What Does It Take to Correctly Assess the High Failure Modes of an Advanced Radiotherapy Procedure Such as Stereotactic Body Radiation Therapy?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han, D; Vile, D; Rosu, M

    Purpose: Assess the correct implementation of risk-based methodology of TG 100 to optimize quality management and patient safety procedures for Stereotactic Body Radiation Therapy. Methods: A detailed process map of SBRT treatment procedure was generated by a team of three physicists with varying clinical experience at our institution to assess the potential high-risk failure modes. The probabilities of occurrence (O), severity (S) and detectability (D) for potential failure mode in each step of the process map were assigned by these individuals independently on the scale from1 to 10. The risk priority numbers (RPN) were computed and analyzed. The highest 30more » potential modes from each physicist’s analysis were then compared. Results: The RPN values assessed by the three physicists ranged from 30 to 300. The magnitudes of the RPN values from each physicist were different, and there was no concordance in the highest RPN values recorded by three physicists independently. The 10 highest RPN values belonged to sub steps of CT simulation, contouring and delivery in the SBRT process map. For these 10 highest RPN values, at least two physicists, irrespective of their length of experience had concordance but no general conclusions emerged. Conclusion: This study clearly shows that the risk-based assessment of a clinical process map requires great deal of preparation, group discussions, and participation by all stakeholders. One group albeit physicists cannot effectively implement risk-based methodology proposed by TG100. It should be a team effort in which the physicists can certainly play the leading role. This also corroborates TG100 recommendation that risk-based assessment of clinical processes is a multidisciplinary team effort.« less

  17. Integrating the human element into the systems engineering process and MBSE methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tadros, Michael Samir

    In response to the challenges related to the increasing size and complexity of systems, organizations have recognized the need to integrate human considerations in the beginning stages of systems development. Human Systems Integration (HSI) seeks to accomplish this objective by incorporating human factors within systems engineering (SE) processes and methodologies, which is the focus of this paper. A representative set of HSI methods from multiple sources are organized, analyzed, and mapped to the systems engineering Vee-model. These methods are then consolidated and evaluated against the SE process and Models-Based Systems Engineering (MBSE) methodology to determine where and how they couldmore » integrate within systems development activities in the form of specific enhancements. Overall conclusions based on these evaluations are presented and future research areas are proposed.« less

  18. Topological data analysis of contagion maps for examining spreading processes on networks.

    PubMed

    Taylor, Dane; Klimm, Florian; Harrington, Heather A; Kramár, Miroslav; Mischaikow, Konstantin; Porter, Mason A; Mucha, Peter J

    2015-07-21

    Social and biological contagions are influenced by the spatial embeddedness of networks. Historically, many epidemics spread as a wave across part of the Earth's surface; however, in modern contagions long-range edges-for example, due to airline transportation or communication media-allow clusters of a contagion to appear in distant locations. Here we study the spread of contagions on networks through a methodology grounded in topological data analysis and nonlinear dimension reduction. We construct 'contagion maps' that use multiple contagions on a network to map the nodes as a point cloud. By analysing the topology, geometry and dimensionality of manifold structure in such point clouds, we reveal insights to aid in the modelling, forecast and control of spreading processes. Our approach highlights contagion maps also as a viable tool for inferring low-dimensional structure in networks.

  19. Topological data analysis of contagion maps for examining spreading processes on networks

    NASA Astrophysics Data System (ADS)

    Taylor, Dane; Klimm, Florian; Harrington, Heather A.; Kramár, Miroslav; Mischaikow, Konstantin; Porter, Mason A.; Mucha, Peter J.

    2015-07-01

    Social and biological contagions are influenced by the spatial embeddedness of networks. Historically, many epidemics spread as a wave across part of the Earth's surface; however, in modern contagions long-range edges--for example, due to airline transportation or communication media--allow clusters of a contagion to appear in distant locations. Here we study the spread of contagions on networks through a methodology grounded in topological data analysis and nonlinear dimension reduction. We construct `contagion maps' that use multiple contagions on a network to map the nodes as a point cloud. By analysing the topology, geometry and dimensionality of manifold structure in such point clouds, we reveal insights to aid in the modelling, forecast and control of spreading processes. Our approach highlights contagion maps also as a viable tool for inferring low-dimensional structure in networks.

  20. TESTING TREE-CLASSIFIER VARIANTS AND ALTERNATE MODELING METHODOLOGIES IN THE EAST GREAT BASIN MAPPING UNIT OF THE SOUTHWEST REGIONAL GAP ANALYSIS PROJECT (SW REGAP)

    EPA Science Inventory

    We tested two methods for dataset generation and model construction, and three tree-classifier variants to identify the most parsimonious and thematically accurate mapping methodology for the SW ReGAP project. Competing methodologies were tested in the East Great Basin mapping un...

  1. The Cellular Automata for modelling of spreading of lava flow on the earth surface

    NASA Astrophysics Data System (ADS)

    Jarna, A.

    2012-12-01

    Volcanic risk assessment is a very important scientific, political and economic issue in densely populated areas close to active volcanoes. Development of effective tools for early prediction of a potential volcanic hazard and management of crises are paramount. However, to this date volcanic hazard maps represent the most appropriate way to illustrate the geographical area that can potentially be affected by a volcanic event. Volcanic hazard maps are usually produced by mapping out old volcanic deposits, however dynamic lava flow simulation gaining popularity and can give crucial information to corroborate other methodologies. The methodology which is used here for the generation of volcanic hazard maps is based on numerical simulation of eruptive processes by the principle of Cellular Automata (CA). The python script is integrated into ArcToolbox in ArcMap (ESRI) and the user can select several input and output parameters which influence surface morphology, size and shape of the flow, flow thickness, flow velocity and length of lava flows. Once the input parameters are selected, the software computes and generates hazard maps on the fly. The results can be exported to Google Maps (.klm format) to visualize the results of the computation. For validation of the simulation code are used data from a real lava flow. Comparison of the simulation results with real lava flows mapped out from satellite images will be presented.

  2. SiSeRHMap v1.0: a simulator for mapped seismic response using a hybrid model

    NASA Astrophysics Data System (ADS)

    Grelle, G.; Bonito, L.; Lampasi, A.; Revellino, P.; Guerriero, L.; Sappa, G.; Guadagno, F. M.

    2015-06-01

    SiSeRHMap is a computerized methodology capable of drawing up prediction maps of seismic response. It was realized on the basis of a hybrid model which combines different approaches and models in a new and non-conventional way. These approaches and models are organized in a code-architecture composed of five interdependent modules. A GIS (Geographic Information System) Cubic Model (GCM), which is a layered computational structure based on the concept of lithodynamic units and zones, aims at reproducing a parameterized layered subsoil model. A metamodeling process confers a hybrid nature to the methodology. In this process, the one-dimensional linear equivalent analysis produces acceleration response spectra of shear wave velocity-thickness profiles, defined as trainers, which are randomly selected in each zone. Subsequently, a numerical adaptive simulation model (Spectra) is optimized on the above trainer acceleration response spectra by means of a dedicated Evolutionary Algorithm (EA) and the Levenberg-Marquardt Algorithm (LMA) as the final optimizer. In the final step, the GCM Maps Executor module produces a serial map-set of a stratigraphic seismic response at different periods, grid-solving the calibrated Spectra model. In addition, the spectra topographic amplification is also computed by means of a numerical prediction model. This latter is built to match the results of the numerical simulations related to isolate reliefs using GIS topographic attributes. In this way, different sets of seismic response maps are developed, on which, also maps of seismic design response spectra are defined by means of an enveloping technique.

  3. Improving wait times to care for individuals with multimorbidities and complex conditions using value stream mapping.

    PubMed

    Sampalli, Tara; Desy, Michel; Dhir, Minakshi; Edwards, Lynn; Dickson, Robert; Blackmore, Gail

    2015-04-05

    Recognizing the significant impact of wait times for care for individuals with complex chronic conditions, we applied a LEAN methodology, namely - an adaptation of Value Stream Mapping (VSM) to meet the needs of people with multiple chronic conditions and to improve wait times without additional resources or funding. Over an 18-month time period, staff applied a patient-centric approach that included LEAN methodology of VSM to improve wait times to care. Our framework of evaluation was grounded in the needs and perspectives of patients and individuals waiting to receive care. Patient centric views were obtained through surveys such as Patient Assessment of Chronic Illness Care (PACIC) and process engineering based questions. In addition, LEAN methodology, VSM was added to identify non-value added processes contributing to wait times. The care team successfully reduced wait times to 2 months in 2014 with no wait times for care anticipated in 2015. Increased patient engagement and satisfaction are also outcomes of this innovative initiative. In addition, successful transformations and implementation have resulted in resource efficiencies without increase in costs. Patients have shown significant improvements in functional health following Integrated Chronic Care Service (ICCS) intervention. The methodology will be applied to other chronic disease management areas in Capital Health and the province. Wait times to care in the management of multimoribidities and other complex conditions can add a significant burden not only on the affected individuals but also on the healthcare system. In this study, a novel and modified LEAN methodology has been applied to embed the voice of the patient in care delivery processes and to reduce wait times to care in the management of complex chronic conditions. © 2015 by Kerman University of Medical Sciences.

  4. SiSeRHMap v1.0: a simulator for mapped seismic response using a hybrid model

    NASA Astrophysics Data System (ADS)

    Grelle, Gerardo; Bonito, Laura; Lampasi, Alessandro; Revellino, Paola; Guerriero, Luigi; Sappa, Giuseppe; Guadagno, Francesco Maria

    2016-04-01

    The SiSeRHMap (simulator for mapped seismic response using a hybrid model) is a computerized methodology capable of elaborating prediction maps of seismic response in terms of acceleration spectra. It was realized on the basis of a hybrid model which combines different approaches and models in a new and non-conventional way. These approaches and models are organized in a code architecture composed of five interdependent modules. A GIS (geographic information system) cubic model (GCM), which is a layered computational structure based on the concept of lithodynamic units and zones, aims at reproducing a parameterized layered subsoil model. A meta-modelling process confers a hybrid nature to the methodology. In this process, the one-dimensional (1-D) linear equivalent analysis produces acceleration response spectra for a specified number of site profiles using one or more input motions. The shear wave velocity-thickness profiles, defined as trainers, are randomly selected in each zone. Subsequently, a numerical adaptive simulation model (Emul-spectra) is optimized on the above trainer acceleration response spectra by means of a dedicated evolutionary algorithm (EA) and the Levenberg-Marquardt algorithm (LMA) as the final optimizer. In the final step, the GCM maps executor module produces a serial map set of a stratigraphic seismic response at different periods, grid solving the calibrated Emul-spectra model. In addition, the spectra topographic amplification is also computed by means of a 3-D validated numerical prediction model. This model is built to match the results of the numerical simulations related to isolate reliefs using GIS morphometric data. In this way, different sets of seismic response maps are developed on which maps of design acceleration response spectra are also defined by means of an enveloping technique.

  5. Processing Satellite Imagery To Detect Waste Tire Piles

    NASA Technical Reports Server (NTRS)

    Skiles, Joseph; Schmidt, Cynthia; Wuinlan, Becky; Huybrechts, Catherine

    2007-01-01

    A methodology for processing commercially available satellite spectral imagery has been developed to enable identification and mapping of waste tire piles in California. The California Integrated Waste Management Board initiated the project and provided funding for the method s development. The methodology includes the use of a combination of previously commercially available image-processing and georeferencing software used to develop a model that specifically distinguishes between tire piles and other objects. The methodology reduces the time that must be spent to initially survey a region for tire sites, thereby increasing inspectors and managers time available for remediation of the sites. Remediation is needed because millions of used tires are discarded every year, waste tire piles pose fire hazards, and mosquitoes often breed in water trapped in tires. It should be possible to adapt the methodology to regions outside California by modifying some of the algorithms implemented in the software to account for geographic differences in spectral characteristics associated with terrain and climate. The task of identifying tire piles in satellite imagery is uniquely challenging because of their low reflectance levels: Tires tend to be spectrally confused with shadows and deep water, both of which reflect little light to satellite-borne imaging systems. In this methodology, the challenge is met, in part, by use of software that implements the Tire Identification from Reflectance (TIRe) model. The development of the TIRe model included incorporation of lessons learned in previous research on the detection and mapping of tire piles by use of manual/ visual and/or computational analysis of aerial and satellite imagery. The TIRe model is a computational model for identifying tire piles and discriminating between tire piles and other objects. The input to the TIRe model is the georeferenced but otherwise raw satellite spectral images of a geographic region to be surveyed. The TIRe model identifies the darkest objects in the images and, on the basis of spatial and spectral image characteristics, discriminates against other dark objects, which can include vegetation, some bodies of water, and dark soils. The TIRe model can identify piles of as few as 100 tires. The output of the TIRe model is a binary mask showing areas containing suspected tire piles and spectrally similar features. This mask is overlaid on the original satellite imagery and examined by a trained image analyst, who strives to further discriminate against non-tire objects that the TIRe model tentatively identified as tire piles. After the analyst has made adjustments, the mask is used to create a synoptic, geographically accurate tire-pile survey map, which can be overlaid with a road map and/or any other map or set of georeferenced data, according to a customer s preferences.

  6. Defining process design space for monoclonal antibody cell culture.

    PubMed

    Abu-Absi, Susan Fugett; Yang, LiYing; Thompson, Patrick; Jiang, Canping; Kandula, Sunitha; Schilling, Bernhard; Shukla, Abhinav A

    2010-08-15

    The concept of design space has been taking root as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. During mapping of the process design space, the multidimensional combination of operational variables is studied to quantify the impact on process performance in terms of productivity and product quality. An efficient methodology to map the design space for a monoclonal antibody cell culture process is described. A failure modes and effects analysis (FMEA) was used as the basis for the process characterization exercise. This was followed by an integrated study of the inoculum stage of the process which includes progressive shake flask and seed bioreactor steps. The operating conditions for the seed bioreactor were studied in an integrated fashion with the production bioreactor using a two stage design of experiments (DOE) methodology to enable optimization of operating conditions. A two level Resolution IV design was followed by a central composite design (CCD). These experiments enabled identification of the edge of failure and classification of the operational parameters as non-key, key or critical. In addition, the models generated from the data provide further insight into balancing productivity of the cell culture process with product quality considerations. Finally, process and product-related impurity clearance was evaluated by studies linking the upstream process with downstream purification. Production bioreactor parameters that directly influence antibody charge variants and glycosylation in CHO systems were identified.

  7. Scoping meta-review: introducing a new methodology.

    PubMed

    Sarrami-Foroushani, Pooria; Travaglia, Joanne; Debono, Deborah; Clay-Williams, Robyn; Braithwaite, Jeffrey

    2015-02-01

    For researchers, policymakers, and practitioners facing a new field, undertaking a systematic review can typically present a challenge due to the enormous number of relevant papers. A scoping review is a method suggested for addressing this dilemma; however, scoping reviews present their own challenges. This paper introduces the "scoping meta-review" (SMR) for expanding current methodologies and is based on our experiences in mapping the field of consumer engagement in healthcare. During this process, we developed the novel SMR method. An SMR combines aspects of a scoping review and a meta-review to establish an evidence-based map of a field. Similar to a scoping review, an SMR offers a practical and flexible methodology. However, unlike in a traditional scoping review, only systematic reviews are included. Stages of the SMR include: undertaking a preliminary nonsystematic review; building a search strategy; interrogating academic literature databases; classifying and excluding studies based on titles and abstracts; saving the refined database of references; revising the search strategy; selecting and reviewing the full text papers; and thematically analyzing the selected texts and writing the report. The main benefit of an SMR is to map a new field based on high-level evidence provided by systematic reviews. © 2014 Wiley Periodicals, Inc.

  8. A methodology for small scale rural land use mapping in semi-arid developing countries using orbital imagery. Part 3: Review of land use surveys using orbital imagery in the USA

    NASA Technical Reports Server (NTRS)

    Vangenderen, J. L. (Principal Investigator); Lock, B. F.

    1976-01-01

    The author has identified the following significant results. Techniques of preprocessing, interpretation, classification, and ground truth sampling were studied. It has shown the need for a low cost, low level technology, viable, operational methodology to replace the emphasis given in the U.S. to machine processing, which many developing countries cannot afford, understand, nor implement.

  9. Zones of Difference, Boundaries of Access: Moral Geography and Community Mapping in Abidjan, Côte d'Ivoire.

    PubMed

    Thomann, Matthew

    2016-01-01

    In Abidjan, Côte d'Ivoire, 18% of men who have sex with men (MSM) are HIV-positive. Based on ethnographic research conducted among HIV peer educators and activists in Abidjan, I examine their narratives and hand-drawn maps of city space. I draw on a methodological process of map-making to examine research participants' evaluations of neighborhoods and link these evaluations to debates over national and cultural belonging in Côte d'Ivoire. I suggest a moral geography emerges from the maps and narratives and ask what the bioethical implications of moral geography are in the context of service delivery and activism among sexual minorities.

  10. A Karnaugh map based approach towards systemic reviews and meta-analysis.

    PubMed

    Hassan, Abdul Wahab; Hassan, Ahmad Kamal

    2016-01-01

    Studying meta-analysis and systemic reviews since long had helped us conclude numerous parallel or conflicting studies. Existing studies are presented in tabulated forms which contain appropriate information for specific cases yet it is difficult to visualize. On meta-analysis of data, this can lead to absorption and subsumption errors henceforth having undesirable potential of consecutive misunderstandings in social and operational methodologies. The purpose of this study is to investigate an alternate forum for meta-data presentation that relies on humans' strong pictorial perception capability. Analysis of big-data is assumed to be a complex and daunting task often reserved on the computational powers of machines yet there exist mapping tools which can analyze such data in a hand-handled manner. Data analysis on such scale can benefit from the use of statistical tools like Karnaugh maps where all studies can be put together on a graph based mapping. Such a formulation can lead to more control in observing patterns of research community and analyzing further for uncertainty and reliability metrics. We present a methodological process of converting a well-established study in Health care to its equaling binary representation followed by furnishing values on to a Karnaugh Map. The data used for the studies presented herein is from Burns et al (J Publ Health 34(1):138-148, 2011) consisting of retrospectively collected data sets from various studies on clinical coding data accuracy. Using a customized filtration process, a total of 25 studies were selected for review with no, partial, or complete knowledge of six independent variables thus forming 64 independent cells on a Karnaugh map. The study concluded that this pictorial graphing as expected had helped in simplifying the overview of meta-analysis and systemic reviews.

  11. Attribution of Net Carbon Change by Disturbance Type across Forest Lands of the Continental United States

    NASA Astrophysics Data System (ADS)

    Hagen, S. C.; Harris, N.; Saatchi, S. S.; Domke, G. M.; Woodall, C. W.; Pearson, T.

    2016-12-01

    We generated spatially comprehensive maps of carbon stocks and net carbon changes from US forestlands between 2005 and 2010 and attributed the changes to natural and anthropogenic processes. The prototype system created to produce these maps is designed to assist with national GHG inventories and support decisions associated with land management. Here, we present the results and methodological framework of our analysis. In summary, combining estimates of net C losses and gains results in net carbon change of 269±49 Tg C yr-1 (sink) in the coterminous US forest land, with carbon loss from harvest acting as the predominent source process.

  12. Sustainability in Brazilian Federal Universities

    ERIC Educational Resources Information Center

    Palma, Lisiane Celia; de Oliveira, Lessandra M.; Viacava, Keitiline R.

    2011-01-01

    Purpose: The purpose of this paper is to identify the number of courses related to sustainability offered in bachelor degree programs of business administration in Brazilian federal universities. Design/methodology/approach: An exploratory research was carried out based on a descriptive scope. The process of mapping federal universities in Brazil…

  13. Massive Cloud Computing Processing of P-SBAS Time Series for Displacement Analyses at Large Spatial Scale

    NASA Astrophysics Data System (ADS)

    Casu, F.; de Luca, C.; Lanari, R.; Manunta, M.; Zinno, I.

    2016-12-01

    A methodology for computing surface deformation time series and mean velocity maps of large areas is presented. Our approach relies on the availability of a multi-temporal set of Synthetic Aperture Radar (SAR) data collected from ascending and descending orbits over an area of interest, and also permits to estimate the vertical and horizontal (East-West) displacement components of the Earth's surface. The adopted methodology is based on an advanced Cloud Computing implementation of the Differential SAR Interferometry (DInSAR) Parallel Small Baseline Subset (P-SBAS) processing chain which allows the unsupervised processing of large SAR data volumes, from the raw data (level-0) imagery up to the generation of DInSAR time series and maps. The presented solution, which is highly scalable, has been tested on the ascending and descending ENVISAT SAR archives, which have been acquired over a large area of Southern California (US) that extends for about 90.000 km2. Such an input dataset has been processed in parallel by exploiting 280 computing nodes of the Amazon Web Services Cloud environment. Moreover, to produce the final mean deformation velocity maps of the vertical and East-West displacement components of the whole investigated area, we took also advantage of the information available from external GPS measurements that permit to account for possible regional trends not easily detectable by DInSAR and to refer the P-SBAS measurements to an external geodetic datum. The presented results clearly demonstrate the effectiveness of the proposed approach that paves the way to the extensive use of the available ERS and ENVISAT SAR data archives. Furthermore, the proposed methodology can be particularly suitable to deal with the very huge data flow provided by the Sentinel-1 constellation, thus permitting to extend the DInSAR analyses at a nearly global scale. This work is partially supported by: the DPC-CNR agreement, the EPOS-IP project and the ESA GEP project.

  14. Hybrid optical acoustic seafloor mapping

    NASA Astrophysics Data System (ADS)

    Inglis, Gabrielle

    The oceanographic research and industrial communities have a persistent demand for detailed three dimensional sea floor maps which convey both shape and texture. Such data products are used for archeology, geology, ship inspection, biology, and habitat classification. There are a variety of sensing modalities and processing techniques available to produce these maps and each have their own potential benefits and related challenges. Multibeam sonar and stereo vision are such two sensors with complementary strengths making them ideally suited for data fusion. Data fusion approaches however, have seen only limited application to underwater mapping and there are no established methods for creating hybrid, 3D reconstructions from two underwater sensing modalities. This thesis develops a processing pipeline to synthesize hybrid maps from multi-modal survey data. It is helpful to think of this processing pipeline as having two distinct phases: Navigation Refinement and Map Construction. This thesis extends existing work in underwater navigation refinement by incorporating methods which increase measurement consistency between both multibeam and camera. The result is a self consistent 3D point cloud comprised of camera and multibeam measurements. In map construction phase, a subset of the multi-modal point cloud retaining the best characteristics of each sensor is selected to be part of the final map. To quantify the desired traits of a map several characteristics of a useful map are distilled into specific criteria. The different ways that hybrid maps can address these criteria provides justification for producing them as an alternative to current methodologies. The processing pipeline implements multi-modal data fusion and outlier rejection with emphasis on different aspects of map fidelity. The resulting point cloud is evaluated in terms of how well it addresses the map criteria. The final hybrid maps retain the strengths of both sensors and show significant improvement over the single modality maps and naively assembled multi-modal maps.

  15. Using Lean Six Sigma Methodology to Improve Quality of the Anesthesia Supply Chain in a Pediatric Hospital.

    PubMed

    Roberts, Renée J; Wilson, Ashley E; Quezado, Zenaide

    2017-03-01

    Six Sigma and Lean methodologies are effective quality improvement tools in many health care settings. We applied the DMAIC methodology (define, measure, analyze, improve, control) to address deficiencies in our pediatric anesthesia supply chain. We defined supply chain problems by mapping existing processes and soliciting comments from those involved. We used daily distance walked by anesthesia technicians and number of callouts for missing supplies as measurements that we analyzed before and after implementing improvements (anesthesia cart redesign). We showed improvement in the metrics after those interventions were implemented, and those improvements were sustained and thus controlled 1 year after implementation.

  16. Low cost, multiscale and multi-sensor application for flooded area mapping

    NASA Astrophysics Data System (ADS)

    Giordan, Daniele; Notti, Davide; Villa, Alfredo; Zucca, Francesco; Calò, Fabiana; Pepe, Antonio; Dutto, Furio; Pari, Paolo; Baldo, Marco; Allasia, Paolo

    2018-05-01

    Flood mapping and estimation of the maximum water depth are essential elements for the first damage evaluation, civil protection intervention planning and detection of areas where remediation is needed. In this work, we present and discuss a methodology for mapping and quantifying flood severity over floodplains. The proposed methodology considers a multiscale and multi-sensor approach using free or low-cost data and sensors. We applied this method to the November 2016 Piedmont (northwestern Italy) flood. We first mapped the flooded areas at the basin scale using free satellite data from low- to medium-high-resolution from both the SAR (Sentinel-1, COSMO-Skymed) and multispectral sensors (MODIS, Sentinel-2). Using very- and ultra-high-resolution images from the low-cost aerial platform and remotely piloted aerial system, we refined the flooded zone and detected the most damaged sector. The presented method considers both urbanised and non-urbanised areas. Nadiral images have several limitations, in particular in urbanised areas, where the use of terrestrial images solved this limitation. Very- and ultra-high-resolution images were processed with structure from motion (SfM) for the realisation of 3-D models. These data, combined with an available digital terrain model, allowed us to obtain maps of the flooded area, maximum high water area and damaged infrastructures.

  17. A simple landslide susceptibility analysis for hazard and risk assessment in developing countries

    NASA Astrophysics Data System (ADS)

    Guinau, M.; Vilaplana, J. M.

    2003-04-01

    In recent years, a number of techniques and methodologies have been developed for mitigating natural disasters. The complexity of these methodologies and the scarcity of material and data series justify the need for simple methodologies to obtain the necessary information for minimising the effects of catastrophic natural phenomena. The work with polygonal maps using a GIS allowed us to develop a simple methodology, which was developed in an area of 473 Km2 in the Departamento de Chinandega (NW Nicaragua). This area was severely affected by a large number of landslides (mainly debris flows), triggered by the Hurricane Mitch rainfalls in October 1998. With the aid of aerial photography interpretation at 1:40.000 scale, amplified to 1:20.000, and detailed field work, a landslide map at 1:10.000 scale was constructed. The failure zones of landslides were digitized in order to obtain a failure zone digital map. A terrain unit digital map, in which a series of physical-environmental terrain factors are represented, was also used. Dividing the studied area into two zones (A and B) with homogeneous physical and environmental characteristics, allows us to develop the proposed methodology and to validate it. In zone A, the failure zone digital map is superimposed onto the terrain unit digital map to establish the relationship between the different terrain factors and the failure zones. The numerical expression of this relationship enables us to classify the terrain by its landslide susceptibility. In zone B, this numerical relationship was employed to obtain a landslide susceptibility map, obviating the need for a failure zone map. The validity of the methodology can be tested in this area by using the degree of superposition of the susceptibility map and the failure zone map. The implementation of the methodology in tropical countries with physical and environmental characteristics similar to those of the study area allows us to carry out a landslide susceptibility analysis in areas where landslide records do not exist. This analysis is essential to landslide hazard and risk assessment, which is necessary to determine the actions for mitigating landslide effects, e.g. land planning, emergency aid actions, etc.

  18. The uses of emotion maps in research and clinical practice with families and couples: methodological innovation and critical inquiry.

    PubMed

    Gabb, Jacqui; Singh, Reenee

    2015-03-01

    We explore how "emotion maps" can be productively used in clinical assessment and clinical practice with families and couples. This graphic participatory method was developed in sociological studies to examine everyday family relationships. Emotion maps enable us to effectively "see" the dynamic experience and emotional repertoires of family life. Through the use of a case example, in this article we illustrate how emotion maps can add to the systemic clinicians' repertoire of visual methods. For clinicians working with families, couples, and young people, the importance of gaining insight into how lives are lived, at home, cannot be understated. Producing emotion maps can encourage critical personal reflection and expedite change in family practice. Hot spots in the household become visualized, facilitating dialogue on prevailing issues and how these events may be perceived differently by different family members. As emotion maps are not reliant on literacy or language skills they can be equally completed by parents and children alike, enabling children's perspective to be heard. Emotion maps can be used as assessment tools, to demonstrate the process of change within families. Furthermore, emotion maps can be extended to use through technology and hence are well suited particularly to working with young people. We end the article with a wider discussion of the place of emotions and emotion maps within systemic psychotherapy. © 2014 The Authors. Family Process published by Wiley Periodicals, Inc. on behalf of Family Process Institute.

  19. Combination of Geophysical Methods to Support Urban Geological Mapping

    NASA Astrophysics Data System (ADS)

    Gabàs, A.; Macau, A.; Benjumea, B.; Bellmunt, F.; Figueras, S.; Vilà, M.

    2014-07-01

    Urban geological mapping is a key to assist management of new developed areas, conversion of current urban areas or assessment of urban geological hazards. Geophysics can have a pivotal role to yield subsurface information in urban areas provided that geophysical methods are capable of dealing with challenges related to these scenarios (e.g., low signal-to-noise ratio or special logistical arrangements). With this principal aim, a specific methodology is developed to characterize lithological changes, to image fault zones and to delineate basin geometry in the urban areas. The process uses the combination of passive and active techniques as complementary data: controlled source audio-magnetotelluric method (CSAMT), magnetotelluric method (MT), microtremor H/V analysis and ambient noise array measurements to overcome the limitations of traditional geophysical methodology. This study is focused in Girona and Salt surrounding areas (NE of Spain) where some uncertainties in subsurface knowledge (maps of bedrock depth and the isopach maps of thickness of quaternary sediments) need to be resolved to carry out the 1:5000 urban geological mapping. These parameters can be estimated using this proposed methodology. (1) Acoustic impedance contrast between Neogene sediments and Paleogene or Paleozoic bedrock is detected with microtremor H/V analysis that provides the soil resonance frequency. The minimum value obtained is 0.4 Hz in Salt city, and the maximum value is the 9.5 Hz in Girona city. The result of this first method is a fast scanner of the geometry of basement. (2) Ambient noise array constrains the bedrock depth using the measurements of shear-wave velocity of soft soil. (3) Finally, the electrical resistivity models contribute with a good description of lithological changes and fault imaging. The conductive materials (1-100 Ωm) are associated with Neogene Basin composed by unconsolidated detrital sediments; medium resistive materials (100-400 Ωm) correspond to Paleogene, and resistive materials (600-1,000 Ωm) are related with complex basement, granite of Paleozoic. The Neogene basin-basement boundary is constrained between surface and 500 m depth, approximately. The new geophysical methodology presented is an optimized and fast tool to refine geological mapping by adding 2D information to traditional geological data and improving the knowledge of subsoil.

  20. Time-varying bispectral analysis of visually evoked multi-channel EEG

    NASA Astrophysics Data System (ADS)

    Chandran, Vinod

    2012-12-01

    Theoretical foundations of higher order spectral analysis are revisited to examine the use of time-varying bicoherence on non-stationary signals using a classical short-time Fourier approach. A methodology is developed to apply this to evoked EEG responses where a stimulus-locked time reference is available. Short-time windowed ensembles of the response at the same offset from the reference are considered as ergodic cyclostationary processes within a non-stationary random process. Bicoherence can be estimated reliably with known levels at which it is significantly different from zero and can be tracked as a function of offset from the stimulus. When this methodology is applied to multi-channel EEG, it is possible to obtain information about phase synchronization at different regions of the brain as the neural response develops. The methodology is applied to analyze evoked EEG response to flash visual stimulii to the left and right eye separately. The EEG electrode array is segmented based on bicoherence evolution with time using the mean absolute difference as a measure of dissimilarity. Segment maps confirm the importance of the occipital region in visual processing and demonstrate a link between the frontal and occipital regions during the response. Maps are constructed using bicoherence at bifrequencies that include the alpha band frequency of 8Hz as well as 4 and 20Hz. Differences are observed between responses from the left eye and the right eye, and also between subjects. The methodology shows potential as a neurological functional imaging technique that can be further developed for diagnosis and monitoring using scalp EEG which is less invasive and less expensive than magnetic resonance imaging.

  1. Processing ultrasonic inspection data from multiple scan patterns for turbine rotor weld build-up evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guan, Xuefei; Zhou, S. Kevin; Rasselkorde, El Mahjoub

    The study presents a data processing methodology for weld build-up using multiple scan patterns. To achieve an overall high probability of detection for flaws with different orientations, an inspection procedure with three different scan patterns is proposed. The three scan patterns are radial-tangential longitude wave pattern, axial-radial longitude wave pattern, and tangential shear wave pattern. Scientific fusion of the inspection data is implemented using volume reconstruction techniques. The idea is to perform spatial domain forward data mapping for all sampling points. A conservative scheme is employed to handle the case that multiple sampling points are mapped to one grid location.more » The scheme assigns the maximum value for the grid location to retain the largest equivalent reflector size for the location. The methodology is demonstrated and validated using a realistic ring of weld build-up. Tungsten balls and bars are embedded to the weld build-up during manufacturing process to represent natural flaws. Flat bottomed holes and side drilled holes are installed as artificial flaws. Automatic flaw identification and extraction are demonstrated. Results indicate the inspection procedure with multiple scan patterns can identify all the artificial and natural flaws.« less

  2. Processing ultrasonic inspection data from multiple scan patterns for turbine rotor weld build-up evaluations

    NASA Astrophysics Data System (ADS)

    Guan, Xuefei; Rasselkorde, El Mahjoub; Abbasi, Waheed; Zhou, S. Kevin

    2015-03-01

    The study presents a data processing methodology for weld build-up using multiple scan patterns. To achieve an overall high probability of detection for flaws with different orientations, an inspection procedure with three different scan patterns is proposed. The three scan patterns are radial-tangential longitude wave pattern, axial-radial longitude wave pattern, and tangential shear wave pattern. Scientific fusion of the inspection data is implemented using volume reconstruction techniques. The idea is to perform spatial domain forward data mapping for all sampling points. A conservative scheme is employed to handle the case that multiple sampling points are mapped to one grid location. The scheme assigns the maximum value for the grid location to retain the largest equivalent reflector size for the location. The methodology is demonstrated and validated using a realistic ring of weld build-up. Tungsten balls and bars are embedded to the weld build-up during manufacturing process to represent natural flaws. Flat bottomed holes and side drilled holes are installed as artificial flaws. Automatic flaw identification and extraction are demonstrated. Results indicate the inspection procedure with multiple scan patterns can identify all the artificial and natural flaws.

  3. Advancing scoping study methodology: a web-based survey and consultation of perceptions on terminology, definition and methodological steps.

    PubMed

    O'Brien, Kelly K; Colquhoun, Heather; Levac, Danielle; Baxter, Larry; Tricco, Andrea C; Straus, Sharon; Wickerson, Lisa; Nayar, Ayesha; Moher, David; O'Malley, Lisa

    2016-07-26

    Scoping studies (or reviews) are a method used to comprehensively map evidence across a range of study designs in an area, with the aim of informing future research practice, programs and policy. However, no universal agreement exists on terminology, definition or methodological steps. Our aim was to understand the experiences of, and considerations for conducting scoping studies from the perspective of academic and community partners. Primary objectives were to 1) describe experiences conducting scoping studies including strengths and challenges; and 2) describe perspectives on terminology, definition, and methodological steps. We conducted a cross-sectional web-based survey with clinicians, educators, researchers, knowledge users, representatives from community-based organizations, graduate students, and policy stakeholders with experience and/or interest in conducting scoping studies to gain an understanding of experiences and perspectives on the conduct and reporting of scoping studies. We administered an electronic self-reported questionnaire comprised of 22 items related to experiences with scoping studies, strengths and challenges, opinions on terminology, and methodological steps. We analyzed questionnaire data using descriptive statistics and content analytical techniques. Survey results were discussed during a multi-stakeholder consultation to identify key considerations in the conduct and reporting of scoping studies. Of the 83 invitations, 54 individuals (65 %) completed the scoping questionnaire, and 48 (58 %) attended the scoping study meeting from Canada, the United Kingdom and United States. Many scoping study strengths were dually identified as challenges including breadth of scope, and iterative process. No consensus on terminology emerged, however key defining features that comprised a working definition of scoping studies included the exploratory mapping of literature in a field; iterative process, inclusion of grey literature; no quality assessment of included studies, and an optional consultation phase. We offer considerations for the conduct and reporting of scoping studies for researchers, clinicians and knowledge users engaging in this methodology. Lack of consensus on scoping terminology, definition and methodological steps persists. Reasons for this may be attributed to diversity of disciplines adopting this methodology for differing purposes. Further work is needed to establish guidelines on the reporting and methodological quality assessment of scoping studies.

  4. Research, methodology, and applications of probabilistic seismic-hazard mapping of the Central and Eastern United States; minutes of a workshop on June 13-14, 2000, at Saint Louis University

    USGS Publications Warehouse

    Wheeler, Russell L.; Perkins, David M.

    2000-01-01

    The U.S. Geological Survey (USGS) is updating and revising its 1996 national seismic-hazard maps for release in 2001. Part of this process is the convening of four regional workshops with earth scientists and other users of the maps. The second of these workshops was sponsored by the USGS and the Mid-America Earthquake Center, and was hosted by Saint Louis University on June 13-14, 2000.The workshop concentrated on the central and eastern U.S. (CEUS) east of the Rocky Mountains. The tasks of the workshop were to (1) evaluate new research findings that are relevant to seismic hazard mapping, (2) discuss modifications in the inputs and methodology used in the national maps, (3) discuss concerns by engineers and other users about the scientific input to the maps and the use of the hazard maps in building codes, and (4) identify needed research in the CEUS that can improve the seismic hazard maps and reduce their uncertainties. These minutes summarize the workshop discussions. This is not a transcript; some individual remarks and short discussions of side issues and logistics were omitted. Named speakers were sent a draft of the minutes with a request for corrections of any errors in remarks attributed to them. Nine people returned corrections, amplifications, or approvals of their remarks as reported. The rest of this document consists of the meeting agenda, discussion summaries, and a list of the 60 attendees.

  5. Quality and Rigor of the Concept Mapping Methodology: A Pooled Study Analysis

    ERIC Educational Resources Information Center

    Rosas, Scott R.; Kane, Mary

    2012-01-01

    The use of concept mapping in research and evaluation has expanded dramatically over the past 20 years. Researchers in academic, organizational, and community-based settings have applied concept mapping successfully without the benefit of systematic analyses across studies to identify the features of a methodologically sound study. Quantitative…

  6. US Topo Maps 2014: Program updates and research

    USGS Publications Warehouse

    Fishburn, Kristin A.

    2014-01-01

    The U. S. Geological Survey (USGS) US Topo map program is now in year two of its second three-year update cycle. Since the program was launched in 2009, the product and the production system tools and processes have undergone enhancements that have made the US Topo maps a popular success story. Research and development continues with structural and content product enhancements, streamlined and more fully automated workflows, and the evaluation of a GIS-friendly US Topo GIS Packet. In addition, change detection methodologies are under evaluation to further streamline product maintenance and minimize resource expenditures for production in the future. The US Topo map program will continue to evolve in the years to come, providing traditional map users and Geographic Information System (GIS) analysts alike with a convenient, freely available product incorporating nationally consistent data that are quality assured to high standards.

  7. Combining geographic information system, multicriteria evaluation techniques and fuzzy logic in siting MSW landfills

    NASA Astrophysics Data System (ADS)

    Gemitzi, Alexandra; Tsihrintzis, Vassilios A.; Voudrias, Evangelos; Petalas, Christos; Stravodimos, George

    2007-01-01

    This study presents a methodology for siting municipal solid waste landfills, coupling geographic information systems (GIS), fuzzy logic, and multicriteria evaluation techniques. Both exclusionary and non-exclusionary criteria are used. Factors, i.e., non-exclusionary criteria, are divided in two distinct groups which do not have the same level of trade off. The first group comprises factors related to the physical environment, which cannot be expressed in terms of monetary cost and, therefore, they do not easily trade off. The second group includes those factors related to human activities, i.e., socioeconomic factors, which can be expressed as financial cost, thus showing a high level of trade off. GIS are used for geographic data acquisition and processing. The analytical hierarchy process (AHP) is the multicriteria evaluation technique used, enhanced with fuzzy factor standardization. Besides assigning weights to factors through the AHP, control over the level of risk and trade off in the siting process is achieved through a second set of weights, i.e., order weights, applied to factors in each factor group, on a pixel-by-pixel basis, thus taking into account the local site characteristics. The method has been applied to Evros prefecture (NE Greece), an area of approximately 4,000 km2. The siting methodology results in two intermediate suitability maps, one related to environmental and the other to socioeconomic criteria. Combination of the two intermediate maps results in the final composite suitability map for landfill siting.

  8. A Practical Methodology for the Systematic Development of Multiple Choice Tests.

    ERIC Educational Resources Information Center

    Blumberg, Phyllis; Felner, Joel

    Using Guttman's facet design analysis, four parallel forms of a multiple-choice test were developed. A mapping sentence, logically representing the universe of content of a basic cardiology course, specified the facets of the course and the semantic structural units linking them. The facets were: cognitive processes, disease priority, specific…

  9. Analyzing Problem's Difficulty Based on Neural Networks and Knowledge Map

    ERIC Educational Resources Information Center

    Kuo, Rita; Lien, Wei-Peng; Chang, Maiga; Heh, Jia-Sheng

    2004-01-01

    This paper proposes a methodology to calculate both the difficulty of the basic problems and the difficulty of solving a problem. The method to calculate the difficulty of problem is according to the process of constructing a problem, including Concept Selection, Unknown Designation, and Proposition Construction. Some necessary measures observed…

  10. A History and Critique of Quality Evaluation in the UK

    ERIC Educational Resources Information Center

    Harvey, Lee

    2005-01-01

    Purpose: To provide a history of the emergence of quality systems from the mid-1980s. To show how quality became a primary policy concern in higher education policy. To map the development of quality processes and raise questions about dominant approaches and express concerns for the future. Design/methodology/approach: Historical document…

  11. Investigating the Effects of Prompts on Argumentation Style, Consensus and Perceived Efficacy in Collaborative Learning

    ERIC Educational Resources Information Center

    Harney, Owen M.; Hogan, Michael J.; Broome, Benjamin; Hall, Tony; Ryan, Cormac

    2015-01-01

    This paper investigates the effects of task-level versus process-level prompts on levels of perceived and objective consensus, perceived efficacy, and argumentation style in the context of a computer-supported collaborative learning session using Interactive Management (IM), a computer facilitated thought and action mapping methodology. Four…

  12. Changes in the methodology used in the production of the Spanish CORINE: Uncertainty analysis of the new maps

    NASA Astrophysics Data System (ADS)

    García-Álvarez, David; Camacho Olmedo, María Teresa

    2017-12-01

    Since 2012 CORINE has been obtained in Spain from the generalization of a more detailed land cover map (SIOSE). This methodological change has meant the production of a new CORINE map, which is different from the existing ones. To analyze how different the new maps are from the previous ones, as well as the advantages and disadvantages of the new methodology, we carried out a comparison of the CORINE obtained from both methods (traditional and generalization) for the year 2006. The new CORINE is more detailed and it is more coherent with the rest of Spanish Land Use Land Cover (LULC) maps. However, problems have been encountered with regard to the meaning of its classes, the fragmentation of patches and the complexity of its perimeters.

  13. Differences in experiences in rockfall hazard mapping in Switzerland and Principality of Andorra

    NASA Astrophysics Data System (ADS)

    Abbruzzese, J.; Labiouse, V.

    2009-04-01

    The need to cope with rockfall hazard and risk led many countries to adopt proper strategies for hazard mapping and risk management, based on their own social and political constraints. The experience of every single country in facing this challenge provides useful information and possible approaches to evaluate rockfall hazard and risk. More, with particular regard to the hazard mapping process, some important points are common to many methodologies in Europe, especially as for the use of rock fall intensity-frequency diagrams to define specific hazard levels. This aspect could suggest a starting point for comparing and possibly harmonising existing methodologies. On the other hand, the results obtained from methodologies used in different countries may be difficult to be compared, first because the existing national guidelines are established as a consequence of what has been learned in each country from dealing with past rockfall events. Particularly, diverse social and political considerations do influence the definition of the threshold values of the parameters which determine a given degree of hazard, and eventually the type of land-use accepted for each hazard level. Therefore, a change in the threshold values for rockfall intensity and frequency is already enough to produce completely different zoning results even if the same methodology is applied. In relation with this issue, the paper introduces some of the current challenges and difficulties in comparing hazard mapping results in Europe and, subsequently, in the chance to develop a common standard procedure to assess the rockfall hazard. The present work is part of an on-going research project whose aim is to improve methodologies for rockfall hazard and risk mapping at the local scale, in the framework of the European Project "Mountain Risks: from prediction to management and governance", funded by the European Commission. As a reference, two approaches will be considered, proposed in Switzerland and in the Principality of Andorra, respectively. At first, the guidelines applied in the two countries will be outlined, showing which way the correspondent procedures differ. For this purpose, in both cases, the main philosophy in facing rockfall hazard will be discussed, together with its consequences in terms of the resulting intensity-frequency threshold values proposed to determine different classes of hazard. Then, a simple case study carried out in Switzerland, in the Canton of Valais, will show an application of the discussed theoretical issues, by means of a comparison between the two approaches. A rockfall hazard mapping will be performed on a 2D slope profile, following both the Swiss energy-probability threshold values and the ones used in the Principality of Andorra. The analysis of the results will introduce some consequences the criteria for defining classes of hazard may have on land-use planning, depending on which guidelines are applied in a study site. This aspect involves not only differences in zoning concerning the extension of the areas in danger, but as well the influence on land-use that the meaning of the same hazard level may have, according to which threshold values for rockfall intensity and frequency are used. These considerations underline what role social and political decisions can play in the hazard assessment process, on the basis of the experiences and understandings of each country in this field. More precisely, it is rather evident that a possible comparison and/or harmonisation of hazard mapping results is closely linked to this aspect as well, and not only to more technical matters, such as computing and mapping techniques.

  14. A Hidden Portrait by Edgar Degas

    NASA Astrophysics Data System (ADS)

    Thurrowgood, David; Paterson, David; de Jonge, Martin D.; Kirkham, Robin; Thurrowgood, Saul; Howard, Daryl L.

    2016-08-01

    The preservation and understanding of cultural heritage depends increasingly on in-depth chemical studies. Rapid technological advances are forging connections between scientists and arts communities, enabling revolutionary new techniques for non-invasive technical study of culturally significant, highly prized artworks. We have applied a non-invasive, rapid, high definition X-ray fluorescence (XRF) elemental mapping technique to a French Impressionist painting using a synchrotron radiation source, and show how this technology can advance scholarly art interpretation and preservation. We have obtained detailed technical understanding of a painting which could not be resolved by conventional techniques. Here we show 31.6 megapixel scanning XRF derived elemental maps and report a novel image processing methodology utilising these maps to produce a false colour representation of a “hidden” portrait by Edgar Degas. This work provides a cohesive methodology for both imaging and understanding the chemical composition of artworks, and enables scholarly understandings of cultural heritage, many of which have eluded conventional technologies. We anticipate that the outcome from this work will encourage the reassessment of some of the world’s great art treasures.

  15. Spectral mapping of brain functional connectivity from diffusion imaging.

    PubMed

    Becker, Cassiano O; Pequito, Sérgio; Pappas, George J; Miller, Michael B; Grafton, Scott T; Bassett, Danielle S; Preciado, Victor M

    2018-01-23

    Understanding the relationship between the dynamics of neural processes and the anatomical substrate of the brain is a central question in neuroscience. On the one hand, modern neuroimaging technologies, such as diffusion tensor imaging, can be used to construct structural graphs representing the architecture of white matter streamlines linking cortical and subcortical structures. On the other hand, temporal patterns of neural activity can be used to construct functional graphs representing temporal correlations between brain regions. Although some studies provide evidence that whole-brain functional connectivity is shaped by the underlying anatomy, the observed relationship between function and structure is weak, and the rules by which anatomy constrains brain dynamics remain elusive. In this article, we introduce a methodology to map the functional connectivity of a subject at rest from his or her structural graph. Using our methodology, we are able to systematically account for the role of structural walks in the formation of functional correlations. Furthermore, in our empirical evaluations, we observe that the eigenmodes of the mapped functional connectivity are associated with activity patterns associated with different cognitive systems.

  16. A Hidden Portrait by Edgar Degas

    PubMed Central

    Thurrowgood, David; Paterson, David; de Jonge, Martin D.; Kirkham, Robin; Thurrowgood, Saul; Howard, Daryl L.

    2016-01-01

    The preservation and understanding of cultural heritage depends increasingly on in-depth chemical studies. Rapid technological advances are forging connections between scientists and arts communities, enabling revolutionary new techniques for non-invasive technical study of culturally significant, highly prized artworks. We have applied a non-invasive, rapid, high definition X-ray fluorescence (XRF) elemental mapping technique to a French Impressionist painting using a synchrotron radiation source, and show how this technology can advance scholarly art interpretation and preservation. We have obtained detailed technical understanding of a painting which could not be resolved by conventional techniques. Here we show 31.6 megapixel scanning XRF derived elemental maps and report a novel image processing methodology utilising these maps to produce a false colour representation of a “hidden” portrait by Edgar Degas. This work provides a cohesive methodology for both imaging and understanding the chemical composition of artworks, and enables scholarly understandings of cultural heritage, many of which have eluded conventional technologies. We anticipate that the outcome from this work will encourage the reassessment of some of the world’s great art treasures. PMID:27490856

  17. Use of Intervention Mapping to Enhance Health Care Professional Practice: A Systematic Review.

    PubMed

    Durks, Desire; Fernandez-Llimos, Fernando; Hossain, Lutfun N; Franco-Trigo, Lucia; Benrimoj, Shalom I; Sabater-Hernández, Daniel

    2017-08-01

    Intervention Mapping is a planning protocol for developing behavior change interventions, the first three steps of which are intended to establish the foundations and rationales of such interventions. This systematic review aimed to identify programs that used Intervention Mapping to plan changes in health care professional practice. Specifically, it provides an analysis of the information provided by the programs in the first three steps of the protocol to determine their foundations and rationales of change. A literature search was undertaken in PubMed, Scopus, SciELO, and DOAJ using "Intervention Mapping" as keyword. Key information was gathered, including theories used, determinants of practice, research methodologies, theory-based methods, and practical applications. Seventeen programs aimed at changing a range of health care practices were included. The social cognitive theory and the theory of planned behavior were the most frequently used frameworks in driving change within health care practices. Programs used a large variety of research methodologies to identify determinants of practice. Specific theory-based methods (e.g., modelling and active learning) and practical applications (e.g., health care professional training and facilitation) were reported to inform the development of practice change interventions and programs. In practice, Intervention Mapping delineates a three-step systematic, theory- and evidence-driven process for establishing the theoretical foundations and rationales underpinning change in health care professional practice. The use of Intervention Mapping can provide health care planners with useful guidelines for the theoretical development of practice change interventions and programs.

  18. Cartography, new technologies and geographic education: theoretical approaches to research the field

    NASA Astrophysics Data System (ADS)

    Seneme do Canto, Tânia

    2018-05-01

    In order to understand the roles that digital mapping can play in cartographic and geographic education, this paper discusses the theoretical and methodological approach used in a research that is undertaking in the education of geography teachers. To develop the study, we found in the works of Lankshear and Knobel (2013) a notion of new literacies that allows us looking at the practices within digital mapping in a sociocultural perspective. From them, we conclude that in order to understand the changes that digital cartography is able to foment in geography teaching, it is necessary to go beyond the substitution of means in the classroom and being able to explore what makes the new mapping practices different from others already consolidated in geography teaching. Therefore, we comment on some features of new forms of cartographic literacy that are in full development with digital technologies, but which are not determined solely by their use. The ideas of Kitchin and Dodge (2007) and Del Casino Junior and Hanna (2006) are also an important reference for the research. Methodologically, this approach helps us to understand that in the seek to comprehend maps and their meanings, irrespective of the medium used, we are dealing with a process of literacy that is very particular and emergent because it involves not only the characteristics of the map artifact and of the individual that produces or consumes it, but depends mainly on a diversity of interconnections that are being built between them (map and individual) and the world.

  19. A revised ground-motion and intensity interpolation scheme for shakemap

    USGS Publications Warehouse

    Worden, C.B.; Wald, D.J.; Allen, T.I.; Lin, K.; Garcia, D.; Cua, G.

    2010-01-01

    We describe a weighted-average approach for incorporating various types of data (observed peak ground motions and intensities and estimates from groundmotion prediction equations) into the ShakeMap ground motion and intensity mapping framework. This approach represents a fundamental revision of our existing ShakeMap methodology. In addition, the increased availability of near-real-time macroseismic intensity data, the development of newrelationships between intensity and peak ground motions, and new relationships to directly predict intensity from earthquake source information have facilitated the inclusion of intensity measurements directly into ShakeMap computations. Our approach allows for the combination of (1) direct observations (ground-motion measurements or reported intensities), (2) observations converted from intensity to ground motion (or vice versa), and (3) estimated ground motions and intensities from prediction equations or numerical models. Critically, each of the aforementioned data types must include an estimate of its uncertainties, including those caused by scaling the influence of observations to surrounding grid points and those associated with estimates given an unknown fault geometry. The ShakeMap ground-motion and intensity estimates are an uncertainty-weighted combination of these various data and estimates. A natural by-product of this interpolation process is an estimate of total uncertainty at each point on the map, which can be vital for comprehensive inventory loss calculations. We perform a number of tests to validate this new methodology and find that it produces a substantial improvement in the accuracy of ground-motion predictions over empirical prediction equations alone.

  20. Development of archetypes for non-ranking classification and comparison of European National Health Technology Assessment systems.

    PubMed

    Allen, Nicola; Pichler, Franz; Wang, Tina; Patel, Sundip; Salek, Sam

    2013-12-01

    European countries are increasingly utilising health technology assessment (HTA) to inform reimbursement decision-making. However, the current European HTA environment is very diverse, and projects are already underway to initiate a more efficient and aligned HTA practice within Europe. This study aims to identify a non-ranking method for classifying the diversity of European HTA agencies process and the organisational architecture of the national regulatory review to reimbursement systems. Using a previously developed mapping methodology, this research created process maps to describe national processes for regulatory review to reimbursement for 33 European jurisdictions. These process maps enabled the creation of 2 HTA taxonomic sets. The confluence of the two taxonomic sets was subsequently cross-referenced to identify 10 HTA archetype groups. HTA is a young, rapidly evolving field and it can be argued that optimal practices for performing HTA are yet to emerge. Therefore, a non-ranking classification approach could objectively characterise and compare the diversity observed in the current European HTA environment. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. A methodology for the generation of the 2-D map from unknown navigation environment by traveling a short distance

    NASA Technical Reports Server (NTRS)

    Bourbakis, N.; Sarkar, D.

    1994-01-01

    A technique for generation of a 2-D space map by traveling a short distance is described. The space to be mapped can be classified as: (1) space without obstacles, (2) space with stationary obstacles, and (3) space with moving obstacles. This paper presents the methodology used to generate a 2-D map of an unknown navigation space. The ability to minimize the redundancy during traveling and maximize the confidence function for generation of the map are advantages of this technique.

  2. Rapid quantitative chemical mapping of surfaces with sub-2 nm resolution

    NASA Astrophysics Data System (ADS)

    Lai, Chia-Yun; Perri, Saverio; Santos, Sergio; Garcia, Ricardo; Chiesa, Matteo

    2016-05-01

    We present a theory that exploits four observables in bimodal atomic force microscopy to produce maps of the Hamaker constant H. The quantitative H maps may be employed by the broader community to directly interpret the high resolution of standard bimodal AFM images as chemical maps while simultaneously quantifying chemistry in the non-contact regime. We further provide a simple methodology to optimize a range of operational parameters for which H is in the closest agreement with the Lifshitz theory in order to (1) simplify data acquisition and (2) generalize the methodology to any set of cantilever-sample systems.We present a theory that exploits four observables in bimodal atomic force microscopy to produce maps of the Hamaker constant H. The quantitative H maps may be employed by the broader community to directly interpret the high resolution of standard bimodal AFM images as chemical maps while simultaneously quantifying chemistry in the non-contact regime. We further provide a simple methodology to optimize a range of operational parameters for which H is in the closest agreement with the Lifshitz theory in order to (1) simplify data acquisition and (2) generalize the methodology to any set of cantilever-sample systems. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr00496b

  3. Physiographic map of the Sicilian region (1:250,000 scale)

    NASA Astrophysics Data System (ADS)

    Priori, Simone; Fantappiè, Maria; Costantini, Edoardo A. C.

    2015-04-01

    Physiographic maps summarize and group the landforms of a territory into homogeneous areas in terms of kind and intensity of main geomorphological process. Most of the physiographic maps have large scale, which is national or continental scale. Other maps have been produced at the semi-detailed scales, while examples at the regional scale are much less common. However, being the Region the main administrative level in Europe, they can be very useful for land planning in many fields, such as ecological studies, risk maps, and soil mapping. This work presents a methodological example of regional physiographic map, compiled at 1:250,000 scale, representing the whole Sicilian region, the largest and most characteristic of Mediterranean island. The physiographic units were classed matching thematich layers (NDVI, geology, DEM, land cover) with the main geomorphological processes that were identified by stereo-interpretation of aerial photographs (1:70,000 scale). In addition, information from other published maps, representing geomorphological forms, aeolian deposits, anthropic terraced slopes, and landslide were used to improve the accuracy and reliability of the map. The classification of the physiographic units, and then the map legend, was built up on the basis of literature and taking into account Italian geomorphological legend. The legend proposed in this map, which can be applied also in other Mediterranean countries, is suitable for different scales. The landform units were grouped on the base of a geomorphological classification of the forms into: anthropogenic, eolian, coastal, valley floor, intermountain fluvial, slope erosional, structural, karstic, and volcanic.

  4. The Research of Improving the Particleboard Glue Dosing Process Based on TRIZ Analysis

    NASA Astrophysics Data System (ADS)

    Yu, Huiling; Fan, Delin; Zhang, Yizhuo

    This research creates a design methodology by synthesizing the Theory of Inventive Problem Solving (TRIZ) and cascade control based on Smith predictor. The particleboard glue supplying and dosing system case study defines the problem and the solution using the methodology proposed in the paper. Status difference existing in the gluing dosing process of particleboard production usually causes gluing volume inaccurately. In order to solve the problem above, we applied the TRIZ technical contradiction and inventive principle to improve the key process of particleboard production. The improving method mapped inaccurate problem to TRIZ technical contradiction, the prior action proposed Smith predictor as the control algorithm in the glue dosing system. This research examines the usefulness of a TRIZ based problem-solving process designed to improve the problem-solving ability of users in addressing difficult or reoccurring problems and also testify TRIZ is practicality and validity. Several suggestions are presented on how to approach this problem.

  5. Modeling of electrohydrodynamic drying process using response surface methodology

    PubMed Central

    Dalvand, Mohammad Jafar; Mohtasebi, Seyed Saeid; Rafiee, Shahin

    2014-01-01

    Energy consumption index is one of the most important criteria for judging about new, and emerging drying technologies. One of such novel and promising alternative of drying process is called electrohydrodynamic (EHD) drying. In this work, a solar energy was used to maintain required energy of EHD drying process. Moreover, response surface methodology (RSM) was used to build a predictive model in order to investigate the combined effects of independent variables such as applied voltage, field strength, number of discharge electrode (needle), and air velocity on moisture ratio, energy efficiency, and energy consumption as responses of EHD drying process. Three-levels and four-factor Box–Behnken design was employed to evaluate the effects of independent variables on system responses. A stepwise approach was followed to build up a model that can map the entire response surface. The interior relationships between parameters were well defined by RSM. PMID:24936289

  6. Image analysis method for the measurement of water saturation in a two-dimensional experimental flow tank

    NASA Astrophysics Data System (ADS)

    Belfort, Benjamin; Weill, Sylvain; Lehmann, François

    2017-04-01

    A novel, non-invasive imaging technique that determines 2D maps of water content in unsaturated porous media is presented. This method directly relates digitally measured intensities to the water content of the porous medium. This method requires the classical image analysis steps, i.e., normalization, filtering, background subtraction, scaling and calibration. The main advantages of this approach are that no calibration experiment is needed and that no tracer or dye is injected into the flow tank. The procedure enables effective processing of a large number of photographs and thus produces 2D water content maps at high temporal resolution. A drainage / imbibition experiment in a 2D flow tank with inner dimensions of 40 cm x 14 cm x 6 cm (L x W x D) is carried out to validate the methodology. The accuracy of the proposed approach is assessed using numerical simulations with a state-of-the-art computational code that solves the Richards. Comparison of the cumulative mass leaving and entering the flow tank and water content maps produced by the photographic measurement technique and the numerical simulations demonstrate the efficiency and high accuracy of the proposed method for investigating vadose zone flow processes. Application examples to a larger flow tank with various boundary conditions are finally presented to illustrate the potential of the methodology.

  7. Becoming-Learner: Coordinates for Mapping the Space and Subject of Nomadic Pedagogy

    ERIC Educational Resources Information Center

    Fendler, Rachel

    2013-01-01

    How can the process of "becoming learner" be observed, documented, and shared? What methodology could be used to discuss nomadic qualities of learning mobilities? This article argues in favor of an arts-based research approach, specifically social cartography, as a tool that can encourage young people to reflect on their identity as…

  8. A Systematic Software, Firmware, and Hardware Codesign Methodology for Digital Signal Processing

    DTIC Science & Technology

    2014-03-01

    possible mappings ...................................................60 Table 25. Possible optimal leaf -nodes... size weight and power UAV unmanned aerial vehicle UHF ultra-high frequency UML universal modeling language Verilog verify logic VHDL VHSIC...optimal leaf -nodes to some design patterns for embedded system design. Software and hardware partitioning is a very difficult challenge in the field of

  9. Development of a Spect-Based Three-Dimensional Treatment Planner for Radionuclide Therapy with Iodine -131.

    NASA Astrophysics Data System (ADS)

    Giap, Huan Bosco

    Accurate calculation of absorbed dose to target tumors and normal tissues in the body is an important requirement for establishing fundamental dose-response relationships for radioimmunotherapy. Two major obstacles have been the difficulty in obtaining an accurate patient-specific 3-D activity map in-vivo and calculating the resulting absorbed dose. This study investigated a methodology for 3-D internal dosimetry, which integrates the 3-D biodistribution of the radionuclide acquired from SPECT with a dose-point kernel convolution technique to provide the 3-D distribution of absorbed dose. Accurate SPECT images were reconstructed with appropriate methods for noise filtering, attenuation correction, and Compton scatter correction. The SPECT images were converted into activity maps using a calibration phantom. The activity map was convolved with an ^{131}I dose-point kernel using a 3-D fast Fourier transform to yield a 3-D distribution of absorbed dose. The 3-D absorbed dose map was then processed to provide the absorbed dose distribution in regions of interest. This methodology can provide heterogeneous distributions of absorbed dose in volumes of any size and shape with nonuniform distributions of activity. Comparison of the activities quantitated by our SPECT methodology to true activities in an Alderson abdominal phantom (with spleen, liver, and spherical tumor) yielded errors of -16.3% to 4.4%. Volume quantitation errors ranged from -4.0 to 5.9% for volumes greater than 88 ml. The percentage differences of the average absorbed dose rates calculated by this methodology and the MIRD S-values were 9.1% for liver, 13.7% for spleen, and 0.9% for the tumor. Good agreement (percent differences were less than 8%) was found between the absorbed dose due to penetrating radiation calculated from this methodology and TLD measurement. More accurate estimates of the 3 -D distribution of absorbed dose can be used as a guide in specifying the minimum activity to be administered to patients to deliver a prescribed absorbed dose to tumor without exceeding the toxicity limits of normal tissues.

  10. Nitrate contamination risk assessment in groundwater at regional scale

    NASA Astrophysics Data System (ADS)

    Daniela, Ducci

    2016-04-01

    Nitrate groundwater contamination is widespread in the world, due to the intensive use of fertilizers, to the leaking from the sewage network and to the presence of old septic systems. This research presents a methodology for groundwater contamination risk assessment using thematic maps derived mainly from the land-use map and from statistical data available at the national institutes of statistic (especially demographic and environmental data). The potential nitrate contamination is considered as deriving from three sources: agricultural, urban and periurban. The first one is related to the use of fertilizers. For this reason the land-use map is re-classified on the basis of the crop requirements in terms of fertilizers. The urban source is the possibility of leaks from the sewage network and, consequently, is linked to the anthropogenic pressure, expressed by the population density, weighted on the basis of the mapped urbanized areas of the municipality. The periurban sources include the un-sewered areas, especially present in the periurban context, where illegal sewage connections coexist with on-site sewage disposal (cesspools, septic tanks and pit latrines). The potential nitrate contamination map is produced by overlaying the agricultural, urban and periurban maps. The map combination process is very easy, being an algebraic combination: the output values are the arithmetic average of the input values. The groundwater vulnerability to contamination can be assessed using parametric methods, like DRASTIC or easier, like AVI (that involves a limited numbers of parameters). In most of cases, previous documents produced at regional level can be used. The pollution risk map is obtained by combining the thematic maps of the potential nitrate contamination map and the groundwater contamination vulnerability map. The criterion for the linkages of the different GIS layers is very easy, corresponding to an algebraic combination. The methodology has been successfully applied in a large flat area of southern Italy, with high concentrations in NO3.

  11. Creating a literature database of low-calorie sweeteners and health studies: evidence mapping.

    PubMed

    Wang, Ding Ding; Shams-White, Marissa; Bright, Oliver John M; Parrott, J Scott; Chung, Mei

    2016-01-05

    Evidence mapping is an emerging tool used to systematically identify, organize and summarize the quantity and focus of scientific evidence on a broad topic, but there are currently no methodological standards. Using the topic of low-calorie sweeteners (LCS) and selected health outcomes, we describe the process of creating an evidence-map database and demonstrate several example descriptive analyses using this database. The process of creating an evidence-map database is described in detail. The steps include: developing a comprehensive literature search strategy, establishing study eligibility criteria and a systematic study selection process, extracting data, developing outcome groups with input from expert stakeholders and tabulating data using descriptive analyses. The database was uploaded onto SRDR™ (Systematic Review Data Repository), an open public data repository. Our final LCS evidence-map database included 225 studies, of which 208 were interventional studies and 17 were cohort studies. An example bubble plot was produced to display the evidence-map data and visualize research gaps according to four parameters: comparison types, population baseline health status, outcome groups, and study sample size. This plot indicated a lack of studies assessing appetite and dietary intake related outcomes using LCS with a sugar intake comparison in people with diabetes. Evidence mapping is an important tool for the contextualization of in-depth systematic reviews within broader literature and identifies gaps in the evidence base, which can be used to inform future research. An open evidence-map database has the potential to promote knowledge translation from nutrition science to policy.

  12. A Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) Determined from Phased Microphone Arrays

    NASA Technical Reports Server (NTRS)

    Brooks, Thomas F.; Humphreys, William M.

    2006-01-01

    Current processing of acoustic array data is burdened with considerable uncertainty. This study reports an original methodology that serves to demystify array results, reduce misinterpretation, and accurately quantify position and strength of acoustic sources. Traditional array results represent noise sources that are convolved with array beamform response functions, which depend on array geometry, size (with respect to source position and distributions), and frequency. The Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) method removes beamforming characteristics from output presentations. A unique linear system of equations accounts for reciprocal influence at different locations over the array survey region. It makes no assumption beyond the traditional processing assumption of statistically independent noise sources. The full rank equations are solved with a new robust iterative method. DAMAS is quantitatively validated using archival data from a variety of prior high-lift airframe component noise studies, including flap edge/cove, trailing edge, leading edge, slat, and calibration sources. Presentations are explicit and straightforward, as the noise radiated from a region of interest is determined by simply summing the mean-squared values over that region. DAMAS can fully replace existing array processing and presentations methodology in most applications. It appears to dramatically increase the value of arrays to the field of experimental acoustics.

  13. Spotlight SAR interferometry for terrain elevation mapping and interferometric change detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eichel, P.H.; Ghiglia, D.C.; Jakowatz, C.V. Jr.

    1996-02-01

    In this report, we employ an approach quite different from any previous work; we show that a new methodology leads to a simpler and clearer understanding of the fundamental principles of SAR interferometry. This methodology also allows implementation of an important collection mode that has not been demonstrated to date. Specifically, we introduce the following six new concepts for the processing of interferometric SAR (INSAR) data: (1) processing using spotlight mode SAR imaging (allowing ultra-high resolution), as opposed to conventional strip-mapping techniques; (2) derivation of the collection geometry constraints required to avoid decorrelation effects in two-pass INSAR; (3) derivation ofmore » maximum likelihood estimators for phase difference and the change parameter employed in interferometric change detection (ICD); (4) processing for the two-pass case wherein the platform ground tracks make a large crossing angle; (5) a robust least-squares method for two-dimensional phase unwrapping formulated as a solution to Poisson`s equation, instead of using traditional path-following techniques; and (6) the existence of a simple linear scale factor that relates phase differences between two SAR images to terrain height. We show both theoretical analysis, as well as numerous examples that employ real SAR collections to demonstrate the innovations listed above.« less

  14. WOCAT mapping, GIS and the Góis municipality

    NASA Astrophysics Data System (ADS)

    Esteves, T. C. J.; Soares, J. A. A.; Ferreira, A. J. D.; Coelho, C. O. A.; Carreiras, M. A.; Lynden, G. V.

    2012-04-01

    In the scope of the goals of the association "The World Overview of Conservation Approaches and Technologies" (WOCAT), the established methodology intends to support the sustainable development of new techniques and the process of decision making in Sustainable Soil Management (SSM). Its main goal is to promote the co-existence with nature, in order to assure the wellbeing of upcoming generations. SSM is defined as the use of terrestrial resources, including soil, water, fauna, flora, for the production of goods that fulfill human needs, guaranteeing simultaneously a long-term productive potential for these resources, as well as the maintenance of their environmental functions. The EU-funded DESIRE (Desertification Mitigation & Remediation of Land: a global approach for local solutions) project is centered on SSM, having as a main goal the development and study of promising conservation, soil use and management strategies, therefore contributing for the protection of arid and semi-arid vulnerable areas. In Portugal, one of the main soil degradation and desertification agents are wildfires. There is consequently an urgent need to establish integrated conservation measures to reduce or prevent these occurrences. To do so, and for the DESIRE project, the WOCAT methodology was implemented, where it could be foreseen as 3 major questionnaires for: technologies (WOCAT Technologies), approaches (WOCAT Approaches) and mapping (WOCAT Mapping). The established methodology for WOCAT Mapping was created in order to attend the questions associated to the soil and water degradation, emphasizing the direct and socio-economic causes of this degradation. It evaluates what type of soil degradation is occurring, where, why and what actions are in practice in what respects to SSM. The association of this questionnaire to Geographical Information Systems (GIS) allows not only to produce maps, but also to calculate areas, taking into account several aspects of soil degradation and conservation. The map database and their outputs give a comprehensive and powerful tool to obtain a global vision of the degradation state of a given territory, at the desired local or regional scale. However for the selected study area, the Portuguese Góis Municipality, there was no base information prepared to be readily inserted in the geographical database. It was necessary to create the requested mapping units, so that the WOCAT Mapping questionnaire could be used.As a result, municipal cartography with 39 mapping units was obtained, and for each one, an exhaustive field work was made, allowing to characterize them in detail and answer the required information by WOCAT Mapping. These answers allowed creating a clearer image of what is happening in the territory in what respects to the used techniques, degradation degree and conservation measures applied. The all-important contact with the municipalities main stakeholders is an important aspect to refer, once they are the ones to help validate the obtained results for the WOCAT Mapping methodology, due to their extensive knowledge of the territory.

  15. Optimal health and disease management using spatial uncertainty: a geographic characterization of emergent artemisinin-resistant Plasmodium falciparum distributions in Southeast Asia.

    PubMed

    Grist, Eric P M; Flegg, Jennifer A; Humphreys, Georgina; Mas, Ignacio Suay; Anderson, Tim J C; Ashley, Elizabeth A; Day, Nicholas P J; Dhorda, Mehul; Dondorp, Arjen M; Faiz, M Abul; Gething, Peter W; Hien, Tran T; Hlaing, Tin M; Imwong, Mallika; Kindermans, Jean-Marie; Maude, Richard J; Mayxay, Mayfong; McDew-White, Marina; Menard, Didier; Nair, Shalini; Nosten, Francois; Newton, Paul N; Price, Ric N; Pukrittayakamee, Sasithon; Takala-Harrison, Shannon; Smithuis, Frank; Nguyen, Nhien T; Tun, Kyaw M; White, Nicholas J; Witkowski, Benoit; Woodrow, Charles J; Fairhurst, Rick M; Sibley, Carol Hopkins; Guerin, Philippe J

    2016-10-24

    Artemisinin-resistant Plasmodium falciparum malaria parasites are now present across much of mainland Southeast Asia, where ongoing surveys are measuring and mapping their spatial distribution. These efforts require substantial resources. Here we propose a generic 'smart surveillance' methodology to identify optimal candidate sites for future sampling and thus map the distribution of artemisinin resistance most efficiently. The approach uses the 'uncertainty' map generated iteratively by a geostatistical model to determine optimal locations for subsequent sampling. The methodology is illustrated using recent data on the prevalence of the K13-propeller polymorphism (a genetic marker of artemisinin resistance) in the Greater Mekong Subregion. This methodology, which has broader application to geostatistical mapping in general, could improve the quality and efficiency of drug resistance mapping and thereby guide practical operations to eliminate malaria in affected areas.

  16. Conversion of KEGG metabolic pathways to SBGN maps including automatic layout

    PubMed Central

    2013-01-01

    Background Biologists make frequent use of databases containing large and complex biological networks. One popular database is the Kyoto Encyclopedia of Genes and Genomes (KEGG) which uses its own graphical representation and manual layout for pathways. While some general drawing conventions exist for biological networks, arbitrary graphical representations are very common. Recently, a new standard has been established for displaying biological processes, the Systems Biology Graphical Notation (SBGN), which aims to unify the look of such maps. Ideally, online repositories such as KEGG would automatically provide networks in a variety of notations including SBGN. Unfortunately, this is non‐trivial, since converting between notations may add, remove or otherwise alter map elements so that the existing layout cannot be simply reused. Results Here we describe a methodology for automatic translation of KEGG metabolic pathways into the SBGN format. We infer important properties of the KEGG layout and treat these as layout constraints that are maintained during the conversion to SBGN maps. Conclusions This allows for the drawing and layout conventions of SBGN to be followed while creating maps that are still recognizably the original KEGG pathways. This article details the steps in this process and provides examples of the final result. PMID:23953132

  17. A methodology for small scale rural land use mapping in semi-arid developing countries using orbital imagery. Part 5: Experimental and operational techniques of mapping land use

    NASA Technical Reports Server (NTRS)

    Vangenderen, J. L. (Principal Investigator); Lock, B. F.

    1976-01-01

    The author has identified the following significant results. Scope of the preprocessing techniques was restricted to standard material from the EROS Data Center accompanied by some enlarging procedures and the use of the diazo process. Investigation has shown that the most appropriate sampling strategy for this study is the stratified random technique. A viable sampling procedure, together with a method for determining minimum number of sample points in order to test results of any interpretation are presented.

  18. Mapping a research agenda for the science of team science

    PubMed Central

    Falk-Krzesinski, Holly J; Contractor, Noshir; Fiore, Stephen M; Hall, Kara L; Kane, Cathleen; Keyton, Joann; Klein, Julie Thompson; Spring, Bonnie; Stokols, Daniel; Trochim, William

    2012-01-01

    An increase in cross-disciplinary, collaborative team science initiatives over the last few decades has spurred interest by multiple stakeholder groups in empirical research on scientific teams, giving rise to an emergent field referred to as the science of team science (SciTS). This study employed a collaborative team science concept-mapping evaluation methodology to develop a comprehensive research agenda for the SciTS field. Its integrative mixed-methods approach combined group process with statistical analysis to derive a conceptual framework that identifies research areas of team science and their relative importance to the emerging SciTS field. The findings from this concept-mapping project constitute a lever for moving SciTS forward at theoretical, empirical, and translational levels. PMID:23223093

  19. Fracture mechanism maps in unirradiated and irradiated metals and alloys

    NASA Astrophysics Data System (ADS)

    Li, Meimei; Zinkle, S. J.

    2007-04-01

    This paper presents a methodology for computing a fracture mechanism map in two-dimensional space of tensile stress and temperature using physically-based constitutive equations. Four principal fracture mechanisms were considered: cleavage fracture, low temperature ductile fracture, transgranular creep fracture, and intergranular creep fracture. The methodology was applied to calculate fracture mechanism maps for several selected reactor materials, CuCrZr, 316 type stainless steel, F82H ferritic-martensitic steel, V4Cr4Ti and Mo. The calculated fracture maps are in good agreement with empirical maps obtained from experimental observations. The fracture mechanism maps of unirradiated metals and alloys were modified to include radiation hardening effects on cleavage fracture and high temperature helium embrittlement. Future refinement of fracture mechanism maps is discussed.

  20. Visual EKF-SLAM from Heterogeneous Landmarks †

    PubMed Central

    Esparza-Jiménez, Jorge Othón; Devy, Michel; Gordillo, José L.

    2016-01-01

    Many applications require the localization of a moving object, e.g., a robot, using sensory data acquired from embedded devices. Simultaneous localization and mapping from vision performs both the spatial and temporal fusion of these data on a map when a camera moves in an unknown environment. Such a SLAM process executes two interleaved functions: the front-end detects and tracks features from images, while the back-end interprets features as landmark observations and estimates both the landmarks and the robot positions with respect to a selected reference frame. This paper describes a complete visual SLAM solution, combining both point and line landmarks on a single map. The proposed method has an impact on both the back-end and the front-end. The contributions comprehend the use of heterogeneous landmark-based EKF-SLAM (the management of a map composed of both point and line landmarks); from this perspective, the comparison between landmark parametrizations and the evaluation of how the heterogeneity improves the accuracy on the camera localization, the development of a front-end active-search process for linear landmarks integrated into SLAM and the experimentation methodology. PMID:27070602

  1. Synthesis: Intertwining product and process

    NASA Technical Reports Server (NTRS)

    Weiss, David M.

    1990-01-01

    Synthesis is a proposed systematic process for rapidly creating different members of a program family. Family members are described by variations in their requirements. Requirements variations are mapped to variations on a standard design to generate production quality code and documentation. The approach is made feasible by using principles underlying design for change. Synthesis incorporates ideas from rapid prototyping, application generators, and domain analysis. The goals of Synthesis and the Synthesis process are discussed. The technology needed and the feasibility of the approach are also briefly discussed. The status of current efforts to implement Synthesis methodologies is presented.

  2. Calculation of catalyst crust thickness from full elemental laser-induced breakdown spectroscopy images

    NASA Astrophysics Data System (ADS)

    Sorbier, L.; Trichard, F.; Moncayo, S.; Lienemann, C. P.; Motto-Ros, V.

    2018-01-01

    We propose a methodology to compute the crust thickness of an element in an egg-shell catalyst from a two-dimensional elemental map. The methodology handles two important catalyst shapes: infinite extrudates of arbitrary section and spheres. The methodology is validated with synthetic analytical profiles on simple shapes (cylinder and sphere). Its relative accuracy is shown close to few percent with a decrease inversely proportional to the square root of the number of sampled pixels. The crust thickness obtained by this method from quantitative Pd maps acquired by laser-induced breakdown spectroscopy are comparable with values obtained from electron-probe microanalysis profiles. Some discrepancies are found and are explained by the heterogeneity of the crust thickness within a grain. As a full map is more representative than a single profile, fast mapping and the methodology exposed in this paper are expected to become valuable tools for the development of new generations of egg-shell deposited catalysts.

  3. TU-FG-201-12: Designing a Risk-Based Quality Assurance Program for a Newly Implemented Y-90 Microspheres Procedure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vile, D; Zhang, L; Cuttino, L

    2016-06-15

    Purpose: To create a quality assurance program based upon a risk-based assessment of a newly implemented SirSpheres Y-90 procedure. Methods: A process map was created for a newly implemented SirSpheres procedure at a community hospital. The process map documented each step of this collaborative procedure, as well as the roles and responsibilities of each member. From the process map, different potential failure modes were determined as well as any current controls in place. From this list, a full failure mode and effects analysis (FMEA) was performed by grading each failure mode’s likelihood of occurrence, likelihood of detection, and potential severity.more » These numbers were then multiplied to compute the risk priority number (RPN) for each potential failure mode. Failure modes were then ranked based on their RPN. Additional controls were then added, with failure modes corresponding to the highest RPNs taking priority. Results: A process map was created that succinctly outlined each step in the SirSpheres procedure in its current implementation. From this, 72 potential failure modes were identified and ranked according to their associated RPN. Quality assurance controls and safety barriers were then added for failure modes associated with the highest risk being addressed first. Conclusion: A quality assurance program was created from a risk-based assessment of the SirSpheres process. Process mapping and FMEA were effective in identifying potential high-risk failure modes for this new procedure, which were prioritized for new quality assurance controls. TG 100 recommends the fault tree analysis methodology to design a comprehensive and effective QC/QM program, yet we found that by simply introducing additional safety barriers to address high RPN failure modes makes the whole process simpler and safer.« less

  4. Relationships between palaeogeography and opal occurrence in Australia: A data-mining approach

    NASA Astrophysics Data System (ADS)

    Landgrebe, T. C. W.; Merdith, A.; Dutkiewicz, A.; Müller, R. D.

    2013-07-01

    Age-coded multi-layered geological datasets are becoming increasingly prevalent with the surge in open-access geodata, yet there are few methodologies for extracting geological information and knowledge from these data. We present a novel methodology, based on the open-source GPlates software in which age-coded digital palaeogeographic maps are used to “data-mine” spatio-temporal patterns related to the occurrence of Australian opal. Our aim is to test the concept that only a particular sequence of depositional/erosional environments may lead to conditions suitable for the formation of gem quality sedimentary opal. Time-varying geographic environment properties are extracted from a digital palaeogeographic dataset of the eastern Australian Great Artesian Basin (GAB) at 1036 opal localities. We obtain a total of 52 independent ordinal sequences sampling 19 time slices from the Early Cretaceous to the present-day. We find that 95% of the known opal deposits are tied to only 27 sequences all comprising fluvial and shallow marine depositional sequences followed by a prolonged phase of erosion. We then map the total area of the GAB that matches these 27 opal-specific sequences, resulting in an opal-prospective region of only about 10% of the total area of the basin. The key patterns underlying this association involve only a small number of key environmental transitions. We demonstrate that these key associations are generally absent at arbitrary locations in the basin. This new methodology allows for the simplification of a complex time-varying geological dataset into a single map view, enabling straightforward application for opal exploration and for future co-assessment with other datasets/geological criteria. This approach may help unravel the poorly understood opal formation process using an empirical spatio-temporal data-mining methodology and readily available datasets to aid hypothesis testing.

  5. Generation of 2D Land Cover Maps for Urban Areas Using Decision Tree Classification

    NASA Astrophysics Data System (ADS)

    Höhle, J.

    2014-09-01

    A 2D land cover map can automatically and efficiently be generated from high-resolution multispectral aerial images. First, a digital surface model is produced and each cell of the elevation model is then supplemented with attributes. A decision tree classification is applied to extract map objects like buildings, roads, grassland, trees, hedges, and walls from such an "intelligent" point cloud. The decision tree is derived from training areas which borders are digitized on top of a false-colour orthoimage. The produced 2D land cover map with six classes is then subsequently refined by using image analysis techniques. The proposed methodology is described step by step. The classification, assessment, and refinement is carried out by the open source software "R"; the generation of the dense and accurate digital surface model by the "Match-T DSM" program of the Trimble Company. A practical example of a 2D land cover map generation is carried out. Images of a multispectral medium-format aerial camera covering an urban area in Switzerland are used. The assessment of the produced land cover map is based on class-wise stratified sampling where reference values of samples are determined by means of stereo-observations of false-colour stereopairs. The stratified statistical assessment of the produced land cover map with six classes and based on 91 points per class reveals a high thematic accuracy for classes "building" (99 %, 95 % CI: 95 %-100 %) and "road and parking lot" (90 %, 95 % CI: 83 %-95 %). Some other accuracy measures (overall accuracy, kappa value) and their 95 % confidence intervals are derived as well. The proposed methodology has a high potential for automation and fast processing and may be applied to other scenes and sensors.

  6. Studies in Ambulatory Care Quality Assessment in the Indian Health Service. Volume II: Appraisal of System Performance.

    ERIC Educational Resources Information Center

    Nutting, Paul A.; And Others

    Six Indian Health Service (IHS) units, chosen in a non-random manner, were evaluated via a quality assessment methodology currently under development by the IHS Office of Research and Development. A set of seven health problems (tracers) was selected to represent major health problems, and clinical algorithms (process maps) were constructed for…

  7. On the retrieval of crystallographic information from atom probe microscopy data via signal mapping from the detector coordinate space.

    PubMed

    Wallace, Nathan D; Ceguerra, Anna V; Breen, Andrew J; Ringer, Simon P

    2018-06-01

    Atom probe tomography is a powerful microscopy technique capable of reconstructing the 3D position and chemical identity of millions of atoms within engineering materials, at the atomic level. Crystallographic information contained within the data is particularly valuable for the purposes of reconstruction calibration and grain boundary analysis. Typically, analysing this data is a manual, time-consuming and error prone process. In many cases, the crystallographic signal is so weak that it is difficult to detect at all. In this study, a new automated signal processing methodology is demonstrated. We use the affine properties of the detector coordinate space, or the 'detector stack', as the basis for our calculations. The methodological framework and the visualisation tools are shown to be superior to the standard method of crystallographic pole visualisation directly from field evaporation images and there is no requirement for iterations between a full real-space initial tomographic reconstruction and the detector stack. The mapping approaches are demonstrated for aluminium, tungsten, magnesium and molybdenum. Implications for reconstruction calibration, accuracy of crystallographic measurements, reliability and repeatability are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. A methodology for small scale rural land use mapping in semi-arid developing countries using orbital imagery. 1: Introduction

    NASA Technical Reports Server (NTRS)

    Vangenderen, J. L. (Principal Investigator); Lock, B. F.

    1976-01-01

    The author has identified the following significant results. This research program has developed a viable methodology for producing small scale rural land use maps in semi-arid developing countries using imagery obtained from orbital multispectral scanners.

  9. Spatio-temporal analysis of prodelta dynamics by means of new satellite generation: the case of Po river by Landsat-8 data

    NASA Astrophysics Data System (ADS)

    Manzo, Ciro; Braga, Federica; Zaggia, Luca; Brando, Vittorio Ernesto; Giardino, Claudia; Bresciani, Mariano; Bassani, Cristiana

    2018-04-01

    This paper describes a procedure to perform spatio-temporal analysis of river plume dispersion in prodelta areas by multi-temporal Landsat-8-derived products for identifying zones sensitive to water discharge and for providing geostatistical patterns of turbidity linked to different meteo-marine forcings. In particular, we characterized the temporal and spatial variability of turbidity and sea surface temperature (SST) in the Po River prodelta (Northern Adriatic Sea, Italy) during the period 2013-2016. To perform this analysis, a two-pronged processing methodology was implemented and the resulting outputs were analysed through a series of statistical tools. A pixel-based spatial correlation analysis was carried out by comparing temporal curves of turbidity and SST hypercubes with in situ time series of wind speed and water discharge, providing correlation coefficient maps. A geostatistical analysis was performed to determine the spatial dependency of the turbidity datasets per each satellite image, providing maps of correlation and variograms. The results show a linear correlation between water discharge and turbidity variations in the points more affected by the buoyant plumes and along the southern coast of Po River delta. Better inverse correlation was found between turbidity and SST during floods rather than other periods. The correlation maps of wind speed with turbidity show different spatial patterns depending on local or basin-scale wind effects. Variogram maps identify different spatial anisotropy structures of turbidity in response to ambient conditions (i.e. strong Bora or Scirocco winds, floods). Since the implemented processing methodology is based on open source software and free satellite data, it represents a promising tool for the monitoring of maritime ecosystems and to address water quality analyses and the investigations of sediment dynamics in estuarine and coastal waters.

  10. Opinion: Clarifying Two Controversies about Information Mapping's Method.

    ERIC Educational Resources Information Center

    Horn, Robert E.

    1992-01-01

    Describes Information Mapping, a methodology for the analysis, organization, sequencing, and presentation of information and explains three major parts of the method: (1) content analysis, (2) project life-cycle synthesis and integration of the content analysis, and (3) sequencing and formatting. Major criticisms of the methodology are addressed.…

  11. Visual analytics as a translational cognitive science.

    PubMed

    Fisher, Brian; Green, Tera Marie; Arias-Hernández, Richard

    2011-07-01

    Visual analytics is a new interdisciplinary field of study that calls for a more structured scientific approach to understanding the effects of interaction with complex graphical displays on human cognitive processes. Its primary goal is to support the design and evaluation of graphical information systems that better support cognitive processes in areas as diverse as scientific research and emergency management. The methodologies that make up this new field are as yet ill defined. This paper proposes a pathway for development of visual analytics as a translational cognitive science that bridges fundamental research in human/computer cognitive systems and design and evaluation of information systems in situ. Achieving this goal will require the development of enhanced field methods for conceptual decomposition of human/computer cognitive systems that maps onto laboratory studies, and improved methods for conducting laboratory investigations that might better map onto real-world cognitive processes in technology-rich environments. Copyright © 2011 Cognitive Science Society, Inc.

  12. Healthy and productive workers: using intervention mapping to design a workplace health promotion and wellness program to improve presenteeism.

    PubMed

    Ammendolia, Carlo; Côté, Pierre; Cancelliere, Carol; Cassidy, J David; Hartvigsen, Jan; Boyle, Eleanor; Soklaridis, Sophie; Stern, Paula; Amick, Benjamin

    2016-11-25

    Presenteeism is a growing problem in developed countries mostly due to an aging workforce. The economic costs related to presenteeism exceed those of absenteeism and employer health costs. Employers are implementing workplace health promotion and wellness programs to improve health among workers and reduce presenteeism. How best to design, integrate and deliver these programs are unknown. The main purpose of this study was to use an intervention mapping approach to develop a workplace health promotion and wellness program aimed at reducing presenteeism. We partnered with a large international financial services company and used a qualitative synthesis based on an intervention mapping methodology. Evidence from systematic reviews and key articles on reducing presenteeism and implementing health promotion programs was combined with theoretical models for changing behavior and stakeholder experience. This was then systematically operationalized into a program using discussion groups and consensus among experts and stakeholders. The top health problem impacting our workplace partner was mental health. Depression and stress were the first and second highest cause of productivity loss respectively. A multi-pronged program with detailed action steps was developed and directed at key stakeholders and health conditions. For mental health, regular sharing focus groups, social networking, monthly personal stories from leadership using webinars and multi-media communications, expert-led workshops, lunch and learn sessions and manager and employee training were part of a comprehensive program. Comprehensive, specific and multi-pronged strategies were developed and aimed at encouraging healthy behaviours that impact presenteeism such as regular exercise, proper nutrition, adequate sleep, smoking cessation, socialization and work-life balance. Limitations of the intervention mapping process included high resource and time requirements, the lack of external input and viewpoints skewed towards middle and upper management, and using secondary workplace data of unknown validity and reliability. In general, intervention mapping was a useful method to develop a workplace health promotion and wellness program aimed at reducing presenteeism. The methodology provided a step-by-step process to unravel a complex problem. The process compelled participants to think critically, collaboratively and in nontraditional ways.

  13. Metabolomics for undergraduates: Identification and pathway assignment of mitochondrial metabolites.

    PubMed

    Marques, Ana Patrícia; Serralheiro, Maria Luisa; Ferreira, António E N; Freire, Ana Ponces; Cordeiro, Carlos; Silva, Marta Sousa

    2016-01-01

    Metabolomics is a key discipline in systems biology, together with genomics, transcriptomics, and proteomics. In this omics cascade, the metabolome represents the biochemical products that arise from cellular processes and is often regarded as the final response of a biological system to environmental or genetic changes. The overall screening approach to identify all the metabolites in a given biological system is called metabolic fingerprinting. Using high-resolution and high-mass accuracy mass spectrometry, large metabolome coverage, sensitivity, and specificity can be attained. Although the theoretical concepts of this methodology are usually provided in life-science programs, hands-on laboratory experiments are not usually accessible to undergraduate students. Even if the instruments are available, there are not simple laboratory protocols created specifically for teaching metabolomics. We designed a straightforward hands-on laboratory experiment to introduce students to this methodology, relating it to biochemical knowledge through metabolic pathway mapping of the identified metabolites. This study focuses on mitochondrial metabolomics since mitochondria have a well-known, medium-sized cellular sub-metabolome. These features facilitate both data processing and pathway mapping. In this experiment, students isolate mitochondria from potatoes, extract the metabolites, and analyze them by high-resolution mass spectrometry (using an FT-ICR mass spectrometer). The resulting mass list is submitted to an online program for metabolite identification, and compounds associated with mitochondrial pathways can be highlighted in a metabolic network map. © 2015 The International Union of Biochemistry and Molecular Biology.

  14. Application of a GIS-/remote sensing-based approach for predicting groundwater potential zones using a multi-criteria data mining methodology.

    PubMed

    Mogaji, Kehinde Anthony; Lim, Hwee San

    2017-07-01

    This study integrates the application of Dempster-Shafer-driven evidential belief function (DS-EBF) methodology with remote sensing and geographic information system techniques to analyze surface and subsurface data sets for the spatial prediction of groundwater potential in Perak Province, Malaysia. The study used additional data obtained from the records of the groundwater yield rate of approximately 28 bore well locations. The processed surface and subsurface data produced sets of groundwater potential conditioning factors (GPCFs) from which multiple surface hydrologic and subsurface hydrogeologic parameter thematic maps were generated. The bore well location inventories were partitioned randomly into a ratio of 70% (19 wells) for model training to 30% (9 wells) for model testing. Application results of the DS-EBF relationship model algorithms of the surface- and subsurface-based GPCF thematic maps and the bore well locations produced two groundwater potential prediction (GPP) maps based on surface hydrologic and subsurface hydrogeologic characteristics which established that more than 60% of the study area falling within the moderate-high groundwater potential zones and less than 35% falling within the low potential zones. The estimated uncertainty values within the range of 0 to 17% for the predicted potential zones were quantified using the uncertainty algorithm of the model. The validation results of the GPP maps using relative operating characteristic curve method yielded 80 and 68% success rates and 89 and 53% prediction rates for the subsurface hydrogeologic factor (SUHF)- and surface hydrologic factor (SHF)-based GPP maps, respectively. The study results revealed that the SUHF-based GPP map accurately delineated groundwater potential zones better than the SHF-based GPP map. However, significant information on the low degree of uncertainty of the predicted potential zones established the suitability of the two GPP maps for future development of groundwater resources in the area. The overall results proved the efficacy of the data mining model and the geospatial technology in groundwater potential mapping.

  15. RF-Based Location Using Interpolation Functions to Reduce Fingerprint Mapping

    PubMed Central

    Ezpeleta, Santiago; Claver, José M.; Pérez-Solano, Juan J.; Martí, José V.

    2015-01-01

    Indoor RF-based localization using fingerprint mapping requires an initial training step, which represents a time consuming process. This location methodology needs a database conformed with RSSI (Radio Signal Strength Indicator) measures from the communication transceivers taken at specific locations within the localization area. But, the real world localization environment is dynamic and it is necessary to rebuild the fingerprint database when some environmental changes are made. This paper explores the use of different interpolation functions to complete the fingerprint mapping needed to achieve the sought accuracy, thereby reducing the effort in the training step. Also, different distributions of test maps and reference points have been evaluated, showing the validity of this proposal and necessary trade-offs. Results reported show that the same or similar localization accuracy can be achieved even when only 50% of the initial fingerprint reference points are taken. PMID:26516862

  16. Quality initiatives: improving patient flow for a bone densitometry practice: results from a Mayo Clinic radiology quality initiative.

    PubMed

    Aakre, Kenneth T; Valley, Timothy B; O'Connor, Michael K

    2010-03-01

    Lean Six Sigma process improvement methodologies have been used in manufacturing for some time. However, Lean Six Sigma process improvement methodologies also are applicable to radiology as a way to identify opportunities for improvement in patient care delivery settings. A multidisciplinary team of physicians and staff conducted a 100-day quality improvement project with the guidance of a quality advisor. By using the framework of DMAIC (define, measure, analyze, improve, and control), time studies were performed for all aspects of patient and technologist involvement. From these studies, value stream maps for the current state and for the future were developed, and tests of change were implemented. Comprehensive value stream maps showed that before implementation of process changes, an average time of 20.95 minutes was required for completion of a bone densitometry study. Two process changes (ie, tests of change) were undertaken. First, the location for completion of a patient assessment form was moved from inside the imaging room to the waiting area, enabling patients to complete the form while waiting for the technologist. Second, the patient was instructed to sit in a waiting area immediately outside the imaging rooms, rather than in the main reception area, which is far removed from the imaging area. Realignment of these process steps, with reduced technologist travel distances, resulted in a 3-minute average decrease in the patient cycle time. This represented a 15% reduction in the initial patient cycle time with no change in staff or costs. Radiology process improvement projects can yield positive results despite small incremental changes.

  17. Using Time-Driven Activity-Based Costing as a Key Component of the Value Platform: A Pilot Analysis of Colonoscopy, Aortic Valve Replacement and Carpal Tunnel Release Procedures.

    PubMed

    Martin, Jacob A; Mayhew, Christopher R; Morris, Amanda J; Bader, Angela M; Tsai, Mitchell H; Urman, Richard D

    2018-04-01

    Time-driven activity-based costing (TDABC) is a methodology that calculates the costs of healthcare resources consumed as a patient moves along a care process. Limited data exist on the application of TDABC from the perspective of an anesthesia provider. We describe the use of TDABC, a bottom-up costing strategy and financial outcomes for three different medical-surgical procedures. In each case, a multi-disciplinary team created process maps describing the care delivery cycle for a patient encounter using the TDABC methodology. Each step in a process map delineated an activity required for delivery of patient care. The resources (personnel, equipment and supplies) associated with each step were identified. A per minute cost for each resource expended was generated, known as the capacity cost rate, and multiplied by its time requirement. The total cost for an episode of care was obtained by adding the cost of each individual resource consumed as the patient moved along a clinical pathway. We built process maps for colonoscopy in the gastroenterology suite, calculated costs of an aortic valve replacement by comparing surgical aortic valve replacement (SAVR) versus transcatheter aortic valve replacement (TAVR) techniques, and determined the cost of carpal tunnel release in an operating room versus an ambulatory procedure room. TDABC is central to the value-based healthcare platform. Application of TDABC provides a framework to identify process improvements for health care delivery. The first case demonstrates cost-savings and improved wait times by shifting some of the colonoscopies scheduled with an anesthesiologist from the main hospital to the ambulatory facility. In the second case, we show that the deployment of an aortic valve via the transcatheter route front loads the costs compared to traditional, surgical replacement. The last case demonstrates significant cost savings to the healthcare system associated with re-organization of staff required to execute a carpal tunnel release.

  18. Using Time-Driven Activity-Based Costing as a Key Component of the Value Platform: A Pilot Analysis of Colonoscopy, Aortic Valve Replacement and Carpal Tunnel Release Procedures

    PubMed Central

    Martin, Jacob A.; Mayhew, Christopher R.; Morris, Amanda J.; Bader, Angela M.; Tsai, Mitchell H.; Urman, Richard D.

    2018-01-01

    Background Time-driven activity-based costing (TDABC) is a methodology that calculates the costs of healthcare resources consumed as a patient moves along a care process. Limited data exist on the application of TDABC from the perspective of an anesthesia provider. We describe the use of TDABC, a bottom-up costing strategy and financial outcomes for three different medical-surgical procedures. Methods In each case, a multi-disciplinary team created process maps describing the care delivery cycle for a patient encounter using the TDABC methodology. Each step in a process map delineated an activity required for delivery of patient care. The resources (personnel, equipment and supplies) associated with each step were identified. A per minute cost for each resource expended was generated, known as the capacity cost rate, and multiplied by its time requirement. The total cost for an episode of care was obtained by adding the cost of each individual resource consumed as the patient moved along a clinical pathway. Results We built process maps for colonoscopy in the gastroenterology suite, calculated costs of an aortic valve replacement by comparing surgical aortic valve replacement (SAVR) versus transcatheter aortic valve replacement (TAVR) techniques, and determined the cost of carpal tunnel release in an operating room versus an ambulatory procedure room. Conclusions TDABC is central to the value-based healthcare platform. Application of TDABC provides a framework to identify process improvements for health care delivery. The first case demonstrates cost-savings and improved wait times by shifting some of the colonoscopies scheduled with an anesthesiologist from the main hospital to the ambulatory facility. In the second case, we show that the deployment of an aortic valve via the transcatheter route front loads the costs compared to traditional, surgical replacement. The last case demonstrates significant cost savings to the healthcare system associated with re-organization of staff required to execute a carpal tunnel release. PMID:29511420

  19. Lean methodology for performance improvement in the trauma discharge process.

    PubMed

    O'Mara, Michael Shaymus; Ramaniuk, Aliaksandr; Graymire, Vickie; Rozzell, Monica; Martin, Stacey

    2014-07-01

    High-volume, complex services such as trauma and acute care surgery are at risk for inefficiency. Lean process improvement can reduce health care waste. Lean allows a structured look at processes not easily amenable to analysis. We applied lean methodology to the current state of communication and discharge planning on an urban trauma service, citing areas for improvement. A lean process mapping event was held. The process map was used to identify areas for immediate analysis and intervention-defining metrics for the stakeholders. After intervention, new performance was assessed by direct data evaluation. The process was completed with an analysis of effect and plans made for addressing future focus areas. The primary area of concern identified was interservice communication. Changes centering on a standardized morning report structure reduced the number of consult questions unanswered from 67% to 34% (p = 0.0021). Physical therapy rework was reduced from 35% to 19% (p = 0.016). Patients admitted to units not designated to the trauma service had 1.6 times longer stays (p < 0.0001). The lean process lasted 8 months, and three areas for new improvement were identified: (1) the off-unit patients; (2) patients with length of stay more than 15 days contribute disproportionately to length of stay; and (3) miscommunication exists around patient education at discharge. Lean process improvement is a viable means of health care analysis. When applied to a trauma service with 4,000 admissions annually, lean identifies areas ripe for improvement. Our inefficiencies surrounded communication and patient localization. Strategies arising from the input of all stakeholders led to real solutions for communication through a face-to-face morning report and identified areas for ongoing improvement. This focuses resource use and identifies areas for improvement of throughput in care delivery.

  20. Mapping land cover from satellite images: A basic, low cost approach

    NASA Technical Reports Server (NTRS)

    Elifrits, C. D.; Barney, T. W.; Barr, D. J.; Johannsen, C. J.

    1978-01-01

    Simple, inexpensive methodologies developed for mapping general land cover and land use categories from LANDSAT images are reported. One methodology, a stepwise, interpretive, direct tracing technique was developed through working with university students from different disciplines with no previous experience in satellite image interpretation. The technique results in maps that are very accurate in relation to actual land cover and relative to the small investment in skill, time, and money needed to produce the products.

  1. Land use mapping from CBERS-2 images with open source tools by applying different classification algorithms

    NASA Astrophysics Data System (ADS)

    Sanhouse-García, Antonio J.; Rangel-Peraza, Jesús Gabriel; Bustos-Terrones, Yaneth; García-Ferrer, Alfonso; Mesas-Carrascosa, Francisco J.

    2016-02-01

    Land cover classification is often based on different characteristics between their classes, but with great homogeneity within each one of them. This cover is obtained through field work or by mean of processing satellite images. Field work involves high costs; therefore, digital image processing techniques have become an important alternative to perform this task. However, in some developing countries and particularly in Casacoima municipality in Venezuela, there is a lack of geographic information systems due to the lack of updated information and high costs in software license acquisition. This research proposes a low cost methodology to develop thematic mapping of local land use and types of coverage in areas with scarce resources. Thematic mapping was developed from CBERS-2 images and spatial information available on the network using open source tools. The supervised classification method per pixel and per region was applied using different classification algorithms and comparing them among themselves. Classification method per pixel was based on Maxver algorithms (maximum likelihood) and Euclidean distance (minimum distance), while per region classification was based on the Bhattacharya algorithm. Satisfactory results were obtained from per region classification, where overall reliability of 83.93% and kappa index of 0.81% were observed. Maxver algorithm showed a reliability value of 73.36% and kappa index 0.69%, while Euclidean distance obtained values of 67.17% and 0.61% for reliability and kappa index, respectively. It was demonstrated that the proposed methodology was very useful in cartographic processing and updating, which in turn serve as a support to develop management plans and land management. Hence, open source tools showed to be an economically viable alternative not only for forestry organizations, but for the general public, allowing them to develop projects in economically depressed and/or environmentally threatened areas.

  2. The conceptual maps in the development of the course of biology of tenth degree: An investigation experience action in the classroom

    NASA Astrophysics Data System (ADS)

    Samo Goyco, Marisol

    This investigation describes and combines the qualitative and quantitative methods of nature. The research I have work explore, observe, record and also it describes the experience to consider the education and teaching of the course. This investigation is a research that our students from the biology course, since the constructivist approach identifying and correct mistake. In this investigation there were participating twenty five students of tenth grade from a public school specialized in music. This research includes conceptual maps, computer integration, science programmed, internet, and broadcast and assessment approach. The research of conceptual maps establishes the correct method to perform capture the acknowledgement and attention of the investigators and the students which represents a significant relation between the concepts. Thought the investigator sustains on the cycle spiral of Carr and kemmis (1988) I design every unit considering the previous ideology of the student and elaborating the unit plan. Sustaining Maintain the methodology of the action research. The methodology has response to a new teaching paradigm. Situate as a principal assignment of the professor to contribute in the process of active learning to the students. Also helps to have in this process a reflection in their function or goals. During the research I analyze and wrote the observation and materials. The investigator express in her writing the final findings in every cycle. Also evaluates the map concepts the varied integration of activity and the assessment skills which are used through the socialized discussion. The socialized discussion communicates the participant concepts that should be attended. The students express between their peers and in front of the research of the investigator how they felt in terms of resources and the development of the maps. At the moment of this information I design the next cycle responding to the outstanding needs, this reflection genre a mayor interest in the students for the concept learning, they also demonstrate an active participation in the learning process. The findings demonstrate that the conceptual maps and the resources integration are concepts of development. An outspoken communication with the educators produces advantage for both parts. In this study I suggest to the professor and evaluate their continually practice reducing the stress of educational between students and educators.

  3. Lunar Flashlight and Other Lunar Cubesats

    NASA Technical Reports Server (NTRS)

    Cohen, Barbara

    2017-01-01

    Water is a human-exploitable resource. Lunar Flashlight is a Cubesat mission to detect and map lunar surface ice in permanently-shadowed regions of the lunar south pole. EM-1 will carry 13 Cubesat-class missions to further smallsat science and exploration capabilities; much room to infuse LEO cubesat methodology, models, and technology. Exploring the value of concurrent measurements to measure dynamical processes of water sources and sinks.

  4. Navigating the grounded theory terrain. Part 1.

    PubMed

    Hunter, Andrew; Murphy, Kathy; Grealish, Annmarie; Casey, Dympna; Keady, John

    2011-01-01

    The decision to use grounded theory is not an easy one and this article aims to illustrate and explore the methodological complexity and decision-making process. It explores the decision making of one researcher in the first two years of a grounded theory PhD study looking at the psychosocial training needs of nurses and healthcare assistants working with people with dementia in residential care. It aims to map out three different approaches to grounded theory: classic, Straussian and constructivist. In nursing research, grounded theory is often referred to but it is not always well understood. This confusion is due in part to the history of grounded theory methodology, which is one of development and divergent approaches. Common elements across grounded theory approaches are briefly outlined, along with the key differences of the divergent approaches. Methodological literature pertaining to the three chosen grounded theory approaches is considered and presented to illustrate the options and support the choice made. The process of deciding on classical grounded theory as the version best suited to this research is presented. The methodological and personal factors that directed the decision are outlined. The relative strengths of Straussian and constructivist grounded theories are reviewed. All three grounded theory approaches considered offer the researcher a structured, rigorous methodology, but researchers need to understand their choices and make those choices based on a range of methodological and personal factors. In the second article, the final methodological decision will be outlined and its research application described.

  5. A non-linear dimension reduction methodology for generating data-driven stochastic input models

    NASA Astrophysics Data System (ADS)

    Ganapathysubramanian, Baskar; Zabaras, Nicholas

    2008-06-01

    Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem of manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space Rn. An isometric mapping F from M to a low-dimensional, compact, connected set A⊂Rd(d≪n) is constructed. Given only a finite set of samples of the data, the methodology uses arguments from graph theory and differential geometry to construct the isometric transformation F:M→A. Asymptotic convergence of the representation of M by A is shown. This mapping F serves as an accurate, low-dimensional, data-driven representation of the property variations. The reduced-order model of the material topology and thermal diffusivity variations is subsequently used as an input in the solution of stochastic partial differential equations that describe the evolution of dependant variables. A sparse grid collocation strategy (Smolyak algorithm) is utilized to solve these stochastic equations efficiently. We showcase the methodology by constructing low-dimensional input stochastic models to represent thermal diffusivity in two-phase microstructures. This model is used in analyzing the effect of topological variations of two-phase microstructures on the evolution of temperature in heat conduction processes.

  6. A methodology for producing small scale rural land use maps in semi-arid developing countries using orbital imagery

    NASA Technical Reports Server (NTRS)

    Vangenderen, J. L. (Principal Investigator); Lock, B. F.

    1976-01-01

    The author has identified the following significant results. Results have shown that it is feasible to design a methodology that can provide suitable guidelines for operational production of small scale rural land use maps of semiarid developing regions from LANDSAT MSS imagery, using inexpensive and unsophisticated visual techniques. The suggested methodology provides immediate practical benefits to map makers attempting to produce land use maps in countries with limited budgets and equipment. Many preprocessing and interpretation techniques were considered, but rejected on the grounds that they were inappropriate mainly due to the high cost of imagery and/or equipment, or due to their inadequacy for use in operational projects in the developing countries. Suggested imagery and interpretation techniques, consisting of color composites and monocular magnification proved to be the simplest, fastest, and most versatile methods.

  7. Strategic environmental noise mapping: methodological issues concerning the implementation of the EU Environmental Noise Directive and their policy implications.

    PubMed

    Murphy, E; King, E A

    2010-04-01

    This paper explores methodological issues and policy implications concerning the implementation of the EU Environmental Noise Directive (END) across Member States. Methodologically, the paper focuses on two key thematic issues relevant to the Directive: (1) calculation methods and (2) mapping methods. For (1), the paper focuses, in particular, on how differing calculation methods influence noise prediction results as well as the value of the EU noise indicator L(den) and its associated implications for comparability of noise data across EU states. With regard to (2), emphasis is placed on identifying the issues affecting strategic noise mapping, estimating population exposure, noise action planning and dissemination of noise mapping results to the general public. The implication of these issues for future environmental noise policy is also examined. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  8. Rational Variety Mapping for Contrast-Enhanced Nonlinear Unsupervised Segmentation of Multispectral Images of Unstained Specimen

    PubMed Central

    Kopriva, Ivica; Hadžija, Mirko; Popović Hadžija, Marijana; Korolija, Marina; Cichocki, Andrzej

    2011-01-01

    A methodology is proposed for nonlinear contrast-enhanced unsupervised segmentation of multispectral (color) microscopy images of principally unstained specimens. The methodology exploits spectral diversity and spatial sparseness to find anatomical differences between materials (cells, nuclei, and background) present in the image. It consists of rth-order rational variety mapping (RVM) followed by matrix/tensor factorization. Sparseness constraint implies duality between nonlinear unsupervised segmentation and multiclass pattern assignment problems. Classes not linearly separable in the original input space become separable with high probability in the higher-dimensional mapped space. Hence, RVM mapping has two advantages: it takes implicitly into account nonlinearities present in the image (ie, they are not required to be known) and it increases spectral diversity (ie, contrast) between materials, due to increased dimensionality of the mapped space. This is expected to improve performance of systems for automated classification and analysis of microscopic histopathological images. The methodology was validated using RVM of the second and third orders of the experimental multispectral microscopy images of unstained sciatic nerve fibers (nervus ischiadicus) and of unstained white pulp in the spleen tissue, compared with a manually defined ground truth labeled by two trained pathophysiologists. The methodology can also be useful for additional contrast enhancement of images of stained specimens. PMID:21708116

  9. See what we say: using concept mapping to visualize Latino immigrant's strategies for health interventions.

    PubMed

    Vaughn, Lisa M; Jacquez, Farrah; Marschner, Daniel; McLinden, Daniel

    2016-09-01

    Researchers need specific tools to engage community members in health intervention development to ensure that efforts are contextually appropriate for immigrant populations. The purpose of the study was to generate and prioritize strategies to address obesity, stress and coping, and healthcare navigation that are contextually appropriate and applicable to the Latino immigrant community in Cincinnati, Ohio, and then use the results to develop specific interventions to improve Latino health in our area. A community-academic research team used concept mapping methodology with over 200 Latino immigrants and Latino-serving providers. A community intervention planning session was held to share the final concept maps and vote on strategies. The concept maps and results from the intervention planning session emphasized a community lay health worker model to connect the Latino immigrant community with resources to address obesity, stress and coping, and healthcare navigation. Concept maps allowed for the visualization of health intervention strategies prioritized by the larger Latino immigrant community. Concept maps revealed the appropriate content for health interventions as well as the process community members preferred for intervention delivery.

  10. Mapping coastal morphodynamics with geospatial techniques, Cape Henry, Virginia, USA

    NASA Astrophysics Data System (ADS)

    Allen, Thomas R.; Oertel, George F.; Gares, Paul A.

    2012-01-01

    The advent and proliferation of digital terrain technologies have spawned concomitant advances in coastal geomorphology. Airborne topographic Light Detection and Ranging (LiDAR) has stimulated a renaissance in coastal mapping, and field-based mapping techniques have benefitted from improvements in real-time kinematic (RTK) Global Positioning System (GPS). Varied methodologies for mapping suggest a need to match geospatial products to geomorphic forms and processes, a task that should consider product and process ontologies from each perspective. Towards such synthesis, coastal morphodynamics on a cuspate foreland are reconstructed using spatial analysis. Sequential beach ridge and swale topography are mapped using photogrammetric spot heights and airborne LiDAR data and integrated with digital bathymetry and large-scale vector shoreline data. Isobaths from bathymetric charts were digitized to determine slope and toe depth of the modern shoreface and a reconstructed three-dimensional antecedent shoreface. Triangulated irregular networks were created for the subaerial cape and subaqueous shoreface models of the cape beach ridges and sets for volumetric analyses. Results provide estimates of relative age and progradation rate and corroborate other paleogeologic sea-level rise data from the region. Swale height elevations and other measurements quantifiable in these data provide several parameters suitable for studying coastal geomorphic evolution. Mapped paleoshorelines and volumes suggest the Virginia Beach coastal compartment is related to embryonic spit development from a late Holocene shoreline located some 5 km east of the current beach.

  11. Methodology for Designing Fault-Protection Software

    NASA Technical Reports Server (NTRS)

    Barltrop, Kevin; Levison, Jeffrey; Kan, Edwin

    2006-01-01

    A document describes a methodology for designing fault-protection (FP) software for autonomous spacecraft. The methodology embodies and extends established engineering practices in the technical discipline of Fault Detection, Diagnosis, Mitigation, and Recovery; and has been successfully implemented in the Deep Impact Spacecraft, a NASA Discovery mission. Based on established concepts of Fault Monitors and Responses, this FP methodology extends the notion of Opinion, Symptom, Alarm (aka Fault), and Response with numerous new notions, sub-notions, software constructs, and logic and timing gates. For example, Monitor generates a RawOpinion, which graduates into Opinion, categorized into no-opinion, acceptable, or unacceptable opinion. RaiseSymptom, ForceSymptom, and ClearSymptom govern the establishment and then mapping to an Alarm (aka Fault). Local Response is distinguished from FP System Response. A 1-to-n and n-to- 1 mapping is established among Monitors, Symptoms, and Responses. Responses are categorized by device versus by function. Responses operate in tiers, where the early tiers attempt to resolve the Fault in a localized step-by-step fashion, relegating more system-level response to later tier(s). Recovery actions are gated by epoch recovery timing, enabling strategy, urgency, MaxRetry gate, hardware availability, hazardous versus ordinary fault, and many other priority gates. This methodology is systematic, logical, and uses multiple linked tables, parameter files, and recovery command sequences. The credibility of the FP design is proven via a fault-tree analysis "top-down" approach, and a functional fault-mode-effects-and-analysis via "bottoms-up" approach. Via this process, the mitigation and recovery strategy(s) per Fault Containment Region scope (width versus depth) the FP architecture.

  12. Hungarian contribution to the Global Soil Organic Carbon Map (GSOC17) using advanced machine learning algorithms and geostatistics

    NASA Astrophysics Data System (ADS)

    Szatmári, Gábor; Laborczi, Annamária; Takács, Katalin; Pásztor, László

    2017-04-01

    The knowledge about soil organic carbon (SOC) baselines and changes, and the detection of vulnerable hot spots for SOC losses and gains under climate change and changed land management is still fairly limited. Thus Global Soil Partnership (GSP) has been requested to develop a global SOC mapping campaign by 2017. GSPs concept builds on official national data sets, therefore, a bottom-up (country-driven) approach is pursued. The elaborated Hungarian methodology suits the general specifications of GSOC17 provided by GSP. The input data for GSOC17@HU mapping approach has involved legacy soil data bases, as well as proper environmental covariates related to the main soil forming factors, such as climate, organisms, relief and parent material. Nowadays, digital soil mapping (DSM) highly relies on the assumption that soil properties of interest can be modelled as a sum of a deterministic and stochastic component, which can be treated and modelled separately. We also adopted this assumption in our methodology. In practice, multiple regression techniques are commonly used to model the deterministic part. However, this global (and usually linear) models commonly oversimplify the often complex and non-linear relationship, which has a crucial effect on the resulted soil maps. Thus, we integrated machine learning algorithms (namely random forest and quantile regression forest) in the elaborated methodology, supposing then to be more suitable for the problem in hand. This approach has enable us to model the GSOC17 soil properties in that complex and non-linear forms as the soil itself. Furthermore, it has enable us to model and assess the uncertainty of the results, which is highly relevant in decision making. The applied methodology has used geostatistical approach to model the stochastic part of the spatial variability of the soil properties of interest. We created GSOC17@HU map with 1 km grid resolution according to the GSPs specifications. The map contributes to the GSPs GSOC17 proposals, as well as to the development of global soil information system under GSP Pillar 4 on soil data and information. However, we elaborated our adherent code (created in R software environment) in such a way that it can be improved, specified and applied for further uses. Hence, it opens the door to create countrywide map(s) with higher grid resolution for SOC (or other soil related properties) using the advanced methodology, as well as to contribute and support the SOC (or other soil) related country level decision making. Our paper will present the soil mapping methodology itself, the resulted GSOC17@HU map, some of our conclusions drawn from the experiences and their effects on the further uses. Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).

  13. Structural mapping from MSS-LANDSAT imagery: A proposed methodology for international geological correlation studies

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Crepani, E.; Martini, P. R.

    1980-01-01

    A methodology is proposed for international geological correlation studies based on LANDSAT-MSS imagery, Bullard's model of continental fit and compatible structural trends between Northeast Brazil and the West African counterpart. Six extensive lineaments in the Brazilian study area are mapped and discussed according to their regional behavior and in relation to the adjacent continental margin. Among the first conclusions, correlations were found between the Sobral Pedro II Lineament and the megafaults that surround the West African craton; and the Pernambuco Lineament with the Ngaurandere Linemanet in Cameroon. Ongoing research to complete the methodological stages includes the mapping of the West African structural framework, reconstruction of the pre-drift puzzle, and an analysis of the counterpart correlations.

  14. Application of Mls Data to the Assessment of Safety-Related Features in the Surrounding Area of Automatically Detected Pedestrian Crossings

    NASA Astrophysics Data System (ADS)

    Soilán, M.; Riveiro, B.; Sánchez-Rodríguez, A.; González-deSantos, L. M.

    2018-05-01

    During the last few years, there has been a huge methodological development regarding the automatic processing of 3D point cloud data acquired by both terrestrial and aerial mobile mapping systems, motivated by the improvement of surveying technologies and hardware performance. This paper presents a methodology that, in a first place, extracts geometric and semantic information regarding the road markings within the surveyed area from Mobile Laser Scanning (MLS) data, and then employs it to isolate street areas where pedestrian crossings are found and, therefore, pedestrians are more likely to cross the road. Then, different safety-related features can be extracted in order to offer information about the adequacy of the pedestrian crossing regarding its safety, which can be displayed in a Geographical Information System (GIS) layer. These features are defined in four different processing modules: Accessibility analysis, traffic lights classification, traffic signs classification, and visibility analysis. The validation of the proposed methodology has been carried out in two different cities in the northwest of Spain, obtaining both quantitative and qualitative results for pedestrian crossing classification and for each processing module of the safety assessment on pedestrian crossing environments.

  15. Multiscale/multiresolution landslides susceptibility mapping

    NASA Astrophysics Data System (ADS)

    Grozavu, Adrian; Cătălin Stanga, Iulian; Valeriu Patriche, Cristian; Toader Juravle, Doru

    2014-05-01

    Within the European strategies, landslides are considered an important threatening that requires detailed studies to identify areas where these processes could occur in the future and to design scientific and technical plans for landslide risk mitigation. In this idea, assessing and mapping the landslide susceptibility is an important preliminary step. Generally, landslide susceptibility at small scale (for large regions) can be assessed through qualitative approach (expert judgements), based on a few variables, while studies at medium and large scale requires quantitative approach (e.g. multivariate statistics), a larger set of variables and, necessarily, the landslide inventory. Obviously, the results vary more or less from a scale to another, depending on the available input data, but also on the applied methodology. Since it is almost impossible to have a complete landslide inventory on large regions (e.g. at continental level), it is very important to verify the compatibility and the validity of results obtained at different scales, identifying the differences and fixing the inherent errors. This paper aims at assessing and mapping the landslide susceptibility at regional level through a multiscale-multiresolution approach from small scale and low resolution to large scale and high resolution of data and results, comparing the compatibility of results. While the first ones could be used for studies at european and national level, the later ones allows results validation, including through fields surveys. The test area, namely the Barlad Plateau (more than 9000 sq.km) is located in Eastern Romania, covering a region where both the natural environment and the human factor create a causal context that favor these processes. The landslide predictors were initially derived from various databases available at pan-european level and progressively completed and/or enhanced together with scale and the resolution: the topography (from SRTM at 90 meters to digital elevation models based on topographical maps, 1:25,000 and 1:5,000), the lithology (from geological maps, 1:200,000), land cover and land use (from CLC 2006 to maps derived from orthorectified aerial images, 0.5 meters resolution), rainfall (from Worldclim, ECAD to our own data), the seismicity (the seismic zonation of Romania) etc. The landslide inventory was created as polygonal data based on aerial images (resolution 0.5 meters), the information being considered at county level (NUTS 3) and, eventually, at communal level (LAU2). The methodological framework is based on the logistic regression as a quantitative method and the analytic hierarchy process as a semi-qualitative methods, both being applied once identically for all scales and once recalibrated for each scale and resolution (from 1:1,000,000 and one km pixel resolution to 1:25,000 and ten meters resolution). The predictive performance of the two models was assessed using the ROC (Receiver Operating Characteristic) curve and the AUC (Area Under Curve) parameter and the results indicate a good correspondence between the susceptibility estimated for the test samples (0.855-0.890) and for the validation samples (0.830-0.865). Finally, the results were compared in pairs in order to fix the errors at small scale and low resolution and to optimize the methodology for landslide susceptibility mapping on large areas.

  16. Arrhythmia Mechanism and Scaling Effect on the Spectral Properties of Electroanatomical Maps With Manifold Harmonics.

    PubMed

    Sanroman-Junquera, Margarita; Mora-Jimenez, Inmaculada; Garcia-Alberola, Arcadio; Caamano, Antonio J; Trenor, Beatriz; Rojo-Alvarez, Jose L

    2018-04-01

    Spatial and temporal processing of intracardiac electrograms provides relevant information to support the arrhythmia ablation during electrophysiological studies. Current cardiac navigation systems (CNS) and electrocardiographic imaging (ECGI) build detailed 3-D electroanatomical maps (EAM), which represent the spatial anatomical distribution of bioelectrical features, such as activation time or voltage. We present a principled methodology for spectral analysis of both EAM geometry and bioelectrical feature in CNS or ECGI, including their spectral representation, cutoff frequency, or spatial sampling rate (SSR). Existing manifold harmonic techniques for spectral mesh analysis are adapted to account for a fourth dimension, corresponding to the EAM bioelectrical feature. Appropriate scaling is required to address different magnitudes and units. With our approach, simulated and real EAM showed strong SSR dependence on both the arrhythmia mechanism and the cardiac anatomical shape. For instance, high frequencies increased significantly the SSR because of the "early-meets-late" in flutter EAM, compared with the sinus rhythm. Besides, higher frequency components were obtained for the left atrium (more complex anatomy) than for the right atrium in sinus rhythm. The proposed manifold harmonics methodology opens the field toward new signal processing tools for principled EAM spatiofeature analysis in CNS and ECGI, and to an improved knowledge on arrhythmia mechanisms.

  17. Getting past the dual logic: findings from a pilot asset mapping exercise in Sheffield, UK.

    PubMed

    South, Jane; Giuntoli, Gianfranco; Kinsella, Karina

    2017-01-01

    Asset-based approaches seek to identify and mobilise the personal, social and organisational resources available to communities. Asset mapping is a recognised method of gathering an inventory of neighbourhood assets and is underpinned by a fundamentally different logic to traditional needs assessments. The aim of this paper is to explore how asset mapping might be used as a tool for health improvement. It reports on a qualitative evaluation of a pilot asset mapping project carried out in two economically disadvantaged neighbourhoods in Sheffield, UK. The project involved community health champions working with two community organisations to identify assets linked to the health and wellbeing of their neighbourhoods. The evaluation was undertaken in 2012 after mapping activities had been completed. A qualitative design, using theory of change methodology, was used to explore assumptions between activities, mechanisms and outcomes. Semi structured interviews were undertaken with a purposive sample of 11 stakeholders including champions, community staff and strategic partners. Thematic analysis was used and themes were identified on the process of asset mapping, the role of champions and the early outcomes for neighbourhoods and services. Findings showed that asset mapping was developmental and understandings grew as participatory activities were planned and implemented. The role of the champions was limited by numbers involved, nonetheless meaningful engagement occurred with residents which led to personal and social resources being identified. Most early outcomes were focused on the lead community organisations. There was less evidence of results feeding into wider planning processes because of the requirements for more quantifiable information. The paper discusses the importance of relational aspects of asset mapping both within communities and between communities and services. The conclusions are that it is insufficient to switch from the logic of needs to assets without building asset mapping as part of a broader planning process. © 2015 John Wiley & Sons Ltd.

  18. Structural knowledge learning from maps for supervised land cover/use classification: Application to the monitoring of land cover/use maps in French Guiana

    NASA Astrophysics Data System (ADS)

    Bayoudh, Meriam; Roux, Emmanuel; Richard, Gilles; Nock, Richard

    2015-03-01

    The number of satellites and sensors devoted to Earth observation has become increasingly elevated, delivering extensive data, especially images. At the same time, the access to such data and the tools needed to process them has considerably improved. In the presence of such data flow, we need automatic image interpretation methods, especially when it comes to the monitoring and prediction of environmental and societal changes in highly dynamic socio-environmental contexts. This could be accomplished via artificial intelligence. The concept described here relies on the induction of classification rules that explicitly take into account structural knowledge, using Aleph, an Inductive Logic Programming (ILP) system, combined with a multi-class classification procedure. This methodology was used to monitor changes in land cover/use of the French Guiana coastline. One hundred and fifty-eight classification rules were induced from 3 diachronic land cover/use maps including 38 classes. These rules were expressed in first order logic language, which makes them easily understandable by non-experts. A 10-fold cross-validation gave significant average values of 84.62%, 99.57% and 77.22% for classification accuracy, specificity and sensitivity, respectively. Our methodology could be beneficial to automatically classify new objects and to facilitate object-based classification procedures.

  19. Spatiotemporal integration of molecular and anatomical data in virtual reality using semantic mapping.

    PubMed

    Soh, Jung; Turinsky, Andrei L; Trinh, Quang M; Chang, Jasmine; Sabhaney, Ajay; Dong, Xiaoli; Gordon, Paul Mk; Janzen, Ryan Pw; Hau, David; Xia, Jianguo; Wishart, David S; Sensen, Christoph W

    2009-01-01

    We have developed a computational framework for spatiotemporal integration of molecular and anatomical datasets in a virtual reality environment. Using two case studies involving gene expression data and pharmacokinetic data, respectively, we demonstrate how existing knowledge bases for molecular data can be semantically mapped onto a standardized anatomical context of human body. Our data mapping methodology uses ontological representations of heterogeneous biomedical datasets and an ontology reasoner to create complex semantic descriptions of biomedical processes. This framework provides a means to systematically combine an increasing amount of biomedical imaging and numerical data into spatiotemporally coherent graphical representations. Our work enables medical researchers with different expertise to simulate complex phenomena visually and to develop insights through the use of shared data, thus paving the way for pathological inference, developmental pattern discovery and biomedical hypothesis testing.

  20. An atlas of ShakeMaps for selected global earthquakes

    USGS Publications Warehouse

    Allen, Trevor I.; Wald, David J.; Hotovec, Alicia J.; Lin, Kuo-Wan; Earle, Paul S.; Marano, Kristin D.

    2008-01-01

    An atlas of maps of peak ground motions and intensity 'ShakeMaps' has been developed for almost 5,000 recent and historical global earthquakes. These maps are produced using established ShakeMap methodology (Wald and others, 1999c; Wald and others, 2005) and constraints from macroseismic intensity data, instrumental ground motions, regional topographically-based site amplifications, and published earthquake-rupture models. Applying the ShakeMap methodology allows a consistent approach to combine point observations with ground-motion predictions to produce descriptions of peak ground motions and intensity for each event. We also calculate an estimated ground-motion uncertainty grid for each earthquake. The Atlas of ShakeMaps provides a consistent and quantitative description of the distribution and intensity of shaking for recent global earthquakes (1973-2007) as well as selected historic events. As such, the Atlas was developed specifically for calibrating global earthquake loss estimation methodologies to be used in the U.S. Geological Survey Prompt Assessment of Global Earthquakes for Response (PAGER) Project. PAGER will employ these loss models to rapidly estimate the impact of global earthquakes as part of the USGS National Earthquake Information Center's earthquake-response protocol. The development of the Atlas of ShakeMaps has also led to several key improvements to the Global ShakeMap system. The key upgrades include: addition of uncertainties in the ground motion mapping, introduction of modern ground-motion prediction equations, improved estimates of global seismic-site conditions (VS30), and improved definition of stable continental region polygons. Finally, we have merged all of the ShakeMaps in the Atlas to provide a global perspective of earthquake ground shaking for the past 35 years, allowing comparison with probabilistic hazard maps. The online Atlas and supporting databases can be found at http://earthquake.usgs.gov/eqcenter/shakemap/atlas.php/.

  1. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  2. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  3. Mapping the Ethnographic Journey: A "Road Map" for Novice Researchers Wanting to Engage in Ethnography, Critical Theory and Policy Analysis

    ERIC Educational Resources Information Center

    Naidu, Sham

    2012-01-01

    In this article, the "researcher" narrates the issues faced by novice researchers in choosing the correct lenses to conduct research when searching for the truth via the use of qualitative methodology. It is argued that choosing an appropriate research approach and methodology can be described as an "arduous" journey. For the…

  4. Concept mapping methodology and community-engaged research: A perfect pairing.

    PubMed

    Vaughn, Lisa M; Jones, Jennifer R; Booth, Emily; Burke, Jessica G

    2017-02-01

    Concept mapping methodology as refined by Trochim et al. is uniquely suited to engage communities in all aspects of research from project set-up to data collection to interpreting results to dissemination of results, and an increasing number of research studies have utilized the methodology for exploring complex health issues in communities. In the current manuscript, we present the results of a literature search of peer-reviewed articles in health-related research where concept mapping was used in collaboration with the community. A total of 103 articles met the inclusion criteria. We first address how community engagement was defined in the articles and then focus on the articles describing high community engagement and the associated community outcomes/benefits and methodological challenges. A majority (61%; n=63) of the articles were classified as low to moderate community engagement and participation while 38% (n=39) of the articles were classified as high community engagement and participation. The results of this literature review enhance our understanding of how concept mapping can be used in direct collaboration with communities and highlights the many potential benefits for both researchers and communities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. A methodology for small scale rural land use mapping in semi-arid developing countries using orbital imagery. Part 6: A low-cost method for land use mapping using simple visual techniques of interpretation. [Spain

    NASA Technical Reports Server (NTRS)

    Vangenderen, J. L. (Principal Investigator); Lock, B. F.

    1976-01-01

    The author has identified the following significant results. It was found that color composite transparencies and monocular magnification provided the best base for land use interpretation. New methods for determining optimum sample sizes and analyzing interpretation accuracy levels were developed. All stages of the methodology were assessed, in the operational sense, during the production of a 1:250,000 rural land use map of Murcia Province, Southeast Spain.

  6. Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Titov, V.; Gonzalez, F.; Kanoglu, U.; Yalciner, A.; Synolakis, C. E.

    2006-12-01

    An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used to model inundation for Cascadia events with results much larger than sediment records and existing state-of-the-art studies suggest leading to confusion among emergency management. Incorrectly assessing tsunami impact is hazardous, as recent events in 2006 in Tonga, Kythira, Greece and Central Java have suggested (Synolakis and Bernard, 2006). To calculate tsunami currents, forces and runup on coastal structures, and inundation of coastlines one must calculate the evolution of the tsunami wave from the deep ocean to its target site, numerically. No matter what the numerical model, validation (the process of ensuring that the model solves the parent equations of motion accurately) and verification (the process of ensuring that the model used represents geophysical reality appropriately) both are an essential. Validation ensures that the model performs well in a wide range of circumstances and is accomplished through comparison with analytical solutions. Verification ensures that the computational code performs well over a range of geophysical problems. A few analytic solutions have been validated themselves with laboratory data. Even fewer existing numerical models have been both validated with the analytical solutions and verified with both laboratory measurements and field measurements, thus establishing a gold standard for numerical codes for inundation mapping. While there is in principle no absolute certainty that a numerical code that has performed well in all the benchmark tests will also produce correct inundation predictions with any given source motions, validated codes reduce the level of uncertainty in their results to the uncertainty in the geophysical initial conditions. Further, when coupled with real--time free--field tsunami measurements from tsunameters, validated codes are the only choice for realistic forecasting of inundation; the consequences of failure are too ghastly to take chances with numerical procedures that have not been validated. We discuss a ten step process of benchmark tests for models used for inundation mapping. The associated methodology and algorithmes have to first be validated with analytical solutions, then verified with laboratory measurements and field data. The models need to be published in the scientific literature in peer-review journals indexed by ISI. While this process may appear onerous, it reflects our state of knowledge, and is the only defensible methodology when human lives are at stake. Synolakis, C.E., and Bernard, E.N, Tsunami science before and beyond Boxing Day 2004, Phil. Trans. R. Soc. A 364 1845, 2231--2263, 2005.

  7. Rational variety mapping for contrast-enhanced nonlinear unsupervised segmentation of multispectral images of unstained specimen.

    PubMed

    Kopriva, Ivica; Hadžija, Mirko; Popović Hadžija, Marijana; Korolija, Marina; Cichocki, Andrzej

    2011-08-01

    A methodology is proposed for nonlinear contrast-enhanced unsupervised segmentation of multispectral (color) microscopy images of principally unstained specimens. The methodology exploits spectral diversity and spatial sparseness to find anatomical differences between materials (cells, nuclei, and background) present in the image. It consists of rth-order rational variety mapping (RVM) followed by matrix/tensor factorization. Sparseness constraint implies duality between nonlinear unsupervised segmentation and multiclass pattern assignment problems. Classes not linearly separable in the original input space become separable with high probability in the higher-dimensional mapped space. Hence, RVM mapping has two advantages: it takes implicitly into account nonlinearities present in the image (ie, they are not required to be known) and it increases spectral diversity (ie, contrast) between materials, due to increased dimensionality of the mapped space. This is expected to improve performance of systems for automated classification and analysis of microscopic histopathological images. The methodology was validated using RVM of the second and third orders of the experimental multispectral microscopy images of unstained sciatic nerve fibers (nervus ischiadicus) and of unstained white pulp in the spleen tissue, compared with a manually defined ground truth labeled by two trained pathophysiologists. The methodology can also be useful for additional contrast enhancement of images of stained specimens. Copyright © 2011 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.

  8. Landslide susceptibility mapping in the coastal region in the State of São Paulo, Brazil

    NASA Astrophysics Data System (ADS)

    Alvala, R. C.; Camarinha, P. I.; Canavesi, V.

    2013-05-01

    The exposure of populations in risk areas is a matter of global concern, because it is a determining factor for the natural disasters occurrences. Furthermore, it has also been observed an intensification of extreme hydrometeorological events that has triggered disasters in various parts of the globe, further increasing the need for monitoring and alerting for natural disasters, aiming the safeguarding of life and minimize economic losses. Accordingly, different methodologies for risk assessment have been proposed, focusing on the specific natural hazards. Particularly for Brazil, which has economic axis of development in the regions near the coast, it is common to observe the process of urbanization advancing on steep slopes of the mountain regions. This characteristic causes the population exposure to the natural hazards related to the mass movements, which the landslides stood out as the cause of many deaths and economic losses every year. Thus, prior to risk analysis (when human occupation intersect with natural hazard), it is essential to analyze the susceptibility, which reflects the physical and environmental conditions that trigger for such phenomena. However, this task becomes a major challenge due to the difficulty of finding databases with good quality. In this context, this paper presents a methodology based only on spatial information in the public domain, integrated into a Geographic Information System free, in order to analyze the landslides susceptibility. In a first effort, we evaluated four counties of Southeastern Brazil - Santos, Cubatão, Caraguatatuba and Ubatuba - located in a region that includes the rugged reliefs of Serra do Mar and the transition to the coastal region, that have historic of disasters related. It is noteworthy that the methodology takes into account many variables that was weighted and crossed by Fuzzy Gamma technique, such as: topography (horizontal and vertical curvature of the slopes), geology, geomorphology, slope, land use and pedology. As a result, we obtain 5 susceptibility classes: very low, low, medium, high and very high. To validate the methodology, there was overlapped the Landslides Susceptibility Map with real risk areas previously mapped, provided by the National Centre for Monitoring and Alert of Natural Disasters. This step is important especially to assess the methodology adherence to evaluate the classes that was mapped with high and very high susceptibility. The preliminary results indicate that over 70% of the the mapped risks areas are located into the classes more susceptible. We observed small inconsistencies that are related with spatial displacement of the various databases considered, which has different resolutions and scales. Therefore, the results indicated that the methodology is robust and showed the high vulnerability of the counties analyzed, which further highlights that the landslides susceptibility should be monitored carefully by the decision makers in order to prevent and minimize the natural disasters impact, so that provide better territorial planning.

  9. Implementation of efficient trajectories for an ultrasonic scanner using chaotic maps

    NASA Astrophysics Data System (ADS)

    Almeda, A.; Baltazar, A.; Treesatayapun, C.; Mijarez, R.

    2012-05-01

    Typical ultrasonic methodology for nondestructive scanning evaluation uses systematic scanning paths. In many cases, this approach is time inefficient and also energy and computational power consuming. Here, a methodology for the scanning of defects using an ultrasonic echo-pulse scanning technique combined with chaotic trajectory generation is proposed. This is implemented in a Cartesian coordinate robotic system developed in our lab. To cover the entire search area, a chaotic function and a proposed mirror mapping were incorporated. To improve detection probability, our proposed scanning methodology is complemented with a probabilistic approach of discontinuity detection. The developed methodology was found to be more efficient than traditional ones used to localize and characterize hidden flaws.

  10. [Health vulnerability mapping in the Community of Madrid (Spain)].

    PubMed

    Ramasco-Gutiérrez, Milagros; Heras-Mosteiro, Julio; Garabato-González, Sonsoles; Aránguez-Ruiz, Emiliano; Aguirre Martín-Gil, Ramón

    The Public Health General Directorate of Madrid has developed a health vulnerability mapping methodology to assist regional social health teams in health planning, prioritisation and intervention based on a model of social determinants of health and an equity approach. This process began with the selection of areas with the worst social indicators in health vulnerability. Then, key stakeholders of the region jointly identified priority areas of intervention and developed a consensual plan of action. We present the outcomes of this experience and its connection with theoretical models of asset-based community development, health-integrated georeferencing systems and community health interventions. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  11. What is an evidence map? A systematic review of published evidence maps and their definitions, methods, and products.

    PubMed

    Miake-Lye, Isomi M; Hempel, Susanne; Shanman, Roberta; Shekelle, Paul G

    2016-02-10

    The need for systematic methods for reviewing evidence is continuously increasing. Evidence mapping is one emerging method. There are no authoritative recommendations for what constitutes an evidence map or what methods should be used, and anecdotal evidence suggests heterogeneity in both. Our objectives are to identify published evidence maps and to compare and contrast the presented definitions of evidence mapping, the domains used to classify data in evidence maps, and the form the evidence map takes. We conducted a systematic review of publications that presented results with a process termed "evidence mapping" or included a figure called an "evidence map." We identified publications from searches of ten databases through 8/21/2015, reference mining, and consulting topic experts. We abstracted the research question, the unit of analysis, the search methods and search period covered, and the country of origin. Data were narratively synthesized. Thirty-nine publications met inclusion criteria. Published evidence maps varied in their definition and the form of the evidence map. Of the 31 definitions provided, 67 % described the purpose as identification of gaps and 58 % referenced a stakeholder engagement process or user-friendly product. All evidence maps explicitly used a systematic approach to evidence synthesis. Twenty-six publications referred to a figure or table explicitly called an "evidence map," eight referred to an online database as the evidence map, and five stated they used a mapping methodology but did not present a visual depiction of the evidence. The principal conclusion of our evaluation of studies that call themselves "evidence maps" is that the implied definition of what constitutes an evidence map is a systematic search of a broad field to identify gaps in knowledge and/or future research needs that presents results in a user-friendly format, often a visual figure or graph, or a searchable database. Foundational work is needed to better standardize the methods and products of an evidence map so that researchers and policymakers will know what to expect of this new type of evidence review. Although an a priori protocol was developed, no registration was completed; this review did not fit the PROSPERO format.

  12. Brief Communication: Mapping river ice using drones and structure from motion

    NASA Astrophysics Data System (ADS)

    Alfredsen, Knut; Haas, Christian; Tuhtan, Jeffrey A.; Zinke, Peggy

    2018-02-01

    In cold climate regions, the formation and break-up of river ice is important for river morphology, winter water supply, and riparian and instream ecology as well as for hydraulic engineering. Data on river ice is therefore significant, both to understand river ice processes directly and to assess ice effects on other systems. Ice measurement is complicated due to difficult site access, the inherent complexity of ice formations, and the potential danger involved in carrying out on-ice measurements. Remote sensing methods are therefore highly useful, and data from satellite-based sensors and, increasingly, aerial and terrestrial imagery are currently applied. Access to low cost drone systems with quality cameras and structure from motion software opens up a new possibility for mapping complex ice formations. Through this method, a georeferenced surface model can be built and data on ice thickness, spatial distribution, and volume can be extracted without accessing the ice, and with considerably fewer measurement efforts compared to traditional surveying methods. A methodology applied to ice mapping is outlined here, and examples are shown of how to successfully derive quantitative data on ice processes.

  13. Advances in the development of common noise assessment methods in Europe: The CNOSSOS-EU framework for strategic environmental noise mapping.

    PubMed

    Kephalopoulos, Stylianos; Paviotti, Marco; Anfosso-Lédée, Fabienne; Van Maercke, Dirk; Shilton, Simon; Jones, Nigel

    2014-06-01

    The Environmental Noise Directive (2002/49/EC) requires EU Member States to determine the exposure to environmental noise through strategic noise mapping and to elaborate action plans in order to reduce noise pollution, where necessary. A common framework for noise assessment methods (CNOSSOS-EU) has been developed by the European Commission in co-operation with the EU Member States to be applied for strategic noise mapping as required by the Environment Noise Directive (2002/49/EC). CNOSSOS-EU represents a harmonised and coherent approach to assess noise levels from the main sources of noise (road traffic, railway traffic, aircraft and industrial) across Europe. This paper outlines the process behind the development of CNOSSOS-EU and the parts of the CNOSSOS-EU core methodological framework which were developed during phase A of the CNOSSOS-EU process (2010-2012), whilst focusing on the main scientific and technical issues that were addressed, and the implementation challenges that are being faced before it can become fully operational in the EU MS. Copyright © 2014. Published by Elsevier B.V.

  14. First Map of Residential Indoor Radon Measurements in Azerbaijan.

    PubMed

    Hoffmann, M; Aliyev, C S; Feyzullayev, A A; Baghirli, R J; Veliyeva, F F; Pampuri, L; Valsangiacomo, C; Tollefsen, T; Cinelli, G

    2017-06-15

    This article describes results of the first measurements of indoor radon concentrations in Azerbaijan, including description of the methodology and the mathematical and statistical processing of the results obtained. Measured radon concentrations varied considerably: from almost radon-free houses to around 1100 Bq m-3. However, only ~7% of the total number of measurements exceeded the maximum permissible concentrations. Based on these data, maps of the distribution of volumetric activity and elevated indoor radon concentrations in Azerbaijan were created. These maps reflect a mosaic character of distribution of radon and enhanced values that are confined to seismically active areas at the intersection of an active West Caspian fault with sub-latitudinal faults along the Great and Lesser Caucasus and the Talysh mountains. Spatial correlation of radon and temperature behavior is also described. The data gathered on residential indoor radon have been integrated into the European Indoor Radon Map. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Automated clustering of probe molecules from solvent mapping of protein surfaces: new algorithms applied to hot-spot mapping and structure-based drug design

    NASA Astrophysics Data System (ADS)

    Lerner, Michael G.; Meagher, Kristin L.; Carlson, Heather A.

    2008-10-01

    Use of solvent mapping, based on multiple-copy minimization (MCM) techniques, is common in structure-based drug discovery. The minima of small-molecule probes define locations for complementary interactions within a binding pocket. Here, we present improved methods for MCM. In particular, a Jarvis-Patrick (JP) method is outlined for grouping the final locations of minimized probes into physical clusters. This algorithm has been tested through a study of protein-protein interfaces, showing the process to be robust, deterministic, and fast in the mapping of protein "hot spots." Improvements in the initial placement of probe molecules are also described. A final application to HIV-1 protease shows how our automated technique can be used to partition data too complicated to analyze by hand. These new automated methods may be easily and quickly extended to other protein systems, and our clustering methodology may be readily incorporated into other clustering packages.

  16. Psychosocial experiences associated with confirmed and self-identified dyslexia: a participant-driven concept map of adult perspectives.

    PubMed

    Nalavany, Blace Arthur; Carawan, Lena Williams; Rennick, Robyn A

    2011-01-01

    Concept mapping (a mixed qualitative-quantitative methodology) was used to describe and understand the psychosocial experiences of adults with confirmed and self-identified dyslexia. Using innovative processes of art and photography, Phase 1 of the study included 15 adults who participated in focus groups and in-depth interviews and were asked to elucidate their experiences with dyslexia. On index cards, 75 statements and experiences with dyslexia were recorded. The second phase of the study included 39 participants who sorted these statements into self-defined categories and rated each statement to reflect their personal experiences to produce a visual representation, or concept map, of their experience. The final concept map generated nine distinct cluster themes: Organization Skills for Success; Finding Success; A Good Support System Makes the Difference; On Being Overwhelmed; Emotional Downside; Why Can't They See It?; Pain, Hurt, and Embarrassment From Past to Present; Fear of Disclosure; and Moving Forward. Implications of these findings are discussed.

  17. Spatial planning using probabilistic flood maps

    NASA Astrophysics Data System (ADS)

    Alfonso, Leonardo; Mukolwe, Micah; Di Baldassarre, Giuliano

    2015-04-01

    Probabilistic flood maps account for uncertainty in flood inundation modelling and convey a degree of certainty in the outputs. Major sources of uncertainty include input data, topographic data, model structure, observation data and parametric uncertainty. Decision makers prefer less ambiguous information from modellers; this implies that uncertainty is suppressed to yield binary flood maps. Though, suppressing information may potentially lead to either surprise or misleading decisions. Inclusion of uncertain information in the decision making process is therefore desirable and transparent. To this end, we utilise the Prospect theory and information from a probabilistic flood map to evaluate potential decisions. Consequences related to the decisions were evaluated using flood risk analysis. Prospect theory explains how choices are made given options for which probabilities of occurrence are known and accounts for decision makers' characteristics such as loss aversion and risk seeking. Our results show that decision making is pronounced when there are high gains and loss, implying higher payoffs and penalties, therefore a higher gamble. Thus the methodology may be appropriately considered when making decisions based on uncertain information.

  18. Acquisition and Processing of High Resolution Hyperspectral Imageries for the 3d Mapping of Urban Heat Islands and Microparticles of Montreal

    NASA Astrophysics Data System (ADS)

    Mongeau, R.; Baudouin, Y.; Cavayas, F.

    2017-10-01

    Ville de Montreal wanted to develop a system to identify heat islands and microparticles at the urban scale and to study their formation. UQAM and UdeM universities have joined their expertise under the framework "Observatoire Spatial Urbain" to create a representative geospatial database of thermal and atmospheric parameters collected during the summer months. They innovated in the development of a methodology for processing high resolution hyperspectral images (1-2 m). In partnership with Ville de Montreal, they integrated 3D geospatial data (topography, transportation and meteorology) in the process. The 3D mapping of intraurban heat islands as well as air micro-particles makes it possible, initially, to identify the problematic situations for future civil protection interventions during extreme heat. Moreover, it will be used as a reference for the Ville de Montreal to establish a strategy for public domain tree planting and in the analysis of urban development projects.

  19. Transition Characteristic Analysis of Traffic Evolution Process for Urban Traffic Network

    PubMed Central

    Chen, Hong; Li, Yang

    2014-01-01

    The characterization of the dynamics of traffic states remains fundamental to seeking for the solutions of diverse traffic problems. To gain more insights into traffic dynamics in the temporal domain, this paper explored temporal characteristics and distinct regularity in the traffic evolution process of urban traffic network. We defined traffic state pattern through clustering multidimensional traffic time series using self-organizing maps and construct a pattern transition network model that is appropriate for representing and analyzing the evolution progress. The methodology is illustrated by an application to data flow rate of multiple road sections from Network of Shenzhen's Nanshan District, China. Analysis and numerical results demonstrated that the methodology permits extracting many useful traffic transition characteristics including stability, preference, activity, and attractiveness. In addition, more information about the relationships between these characteristics was extracted, which should be helpful in understanding the complex behavior of the temporal evolution features of traffic patterns. PMID:24982969

  20. Rapid identification of kidney cyst mutations by whole exome sequencing in zebrafish

    PubMed Central

    Ryan, Sean; Willer, Jason; Marjoram, Lindsay; Bagwell, Jennifer; Mankiewicz, Jamie; Leshchiner, Ignaty; Goessling, Wolfram; Bagnat, Michel; Katsanis, Nicholas

    2013-01-01

    Forward genetic approaches in zebrafish have provided invaluable information about developmental processes. However, the relative difficulty of mapping and isolating mutations has limited the number of new genetic screens. Recent improvements in the annotation of the zebrafish genome coupled to a reduction in sequencing costs prompted the development of whole genome and RNA sequencing approaches for gene discovery. Here we describe a whole exome sequencing (WES) approach that allows rapid and cost-effective identification of mutations. We used our WES methodology to isolate four mutations that cause kidney cysts; we identified novel alleles in two ciliary genes as well as two novel mutants. The WES approach described here does not require specialized infrastructure or training and is therefore widely accessible. This methodology should thus help facilitate genetic screens and expedite the identification of mutants that can inform basic biological processes and the causality of genetic disorders in humans. PMID:24130329

  1. Theoretical analysis of the all-fiberized, dispersion-managed regenerator for simultaneous processing of WDM channels

    NASA Astrophysics Data System (ADS)

    Kouloumentas, Christos

    2011-09-01

    The concept of the all-fiberized multi-wavelength regenerator is analyzed, and the design methodology for operation at 40 Gb/s is presented. The specific methodology has been applied in the past for the experimental proof-of-principle of the technique, but it has never been reported in detail. The regenerator is based on a strong dispersion map that is implemented using alternating dispersion compensating fibers (DCF) and single-mode fibers (SMF), and minimizes the nonlinear interaction between the wavelength-division multiplexing (WDM) channels. The optimized regenerator design with + 0.86 ps/nm/km average dispersion of the nonlinear fiber section is further investigated. The specific design is capable of simultaneously processing five WDM channels with 800 GHz channel spacing and providing Q-factor improvement higher than 1 dB for each channel. The cascadeability of the regenerator is also indicated using a 6-node metropolitan network simulation model.

  2. Road-corridor planning in the EIA procedure in Spain. A review of case studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loro, Manuel, E-mail: manuel.loro@upm.es; Transport Research Centre; Centro de investigación del transporte, TRANSyT-UPM, ETSI Caminos, Canales y Puertos, Universidad Politécnica de Madrid, Prof. Aranguren s/n, 28040 Madrid

    The assessment of different alternatives in road-corridor planning must be based on a number of well-defined territorial variables that serve as decision making criteria, and this requires a high-quality preliminary environmental assessment study. In Spain the formal specifications for the technical requirements stipulate the constraints that must be considered in the early stages of defining road corridors, but not how they should be analyzed and ranked. As part of the feasibility study of a new road definition, the most common methodology is to establish different levels of Territorial Carrying Capacity (TCC) in the study area in order to summarize themore » territorial variables on thematic maps and to ease the tracing process of road-corridor layout alternatives. This paper explores the variables used in 22 road-construction projects conducted by the Ministry of Public Works that were subject to the Spanish EIA regulation and published between 2006 and 2008. The aim was to evaluate the quality of the methods applied and the homogeneity and suitability of the variables used for defining the TCC. The variables were clustered into physical, environmental, land-use and cultural constraints for the purpose of comparing the TCC values assigned in the studies reviewed. We found the average quality of the studies to be generally acceptable in terms of the justification of the methodology, the weighting and classification of the variables, and the creation of a synthesis map. Nevertheless, the methods for assessing the TCC are not sufficiently standardized; there is a lack of uniformity in the cartographic information sources and methodologies for the TCC valuation. -- Highlights: • We explore 22 road-corridor planning studies subjected to the Spanish EIA regulation. • We analyze the variables selected for defining territorial carrying capacity. • The quality of the studies is acceptable (methodology, variable weighting, mapping). • There is heterogeneity in the methods for territorial carrying capacity valuation.« less

  3. Moving from theory to practice: A participatory social network mapping approach to address unmet need for family planning in Benin.

    PubMed

    Igras, Susan; Diakité, Mariam; Lundgren, Rebecka

    2017-07-01

    In West Africa, social factors influence whether couples with unmet need for family planning act on birth-spacing desires. Tékponon Jikuagou is testing a social network-based intervention to reduce social barriers by diffusing new ideas. Individuals and groups judged socially influential by their communities provide entrée to networks. A participatory social network mapping methodology was designed to identify these diffusion actors. Analysis of monitoring data, in-depth interviews, and evaluation reports assessed the methodology's acceptability to communities and staff and whether it produced valid, reliable data to identify influential individuals and groups who diffuse new ideas through their networks. Results indicated the methodology's acceptability. Communities were actively and equitably engaged. Staff appreciated its ability to yield timely, actionable information. The mapping methodology also provided valid and reliable information by enabling communities to identify highly connected and influential network actors. Consistent with social network theory, this methodology resulted in the selection of informal groups and individuals in both informal and formal positions. In-depth interview data suggest these actors were diffusing new ideas, further confirming their influence/connectivity. The participatory methodology generated insider knowledge of who has social influence, challenging commonly held assumptions. Collecting and displaying information fostered staff and community learning, laying groundwork for social change.

  4. A method to calibrate channel friction and bathymetry parameters of a Sub-Grid hydraulic model using SAR flood images

    NASA Astrophysics Data System (ADS)

    Wood, M.; Neal, J. C.; Hostache, R.; Corato, G.; Chini, M.; Giustarini, L.; Matgen, P.; Wagener, T.; Bates, P. D.

    2015-12-01

    Synthetic Aperture Radar (SAR) satellites are capable of all-weather day and night observations that can discriminate between land and smooth open water surfaces over large scales. Because of this there has been much interest in the use of SAR satellite data to improve our understanding of water processes, in particular for fluvial flood inundation mechanisms. Past studies prove that integrating SAR derived data with hydraulic models can improve simulations of flooding. However while much of this work focusses on improving model channel roughness values or inflows in ungauged catchments, improvement of model bathymetry is often overlooked. The provision of good bathymetric data is critical to the performance of hydraulic models but there are only a small number of ways to obtain bathymetry information where no direct measurements exist. Spatially distributed river depths are also rarely available. We present a methodology for calibration of model average channel depth and roughness parameters concurrently using SAR images of flood extent and a Sub-Grid model utilising hydraulic geometry concepts. The methodology uses real data from the European Space Agency's archive of ENVISAT[1] Wide Swath Mode images of the River Severn between Worcester and Tewkesbury during flood peaks between 2007 and 2010. Historic ENVISAT WSM images are currently free and easy to access from archive but the methodology can be applied with any available SAR data. The approach makes use of the SAR image processing algorithm of Giustarini[2] et al. (2013) to generate binary flood maps. A unique feature of the calibration methodology is to also use parameter 'identifiability' to locate the parameters with higher accuracy from a pre-assigned range (adopting the DYNIA method proposed by Wagener[3] et al., 2003). [1] https://gpod.eo.esa.int/services/ [2] Giustarini. 2013. 'A Change Detection Approach to Flood Mapping in Urban Areas Using TerraSAR-X'. IEEE Transactions on Geoscience and Remote Sensing, vol. 51, no. 4. [3] Wagener. 2003. 'Towards reduced uncertainty in conceptual rainfall-runoff modelling: Dynamic identifiability analysis'. Hydrol. Process. 17, 455-476.

  5. The Application of MRI for Depiction of Subtle Blood Brain Barrier Disruption in Stroke

    PubMed Central

    Israeli, David; Tanne, David; Daniels, Dianne; Last, David; Shneor, Ran; Guez, David; Landau, Efrat; Roth, Yiftach; Ocherashvilli, Aharon; Bakon, Mati; Hoffman, Chen; Weinberg, Amit; Volk, Talila; Mardor, Yael

    2011-01-01

    The development of imaging methodologies for detecting blood-brain-barrier (BBB) disruption may help predict stroke patient's propensity to develop hemorrhagic complications following reperfusion. We have developed a delayed contrast extravasation MRI-based methodology enabling real-time depiction of subtle BBB abnormalities in humans with high sensitivity to BBB disruption and high spatial resolution. The increased sensitivity to subtle BBB disruption is obtained by acquiring T1-weighted MRI at relatively long delays (~15 minutes) after contrast injection and subtracting from them images acquired immediately after contrast administration. In addition, the relatively long delays allow for acquisition of high resolution images resulting in high resolution BBB disruption maps. The sensitivity is further increased by image preprocessing with corrections for intensity variations and with whole body (rigid+elastic) registration. Since only two separate time points are required, the time between the two acquisitions can be used for acquiring routine clinical data, keeping the total imaging time to a minimum. A proof of concept study was performed in 34 patients with ischemic stroke and 2 patients with brain metastases undergoing high resolution T1-weighted MRI acquired at 3 time points after contrast injection. The MR images were pre-processed and subtracted to produce BBB disruption maps. BBB maps of patients with brain metastases and ischemic stroke presented different patterns of BBB opening. The significant advantage of the long extravasation time was demonstrated by a dynamic-contrast-enhancement study performed continuously for 18 min. The high sensitivity of our methodology enabled depiction of clear BBB disruption in 27% of the stroke patients who did not have abnormalities on conventional contrast-enhanced MRI. In 36% of the patients, who had abnormalities detectable by conventional MRI, the BBB disruption volumes were significantly larger in the maps than in conventional MRI. These results demonstrate the advantages of delayed contrast extravasation in increasing the sensitivity to subtle BBB disruption in ischemic stroke patients. The calculated disruption maps provide clear depiction of significant volumes of BBB disruption unattainable by conventional contrast-enhanced MRI. PMID:21209786

  6. The application of MRI for depiction of subtle blood brain barrier disruption in stroke.

    PubMed

    Israeli, David; Tanne, David; Daniels, Dianne; Last, David; Shneor, Ran; Guez, David; Landau, Efrat; Roth, Yiftach; Ocherashvilli, Aharon; Bakon, Mati; Hoffman, Chen; Weinberg, Amit; Volk, Talila; Mardor, Yael

    2010-12-26

    The development of imaging methodologies for detecting blood-brain-barrier (BBB) disruption may help predict stroke patient's propensity to develop hemorrhagic complications following reperfusion. We have developed a delayed contrast extravasation MRI-based methodology enabling real-time depiction of subtle BBB abnormalities in humans with high sensitivity to BBB disruption and high spatial resolution. The increased sensitivity to subtle BBB disruption is obtained by acquiring T1-weighted MRI at relatively long delays (~15 minutes) after contrast injection and subtracting from them images acquired immediately after contrast administration. In addition, the relatively long delays allow for acquisition of high resolution images resulting in high resolution BBB disruption maps. The sensitivity is further increased by image preprocessing with corrections for intensity variations and with whole body (rigid+elastic) registration. Since only two separate time points are required, the time between the two acquisitions can be used for acquiring routine clinical data, keeping the total imaging time to a minimum. A proof of concept study was performed in 34 patients with ischemic stroke and 2 patients with brain metastases undergoing high resolution T1-weighted MRI acquired at 3 time points after contrast injection. The MR images were pre-processed and subtracted to produce BBB disruption maps. BBB maps of patients with brain metastases and ischemic stroke presented different patterns of BBB opening. The significant advantage of the long extravasation time was demonstrated by a dynamic-contrast-enhancement study performed continuously for 18 min. The high sensitivity of our methodology enabled depiction of clear BBB disruption in 27% of the stroke patients who did not have abnormalities on conventional contrast-enhanced MRI. In 36% of the patients, who had abnormalities detectable by conventional MRI, the BBB disruption volumes were significantly larger in the maps than in conventional MRI. These results demonstrate the advantages of delayed contrast extravasation in increasing the sensitivity to subtle BBB disruption in ischemic stroke patients. The calculated disruption maps provide clear depiction of significant volumes of BBB disruption unattainable by conventional contrast-enhanced MRI.

  7. Using Lean methodologies to streamline processing of requests for durable medical equipment and supplies for children with complex conditions.

    PubMed

    Fields, Elise; Neogi, Smriti; Schoettker, Pamela J; Lail, Jennifer

    2017-12-12

    An improvement team from the Complex Care Center at our large pediatric medical center participated in a 60-day initiative to use Lean methodologies to standardize their processes, eliminate waste and improve the timely and reliable provision of durable medical equipment and supplies. The team used value stream mapping to identify processes needing improvement. Improvement activities addressed the initial processing of a request, provider signature on the form, returning the form to the sender, and uploading the completed documents to the electronic medical record. Data on lead time (time between receiving a request and sending the completed request to the Health Information Management department) and process time (amount of time the staff worked on the request) were collected via manual pre- and post-time studies. Following implementation of interventions, the median lead time for processing durable medical equipment and supply requests decreased from 50 days to 3 days (p < 0.0001). Median processing time decreased from 14min to 9min (p < 0.0001). The decrease in processing time realized annual cost savings of approximately $11,000. Collaborative leadership and multidisciplinary training in Lean methods allowed the CCC staff to incorporate common sense, standardize practices, and adapt their work environment to improve the timely and reliable provision of equipment and supplies that are essential for their patients. The application of Lean methodologies to processing requests for DME and supplies could also result in a natural spread to other paperwork and requests, thus avoiding delays and potential risk for clinical instability or deterioration. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Mapping tobacco industry strategies in South East Asia for action planning and surveillance

    PubMed Central

    Stillman, F; Hoang, M; Linton, R; Ritthiphakdee, B; Trochim, W

    2008-01-01

    Objective: To develop a comprehensive conceptual framework of tobacco industry tactics in four countries in South East Asia for the purpose of: (1) generating consensus on key areas of importance and feasibility for regional and cross country tobacco industry monitoring and surveillance; (2) developing measures to track and monitor the effects of the tobacco industry and to design counterstrategies; and (3) building capacity to improve tobacco control planning in the participating countries. Design: A structured conceptualisation methodology known as concept mapping was used. The process included brainstorming, sorting and rating of statements describing industry activities. Statistical analyses used multidimensional scaling and cluster analysis. Interpretation of the maps was participatory, using regional tobacco control researchers, practitioners, and policy makers during a face to face meeting. Participants: 31 participants in this study come from the four countries represented in the project along with six people from the Johns Hopkins Blomberg School of Public Health. Conclusions: The map shows eight clusters of industry activities within the four countries. These were arranged into four general sectors: economics, politics, public relations and deception. For project design purposes, the map indicates areas of importance and feasibility for monitoring tobacco industry activities and serves as a basis for an initial discussion about action planning. Furthermore, the development of the map used a consensus building process across different stakeholders or stakeholder agencies and is critical when developing regional, cross border strategies for tracking and surveillance. PMID:18218787

  9. Large Scale Geomorphic Mapping of Cryoplanation Terraces in Central and Eastern Alaska

    NASA Astrophysics Data System (ADS)

    Queen, C.; Nyland, K. E.; Nelson, F. E.

    2017-12-01

    Cryoplanation terraces (CTs) are large periglacial landforms characterized by alternating treads and risers, giving the appearance of giant staircases ascending ridgecrests and hillsides. The risers (scarps) are typically covered with coarse clastic material, while the surfaces of the nearly planar treads are a mosaic of vegetation, rock debris, and surficial periglacial landforms. CTs are best developed in areas of moderate relief across Beringia, the largely unglaciated region between the Lena and Mackenzie rivers, including Bering Sea islands that were formerly highlands on the Bering Land Bridge. CTs are generally thought to develop through locally intensified weathering at the base of scarps by processes associated with late lying bodies of snow. This hypothesis has been the subject of much speculative literature, but until recently there have been few process-oriented field studies performed on them. The work reported here builds on foundational work by R. D. Reger, who inventoried and investigated a large number of CTs in central and western Alaska. The resultant large-scale (1:2000) maps of cryoplanation terraces at Eagle Summit and Mount Fairplay in east-central Alaska were created using traditional and GPS-based mapping methodologies. Pits were excavated at representative locations across treads to obtain information about subsurface characteristics. The resulting maps show the location and morphology of surficial geomorphic features on CT scarps, treads, and sideslopes, superimposed on high-resolution topographic maps and perspective diagrams. GIS-based analysis of the assembled map layers promotes three-dimensional understanding of the spatial relationships between CT morphology, material properties, and erosional processes, and provides key insights into intra- and inter- terrace relationships. In concert with relative and absolute dating of material on the landforms, this research is generally supportive of the "nivation hypothesis of CT development."

  10. Peer review of health research funding proposals: A systematic map and systematic review of innovations for effectiveness and efficiency.

    PubMed

    Shepherd, Jonathan; Frampton, Geoff K; Pickett, Karen; Wyatt, Jeremy C

    2018-01-01

    To investigate methods and processes for timely, efficient and good quality peer review of research funding proposals in health. A two-stage evidence synthesis: (1) a systematic map to describe the key characteristics of the evidence base, followed by (2) a systematic review of the studies stakeholders prioritised as relevant from the map on the effectiveness and efficiency of peer review 'innovations'. Standard processes included literature searching, duplicate inclusion criteria screening, study keyword coding, data extraction, critical appraisal and study synthesis. A total of 83 studies from 15 countries were included in the systematic map. The evidence base is diverse, investigating many aspects of the systems for, and processes of, peer review. The systematic review included eight studies from Australia, Canada, and the USA, evaluating a broad range of peer review innovations. These studies showed that simplifying the process by shortening proposal forms, using smaller reviewer panels, or expediting processes can speed up the review process and reduce costs, but this might come at the expense of peer review quality, a key aspect that has not been assessed. Virtual peer review using videoconferencing or teleconferencing appears promising for reducing costs by avoiding the need for reviewers to travel, but again any consequences for quality have not been adequately assessed. There is increasing international research activity into the peer review of health research funding. The studies reviewed had methodological limitations and variable generalisability to research funders. Given these limitations it is not currently possible to recommend immediate implementation of these innovations. However, many appear promising based on existing evidence, and could be adapted as necessary by funders and evaluated. Where feasible, experimental evaluation, including randomised controlled trials, should be conducted, evaluating impact on effectiveness, efficiency and quality.

  11. VOLCWORKS: A suite for optimization of hazards mapping

    NASA Astrophysics Data System (ADS)

    Delgado Granados, H.; Ramírez Guzmán, R.; Villareal Benítez, J. L.; García Sánchez, T.

    2012-04-01

    Making hazards maps is a process linking basic science, applied science and engineering for the benefit of the society. The methodologies for hazards maps' construction have evolved enormously together with the tools that allow the forecasting of the behavior of the materials produced by different eruptive processes. However, in spite of the development of tools and evolution of methodologies, the utility of hazards maps has not changed: prevention and mitigation of volcanic disasters. Integration of different tools for simulation of different processes for a single volcano is a challenge to be solved using software tools including processing, simulation and visualization techniques, and data structures in order to build up a suit that helps in the construction process starting from the integration of the geological data, simulations and simplification of the output to design a hazards/scenario map. Scientific visualization is a powerful tool to explore and gain insight into complex data from instruments and simulations. The workflow from data collection, quality control and preparation for simulations, to achieve visual and appropriate presentation is a process that is usually disconnected, using in most of the cases different applications for each of the needed processes, because it requires many tools that are not built for the solution of a specific problem, or were developed by research groups to solve particular tasks, but disconnected. In volcanology, due to its complexity, groups typically examine only one aspect of the phenomenon: ash dispersal, laharic flows, pyroclastic flows, lava flows, and ballistic projectile ejection, among others. However, when studying the hazards associated to the activity of a volcano, it is important to analyze all the processes comprehensively, especially for communication of results to the end users: decision makers and planners. In order to solve this problem and connect different parts of a workflow we are developing the suite VOLCWORKS, whose principle is to have a flexible-implementation architecture allowing rapid development of software to the extent specified by the needs including calculations, routines, or algorithms, both new and through redesign of available software in the volcanological community, but especially allowing to include new knowledge, models or software transferring them to software modules. The design is component-oriented platform, which allows incorporating particular solutions (routines, simulations, etc.), which can be concatenated for integration or highlighting information. The platform includes a graphical interface with capabilities for working in different visual environments that can be focused to the particular work of different types of users (researchers, lecturers, students, etc.). This platform aims to integrate simulation and visualization phases, incorporating proven tools (now isolated). VOLCWORKS can be used under different operating systems (Windows, Linux and Mac OS) and fit the context of use automatically and at runtime: in both tasks and their sequence, such as utilization of hardware resources (CPU, GPU, special monitors, etc.). The application has the ability to run on a laptop or even in a virtual reality room with access to supercomputers.

  12. [Organization of monitoring of electromagnetic radiation in the urban environment].

    PubMed

    Savel'ev, S I; Dvoeglazova, S V; Koz'min, V A; Kochkin, D E; Begishev, M R

    2008-01-01

    The authors describe new current approaches to monitoring the environment, including the sources of electromagnetic radiation and noise. Electronic maps of the area under study are shown to be made, by constructing the isolines or distributing the actual levels of controlled factors. These current approaches to electromagnetic and acoustic monitoring make it possible to automate a process of measurements, to analyze the established situation, and to simplify the risk controlling methodology.

  13. 2Loud?: Community mapping of exposure to traffic noise with mobile phones.

    PubMed

    Leao, Simone; Ong, Kok-Leong; Krezel, Adam

    2014-10-01

    Despite ample medical evidence of the adverse impacts of traffic noise on health, most policies for traffic noise management are arbitrary or incomplete, resulting in serious social and economic impacts. Surprisingly, there is limited information about citizen's exposure to traffic noise worldwide. This paper presents the 2Loud? mobile phone application, developed and tested as a methodology to monitor, assess and map the level of exposure to traffic noise of citizens with focus on the night period and indoor locations, since sleep disturbance is one of the major triggers for ill health related to traffic noise. Based on a community participation experiment using the 2Loud? mobile phone application in a region close to freeways in Australia, the results of this research indicates a good level of accuracy for the noise monitoring by mobile phones and also demonstrates significant levels of indoor night exposure to traffic noise in the study area. The proposed methodology, through the data produced and the participatory process involved, can potentially assist in planning and management towards healthier urban environments.

  14. Participatory GIS for Soil Conservation in Phewa Watershed of Nepal

    NASA Astrophysics Data System (ADS)

    Bhandari, K. P.

    2012-07-01

    Participatory Geographic Information Systems (PGIS) can integrate participatory methodologies with geo-spatial technologies for the representation of characteristic of particular place. Over the last decade, researchers use this method to integrate the local knowledge of community within a GIS and Society conceptual framework. Participatory GIS are tailored to answer specific geographic questions at the local level and their modes of implementation vary considerably across space, ranging from field-based, qualitative approaches to more complex web-based applications. These broad ranges of techniques, PGIS are becoming an effective methodology for incorporating community local knowledge into complex spatial decision-making processes. The objective of this study is to reduce the soil erosion by formulating the general rule for the soil conservation by participation of the stakeholders. The poster was prepared by satellite image, topographic map and Arc GIS software including the local knowledge. The data were collected from the focus group discussion and the individual questionnaire for incorporate the local knowledge and use it to find the risk map on the basis of economic, social and manageable physical factors for the sensitivity analysis. The soil erosion risk map is prepared by the physical factors Rainfall-runoff erosivity, Soil erodibility, Slope length, Slope steepness, Cover-management, Conservation practice using RUSLE model. After the comparison and discussion among stakeholders, researcher and export group, and the soil erosion risk map showed that socioeconomic, social and manageable physical factors management can reduce the soil erosion. The study showed that the preparation of the poster GIS map and implement this in the watershed area could reduce the soil erosion in the study area compared to the existing national policy.

  15. An optimal baseline selection methodology for data-driven damage detection and temperature compensation in acousto-ultrasonics

    NASA Astrophysics Data System (ADS)

    Torres-Arredondo, M.-A.; Sierra-Pérez, Julián; Cabanes, Guénaël

    2016-05-01

    The process of measuring and analysing the data from a distributed sensor network all over a structural system in order to quantify its condition is known as structural health monitoring (SHM). For the design of a trustworthy health monitoring system, a vast amount of information regarding the inherent physical characteristics of the sources and their propagation and interaction across the structure is crucial. Moreover, any SHM system which is expected to transition to field operation must take into account the influence of environmental and operational changes which cause modifications in the stiffness and damping of the structure and consequently modify its dynamic behaviour. On that account, special attention is paid in this paper to the development of an efficient SHM methodology where robust signal processing and pattern recognition techniques are integrated for the correct interpretation of complex ultrasonic waves within the context of damage detection and identification. The methodology is based on an acousto-ultrasonics technique where the discrete wavelet transform is evaluated for feature extraction and selection, linear principal component analysis for data-driven modelling and self-organising maps for a two-level clustering under the principle of local density. At the end, the methodology is experimentally demonstrated and results show that all the damages were detectable and identifiable.

  16. A scale‐down mimic for mapping the process performance of centrifugation, depth and sterile filtration

    PubMed Central

    Joseph, Adrian; Kenty, Brian; Mollet, Michael; Hwang, Kenneth; Rose, Steven; Goldrick, Stephen; Bender, Jean; Farid, Suzanne S.

    2016-01-01

    ABSTRACT In the production of biopharmaceuticals disk‐stack centrifugation is widely used as a harvest step for the removal of cells and cellular debris. Depth filters followed by sterile filters are often then employed to remove residual solids remaining in the centrate. Process development of centrifugation is usually conducted at pilot‐scale so as to mimic the commercial scale equipment but this method requires large quantities of cell culture and significant levels of effort for successful characterization. A scale‐down approach based upon the use of a shear device and a bench‐top centrifuge has been extended in this work towards a preparative methodology that successfully predicts the performance of the continuous centrifuge and polishing filters. The use of this methodology allows the effects of cell culture conditions and large‐scale centrifugal process parameters on subsequent filtration performance to be assessed at an early stage of process development where material availability is limited. Biotechnol. Bioeng. 2016;113: 1934–1941. © 2016 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc. PMID:26927621

  17. Using Fuzzy Analytic Hierarchy Process multicriteria and Geographical information system for coastal vulnerability analysis in Morocco: The case of Mohammedia

    NASA Astrophysics Data System (ADS)

    Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha

    2016-04-01

    This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.

  18. Gridded uncertainty in fossil fuel carbon dioxide emission maps, a CDIAC example

    DOE PAGES

    Andres, Robert J.; Boden, Thomas A.; Higdon, David M.

    2016-12-05

    Due to a current lack of physical measurements at appropriate spatial and temporal scales, all current global maps and distributions of fossil fuel carbon dioxide (FFCO2) emissions use one or more proxies to distribute those emissions. These proxies and distribution schemes introduce additional uncertainty into these maps. This paper examines the uncertainty associated with the magnitude of gridded FFCO2 emissions. This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughoutmore » this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty. The results of the uncertainty analysis reveal a range of 4–190 %, with an average of 120 % (2 σ) for populated and FFCO2-emitting grid spaces over annual timescales. This paper also describes a methodological change specific to the creation of the Carbon Dioxide Information Analysis Center (CDIAC) FFCO2 emission maps: the change from a temporally fixed population proxy to a temporally varying population proxy.« less

  19. Gridded uncertainty in fossil fuel carbon dioxide emission maps, a CDIAC example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andres, Robert J.; Boden, Thomas A.; Higdon, David M.

    Due to a current lack of physical measurements at appropriate spatial and temporal scales, all current global maps and distributions of fossil fuel carbon dioxide (FFCO2) emissions use one or more proxies to distribute those emissions. These proxies and distribution schemes introduce additional uncertainty into these maps. This paper examines the uncertainty associated with the magnitude of gridded FFCO2 emissions. This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughoutmore » this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty. The results of the uncertainty analysis reveal a range of 4–190 %, with an average of 120 % (2 σ) for populated and FFCO2-emitting grid spaces over annual timescales. This paper also describes a methodological change specific to the creation of the Carbon Dioxide Information Analysis Center (CDIAC) FFCO2 emission maps: the change from a temporally fixed population proxy to a temporally varying population proxy.« less

  20. Gridded uncertainty in fossil fuel carbon dioxide emission maps, a CDIAC example

    NASA Astrophysics Data System (ADS)

    Andres, Robert J.; Boden, Thomas A.; Higdon, David M.

    2016-12-01

    Due to a current lack of physical measurements at appropriate spatial and temporal scales, all current global maps and distributions of fossil fuel carbon dioxide (FFCO2) emissions use one or more proxies to distribute those emissions. These proxies and distribution schemes introduce additional uncertainty into these maps. This paper examines the uncertainty associated with the magnitude of gridded FFCO2 emissions. This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughout this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty. The results of the uncertainty analysis reveal a range of 4-190 %, with an average of 120 % (2σ) for populated and FFCO2-emitting grid spaces over annual timescales. This paper also describes a methodological change specific to the creation of the Carbon Dioxide Information Analysis Center (CDIAC) FFCO2 emission maps: the change from a temporally fixed population proxy to a temporally varying population proxy.

  1. The ECOUTER methodology for stakeholder engagement in translational research.

    PubMed

    Murtagh, Madeleine J; Minion, Joel T; Turner, Andrew; Wilson, Rebecca C; Blell, Mwenza; Ochieng, Cynthia; Murtagh, Barnaby; Roberts, Stephanie; Butters, Oliver W; Burton, Paul R

    2017-04-04

    Because no single person or group holds knowledge about all aspects of research, mechanisms are needed to support knowledge exchange and engagement. Expertise in the research setting necessarily includes scientific and methodological expertise, but also expertise gained through the experience of participating in research and/or being a recipient of research outcomes (as a patient or member of the public). Engagement is, by its nature, reciprocal and relational: the process of engaging research participants, patients, citizens and others (the many 'publics' of engagement) brings them closer to the research but also brings the research closer to them. When translating research into practice, engaging the public and other stakeholders is explicitly intended to make the outcomes of translation relevant to its constituency of users. In practice, engagement faces numerous challenges and is often time-consuming, expensive and 'thorny' work. We explore the epistemic and ontological considerations and implications of four common critiques of engagement methodologies that contest: representativeness, communication and articulation, impacts and outcome, and democracy. The ECOUTER (Employing COnceptUal schema for policy and Translation Engagement in Research) methodology addresses problems of representation and epistemic foundationalism using a methodology that asks, "How could it be otherwise?" ECOUTER affords the possibility of engagement where spatial and temporal constraints are present, relying on saturation as a method of 'keeping open' the possible considerations that might emerge and including reflexive use of qualitative analytic methods. This paper describes the ECOUTER process, focusing on one worked example and detailing lessons learned from four other pilots. ECOUTER uses mind-mapping techniques to 'open up' engagement, iteratively and organically. ECOUTER aims to balance the breadth, accessibility and user-determination of the scope of engagement. An ECOUTER exercise comprises four stages: (1) engagement and knowledge exchange; (2) analysis of mindmap contributions; (3) development of a conceptual schema (i.e. a map of concepts and their relationship); and (4) feedback, refinement and development of recommendations. ECOUTER refuses fixed truths but also refuses a fixed nature. Its promise lies in its flexibility, adaptability and openness. ECOUTER will be formed and re-formed by the needs and creativity of those who use it.

  2. Assessing Human Modifications to Floodplains using Large-Scale Hydrogeomorphic Floodplain Modeling

    NASA Astrophysics Data System (ADS)

    Morrison, R. R.; Scheel, K.; Nardi, F.; Annis, A.

    2017-12-01

    Human modifications to floodplains for water resource and flood management purposes have significantly transformed river-floodplain connectivity dynamics in many watersheds. Bridges, levees, reservoirs, shifts in land use, and other hydraulic engineering works have altered flow patterns and caused changes in the timing and extent of floodplain inundation processes. These hydrogeomorphic changes have likely resulted in negative impacts to aquatic habitat and ecological processes. The availability of large-scale topographic datasets at high resolution provide an opportunity for detecting anthropogenic impacts by means of geomorphic mapping. We have developed and are implementing a methodology for comparing a hydrogeomorphic floodplain mapping technique to hydraulically-modeled floodplain boundaries to estimate floodplain loss due to human activities. Our hydrogeomorphic mapping methodology assumes that river valley morphology intrinsically includes information on flood-driven erosion and depositional phenomena. We use a digital elevation model-based algorithm to identify the floodplain as the area of the fluvial corridor laying below water reference levels, which are estimated using a simplified hydrologic model. Results from our hydrogeomorphic method are compared to hydraulically-derived flood zone maps and spatial datasets of levee protected-areas to explore where water management features, such as levees, have changed floodplain dynamics and landscape features. Parameters associated with commonly used F-index functions are quantified and analyzed to better understand how floodplain areas have been reduced within a basin. Preliminary results indicate that the hydrogeomorphic floodplain model is useful for quickly delineating floodplains at large watershed scales, but further analyses are needed to understand the caveats for using the model in determining floodplain loss due to levees. We plan to continue this work by exploring the spatial dependencies of the F-index function. Results from this work have implications for loss of aquatic habitat and ecological functions, and can inform management and restoration activities by highlighting regions with significant floodplain loss.

  3. Towards the Development of a Smart Flying Sensor: Illustration in the Field of Precision Agriculture.

    PubMed

    Hernandez, Andres; Murcia, Harold; Copot, Cosmin; De Keyser, Robin

    2015-07-10

    Sensing is an important element to quantify productivity, product quality and to make decisions. Applications, such as mapping, surveillance, exploration and precision agriculture, require a reliable platform for remote sensing. This paper presents the first steps towards the development of a smart flying sensor based on an unmanned aerial vehicle (UAV). The concept of smart remote sensing is illustrated and its performance tested for the task of mapping the volume of grain inside a trailer during forage harvesting. Novelty lies in: (1) the development of a position-estimation method with time delay compensation based on inertial measurement unit (IMU) sensors and image processing; (2) a method to build a 3D map using information obtained from a regular camera; and (3) the design and implementation of a path-following control algorithm using model predictive control (MPC). Experimental results on a lab-scale system validate the effectiveness of the proposed methodology.

  4. Towards the Development of a Smart Flying Sensor: Illustration in the Field of Precision Agriculture

    PubMed Central

    Hernandez, Andres; Murcia, Harold; Copot, Cosmin; De Keyser, Robin

    2015-01-01

    Sensing is an important element to quantify productivity, product quality and to make decisions. Applications, such as mapping, surveillance, exploration and precision agriculture, require a reliable platform for remote sensing. This paper presents the first steps towards the development of a smart flying sensor based on an unmanned aerial vehicle (UAV). The concept of smart remote sensing is illustrated and its performance tested for the task of mapping the volume of grain inside a trailer during forage harvesting. Novelty lies in: (1) the development of a position-estimation method with time delay compensation based on inertial measurement unit (IMU) sensors and image processing; (2) a method to build a 3D map using information obtained from a regular camera; and (3) the design and implementation of a path-following control algorithm using model predictive control (MPC). Experimental results on a lab-scale system validate the effectiveness of the proposed methodology. PMID:26184205

  5. Carbene footprinting accurately maps binding sites in protein-ligand and protein-protein interactions

    NASA Astrophysics Data System (ADS)

    Manzi, Lucio; Barrow, Andrew S.; Scott, Daniel; Layfield, Robert; Wright, Timothy G.; Moses, John E.; Oldham, Neil J.

    2016-11-01

    Specific interactions between proteins and their binding partners are fundamental to life processes. The ability to detect protein complexes, and map their sites of binding, is crucial to understanding basic biology at the molecular level. Methods that employ sensitive analytical techniques such as mass spectrometry have the potential to provide valuable insights with very little material and on short time scales. Here we present a differential protein footprinting technique employing an efficient photo-activated probe for use with mass spectrometry. Using this methodology the location of a carbohydrate substrate was accurately mapped to the binding cleft of lysozyme, and in a more complex example, the interactions between a 100 kDa, multi-domain deubiquitinating enzyme, USP5 and a diubiquitin substrate were located to different functional domains. The much improved properties of this probe make carbene footprinting a viable method for rapid and accurate identification of protein binding sites utilizing benign, near-UV photoactivation.

  6. Enhancing Collaborative and Meaningful Language Learning Through Concept Mapping

    NASA Astrophysics Data System (ADS)

    Marriott, Rita De Cássia Veiga; Torres, Patrícia Lupion

    This chapter aims to investigate new ways of foreign-language teaching/learning via a study of how concept mapping can help develop a student's reading, writing and oral skills as part of a blended methodology for language teaching known as LAPLI (Laboratorio de Aprendizagem de LInguas: The Language Learning Lab). LAPLI is a student-centred and collaborative methodology which encourages students to challenge their limitations and expand their current knowledge whilst developing their linguistic and interpersonal skills. We explore the theories that underpin LAPLI and detail the 12 activities comprising its programme with specify reference to the use of "concept mapping". An innovative table enabling a formative and summative assessment of the concept maps is formulated. Also presented are some of the qualitative and quantitative results achieved when this methodology was first implemented with a group of pre-service students studying for a degree in English and Portuguese languages at the Catholic University of Parana (PUCPR) in Brazil. The contribution of concept mapping and LAPLI to an under standing of language learning along with a consideration of the difficulties encountered in its implementation with student groups is discussed and suggestions made for future research.

  7. Using lean methodology to improve productivity in a hospital oncology pharmacy.

    PubMed

    Sullivan, Peter; Soefje, Scott; Reinhart, David; McGeary, Catherine; Cabie, Eric D

    2014-09-01

    Quality improvements achieved by a hospital pharmacy through the use of lean methodology to guide i.v. compounding workflow changes are described. The outpatient oncology pharmacy of Yale-New Haven Hospital conducted a quality-improvement initiative to identify and implement workflow changes to support a major expansion of chemotherapy services. Applying concepts of lean methodology (i.e., elimination of non-value-added steps and waste in the production process), the pharmacy team performed a failure mode and effects analysis, workflow mapping, and impact analysis; staff pharmacists and pharmacy technicians identified 38 opportunities to decrease waste and increase efficiency. Three workflow processes (order verification, compounding, and delivery) accounted for 24 of 38 recommendations and were targeted for lean process improvements. The workflow was decreased to 14 steps, eliminating 6 non-value-added steps, and pharmacy staff resources and schedules were realigned with the streamlined workflow. The time required for pharmacist verification of patient-specific oncology orders was decreased by 33%; the time required for product verification was decreased by 52%. The average medication delivery time was decreased by 47%. The results of baseline and postimplementation time trials indicated a decrease in overall turnaround time to about 70 minutes, compared with a baseline time of about 90 minutes. The use of lean methodology to identify non-value-added steps in oncology order processing and the implementation of staff-recommended workflow changes resulted in an overall reduction in the turnaround time per dose. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  8. An expert-based approach to forest road network planning by combining Delphi and spatial multi-criteria evaluation.

    PubMed

    Hayati, Elyas; Majnounian, Baris; Abdi, Ehsan; Sessions, John; Makhdoum, Majid

    2013-02-01

    Changes in forest landscapes resulting from road construction have increased remarkably in the last few years. On the other hand, the sustainable management of forest resources can only be achieved through a well-organized road network. In order to minimize the environmental impacts of forest roads, forest road managers must design the road network efficiently and environmentally as well. Efficient planning methodologies can assist forest road managers in considering the technical, economic, and environmental factors that affect forest road planning. This paper describes a three-stage methodology using the Delphi method for selecting the important criteria, the Analytic Hierarchy Process for obtaining the relative importance of the criteria, and finally, a spatial multi-criteria evaluation in a geographic information system (GIS) environment for identifying the lowest-impact road network alternative. Results of the Delphi method revealed that ground slope, lithology, distance from stream network, distance from faults, landslide susceptibility, erosion susceptibility, geology, and soil texture are the most important criteria for forest road planning in the study area. The suitability map for road planning was then obtained by combining the fuzzy map layers of these criteria with respect to their weights. Nine road network alternatives were designed using PEGGER, an ArcView GIS extension, and finally, their values were extracted from the suitability map. Results showed that the methodology was useful for identifying road that met environmental and cost considerations. Based on this work, we suggest future work in forest road planning using multi-criteria evaluation and decision making be considered in other regions and that the road planning criteria identified in this study may be useful.

  9. Large-scale mapping of hard-rock aquifer properties applied to Burkina Faso.

    PubMed

    Courtois, Nathalie; Lachassagne, Patrick; Wyns, Robert; Blanchin, Raymonde; Bougaïré, Francis D; Somé, Sylvain; Tapsoba, Aïssata

    2010-01-01

    A country-scale (1:1,000,000) methodology has been developed for hydrogeologic mapping of hard-rock aquifers (granitic and metamorphic rocks) of the type that underlie a large part of the African continent. The method is based on quantifying the "useful thickness" and hydrodynamic properties of such aquifers and uses a recent conceptual model developed for this hydrogeologic context. This model links hydrodynamic parameters (transmissivity, storativity) to lithology and the geometry of the various layers constituting a weathering profile. The country-scale hydrogeological mapping was implemented in Burkina Faso, where a recent 1:1,000,000-scale digital geological map and a database of some 16,000 water wells were used to evaluate the methodology.

  10. Towards a Unified Approach to Information Integration - A review paper on data/information fusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitney, Paul D.; Posse, Christian; Lei, Xingye C.

    2005-10-14

    Information or data fusion of data from different sources are ubiquitous in many applications, from epidemiology, medical, biological, political, and intelligence to military applications. Data fusion involves integration of spectral, imaging, text, and many other sensor data. For example, in epidemiology, information is often obtained based on many studies conducted by different researchers at different regions with different protocols. In the medical field, the diagnosis of a disease is often based on imaging (MRI, X-Ray, CT), clinical examination, and lab results. In the biological field, information is obtained based on studies conducted on many different species. In military field, informationmore » is obtained based on data from radar sensors, text messages, chemical biological sensor, acoustic sensor, optical warning and many other sources. Many methodologies are used in the data integration process, from classical, Bayesian, to evidence based expert systems. The implementation of the data integration ranges from pure software design to a mixture of software and hardware. In this review we summarize the methodologies and implementations of data fusion process, and illustrate in more detail the methodologies involved in three examples. We propose a unified multi-stage and multi-path mapping approach to the data fusion process, and point out future prospects and challenges.« less

  11. From local to national scale DInSAR analysis for the comprehension of Earth's surface dynamics.

    NASA Astrophysics Data System (ADS)

    De Luca, Claudio; Casu, Francesco; Manunta, Michele; Zinno, Ivana; lanari, Riccardo

    2017-04-01

    Earth Observation techniques can be very helpful for the estimation of several sources of ground deformation due to their characteristics of large spatial coverage, high resolution and cost effectiveness. In this scenario, Differential Synthetic Aperture Radar Interferometry (DInSAR) is one of the most effective methodologies for its capability to generate spatially dense deformation maps with centimeter to millimeter accuracy. DInSAR exploits the phase difference (interferogram) between SAR image pairs relevant to acquisitions gathered at different times, but with the same illumination geometry and from sufficiently close flight tracks, whose separation is typically referred to as baseline. Among several, the SBAS algorithm is one of the most used DInSAR approaches and it is aimed at generating displacement time series at a multi-scale level by exploiting a set of small baseline interferograms. SBAS, and generally DInSAR, has taken benefit from the large availability of spaceborne SAR data collected along years by several satellite systems, with particular regard to the European ERS and ENVISAT sensors, which have acquired SAR images worldwide during approximately 20 years. While the application of SBAS to ERS and ENVISAT data at local scale is widely testified, very few examples involving those archives for analysis at huge spatial scale are available in literature. This is mainly due to the required processing power (in terms of CPUs, memory and storage) and the limited availability of automatic processing procedures (unsupervised tools), which are mandatory requirements for obtaining displacement results in a time effective way. Accordingly, in this work we present a methodology for generating the Vertical and Horizontal (East-West) components of Earth's surface deformation at very large (national/continental) spatial scale. In particular, it relies on the availability of a set of SAR data collected over an Area of Interest (AoI), which could be some hundreds of thousands of square kilometers wide, from ascending and descending orbits. The exploited SAR data are processed, on a local basis, through the Parallel SBAS (P-SBAS) approach thus generating the displacement time series and the corresponding mean deformation velocity maps. Subsequently, starting from the so generated DInSAR results, the proposed methodology lays on a proper mosaicking procedure to finally retrieve the mean velocity maps of the Vertical and Horizontal (East-West) deformation components relevant to the overall AoI. This technique permits to account for possible regional trends (tectonics trend) not easily detectable by the local scale DInSAR analyses. We tested the proposed methodology with the ENVISAT ASAR archives that have been acquired, from ascending and descending orbits, over California (US), covering an area of about 100.000 km2. The presented methodology can be easily applied also to other SAR satellite data. Above all, it is particularly suitable to deal with the very large data flow provided by the Sentinel-1 constellation, which collects data with a global coverage policy and an acquisition mode specifically designed for interferometric applications.

  12. Exploring gender, age, time and space in research with older Pakistani Muslims in the United Kingdom: formalised research 'ethics' and performances of the public/private divide in 'the field'.

    PubMed

    Zubair, Maria; Victor, Christina

    2015-05-01

    In recent years, there has been an increasing interest in researching ageing ethnic minority populations in the West. However, older people from such minority communities have received comparatively little attention in wide-ranging discussions on appropriate research methodologies. By a process of critically reflecting on our experiences of undertaking fieldwork for our Economic and Social Research Council New Dynamics of Ageing study of 'Families and Caring in South Asian Communities', this paper maps out the key methodological and ethical challenges we faced and, in the process, highlights the importance of developing socially appropriate research methodologies and ethical frameworks for research with such populations. With a reflexive approach, we specifically explore the significance of gender, age, time and space to the fieldwork processes and the 'field' relationships formed at various stages of the research process. In particular, we explore three key emergent issues which conflicted with our formal research protocols and presented particular challenges for us and our older Pakistani Muslim participants: (a) structuring of time in daily life; (b) gendered use of public and private spaces; and (c) orality of informal social contexts and relationships. Using illustrations from our fieldwork which reveal the particular significance of these issues to our fieldwork experiences and performativities of public/private identities, we highlight important tensions between formalised ethical and methodological dimensions of conducting funded research and the realities of being in 'the field'. We conclude the paper by emphasising the need to explore further not only the ways in which researchers can adopt more socially and culturally sensitive data collection processes and methodologies at the micro level of their interactions with research participants, but also contextualising the particular challenges experienced by researchers and their participants in terms of the wider research frameworks and agendas as well as the broader social contexts within which they live and work.

  13. Global land cover mapping: a review and uncertainty analysis

    USGS Publications Warehouse

    Congalton, Russell G.; Gu, Jianyu; Yadav, Kamini; Thenkabail, Prasad S.; Ozdogan, Mutlu

    2014-01-01

    Given the advances in remotely sensed imagery and associated technologies, several global land cover maps have been produced in recent times including IGBP DISCover, UMD Land Cover, Global Land Cover 2000 and GlobCover 2009. However, the utility of these maps for specific applications has often been hampered due to considerable amounts of uncertainties and inconsistencies. A thorough review of these global land cover projects including evaluating the sources of error and uncertainty is prudent and enlightening. Therefore, this paper describes our work in which we compared, summarized and conducted an uncertainty analysis of the four global land cover mapping projects using an error budget approach. The results showed that the classification scheme and the validation methodology had the highest error contribution and implementation priority. A comparison of the classification schemes showed that there are many inconsistencies between the definitions of the map classes. This is especially true for the mixed type classes for which thresholds vary for the attributes/discriminators used in the classification process. Examination of these four global mapping projects provided quite a few important lessons for the future global mapping projects including the need for clear and uniform definitions of the classification scheme and an efficient, practical, and valid design of the accuracy assessment.

  14. A geostatistical approach to estimate mining efficiency indicators with flexible meshes

    NASA Astrophysics Data System (ADS)

    Freixas, Genis; Garriga, David; Fernàndez-Garcia, Daniel; Sanchez-Vila, Xavier

    2014-05-01

    Geostatistics is a branch of statistics developed originally to predict probability distributions of ore grades for mining operations by considering the attributes of a geological formation at unknown locations as a set of correlated random variables. Mining exploitations typically aim to maintain acceptable mineral laws to produce commercial products based upon demand. In this context, we present a new geostatistical methodology to estimate strategic efficiency maps that incorporate hydraulic test data, the evolution of concentrations with time obtained from chemical analysis (packer tests and production wells) as well as hydraulic head variations. The methodology is applied to a salt basin in South America. The exploitation is based on the extraction of brines through vertical and horizontal wells. Thereafter, brines are precipitated in evaporation ponds to obtain target potassium and magnesium salts of economic interest. Lithium carbonate is obtained as a byproduct of the production of potassium chloride. Aside from providing an assemble of traditional geostatistical methods, the strength of this study falls with the new methodology developed, which focus on finding the best sites to exploit the brines while maintaining efficiency criteria. Thus, some strategic indicator efficiency maps have been developed under the specific criteria imposed by exploitation standards to incorporate new extraction wells in new areas that would allow maintain or improve production. Results show that the uncertainty quantification of the efficiency plays a dominant role and that the use flexible meshes, which properly describe the curvilinear features associated with vertical stratification, provides a more consistent estimation of the geological processes. Moreover, we demonstrate that the vertical correlation structure at the given salt basin is essentially linked to variations in the formation thickness, which calls for flexible meshes and non-stationarity stochastic processes.

  15. Geological Mapping Uses Landsat 4-5TM Satellite Data in Manlai Soum of Omnogovi Aimag

    NASA Astrophysics Data System (ADS)

    Norovsuren, B.

    2014-12-01

    Author: Bayanmonkh N1, Undram.G1, Tsolmon.R2, Ariunzul.Ya1, Bayartungalag B31 Environmental Research Information and Study Center 2NUM-ITC-UNESCO Space Science and Remote Sensing International Laboratory, National University of Mongolia 3Geology and Hydrology School, Korea University KEY WORDS: geology, mineral resources, fracture, structure, lithologyABSTRACTGeologic map is the most important map for mining when it does exploration job. In Mongolia geological map completed by Russian geologists which is done by earlier technology. Those maps doesn't satisfy for present requirements. Thus we want to study improve geological map which includes fracture, structural map and lithology use Landsat TM4-5 satellite data. If we can produce a geological map from satellite data with more specification then geologist can explain or read mineralogy very easily. We searched all methodology and researches of every single element of geological mapping. Then we used 3 different remote sensing methodologies to produce structural and lithology and fracture map based on geographic information system's softwares. There can be found a visible lithology border improvement and understandable structural map and we found fracture of the Russian geological map has a lot of distortion. The result of research geologist can read mineralogy elements very easy and discovered 3 unfound important elements from satellite image.

  16. A visual analytical approach to rock art panel condition assessment

    NASA Astrophysics Data System (ADS)

    Vogt, Brandon J.

    Rock art is a term for pecked, scratched, or painted symbols found on rock surfaces, most typically joint faces called rock art panels. Because rock art exists on rock at the atmosphere interface, it is highly susceptible to the destructive processes of weathering. Thus, rock weathering scientists, including those that study both natural and cultural surfaces, play a key role towards understanding rock art longevity. The mapping of weathering forms on rock art panels serves as a basis from which to assess overall panel instability. This work examines fissures, case hardened surfaces, crumbly disintegration, and lichen. Knowledge of instability, as measured through these and other weathering forms, provides integral information to land managers and archaeological conservators required to prioritize panels for remedial action. The work is divided into five chapters, three of which are going to be submitted as a peer-reviewed journal manuscript. The second chapter, written as a manuscript for International Newsletter on Rock Art, describes a specific set of criteria that lead to the development of a mapping tool for weathering forms, called 'mapping weathering forms in three dimensions' (MapWeF). The third chapter, written as a manuscript for Remote Sensing of Environment, presents the methodology used to develop MapWeF. The chapter incorporates terrestrial laser scanning, a geographic information system (GIS), geovisualization, image analysis, and exploratory spatial data analysis (ESDA) to identify, map, and quantify weathering features known to cause instability on rock art panels. The methodology implemented in the third chapter satisfies the criteria described in Chapter Two. In the fourth chapter, prepared as a manuscript for Geomorphology, MapWeF is applied to a site management case study, focusing on a region---southeastern Colorado---with notoriously weak and endangered sandstone rock art panels. The final conclusions chapter describes contributions of the work to GIScience and rock weathering, and discusses how MapWeF, as a diagnostic tool, fits into a larger national vision by linking existing rock art stability characterizations to cultural resource management-related conservation action.

  17. Analytical results for a stochastic model of gene expression with arbitrary partitioning of proteins

    NASA Astrophysics Data System (ADS)

    Tschirhart, Hugo; Platini, Thierry

    2018-05-01

    In biophysics, the search for analytical solutions of stochastic models of cellular processes is often a challenging task. In recent work on models of gene expression, it was shown that a mapping based on partitioning of Poisson arrivals (PPA-mapping) can lead to exact solutions for previously unsolved problems. While the approach can be used in general when the model involves Poisson processes corresponding to creation or degradation, current applications of the method and new results derived using it have been limited to date. In this paper, we present the exact solution of a variation of the two-stage model of gene expression (with time dependent transition rates) describing the arbitrary partitioning of proteins. The methodology proposed makes full use of the PPA-mapping by transforming the original problem into a new process describing the evolution of three biological switches. Based on a succession of transformations, the method leads to a hierarchy of reduced models. We give an integral expression of the time dependent generating function as well as explicit results for the mean, variance, and correlation function. Finally, we discuss how results for time dependent parameters can be extended to the three-stage model and used to make inferences about models with parameter fluctuations induced by hidden stochastic variables.

  18. Techniques of remote sensing applied to the environmental analysis of part of an aquifer located in the São José dos Campos Region sp, Brazil.

    PubMed

    Bressan, Mariana Affonseca; Dos Anjos, Célio Eustáquio

    2003-05-01

    The anthropogenic activity on the surface can modify and introduce new mechanisms of recharging the groundwater system, modifying the tax, the frequency and the quality of recharge of underground waters. The understanding of these mechanisms and the correct evaluation of such modifications are fundamental in determining the vulnerability of groundwater contamination. The groundwater flow of the South Paraíba Compartment, in the region of São José dos Campos, São Paulo, is directly related to structural features of the Taubaté Basin and, therefore, the analysis of its behaviour enhances the understanding of tectonic structure. The methodology adopted for this work consists in pre-processing and processing of the satellite images, visual interpretation of HSI products, field work and data integration. The derivation of the main structural features was based on visual analysis of the texture elements of drainage, and the relief in sedimentary and crystalline rocks. Statistical analysis of the feature densities and the metric-geometric relations between the analysed elements have been conducted. The crystalline rocks, on which the sediments were laying, conditions and controls the structural arrangement of sedimentary formations. The formation of the South Paraíba Grabén is associated with Cenozoic distensive movement which reactivated old features of crust weakness and generated previous cycles with normal characteristics. The environmental analysis is based on the integration of the existing methodology to characterise vulnerability of an universal pollutant and density fracture zone. The digital integration was processed using GIS (Geographic Information System) to delineate five defined vulnerability classes. The hydrogeological settings were analysed in each thematic map and, using fuzzy logic, an index for each different vulnerability class was compiled. Evidence maps could be combined in a series of steps using map algebra.

  19. Failure mode and effects analysis outputs: are they valid?

    PubMed

    Shebl, Nada Atef; Franklin, Bryony Dean; Barber, Nick

    2012-06-10

    Failure Mode and Effects Analysis (FMEA) is a prospective risk assessment tool that has been widely used within the aerospace and automotive industries and has been utilised within healthcare since the early 1990s. The aim of this study was to explore the validity of FMEA outputs within a hospital setting in the United Kingdom. Two multidisciplinary teams each conducted an FMEA for the use of vancomycin and gentamicin. Four different validity tests were conducted: Face validity: by comparing the FMEA participants' mapped processes with observational work. Content validity: by presenting the FMEA findings to other healthcare professionals. Criterion validity: by comparing the FMEA findings with data reported on the trust's incident report database. Construct validity: by exploring the relevant mathematical theories involved in calculating the FMEA risk priority number. Face validity was positive as the researcher documented the same processes of care as mapped by the FMEA participants. However, other healthcare professionals identified potential failures missed by the FMEA teams. Furthermore, the FMEA groups failed to include failures related to omitted doses; yet these were the failures most commonly reported in the trust's incident database. Calculating the RPN by multiplying severity, probability and detectability scores was deemed invalid because it is based on calculations that breach the mathematical properties of the scales used. There are significant methodological challenges in validating FMEA. It is a useful tool to aid multidisciplinary groups in mapping and understanding a process of care; however, the results of our study cast doubt on its validity. FMEA teams are likely to need different sources of information, besides their personal experience and knowledge, to identify potential failures. As for FMEA's methodology for scoring failures, there were discrepancies between the teams' estimates and similar incidents reported on the trust's incident database. Furthermore, the concept of multiplying ordinal scales to prioritise failures is mathematically flawed. Until FMEA's validity is further explored, healthcare organisations should not solely depend on their FMEA results to prioritise patient safety issues.

  20. Field-based rice classification in Wuhua county through integration of multi-temporal Sentinel-1A and Landsat-8 OLI data

    NASA Astrophysics Data System (ADS)

    Yang, Huijin; Pan, Bin; Wu, Wenfu; Tai, Jianhao

    2018-07-01

    Rice is one of the most important cereals in the world. With the change of agricultural land, it is urgently necessary to update information about rice planting areas. This study aims to map rice planting areas with a field-based approach through the integration of multi-temporal Sentinel-1A and Landsat-8 OLI data in Wuhua County of South China where has many basins and mountains. This paper, using multi-temporal SAR and optical images, proposes a methodology for the identification of rice-planting areas. This methodology mainly consists of SSM applied to time series SAR images for the calculation of a similarity measure, image segmentation process applied to the pan-sharpened optical image for the searching of homogenous objects, and the integration of SAR and optical data for the elimination of some speckles. The study compares the per-pixel approach with the per-field approach and the results show that the highest accuracy (91.38%) based on the field-based approach is 1.18% slightly higher than that based on the pixel-based approach for VH polarization, which is brought by eliminating speckle noise through comparing the rice maps of these two approaches. Therefore, the integration of Sentinel-1A and Landsat-8 OLI images with a field-based approach has great potential for mapping rice or other crops' areas.

  1. Extending systems thinking in planning and evaluation using group concept mapping and system dynamics to tackle complex problems.

    PubMed

    Hassmiller Lich, Kristen; Urban, Jennifer Brown; Frerichs, Leah; Dave, Gaurav

    2017-02-01

    Group concept mapping (GCM) has been successfully employed in program planning and evaluation for over 25 years. The broader set of systems thinking methodologies (of which GCM is one), have only recently found their way into the field. We present an overview of systems thinking emerging from a system dynamics (SD) perspective, and illustrate the potential synergy between GCM and SD. As with GCM, participatory processes are frequently employed when building SD models; however, it can be challenging to engage a large and diverse group of stakeholders in the iterative cycles of divergent thinking and consensus building required, while maintaining a broad perspective on the issue being studied. GCM provides a compelling resource for overcoming this challenge, by richly engaging a diverse set of stakeholders in broad exploration, structuring, and prioritization. SD provides an opportunity to extend GCM findings by embedding constructs in a testable hypothesis (SD model) describing how system structure and changes in constructs affect outcomes over time. SD can be used to simulate the hypothesized dynamics inherent in GCM concept maps. We illustrate the potential of the marriage of these methodologies in a case study of BECOMING, a federally-funded program aimed at strengthening the cross-sector system of care for youth with severe emotional disturbances. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Evaluation of subsidence hazard in mantled karst setting: a case study from Val d'Orléans (France)

    NASA Astrophysics Data System (ADS)

    Perrin, Jérôme; Cartannaz, Charles; Noury, Gildas; Vanoudheusden, Emilie

    2015-04-01

    Soil subsidence/collapse is a major geohazard occurring in karst region. It occurs as suffosion or dropout sinkholes developing in the soft cover. Less frequently it corresponds to a breakdown of karst void ceiling (i.e., collapse sinkhole). This hazard can cause significant engineering challenges. Therefore decision-makers require the elaboration of methodologies for reliable predictions of such hazards (e.g., karst subsidence susceptibility and hazards maps, early-warning monitoring systems). A methodological framework was developed to evaluate relevant conditioning factors favouring subsidence (Perrin et al. submitted) and then to combine these factors to produce karst subsidence susceptibility maps. This approach was applied to a mantled karst area south of Paris (Val d'Orléans). Results show the significant roles of the overburden lithology (presence/absence of low-permeability layer) and of the karst aquifer piezometric surface position within the overburden. In parallel, an experimental site has been setup to improve the understanding of key processes leading to subsidence/collapse and includes piezometers for measurements of water levels and physico-chemical parameters in both the alluvial and karst aquifers as well as surface deformation monitoring. Results should help in designing monitoring systems to anticipate occurrence of subsidence/collapse. Perrin J., Cartannaz C., Noury G., Vanoudheusden E. 2015. A multicriteria approach to karst subsidence hazard mapping supported by Weights-of-Evidence analysis. Submitted to Engineering Geology.

  3. Probabilistic Flood Maps to support decision-making: Mapping the Value of Information

    NASA Astrophysics Data System (ADS)

    Alfonso, L.; Mukolwe, M. M.; Di Baldassarre, G.

    2016-02-01

    Floods are one of the most frequent and disruptive natural hazards that affect man. Annually, significant flood damage is documented worldwide. Flood mapping is a common preimpact flood hazard mitigation measure, for which advanced methods and tools (such as flood inundation models) are used to estimate potential flood extent maps that are used in spatial planning. However, these tools are affected, largely to an unknown degree, by both epistemic and aleatory uncertainty. Over the past few years, advances in uncertainty analysis with respect to flood inundation modeling show that it is appropriate to adopt Probabilistic Flood Maps (PFM) to account for uncertainty. However, the following question arises; how can probabilistic flood hazard information be incorporated into spatial planning? Thus, a consistent framework to incorporate PFMs into the decision-making is required. In this paper, a novel methodology based on Decision-Making under Uncertainty theories, in particular Value of Information (VOI) is proposed. Specifically, the methodology entails the use of a PFM to generate a VOI map, which highlights floodplain locations where additional information is valuable with respect to available floodplain management actions and their potential consequences. The methodology is illustrated with a simplified example and also applied to a real case study in the South of France, where a VOI map is analyzed on the basis of historical land use change decisions over a period of 26 years. Results show that uncertain flood hazard information encapsulated in PFMs can aid decision-making in floodplain planning.

  4. Longitudinal assessment of treatment effects on pulmonary ventilation using 1H/3He MRI multivariate templates

    NASA Astrophysics Data System (ADS)

    Tustison, Nicholas J.; Contrella, Benjamin; Altes, Talissa A.; Avants, Brian B.; de Lange, Eduard E.; Mugler, John P.

    2013-03-01

    The utitlity of pulmonary functional imaging techniques, such as hyperpolarized 3He MRI, has encouraged their inclusion in research studies for longitudinal assessment of disease progression and the study of treatment effects. We present methodology for performing voxelwise statistical analysis of ventilation maps derived from hyper­ polarized 3He MRI which incorporates multivariate template construction using simultaneous acquisition of IH and 3He images. Additional processing steps include intensity normalization, bias correction, 4-D longitudinal segmentation, and generation of expected ventilation maps prior to voxelwise regression analysis. Analysis is demonstrated on a cohort of eight individuals with diagnosed cystic fibrosis (CF) undergoing treatment imaged five times every two weeks with a prescribed treatment schedule.

  5. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction Engineering, Oregon State University, Corvallis, OR 97331, USA

  6. A non-linear dimension reduction methodology for generating data-driven stochastic input models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ganapathysubramanian, Baskar; Zabaras, Nicholas

    Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem ofmore » manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space R{sup n}. An isometric mapping F from M to a low-dimensional, compact, connected set A is contained in R{sup d}(d<

  7. Mobile mapping of sporting event spectators using bluetooth sensors: tour of flanders 2011.

    PubMed

    Versichele, Mathias; Neutens, Tijs; Goudeseune, Stephanie; van Bossche, Frederik; van de Weghe, Nico

    2012-10-22

    Accurate spatiotemporal information on crowds is a necessity for a better management in general and for the mitigation of potential security risks. The large numbers of individuals involved and their mobility, however, make generation of this information non-trivial. This paper proposes a novel methodology to estimate and map crowd sizes using mobile Bluetooth sensors and examines to what extent this methodology represents a valuable alternative to existing traditional crowd density estimation methods. The proposed methodology is applied in a unique case study that uses Bluetooth technology for the mobile mapping of spectators of the Tour of Flanders 2011 road cycling race. The locations of nearly 16,000 cell phones of spectators along the race course were registered and detailed views of the spatiotemporal distribution of the crowd were generated. Comparison with visual head counts from camera footage delivered a detection ratio of 13.0 ± 2.3%, making it possible to estimate the crowd size. To our knowledge, this is the first study that uses mobile Bluetooth sensors to count and map a crowd over space and time.

  8. Mobile Mapping of Sporting Event Spectators Using Bluetooth Sensors: Tour of Flanders 2011

    PubMed Central

    Versichele, Mathias; Neutens, Tijs; Goudeseune, Stephanie; van Bossche, Frederik; van de Weghe, Nico

    2012-01-01

    Accurate spatiotemporal information on crowds is a necessity for a better management in general and for the mitigation of potential security risks. The large numbers of individuals involved and their mobility, however, make generation of this information non-trivial. This paper proposes a novel methodology to estimate and map crowd sizes using mobile Bluetooth sensors and examines to what extent this methodology represents a valuable alternative to existing traditional crowd density estimation methods. The proposed methodology is applied in a unique case study that uses Bluetooth technology for the mobile mapping of spectators of the Tour of Flanders 2011 road cycling race. The locations of nearly 16,000 cell phones of spectators along the race course were registered and detailed views of the spatiotemporal distribution of the crowd were generated. Comparison with visual head counts from camera footage delivered a detection ratio of 13.0 ± 2.3%, making it possible to estimate the crowd size. To our knowledge, this is the first study that uses mobile Bluetooth sensors to count and map a crowd over space and time. PMID:23202044

  9. Modeling of natural risks in GIS, decision support in the Civil Protection and Emergency Planning

    NASA Astrophysics Data System (ADS)

    Santos, M.; Martins, L.; Moreira, S.; Costa, A.; Matos, F.; Teixeira, M.; Bateira, C.

    2012-04-01

    The assessment of natural hazards in Civil Protection is essential in the prevention and mitigation of emergency situations. This paper presents the results of the development of mapping susceptibility to landslides, floods, forest fires and soil erosion, using GIS (Geographic Information System) tools in two municipalities - Santo Tirso and Trofa - in the district of Oporto, in the northwest of Portugal. The mapping of natural hazards fits in the legislative plan of the Municipal Civil Protection (Law No. 65/2007 of 12 November) and it provides the key elements to planning and preparing an appropriate response in case some of the processes / phenomena occur, thus optimizing the procedures for protection and relief provided by the Municipal Civil Protection Service. Susceptibility mapping to landslides, floods, forest fires and soil erosion was performed with GIS tools resources. The methodology used to compile the mapping of landslides, forest fires and soil erosion was based on the modeling of different conditioning factors and validated with field work and event log. The mapping of susceptibility to floods and flooding was developed through mathematical parameters (statistical, hydrologic and hydraulic), supported by field work and the recognition of individual characteristics of each sector analysis and subsequently analyzed in a GIS environment The mapping proposal was made in 1:5000 scale which allows not only the identification of large sets affected by the spatial dynamics of the processes / phenomena, but also a more detailed analysis, especially when combined with geographic information systems (GIS) thus allowing to study more specific situations that require a quick response. The maps developed in this study are fundamental to the understanding, prediction and prevention of susceptibility and risks present in the municipalities, being a valuable tool in the process of Emergency Planning, since it identifies priority areas of intervention for farther detail analysis, promote and safeguard mechanisms to prevent injury and it anticipates the possibility of potential interventions that can minimize the risk.

  10. Validation and application of Acoustic Mapping Velocimetry

    NASA Astrophysics Data System (ADS)

    Baranya, Sandor; Muste, Marian

    2016-04-01

    The goal of this paper is to introduce a novel methodology to estimate bedload transport in rivers based on an improved bedform tracking procedure. The measurement technique combines components and processing protocols from two contemporary nonintrusive instruments: acoustic and image-based. The bedform mapping is conducted with acoustic surveys while the estimation of the velocity of the bedforms is obtained with processing techniques pertaining to image-based velocimetry. The technique is therefore called Acoustic Mapping Velocimetry (AMV). The implementation of this technique produces a whole-field velocity map associated with the multi-directional bedform movement. Based on the calculated two-dimensional bedform migration velocity field, the bedload transport estimation is done using the Exner equation. A proof-of-concept experiment was performed to validate the AMV based bedload estimation in a laboratory flume at IIHR-Hydroscience & Engineering (IIHR). The bedform migration was analysed at three different flow discharges. Repeated bed geometry mapping, using a multiple transducer array (MTA), provided acoustic maps, which were post-processed with a particle image velocimetry (PIV) method. Bedload transport rates were calculated along longitudinal sections using the streamwise components of the bedform velocity vectors and the measured bedform heights. The bulk transport rates were compared with the results from concurrent direct physical samplings and acceptable agreement was found. As a first field implementation of the AMV an attempt was made to estimate bedload transport for a section of the Ohio river in the United States, where bed geometry maps, resulted by repeated multibeam echo sounder (MBES) surveys, served as input data. Cross-sectional distributions of bedload transport rates from the AMV based method were compared with the ones obtained from another non-intrusive technique (due to the lack of direct samplings), ISSDOTv2, developed by the US Army Corps of Engineers. The good agreement between the results from the two different methods is encouraging and suggests further field tests in varying hydro-morphological situations.

  11. A mapping closure for turbulent scalar mixing using a time-evolving reference field

    NASA Technical Reports Server (NTRS)

    Girimaji, Sharath S.

    1992-01-01

    A general mapping-closure approach for modeling scalar mixing in homogeneous turbulence is developed. This approach is different from the previous methods in that the reference field also evolves according to the same equations as the physical scalar field. The use of a time-evolving Gaussian reference field results in a model that is similar to the mapping closure model of Pope (1991), which is based on the methodology of Chen et al. (1989). Both models yield identical relationships between the scalar variance and higher-order moments, which are in good agreement with heat conduction simulation data and can be consistent with any type of epsilon(phi) evolution. The present methodology can be extended to any reference field whose behavior is known. The possibility of a beta-pdf reference field is explored. The shortcomings of the mapping closure methods are discussed, and the limit at which the mapping becomes invalid is identified.

  12. The Uses of Emotion Maps in Research and Clinical Practice with Families and Couples: Methodological Innovation and Critical Inquiry

    PubMed Central

    Gabb, Jacqui; Singh, Reenee

    2015-01-01

    We explore how “emotion maps” can be productively used in clinical assessment and clinical practice with families and couples. This graphic participatory method was developed in sociological studies to examine everyday family relationships. Emotion maps enable us to effectively “see” the dynamic experience and emotional repertoires of family life. Through the use of a case example, in this article we illustrate how emotion maps can add to the systemic clinicians’ repertoire of visual methods. For clinicians working with families, couples, and young people, the importance of gaining insight into how lives are lived, at home, cannot be understated. Producing emotion maps can encourage critical personal reflection and expedite change in family practice. Hot spots in the household become visualized, facilitating dialogue on prevailing issues and how these events may be perceived differently by different family members. As emotion maps are not reliant on literacy or language skills they can be equally completed by parents and children alike, enabling children's perspective to be heard. Emotion maps can be used as assessment tools, to demonstrate the process of change within families. Furthermore, emotion maps can be extended to use through technology and hence are well suited particularly to working with young people. We end the article with a wider discussion of the place of emotions and emotion maps within systemic psychotherapy. PMID:25091031

  13. Revision of Time-Independent Probabilistic Seismic Hazard Maps for Alaska

    USGS Publications Warehouse

    Wesson, Robert L.; Boyd, Oliver S.; Mueller, Charles S.; Bufe, Charles G.; Frankel, Arthur D.; Petersen, Mark D.

    2007-01-01

    We present here time-independent probabilistic seismic hazard maps of Alaska and the Aleutians for peak ground acceleration (PGA) and 0.1, 0.2, 0.3, 0.5, 1.0 and 2.0 second spectral acceleration at probability levels of 2 percent in 50 years (annual probability of 0.000404), 5 percent in 50 years (annual probability of 0.001026) and 10 percent in 50 years (annual probability of 0.0021). These maps represent a revision of existing maps based on newly obtained data and assumptions reflecting best current judgments about methodology and approach. These maps have been prepared following the procedures and assumptions made in the preparation of the 2002 National Seismic Hazard Maps for the lower 48 States. A significant improvement relative to the 2002 methodology is the ability to include variable slip rate along a fault where appropriate. These maps incorporate new data, the responses to comments received at workshops held in Fairbanks and Anchorage, Alaska, in May, 2005, and comments received after draft maps were posted on the National Seismic Hazard Mapping Web Site. These maps will be proposed for adoption in future revisions to the International Building Code. In this documentation we describe the maps and in particular explain and justify changes that have been made relative to the 1999 maps. We are also preparing a series of experimental maps of time-dependent hazard that will be described in future documents.

  14. Fault region localization (FRL): collaborative product and process improvement based on field performance

    NASA Astrophysics Data System (ADS)

    Mannar, Kamal; Ceglarek, Darek

    2005-11-01

    Customer feedback in the form of warranty/field performance is an important and direct indicator of quality and robustness of a product. Linking warranty information to manufacturing measurements can identify key design parameters and process variables (DPs and PVs) that are related to warranty failures. Warranty data has been traditionally used in reliability studies to determine failure distributions and warranty cost. This paper proposes a novel Fault Region Localization (FRL) methodology to map warranty failures to manufacturing measurements (hence to DPs/PVs) to diagnose warranty failures and perform tolerance revaluation. The FRL methodology consists of two parts: 1. Identifying relations between warranty failures and DPs and PVs using the Generalized Rough Set (GRS) method. GRS is a supervised learning technique to identify specific DPs and PVs related to the given warranty failures and then determining the corresponding Warranty Fault Regions (WFR), Normal Region (NR) and Boundary region (BND). GRS expands traditional Rough Set method by allowing inclusion of noise and uncertainty of warranty data classes. 2. Revaluating the original tolerances of DPs/PVs based on the WFR and BND region identified. The FRL methodology is illustrated using case studies based on two warranty failures from the electronics industry.

  15. Crop biometric maps: the key to prediction.

    PubMed

    Rovira-Más, Francisco; Sáiz-Rubio, Verónica

    2013-09-23

    The sustainability of agricultural production in the twenty-first century, both in industrialized and developing countries, benefits from the integration of farm management with information technology such that individual plants, rows, or subfields may be endowed with a singular "identity." This approach approximates the nature of agricultural processes to the engineering of industrial processes. In order to cope with the vast variability of nature and the uncertainties of agricultural production, the concept of crop biometrics is defined as the scientific analysis of agricultural observations confined to spaces of reduced dimensions and known position with the purpose of building prediction models. This article develops the idea of crop biometrics by setting its principles, discussing the selection and quantization of biometric traits, and analyzing the mathematical relationships among measured and predicted traits. Crop biometric maps were applied to the case of a wine-production vineyard, in which vegetation amount, relative altitude in the field, soil compaction, berry size, grape yield, juice pH, and grape sugar content were selected as biometric traits. The enological potential of grapes was assessed with a quality-index map defined as a combination of titratable acidity, sugar content, and must pH. Prediction models for yield and quality were developed for high and low resolution maps, showing the great potential of crop biometric maps as a strategic tool for vineyard growers as well as for crop managers in general, due to the wide versatility of the methodology proposed.

  16. Crop Biometric Maps: The Key to Prediction

    PubMed Central

    Rovira-Más, Francisco; Sáiz-Rubio, Verónica

    2013-01-01

    The sustainability of agricultural production in the twenty-first century, both in industrialized and developing countries, benefits from the integration of farm management with information technology such that individual plants, rows, or subfields may be endowed with a singular “identity.” This approach approximates the nature of agricultural processes to the engineering of industrial processes. In order to cope with the vast variability of nature and the uncertainties of agricultural production, the concept of crop biometrics is defined as the scientific analysis of agricultural observations confined to spaces of reduced dimensions and known position with the purpose of building prediction models. This article develops the idea of crop biometrics by setting its principles, discussing the selection and quantization of biometric traits, and analyzing the mathematical relationships among measured and predicted traits. Crop biometric maps were applied to the case of a wine-production vineyard, in which vegetation amount, relative altitude in the field, soil compaction, berry size, grape yield, juice pH, and grape sugar content were selected as biometric traits. The enological potential of grapes was assessed with a quality-index map defined as a combination of titratable acidity, sugar content, and must pH. Prediction models for yield and quality were developed for high and low resolution maps, showing the great potential of crop biometric maps as a strategic tool for vineyard growers as well as for crop managers in general, due to the wide versatility of the methodology proposed. PMID:24064605

  17. Evaluation of the clinical process in a critical care information system using the Lean method: a case study

    PubMed Central

    2012-01-01

    Background There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. Methods We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. Results We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. Conclusions The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy. PMID:23259846

  18. Evaluation of the clinical process in a critical care information system using the Lean method: a case study.

    PubMed

    Yusof, Maryati Mohd; Khodambashi, Soudabeh; Mokhtar, Ariffin Marzuki

    2012-12-21

    There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy.

  19. Methodology to design a municipal solid waste generation and composition map: A case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallardo, A., E-mail: gallardo@uji.es; Carlos, M., E-mail: mcarlos@uji.es; Peris, M., E-mail: perism@uji.es

    Highlights: • To draw a waste generation and composition map of a town a lot of factors must be taken into account. • The methodology proposed offers two different depending on the available data combined with geographical information systems. • The methodology has been applied to a Spanish city with success. • The methodology will be a useful tool to organize the municipal solid waste management. - Abstract: The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve naturalmore » resources. To design an adequate MSW management plan the first step consist in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town.« less

  20. Toward a Digital Thread and Data Package for Metals-Additive Manufacturing.

    PubMed

    Kim, D B; Witherell, P; Lu, Y; Feng, S

    2017-01-01

    Additive manufacturing (AM) has been envisioned by many as a driving factor of the next industrial revolution. Potential benefits of AM adoption include the production of low-volume, customized, complicated parts/products, supply chain efficiencies, shortened time-to-market, and environmental sustainability. Work remains, however, for AM to reach the status of a full production-ready technology. Whereas the ability to create unique 3D geometries has been generally proven, production challenges remain, including lack of (1) data manageability through information management systems, (2) traceability to promote product producibility, process repeatability, and part-to-part reproducibility, and (3) accountability through mature certification and qualification methodologies. To address these challenges in part, this paper discusses the building of data models to support the development of validation and conformance methodologies in AM. We present an AM information map that leverages informatics to facilitate part producibility, process repeatability, and part-to-part reproducibility in an AM process. We present three separate case studies to demonstrate the importance of establishing baseline data structures and part provenance through an AM digital thread.

  1. Toward a Digital Thread and Data Package for Metals-Additive Manufacturing

    PubMed Central

    Kim, D. B.; Witherell, P.; Lu, Y.; Feng, S.

    2017-01-01

    Additive manufacturing (AM) has been envisioned by many as a driving factor of the next industrial revolution. Potential benefits of AM adoption include the production of low-volume, customized, complicated parts/products, supply chain efficiencies, shortened time-to-market, and environmental sustainability. Work remains, however, for AM to reach the status of a full production-ready technology. Whereas the ability to create unique 3D geometries has been generally proven, production challenges remain, including lack of (1) data manageability through information management systems, (2) traceability to promote product producibility, process repeatability, and part-to-part reproducibility, and (3) accountability through mature certification and qualification methodologies. To address these challenges in part, this paper discusses the building of data models to support the development of validation and conformance methodologies in AM. We present an AM information map that leverages informatics to facilitate part producibility, process repeatability, and part-to-part reproducibility in an AM process. We present three separate case studies to demonstrate the importance of establishing baseline data structures and part provenance through an AM digital thread. PMID:28691115

  2. 3D nonrigid registration via optimal mass transport on the GPU.

    PubMed

    Ur Rehman, Tauseef; Haber, Eldad; Pryor, Gallagher; Melonakos, John; Tannenbaum, Allen

    2009-12-01

    In this paper, we present a new computationally efficient numerical scheme for the minimizing flow approach for optimal mass transport (OMT) with applications to non-rigid 3D image registration. The approach utilizes all of the gray-scale data in both images, and the optimal mapping from image A to image B is the inverse of the optimal mapping from B to A. Further, no landmarks need to be specified, and the minimizer of the distance functional involved is unique. Our implementation also employs multigrid, and parallel methodologies on a consumer graphics processing unit (GPU) for fast computation. Although computing the optimal map has been shown to be computationally expensive in the past, we show that our approach is orders of magnitude faster then previous work and is capable of finding transport maps with optimality measures (mean curl) previously unattainable by other works (which directly influences the accuracy of registration). We give results where the algorithm was used to compute non-rigid registrations of 3D synthetic data as well as intra-patient pre-operative and post-operative 3D brain MRI datasets.

  3. Trends in hard X-ray fluorescence mapping: environmental applications in the age of fast detectors.

    PubMed

    Lombi, E; de Jonge, M D; Donner, E; Ryan, C G; Paterson, D

    2011-06-01

    Environmental samples are extremely diverse but share a tendency for heterogeneity and complexity. This heterogeneity poses methodological challenges when investigating biogeochemical processes. In recent years, the development of analytical tools capable of probing element distribution and speciation at the microscale have allowed this challenge to be addressed. Of these available tools, laterally resolved synchrotron techniques such as X-ray fluorescence mapping are key methods for the in situ investigation of micronutrients and inorganic contaminants in environmental samples. This article demonstrates how recent advances in X-ray fluorescence detector technology are bringing new possibilities to environmental research. Fast detectors are helping to circumvent major issues such as X-ray beam damage of hydrated samples, as dwell times during scanning are reduced. They are also helping to reduce temporal beamtime requirements, making particularly time-consuming techniques such as micro X-ray fluorescence (μXRF) tomography increasingly feasible. This article focuses on μXRF mapping of nutrients and metalloids in environmental samples, and suggests that the current divide between mapping and speciation techniques will be increasingly blurred by the development of combined approaches.

  4. Temporally inter-comparable maps of terrestrial wilderness and the Last of the Wild

    NASA Astrophysics Data System (ADS)

    Allan, James R.; Venter, Oscar; Watson, James E. M.

    2017-12-01

    Wilderness areas, defined as areas free of industrial scale activities and other human pressures which result in significant biophysical disturbance, are important for biodiversity conservation and sustaining the key ecological processes underpinning planetary life-support systems. Despite their importance, wilderness areas are being rapidly eroded in extent and fragmented. Here we present the most up-to-date temporally inter-comparable maps of global terrestrial wilderness areas, which are essential for monitoring changes in their extent, and for proactively planning conservation interventions to ensure their preservation. Using maps of human pressure on the natural environment for 1993 and 2009, we identified wilderness as all 'pressure free' lands with a contiguous area >10,000 km2. These places are likely operating in a natural state and represent the most intact habitats globally. We then created a regionally representative map of wilderness following the well-established 'Last of the Wild' methodology; which identifies the 10% area with the lowest human pressure within each of Earth's 60 biogeographic realms, and identifies the ten largest contiguous areas, along with all contiguous areas >10,000 km2.

  5. Radical probing of spliceosome assembly.

    PubMed

    Grewal, Charnpal S; Kent, Oliver A; MacMillan, Andrew M

    2017-08-01

    Here we describe the synthesis and use of a directed hydroxyl radical probe, tethered to a pre-mRNA substrate, to map the structure of this substrate during the spliceosome assembly process. These studies indicate an early organization and proximation of conserved pre-mRNA sequences during spliceosome assembly. This methodology may be adapted to the synthesis of a wide variety of modified RNAs for use as probes of RNA structure and RNA-protein interaction. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Functional approximation using artificial neural networks in structural mechanics

    NASA Technical Reports Server (NTRS)

    Alam, Javed; Berke, Laszlo

    1993-01-01

    The artificial neural networks (ANN) methodology is an outgrowth of research in artificial intelligence. In this study, the feed-forward network model that was proposed by Rumelhart, Hinton, and Williams was applied to the mapping of functions that are encountered in structural mechanics problems. Several different network configurations were chosen to train the available data for problems in materials characterization and structural analysis of plates and shells. By using the recall process, the accuracy of these trained networks was assessed.

  7. [Information system for supporting the Nursing Care Systematization].

    PubMed

    Malucelli, Andreia; Otemaier, Kelly Rafaela; Bonnet, Marcel; Cubas, Marcia Regina; Garcia, Telma Ribeiro

    2010-01-01

    It is an unquestionable fact, the importance, relevance and necessity of implementing the Nursing Care Systematization in the different environments of professional practice. Considering it as a principle, emerged the motivation for the development of an information system to support the Nursing Care Systematization, based on Nursing Process steps and Human Needs, using the diagnoses language, nursing interventions and outcomes for professional practice documentation. This paper describes the methodological steps and results of the information system development - requirements elicitation, modeling, object-relational mapping, implementation and system validation.

  8. Accuracy assessment of maps of forest condition: Statistical design and methodological considerations [Chapter 5

    Treesearch

    Raymond L. Czaplewski

    2003-01-01

    No thematic map is perfect. Some pixels or polygons are not accurately classified, no matter how well the map is crafted. Therefore, thematic maps need metadata that sufficiently characterize the nature and degree of these imperfections. To decision-makers, an accuracy assessment helps judge the risks of using imperfect geospatial data. To analysts, an accuracy...

  9. Cloud-based computation for accelerating vegetation mapping and change detection at regional to national scales

    Treesearch

    Matthew J. Gregory; Zhiqiang Yang; David M. Bell; Warren B. Cohen; Sean Healey; Janet L. Ohmann; Heather M. Roberts

    2015-01-01

    Mapping vegetation and landscape change at fine spatial scales is needed to inform natural resource and conservation planning, but such maps are expensive and time-consuming to produce. For Landsat-based methodologies, mapping efforts are hampered by the daunting task of manipulating multivariate data for millions to billions of pixels. The advent of cloud-based...

  10. 3D Geological Mapping - uncovering the subsurface to increase environmental understanding

    NASA Astrophysics Data System (ADS)

    Kessler, H.; Mathers, S.; Peach, D.

    2012-12-01

    Geological understanding is required for many disciplines studying natural processes from hydrology to landscape evolution. The subsurface structure of rocks and soils and their properties occupies three-dimensional (3D) space and geological processes operate in time. Traditionally geologists have captured their spatial and temporal knowledge in 2 dimensional maps and cross-sections and through narrative, because paper maps and later two dimensional geographical information systems (GIS) were the only tools available to them. Another major constraint on using more explicit and numerical systems to express geological knowledge is the fact that a geologist only ever observes and measures a fraction of the system they study. Only on rare occasions does the geologist have access to enough real data to generate meaningful predictions of the subsurface without the input of conceptual understanding developed from and knowledge of the geological processes responsible for the deposition, emplacement and diagenesis of the rocks. This in turn has led to geology becoming an increasingly marginalised science as other disciplines have embraced the digital world and have increasingly turned to implicit numerical modelling to understand environmental processes and interactions. Recent developments in geoscience methodology and technology have gone some way to overcoming these barriers and geologists across the world are beginning to routinely capture their knowledge and combine it with all available subsurface data (of often highly varying spatial distribution and quality) to create regional and national geological three dimensional geological maps. This is re-defining the way geologists interact with other science disciplines, as their concepts and knowledge are now expressed in an explicit form that can be used downstream to design process models structure. For example, groundwater modellers can refine their understanding of groundwater flow in three dimensions or even directly parameterize their numerical models using outputs from 3D mapping. In some cases model code is being re-designed in order to deal with the increasing geological complexity expressed by Geologists. These 3D maps contain have inherent uncertainty, just as their predecessors, 2D geological maps had, and there remains a significant body of work to quantify and effectively communicate this uncertainty. Here we present examples of regional and national 3D maps from Geological Survey Organisations worldwide and how these are being used to better solve real-life environmental problems. The future challenge for geologists is to make these 3D maps easily available in an accessible and interoperable form so that the environmental science community can truly integrate the hidden subsurface into a common understanding of the whole geosphere.

  11. Land cover mapping of the National Park Service northwest Alaska management area using Landsat multispectral and thematic mapper satellite data

    USGS Publications Warehouse

    Markon, C.J.; Wesser, Sara

    1998-01-01

    A land cover map of the National Park Service northwest Alaska management area was produced using digitally processed Landsat data. These and other environmental data were incorporated into a geographic information system to provide baseline information about the nature and extent of resources present in this northwest Alaskan environment.This report details the methodology, depicts vegetation profiles of the surrounding landscape, and describes the different vegetation types mapped. Portions of nine Landsat satellite (multispectral scanner and thematic mapper) scenes were used to produce a land cover map of the Cape Krusenstern National Monument and Noatak National Preserve and to update an existing land cover map of Kobuk Valley National Park Valley National Park. A Bayesian multivariate classifier was applied to the multispectral data sets, followed by the application of ancillary data (elevation, slope, aspect, soils, watersheds, and geology) to enhance the spectral separation of classes into more meaningful vegetation types. The resulting land cover map contains six major land cover categories (forest, shrub, herbaceous, sparse/barren, water, other) and 19 subclasses encompassing 7 million hectares. General narratives of the distribution of the subclasses throughout the project area are given along with vegetation profiles showing common relationships between topographic gradients and vegetation communities.

  12. Participatory GIS in design of the Wroclaw University of Science and Technology campus web map and spatial analysis of campus area quality

    NASA Astrophysics Data System (ADS)

    Blachowski, Jan; Łuczak, Jakub; Zagrodnik, Paulina

    2018-01-01

    Public participation geographic information system (GIS) and participatory mapping data collection methods are means that enhance capacity in generating, managing, and communicating spatial information in various fields ranging from local planning to environmental management. In this study these methods have been used in two ways. The first one, to gather information on the additional functionality of campus web map expected by its potential users, i.e. students, staff and visitors, through web based survey. The second, to collect geographically referenced information on campus areas that are liked and disliked in a geo-survey carried out with ArcGIS Online GeoForm Application. The results of the first survey were used to map facilities such as: bicycle infrastructure, building entrances, wheelchair accessible infrastructure and benches. The results of the second one, to analyse the most and the least attractive parts of the campus with heat and hot spot analyses in GIS. In addition, the answers have been studied with regard to the visual and functional aspects of campus area raised in the survey. The thematic layers developed in the results of field mapping and geoprocessing of geosurvey data were included in the campus web map project. The paper describes the applied methodology of data collection, processing, analysis, interpretation and geovisualisation.

  13. Environmental hazard mapping using GIS and AHP - A case study of Dong Trieu District in Quang Ninh Province, Vietnam

    NASA Astrophysics Data System (ADS)

    Anh, N. K.; Phonekeo, V.; My, V. C.; Duong, N. D.; Dat, P. T.

    2014-02-01

    In recent years, Vietnamese economy has been growing up rapidly and caused serious environmental quality plunging, especially in industrial and mining areas. It brings an enormous threat to a socially sustainable development and the health of human beings. Environmental quality assessment and protection are complex and dynamic processes, since it involves spatial information from multi-sector, multi-region and multi-field sources and needs complicated data processing. Therefore, an effective environmental protection information system is needed, in which considerable factors hidden in the complex relationships will become clear and visible. In this paper, the authors present the methodology which was used to generate environmental hazard maps which are applied to the integration of Analytic Hierarchy Process (AHP) and Geographical Information system (GIS). We demonstrate the results that were obtained from the study area in Dong Trieu district. This research study has contributed an overall perspective of environmental quality and identified the devastated areas where the administration urgently needs to establish an appropriate policy to improve and protect the environment.

  14. GIS-mapping of environmental assessment of the territories in the region of intense activity for the oil and gas complex for achievement the goals of the Sustainable Development (on the example of Russia)

    NASA Astrophysics Data System (ADS)

    Yermolaev, Oleg

    2014-05-01

    The uniform system of complex scientific-reference ecological-geographical should act as a base for the maintenance of the Sustainable Development (SD) concept in the territories of the Russian Federation subjects or certain regions. In this case, the assessment of the ecological situation in the regions can be solved by the conjugation of the two interrelated system - the mapping and the geoinformational. The report discusses the methodological aspects of the Atlas-mapping for the purposes of SD in the regions of Russia. The Republic of Tatarstan viewed as a model territory where a large-scale oil-gas complex "Tatneft" PLC works. The company functions for more than 60 years. Oil fields occupy an area of more than 38 000 km2; placed in its territory about 40 000 oil wells, more than 55 000 km of pipelines; more than 3 billion tons of oil was extracted. Methods for to the structure and requirements for the Atlas's content were outlined. The approaches to mapping of "an ecological dominant" of SD conceptually substantiated following the pattern of a large region of Russia. Several trends of thematically mapping were suggested to be distinguished in the Atlas's structure: • The background history of oil-fields mine working; • The nature preservation technologies while oil extracting; • The assessment of natural conditions of a humans vital activity; • Unfavorable and dangerous natural processes and phenomena; • The anthropogenic effect and environmental surroundings change; • The social-economical processes and phenomena. • The medical-ecological and geochemical processes and phenomena; Within these groups the other numerous groups can distinguished. The maps of unfavorable and dangerous processes and phenomena subdivided in accordance with the types of processes - of endogenous and exogenous origin. Among the maps of the anthropogenic effects on the natural surroundings one can differentiate the maps of the influence on different nature's spheres - atmosphere, hydrosphere, lithosphere, biosphere, etc. In this way, all thematic groups brought together into four main sections: • The introduction (the maps of a general condition and social-economical state, a region's rating in Republic; • The components of natural, social-economics systems that form the conditions for the ecological situations; • The integrated maps of exertion and change of the environment; • The strategy to reach an ecological equilibrium. The following data confirm that: more than 200 electronic analytical, complex and synthetic maps; more than 1000 small rivers basins, 6000 landscapes areas, 500 anthropogenic pollutions source, etc. The extensive information, richness and diversity of the maps content, objective indices used in the maps, open the door to wide opportunities to apply different methods of cartography analysis comprising both usual visional one and the geographical constructions, cartometry statistical data treatment, respectively. The methods of mathematical-mapping and computer modeling presume to compute spatial correlations and mutual conformity of phenomena and to estimate the homogeneity of the ecological conditions, to reveal the leading factors of distribution and phenomena and processes development using the means of multidimensional statistical analysis.

  15. Unsupervised Detection of Planetary Craters by a Marked Point Process

    NASA Technical Reports Server (NTRS)

    Troglio, G.; Benediktsson, J. A.; Le Moigne, J.; Moser, G.; Serpico, S. B.

    2011-01-01

    With the launch of several planetary missions in the last decade, a large amount of planetary images is being acquired. Preferably, automatic and robust processing techniques need to be used for data analysis because of the huge amount of the acquired data. Here, the aim is to achieve a robust and general methodology for crater detection. A novel technique based on a marked point process is proposed. First, the contours in the image are extracted. The object boundaries are modeled as a configuration of an unknown number of random ellipses, i.e., the contour image is considered as a realization of a marked point process. Then, an energy function is defined, containing both an a priori energy and a likelihood term. The global minimum of this function is estimated by using reversible jump Monte-Carlo Markov chain dynamics and a simulated annealing scheme. The main idea behind marked point processes is to model objects within a stochastic framework: Marked point processes represent a very promising current approach in the stochastic image modeling and provide a powerful and methodologically rigorous framework to efficiently map and detect objects and structures in an image with an excellent robustness to noise. The proposed method for crater detection has several feasible applications. One such application area is image registration by matching the extracted features.

  16. Mycobacterium avium subspecies paratuberculosis causes Crohn's disease in some inflammatory bowel disease patients.

    PubMed

    Naser, Saleh A; Sagramsingh, Sudesh R; Naser, Abed S; Thanigachalam, Saisathya

    2014-06-21

    Crohn's disease (CD) is a chronic inflammatory condition that plagues millions all over the world. This debilitating bowel disease can start in early childhood and continue into late adulthood. Signs and symptoms are usually many and multiple tests are often required for the diagnosis and confirmation of this disease. However, little is still understood about the cause(s) of CD. As a result, several theories have been proposed over the years. One theory in particular is that Mycobacterium avium subspecies paratuberculosis (MAP) is intimately linked to the etiology of CD. This fastidious bacterium also known to cause Johne's disease in cattle has infected the intestines of animals for years. It is believed that due to the thick, waxy cell wall of MAP it is able to survive the process of pasteurization as well as chemical processes seen in irrigation purification systems. Subsequently meat, dairy products and water serve as key vehicles in the transmission of MAP infection to humans (from farm to fork) who have a genetic predisposition, thus leading to the development of CD. The challenges faced in culturing this bacterium from CD are many. Examples include its extreme slow growth, lack of cell wall, low abundance, and its mycobactin dependency. In this review article, data from 60 studies showing the detection and isolation of MAP by PCR and culture techniques have been reviewed. Although this review may not be 100% comprehensive of all studies, clearly the majority of the studies overwhelmingly and definitively support the role of MAP in at least 30%-50% of CD patients. It is very possible that lack of detection of MAP from some CD patients may be due to the absence of MAP role in these patients. The latter statement is conditional on utilization of methodology appropriate for detection of human MAP strains. Ultimately, stratification of CD and inflammatory bowel disease patients for the presence or absence of MAP is necessary for appropriate and effective treatment which may lead to a cure.

  17. Mapping as a learning strategy in health professions education: a critical analysis.

    PubMed

    Pudelko, Beatrice; Young, Meredith; Vincent-Lamarre, Philippe; Charlin, Bernard

    2012-12-01

    Mapping is a means of representing knowledge in a visual network and is becoming more commonly used as a learning strategy in medical education. The assumption driving the development and use of concept mapping is that it supports and furthers meaningful learning. The goal of this paper was to examine the effectiveness of concept mapping as a learning strategy in health professions education. The authors conducted a critical analysis of recent literature on the use of concept mapping as a learning strategy in the area of health professions education. Among the 65 studies identified, 63% were classified as empirical work, the majority (76%) of which used pre-experimental designs. Only 24% of empirical studies assessed the impact of mapping on meaningful learning. Results of the analysis do not support the hypothesis that mapping per se furthers and supports meaningful learning, memorisation or factual recall. When documented improvements in learning were found, they often occurred when mapping was used in concert with other strategies, such as collaborative learning or instructor modelling, scaffolding and feedback. Current empirical research on mapping as a learning strategy presents methodological shortcomings that limit its internal and external validity. The results of our analysis indicate that mapping strategies that make use of feedback and scaffolding have beneficial effects on learning. Accordingly, we see a need to expand the process of reflection on the characteristics of representational guidance as it is provided by mapping techniques and tools based on field of knowledge, instructional objectives, and the characteristics of learners in health professions education. © Blackwell Publishing Ltd 2012.

  18. Finding common ground in large carnivore conservation: mapping contending perspectives

    USGS Publications Warehouse

    Mattson, D.J.; Byrd, K.L.; Rutherford, M.B.; Brown, S.R.; Clark, T.W.

    2006-01-01

    Reducing current conflict over large carnivore conservation and designing effective strategies that enjoy broad public support depend on a better understanding of the values, beliefs, and demands of those who are involved or affected. We conducted a workshop attended by diverse participants involved in conservation of large carnivores in the northern U.S. Rocky Mountains, and used Q methodology to elucidate participant perspectives regarding "problems" and "solutions". Q methodology employs qualitative and quantitative techniques to reveal the subjectivity in any situation. We identified four general perspectives for both problems and solutions, three of which (Carnivore Advocates, Devolution Advocates, and Process Reformers) were shared by participants across domains. Agency Empathizers (problems) and Economic Pragmatists (solutions) were not clearly linked. Carnivore and Devolution Advocates expressed diametrically opposed perspectives that legitimized different sources of policy-relevant information ("science" for Carnivore Advocates and "local knowledge" for Devolution Advocates). Despite differences, we identified potential common ground focused on respectful, persuasive, and creative processes that would build understanding and tolerance. ?? 2006 Elsevier Ltd. All rights reserved.

  19. Computational Acoustic Beamforming for Noise Source Identification for Small Wind Turbines.

    PubMed

    Ma, Ping; Lien, Fue-Sang; Yee, Eugene

    2017-01-01

    This paper develops a computational acoustic beamforming (CAB) methodology for identification of sources of small wind turbine noise. This methodology is validated using the case of the NACA 0012 airfoil trailing edge noise. For this validation case, the predicted acoustic maps were in excellent conformance with the results of the measurements obtained from the acoustic beamforming experiment. Following this validation study, the CAB methodology was applied to the identification of noise sources generated by a commercial small wind turbine. The simulated acoustic maps revealed that the blade tower interaction and the wind turbine nacelle were the two primary mechanisms for sound generation for this small wind turbine at frequencies between 100 and 630 Hz.

  20. Harmonization of Multiple Forest Disturbance Data to Create a 1986-2011 Database for the Conterminous United States

    NASA Astrophysics Data System (ADS)

    Soulard, C. E.; Acevedo, W.; Yang, Z.; Cohen, W. B.; Stehman, S. V.; Taylor, J. L.

    2015-12-01

    A wide range of spatial forest disturbance data exist for the conterminous United States, yet inconsistencies between map products arise because of differing programmatic objectives and methodologies. Researchers on the Land Change Research Project (LCRP) are working to assess spatial agreement, characterize uncertainties, and resolve discrepancies between these national level datasets, in regard to forest disturbance. Disturbance maps from the Global Forest Change (GFC), Landfire Vegetation Disturbance (LVD), National Land Cover Dataset (NLCD), Vegetation Change Tracker (VCT), Web-enabled Landsat Data (WELD), and Monitoring Trends in Burn Severity (MTBS) were harmonized using a pixel-based data fusion process. The harmonization process reconciled forest harvesting, forest fire, and remaining forest disturbance across four intervals (1986-1992, 1992-2001, 2001-2006, and 2006-2011) by relying on convergence of evidence across all datasets available for each interval. Pixels with high agreement across datasets were retained, while moderate-to-low agreement pixels were visually assessed and either manually edited using reference imagery or discarded from the final disturbance map(s). National results show that annual rates of forest harvest and overall fire have increased over the past 25 years. Overall, this study shows that leveraging the best elements of readily-available data improves forest loss monitoring relative to using a single dataset to monitor forest change, particularly by reducing commission errors.

  1. Event-Based Tone Mapping for Asynchronous Time-Based Image Sensor

    PubMed Central

    Simon Chane, Camille; Ieng, Sio-Hoi; Posch, Christoph; Benosman, Ryad B.

    2016-01-01

    The asynchronous time-based neuromorphic image sensor ATIS is an array of autonomously operating pixels able to encode luminance information with an exceptionally high dynamic range (>143 dB). This paper introduces an event-based methodology to display data from this type of event-based imagers, taking into account the large dynamic range and high temporal accuracy that go beyond available mainstream display technologies. We introduce an event-based tone mapping methodology for asynchronously acquired time encoded gray-level data. A global and a local tone mapping operator are proposed. Both are designed to operate on a stream of incoming events rather than on time frame windows. Experimental results on real outdoor scenes are presented to evaluate the performance of the tone mapping operators in terms of quality, temporal stability, adaptation capability, and computational time. PMID:27642275

  2. Towards Web-based representation and processing of health information

    PubMed Central

    Gao, Sheng; Mioc, Darka; Yi, Xiaolun; Anton, Francois; Oldfield, Eddie; Coleman, David J

    2009-01-01

    Background There is great concern within health surveillance, on how to grapple with environmental degradation, rapid urbanization, population mobility and growth. The Internet has emerged as an efficient way to share health information, enabling users to access and understand data at their fingertips. Increasingly complex problems in the health field require increasingly sophisticated computer software, distributed computing power, and standardized data sharing. To address this need, Web-based mapping is now emerging as an important tool to enable health practitioners, policy makers, and the public to understand spatial health risks, population health trends and vulnerabilities. Today several web-based health applications generate dynamic maps; however, for people to fully interpret the maps they need data source description and the method used in the data analysis or statistical modeling. For the representation of health information through Web-mapping applications, there still lacks a standard format to accommodate all fixed (such as location) and variable (such as age, gender, health outcome, etc) indicators in the representation of health information. Furthermore, net-centric computing has not been adequately applied to support flexible health data processing and mapping online. Results The authors of this study designed a HEalth Representation XML (HERXML) schema that consists of the semantic (e.g., health activity description, the data sources description, the statistical methodology used for analysis), geometric, and cartographical representations of health data. A case study has been carried on the development of web application and services within the Canadian Geospatial Data Infrastructure (CGDI) framework for community health programs of the New Brunswick Lung Association. This study facilitated the online processing, mapping and sharing of health information, with the use of HERXML and Open Geospatial Consortium (OGC) services. It brought a new solution in better health data representation and initial exploration of the Web-based processing of health information. Conclusion The designed HERXML has been proven to be an appropriate solution in supporting the Web representation of health information. It can be used by health practitioners, policy makers, and the public in disease etiology, health planning, health resource management, health promotion and health education. The utilization of Web-based processing services in this study provides a flexible way for users to select and use certain processing functions for health data processing and mapping via the Web. This research provides easy access to geospatial and health data in understanding the trends of diseases, and promotes the growth and enrichment of the CGDI in the public health sector. PMID:19159445

  3. The use of Sentinel-2 imagery for seagrass mapping: Kalloni Gulf (Lesvos Island, Greece) case study

    NASA Astrophysics Data System (ADS)

    Topouzelis, Konstantinos; Charalampis Spondylidis, Spyridon; Papakonstantinou, Apostolos; Soulakellis, Nikolaos

    2016-08-01

    Seagrass meadows play a significant role in ecosystems by stabilizing sediment and improving water clarity, which enhances seagrass growing conditions. It is high on the priority of EU legislation to map and protect them. The traditional use of medium spatial resolution satellite imagery e.g. Landsat-8 (30m) is very useful for mapping seagrass meadows on a regional scale. However, the availability of Sentinel-2 data, the recent ESA's satellite with its payload Multi-Spectral Instrument (MSI) is expected to improve the mapping accuracy. MSI designed to improve coastline studies due to its enhanced spatial and spectral capabilities e.g. optical bands with 10m spatial resolution. The present work examines the quality of Sentinel-2 images for seagrass mapping, the ability of each band in detection and discrimination of different habitats and estimates the accuracy of seagrass mapping. After pre-processing steps, e.g. radiometric calibration and atmospheric correction, image classified into four classes. Classification classes included sub-bottom composition e.g. seagrass, soft bottom, and hard bottom. Concrete vectors describing the areas covered by seagrass extracted from the high-resolution satellite image and used as in situ measurements. The developed methodology applied in the Gulf of Kalloni, (Lesvos Island - Greece). Results showed that Sentinel-2 images can be robustly used for seagrass mapping due to their spatial resolution, band availability and radiometric accuracy.

  4. [Meanings and methods of territorialization in primary health care].

    PubMed

    Pessoa, Vanira Matos; Rigotto, Raquel Maria; Carneiro, Fernando Ferreira; Teixeira, Ana Cláudia de Araújo

    2013-08-01

    Territorially-based participative analytical methodologies taking the environmental question and work into consideration are essential for effective primary health care. The study analyzed work and environment-related processes in the primary health care area and their repercussions on the health of workers and the community in a rural city in Ceará, whose economy is based on agriculture for export,. It sought to redeem the area and the proposal of actions focused on health needs by the social subjects through the making of social, environmental and work-related maps in workshops within the framework of action research. Examining the situation from a critical perspective, based on social participation and social determination of the health-disease process with regard to the relations between production, environment and health, was the most important step in the participative map-making process, with the qualitative material interpreted in light of discourse analysis. The process helped identify the health needs, the redemption of the area, strengthened the cooperation between sectors and the tie between the health of the worker and that of the environment, and represented an advance towards the eradication of the causes of poor primary health care services.

  5. Stalking Higher Energy Conformers on the Potential Energy Surface of Charged Species.

    PubMed

    Brites, Vincent; Cimas, Alvaro; Spezia, Riccardo; Sieffert, Nicolas; Lisy, James M; Gaigeot, Marie-Pierre

    2015-03-10

    Combined theoretical DFT-MD and RRKM methodologies and experimental spectroscopic infrared predissociation (IRPD) strategies to map potential energy surfaces (PES) of complex ionic clusters are presented, providing lowest and high energy conformers, thresholds to isomerization, and cluster formation pathways. We believe this association not only represents a significant advance in the field of mapping minima and transition states on the PES but also directly measures dynamical pathways for the formation of structural conformers and isomers. Pathways are unraveled over picosecond (DFT-MD) and microsecond (RRKM) time scales while changing the amount of internal energy is experimentally achieved by changing the loss channel for the IRPD measurements, thus directly probing different kinetic and isomerization pathways. Demonstration is provided for Li(+)(H2O)3,4 ionic clusters. Nonstatistical formation of these ionic clusters by both direct and cascade processes, involving isomerization processes that can lead to trapping of high energy conformers along the paths due to evaporative cooling, has been unraveled.

  6. The Concept Maps as a Didactic Resource Tool of Meaningful Learning in Astronomy Themes

    NASA Astrophysics Data System (ADS)

    Silveira, Felipa Pacífico Ribeiro de Assis; Mendonça, Conceição Aparecida Soares

    2015-07-01

    This article presents the results of an investigation that sought to understand the performance of the conceptual map (MC) as a teaching resource facilitator of meaningful learning of scientific concepts on astronomical themes, developed with elementary school students. The methodology employed to obtain and process the data was based on a quantitative and qualitative approach. On the quantitative level we designed a quasi-experimental research with a control group that did not use the MC and an experimental group that used the MC, both being evaluated in the beginning and end of the process. In this case, the performance of both groups is displayed in a descriptive and analytical study. In the qualitative approach, the MCs were interpreted using the structuring and assigned meanings shared by the student during his/her presentation. The results demonstrated through the improvement of qualifications that the MC made a difference in conceptual learning and in certain skills revealed by learning indicators.

  7. Fast gene ontology based clustering for microarray experiments.

    PubMed

    Ovaska, Kristian; Laakso, Marko; Hautaniemi, Sampsa

    2008-11-21

    Analysis of a microarray experiment often results in a list of hundreds of disease-associated genes. In order to suggest common biological processes and functions for these genes, Gene Ontology annotations with statistical testing are widely used. However, these analyses can produce a very large number of significantly altered biological processes. Thus, it is often challenging to interpret GO results and identify novel testable biological hypotheses. We present fast software for advanced gene annotation using semantic similarity for Gene Ontology terms combined with clustering and heat map visualisation. The methodology allows rapid identification of genes sharing the same Gene Ontology cluster. Our R based semantic similarity open-source package has a speed advantage of over 2000-fold compared to existing implementations. From the resulting hierarchical clustering dendrogram genes sharing a GO term can be identified, and their differences in the gene expression patterns can be seen from the heat map. These methods facilitate advanced annotation of genes resulting from data analysis.

  8. On Robust Methodologies for Managing Public Health Care Systems

    PubMed Central

    Nimmagadda, Shastri L.; Dreher, Heinz V.

    2014-01-01

    Authors focus on ontology-based multidimensional data warehousing and mining methodologies, addressing various issues on organizing, reporting and documenting diabetic cases and their associated ailments, including causalities. Map and other diagnostic data views, depicting similarity and comparison of attributes, extracted from warehouses, are used for understanding the ailments, based on gender, age, geography, food-habits and other hereditary event attributes. In addition to rigor on data mining and visualization, an added focus is on values of interpretation of data views, from processed full-bodied diagnosis, subsequent prescription and appropriate medications. The proposed methodology, is a robust back-end application, for web-based patient-doctor consultations and e-Health care management systems through which, billions of dollars spent on medical services, can be saved, in addition to improving quality of life and average life span of a person. Government health departments and agencies, private and government medical practitioners including social welfare organizations are typical users of these systems. PMID:24445953

  9. Improving Efficiency Using Time-Driven Activity-Based Costing Methodology.

    PubMed

    Tibor, Laura C; Schultz, Stacy R; Menaker, Ronald; Weber, Bradley D; Ness, Jay; Smith, Paula; Young, Phillip M

    2017-03-01

    The aim of this study was to increase efficiency in MR enterography using a time-driven activity-based costing methodology. In February 2015, a multidisciplinary team was formed to identify the personnel, equipment, space, and supply costs of providing outpatient MR enterography. The team mapped the current state, completed observations, performed timings, and calculated costs associated with each element of the process. The team used Pareto charts to understand the highest cost and most time-consuming activities, brainstormed opportunities, and assessed impact. Plan-do-study-act cycles were developed to test the changes, and run charts were used to monitor progress. The process changes consisted of revising the workflow associated with the preparation and administration of glucagon, with completed implementation in November 2015. The time-driven activity-based costing methodology allowed the radiology department to develop a process to more accurately identify the costs of providing MR enterography. The primary process modification was reassigning responsibility for the administration of glucagon from nurses to technologists. After implementation, the improvements demonstrated success by reducing non-value-added steps and cost by 13%, staff time by 16%, and patient process time by 17%. The saved process time was used to augment existing examination time slots to more accurately accommodate the entire enterographic examination. Anecdotal comments were captured to validate improved staff satisfaction within the multidisciplinary team. This process provided a successful outcome to address daily workflow frustrations that could not previously be improved. A multidisciplinary team was necessary to achieve success, in addition to the use of a structured problem-solving approach. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  10. Cognitive mapping tools: review and risk management needs.

    PubMed

    Wood, Matthew D; Bostrom, Ann; Bridges, Todd; Linkov, Igor

    2012-08-01

    Risk managers are increasingly interested in incorporating stakeholder beliefs and other human factors into the planning process. Effective risk assessment and management requires understanding perceptions and beliefs of involved stakeholders, and how these beliefs give rise to actions that influence risk management decisions. Formal analyses of risk manager and stakeholder cognitions represent an important first step. Techniques for diagramming stakeholder mental models provide one tool for risk managers to better understand stakeholder beliefs and perceptions concerning risk, and to leverage this new understanding in developing risk management strategies. This article reviews three methodologies for assessing and diagramming stakeholder mental models--decision-analysis-based mental modeling, concept mapping, and semantic web analysis--and assesses them with regard to their ability to address risk manager needs. © 2012 Society for Risk Analysis.

  11. Enhancing Collaborative and Meaningful Language Learning through Concept Mapping

    NASA Astrophysics Data System (ADS)

    de Cássia Veiga Marriott, Rita; Torres, Patrícia Lupion

    This chapter aims to investigate new ways of foreign-language teaching/learning via a study of how concept mapping can help develop a student's reading, writing and oral skills as part of a blended methodology for language teaching known as LAPLI (Laboratorio de Aprendizagem de LInguas: The Language Learning Lab). LAPLI is a student-centred and collaborative methodology which encourages students to challenge their limitations and expand their current knowledge whilst developing their linguistic and interpersonal skills. We explore the theories that underpin LAPLI and detail the 12 activities comprising its programme with specify reference to the use of “concept mapping”. An innovative table enabling a formative and summative assessment of the concept maps is formulated. Also presented are some of the qualitative and quantitative results achieved when this methodology was first implemented with a group of pre-service students studying for a degree in English and Portuguese languages at the Catholic University of Parana (PUCPR) in Brazil. The contribution of concept mapping and LAPLI to an understanding of language learning along with a consideration of the difficulties encountered in its implementation with student groups is discussed and suggestions made for future research.

  12. The multi-copy simultaneous search methodology: a fundamental tool for structure-based drug design.

    PubMed

    Schubert, Christian R; Stultz, Collin M

    2009-08-01

    Fragment-based ligand design approaches, such as the multi-copy simultaneous search (MCSS) methodology, have proven to be useful tools in the search for novel therapeutic compounds that bind pre-specified targets of known structure. MCSS offers a variety of advantages over more traditional high-throughput screening methods, and has been applied successfully to challenging targets. The methodology is quite general and can be used to construct functionality maps for proteins, DNA, and RNA. In this review, we describe the main aspects of the MCSS method and outline the general use of the methodology as a fundamental tool to guide the design of de novo lead compounds. We focus our discussion on the evaluation of MCSS results and the incorporation of protein flexibility into the methodology. In addition, we demonstrate on several specific examples how the information arising from the MCSS functionality maps has been successfully used to predict ligand binding to protein targets and RNA.

  13. Mapping and monitoring small stakholder agriculture in Tigray, Ethiopia using sub-meter Worldview and Landsat imagery and high performance computing.

    NASA Astrophysics Data System (ADS)

    Carroll, M.; McCarty, J. L.; Neigh, C. S. R.; Wooten, M.

    2016-12-01

    Very high resolution (VHR) satellite data is experiencing rapid annual growth, producing petabytes of remotely sensed data per year. The WorldView constellation, operated by DigitalGlobe, images over 1.2 billion km2 annually at a > 2 m spatial resolution. Due to computation, data cost, and methodological concerns, VHR satellite data has mainly been used to produce needed geospatial information for site-specific phenomenon. This project produced a VHR spatiotemporally-explicit wall-to-wall cropland area map for the rainfed residential cropland mosaic of Tigray Region, Ethiopia, which is comprised entirely of smallholder farms. Moderate resolution satellite data is not adequate in spatial or temporal resolution to capture total area occupied by smallholder farms, i.e., farms with agricultural fields of ≥ 45 × 45 m in dimension. In order to accurately map smallholder crop area over a large region, hundreds of VHR images spanning two or more years are needed. Sub-meter WorldView-1 and WorldView-2 segmentation results were combined median phenology amplitude from Landsat 8 data. VHR WorldView-1, -2 data were obtained from the U.S. National Geospatial-Intelligence Agency (NGA) Commercial Archive Data at NASA Goddard Space Flight Center (GSFC) (http://cad4nasa.gsfc.nasa.gov/). Over 2700 scenes were processed from raw imagery to completed crop map in 1 week in a semi-automated method using the large computing capacity of the Advanced Data Analytics Platform (ADAPT) at NASA GSFC's NCCS (http://www.nccs.nasa.gov/services/adapt). This methodology is extensible to any land cover type and can easily be expanded to run on much larger regions.

  14. A methodology for the assessment of flood hazards at the regional scale

    NASA Astrophysics Data System (ADS)

    Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Zabeo, Alex; Semenzin, Elena; Marcomini, Antonio

    2013-04-01

    In recent years, the frequency of water-related disasters has increased and recent flood events in Europe (e.g. 2002 in Central Europe, 2007 in UK, 2010 in Italy) caused physical-environmental and socio-economic damages. Specifically, floods are the most threatening water-related disaster that affects humans, their lives and properties. Within the KULTURisk project (FP7) a Regional Risk Assessment (RRA) methodology is proposed to evaluate the benefits of risk prevention in terms of reduced environmental risks due to floods. The method is based on the KULTURisk framework and allows the identification and prioritization of targets (i.e. people, buildings, infrastructures, agriculture, natural and semi-natural systems, cultural heritages) and areas at risk from floods in the considered region by comparing the baseline scenario (i.e. current state) with alternative scenarios (i.e. where different structural and/or non-structural measures are planned). The RRA methodology is flexible and can be adapted to different case studies (i.e. large rivers, alpine/mountain catchments, urban areas and coastal areas) and spatial scales (i.e. from the large river to the urban scale). The final aim of RRA is to help decision-makers in examining the possible environmental risks associated with uncertain future flood hazards and in identifying which prevention scenario could be the most suitable one. The RRA methodology employs Multi-Criteria Decision Analysis (MCDA functions) in order to integrate stakeholder preferences and experts judgments into the analysis. Moreover, Geographic Information Systems (GISs) are used to manage, process, analyze, and map data to facilitate the analysis and the information sharing with different experts and stakeholders. In order to characterize flood risks, the proposed methodology integrates the output of hydrodynamic models with the analysis of site-specific bio-geophysical and socio-economic indicators (e.g. slope of the territory, land cover, population density, economic activities) of several case studies in order to develop risk maps that identify and prioritize relative hot-spot areas and targets at risk at the regional scale. The main outputs of the RRA are receptor-based maps of risks useful to communicate the potential implications of floods in non-monetary terms to stakeholders and administrations. These maps can be a basis for the management of flood risks as they can provide information about the indicative number of inhabitants, the type of economic activities, natural systems and cultural heritages potentially affected by flooding. Moreover, they can provide suitable information about flood risk in the considered area in order to define priorities for prevention measures, for land use planning and management. Finally, the outputs of the RRA methodology can be used as data input in the Socio- Economic Regional Risk Assessment methodology for the economic evaluation of different damages (e.g. tangible costs, intangible costs) and for the social assessment considering the benefits of the human dimension of vulnerability (i.e. adaptive and coping capacity). Within the KULTURisk project, the methodology has been applied and validated in several European case studies. Moreover, its generalization to address other types of natural hazards (e.g. earthquakes, forest fires) will be evaluated. The preliminary results of the RRA application in the KULTURisk project will be here presented and discussed.

  15. NREL: International Activities - Pakistan Resource Maps

    Science.gov Websites

    . The high-resolution (1-km) annual wind power maps were developed using a numerical modeling approach along with NREL's empirical validation methodology. The high-resolution (10-km) annual and seasonal KB) | High-Res (ZIP 281 KB) 40-km Resolution Annual Maps (Direct) Low-Res (JPG 156 KB) | High-Res

  16. The Strategic Design Inquiry: A Formal Methodology For Approaching, Designing, Integrating, And Articulating National Strategy

    DTIC Science & Technology

    2014-04-01

    15 Figure 4: Example cognitive map ... map , aligning planning efforts throughout the government. Even after strategy implementation, SDI calls for continuing, iterative learning and...the design before total commitment to it. Capturing this analysis on a cognitive map allows strategists to articulate a design to government

  17. Genetic dissection of the maize kernel development process via conditional QTL mapping for three developing kernel-related traits in an immortalized F2 population.

    PubMed

    Zhang, Zhanhui; Wu, Xiangyuan; Shi, Chaonan; Wang, Rongna; Li, Shengfei; Wang, Zhaohui; Liu, Zonghua; Xue, Yadong; Tang, Guiliang; Tang, Jihua

    2016-02-01

    Kernel development is an important dynamic trait that determines the final grain yield in maize. To dissect the genetic basis of maize kernel development process, a conditional quantitative trait locus (QTL) analysis was conducted using an immortalized F2 (IF2) population comprising 243 single crosses at two locations over 2 years. Volume (KV) and density (KD) of dried developing kernels, together with kernel weight (KW) at different developmental stages, were used to describe dynamic changes during kernel development. Phenotypic analysis revealed that final KW and KD were determined at DAP22 and KV at DAP29. Unconditional QTL mapping for KW, KV and KD uncovered 97 QTLs at different kernel development stages, of which qKW6b, qKW7a, qKW7b, qKW10b, qKW10c, qKV10a, qKV10b and qKV7 were identified under multiple kernel developmental stages and environments. Among the 26 QTLs detected by conditional QTL mapping, conqKW7a, conqKV7a, conqKV10a, conqKD2, conqKD7 and conqKD8a were conserved between the two mapping methodologies. Furthermore, most of these QTLs were consistent with QTLs and genes for kernel development/grain filling reported in previous studies. These QTLs probably contain major genes associated with the kernel development process, and can be used to improve grain yield and quality through marker-assisted selection.

  18. Ontological Standardization for Historical Map Collections: Studying the Greek Borderlines of 1881

    NASA Astrophysics Data System (ADS)

    Gkadolou, E.; Tomai, E.; Stefanakis, E.; Kritikos, G.

    2012-07-01

    Historical maps deliver valuable historical information which is applicable in several domains while they document the spatiotemporal evolution of the geographical entities that are depicted therein. In order to use the historical cartographic information effectively, the maps' semantic documentation becomes a necessity for restoring any semantic ambiguities and structuring the relationship between historical and current geographical space. This paper examines cartographic ontologies as a proposed methodology and presents the first outcomes of the methodology applied for the historical map series «Carte de la nouvelle frontière Turco-Grecque» that sets the borderlines between Greece and Ottoman Empire in 1881. The map entities were modelled and compared to the current ones so as to record the changes in their spatial and thematic attributes and an ontology was developed in Protégé OWL Editor 3.4.4 for the attributes that thoroughly define a historical map and the digitised spatial entities. Special focus was given on the Greek borderline and the changes that it caused to other geographic entities.

  19. Footprint Map Partitioning Using Airborne Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Xiong, B.; Oude Elberink, S.; Vosselman, G.

    2016-06-01

    Nowadays many cities and countries are creating their 3D building models for a better daily management and smarter decision making. The newly created 3D models are required to be consistent with existing 2D footprint maps. Thereby the 2D maps are usually combined with height data for the task of 3D reconstruction. Many buildings are often composed by parts that are discontinuous over height. Building parts can be reconstructed independently and combined into a complete building. Therefore, most of the state-of-the-art work on 3D building reconstruction first decomposes a footprint map into parts. However, those works usually change the footprint maps for easier partitioning and cannot detect building parts that are fully inside the footprint polygon. In order to solve those problems, we introduce two methodologies, one more dependent on height data, and the other one more dependent on footprints. We also experimentally evaluate the two methodologies and compare their advantages and disadvantages. The experiments use Airborne Laser Scanning (ALS) data and two vector maps, one with 1:10,000 scale and another one with 1:500 scale.

  20. NREL: International Activities - Bhutan Resource Maps

    Science.gov Websites

    modeling approach along with NREL's empirical validation methodology. The high-resolution (10-km) annual -time specific solar mapping approach developed at the U.S. State University of New York at Albany. Data

  1. A method to preserve trends in quantile mapping bias correction of climate modeled temperature

    NASA Astrophysics Data System (ADS)

    Grillakis, Manolis G.; Koutroulis, Aristeidis G.; Daliakopoulos, Ioannis N.; Tsanis, Ioannis K.

    2017-09-01

    Bias correction of climate variables is a standard practice in climate change impact (CCI) studies. Various methodologies have been developed within the framework of quantile mapping. However, it is well known that quantile mapping may significantly modify the long-term statistics due to the time dependency of the temperature bias. Here, a method to overcome this issue without compromising the day-to-day correction statistics is presented. The methodology separates the modeled temperature signal into a normalized and a residual component relative to the modeled reference period climatology, in order to adjust the biases only for the former and preserve the signal of the later. The results show that this method allows for the preservation of the originally modeled long-term signal in the mean, the standard deviation and higher and lower percentiles of temperature. To illustrate the improvements, the methodology is tested on daily time series obtained from five Euro CORDEX regional climate models (RCMs).

  2. Peer review of health research funding proposals: A systematic map and systematic review of innovations for effectiveness and efficiency

    PubMed Central

    Frampton, Geoff K.; Pickett, Karen; Wyatt, Jeremy C.

    2018-01-01

    Objective To investigate methods and processes for timely, efficient and good quality peer review of research funding proposals in health. Methods A two-stage evidence synthesis: (1) a systematic map to describe the key characteristics of the evidence base, followed by (2) a systematic review of the studies stakeholders prioritised as relevant from the map on the effectiveness and efficiency of peer review ‘innovations’. Standard processes included literature searching, duplicate inclusion criteria screening, study keyword coding, data extraction, critical appraisal and study synthesis. Results A total of 83 studies from 15 countries were included in the systematic map. The evidence base is diverse, investigating many aspects of the systems for, and processes of, peer review. The systematic review included eight studies from Australia, Canada, and the USA, evaluating a broad range of peer review innovations. These studies showed that simplifying the process by shortening proposal forms, using smaller reviewer panels, or expediting processes can speed up the review process and reduce costs, but this might come at the expense of peer review quality, a key aspect that has not been assessed. Virtual peer review using videoconferencing or teleconferencing appears promising for reducing costs by avoiding the need for reviewers to travel, but again any consequences for quality have not been adequately assessed. Conclusions There is increasing international research activity into the peer review of health research funding. The studies reviewed had methodological limitations and variable generalisability to research funders. Given these limitations it is not currently possible to recommend immediate implementation of these innovations. However, many appear promising based on existing evidence, and could be adapted as necessary by funders and evaluated. Where feasible, experimental evaluation, including randomised controlled trials, should be conducted, evaluating impact on effectiveness, efficiency and quality. PMID:29750807

  3. Incorporating Manual and Autonomous Code Generation

    NASA Technical Reports Server (NTRS)

    McComas, David

    1998-01-01

    Code can be generated manually or using code-generated software tools, but how do you interpret the two? This article looks at a design methodology that combines object-oriented design with autonomic code generation for attitude control flight software. Recent improvements in space flight computers are allowing software engineers to spend more time engineering the applications software. The application developed was the attitude control flight software for an astronomical satellite called the Microwave Anisotropy Probe (MAP). The MAP flight system is being designed, developed, and integrated at NASA's Goddard Space Flight Center. The MAP controls engineers are using Integrated Systems Inc.'s MATRIXx for their controls analysis. In addition to providing a graphical analysis for an environment, MATRIXx includes an autonomic code generation facility called AutoCode. This article examines the forces that shaped the final design and describes three highlights of the design process: (1) Defining the manual to autonomic code interface; (2) Applying object-oriented design to the manual flight code; (3) Implementing the object-oriented design in C.

  4. Photometric and polarimetric mapping of water turbidity and water depth

    NASA Technical Reports Server (NTRS)

    Halajian, J.; Hallock, H.

    1973-01-01

    A Digital Photometric Mapper (DPM) was used in the Fall of 1971 in an airborne survey of New York and Boston area waters to acquire photometric, spectral and polarimetric data. The object of this study is to analyze these data with quantitative computer processing techniques to assess the potential of the DPM in the measurement and regional mapping of water turbidity and depth. These techniques have been developed and an operational potential has been demonstrated. More emphasis is placed at this time on the methodology of data acquisition, analysis and display than on the quantity of data. The results illustrate the type, quantity and format of information that could be generated operationally with the DPM-type sensor characterized by high photometric stability and fast, accurate digital output. The prototype, single-channel DPM is suggested as a unique research tool for a number of new applications. For the operational mapping of water turbidity and depth, the merits of a multichannel DPM coupled with a laser system are stressed.

  5. Sequence analysis by iterated maps, a review.

    PubMed

    Almeida, Jonas S

    2014-05-01

    Among alignment-free methods, Iterated Maps (IMs) are on a particular extreme: they are also scale free (order free). The use of IMs for sequence analysis is also distinct from other alignment-free methodologies in being rooted in statistical mechanics instead of computational linguistics. Both of these roots go back over two decades to the use of fractal geometry in the characterization of phase-space representations. The time series analysis origin of the field is betrayed by the title of the manuscript that started this alignment-free subdomain in 1990, 'Chaos Game Representation'. The clash between the analysis of sequences as continuous series and the better established use of Markovian approaches to discrete series was almost immediate, with a defining critique published in same journal 2 years later. The rest of that decade would go by before the scale-free nature of the IM space was uncovered. The ensuing decade saw this scalability generalized for non-genomic alphabets as well as an interest in its use for graphic representation of biological sequences. Finally, in the past couple of years, in step with the emergence of BigData and MapReduce as a new computational paradigm, there is a surprising third act in the IM story. Multiple reports have described gains in computational efficiency of multiple orders of magnitude over more conventional sequence analysis methodologies. The stage appears to be now set for a recasting of IMs with a central role in processing nextgen sequencing results.

  6. Enhancement of snow cover change detection with sparse representation and dictionary learning

    NASA Astrophysics Data System (ADS)

    Varade, D.; Dikshit, O.

    2014-11-01

    Sparse representation and decoding is often used for denoising images and compression of images with respect to inherent features. In this paper, we adopt a methodology incorporating sparse representation of a snow cover change map using the K-SVD trained dictionary and sparse decoding to enhance the change map. The pixels often falsely characterized as "changes" are eliminated using this approach. The preliminary change map was generated using differenced NDSI or S3 maps in case of Resourcesat-2 and Landsat 8 OLI imagery respectively. These maps are extracted into patches for compressed sensing using Discrete Cosine Transform (DCT) to generate an initial dictionary which is trained by the K-SVD approach. The trained dictionary is used for sparse coding of the change map using the Orthogonal Matching Pursuit (OMP) algorithm. The reconstructed change map incorporates a greater degree of smoothing and represents the features (snow cover changes) with better accuracy. The enhanced change map is segmented using kmeans to discriminate between the changed and non-changed pixels. The segmented enhanced change map is compared, firstly with the difference of Support Vector Machine (SVM) classified NDSI maps and secondly with a reference data generated as a mask by visual interpretation of the two input images. The methodology is evaluated using multi-spectral datasets from Resourcesat-2 and Landsat-8. The k-hat statistic is computed to determine the accuracy of the proposed approach.

  7. Quality and rigor of the concept mapping methodology: a pooled study analysis.

    PubMed

    Rosas, Scott R; Kane, Mary

    2012-05-01

    The use of concept mapping in research and evaluation has expanded dramatically over the past 20 years. Researchers in academic, organizational, and community-based settings have applied concept mapping successfully without the benefit of systematic analyses across studies to identify the features of a methodologically sound study. Quantitative characteristics and estimates of quality and rigor that may guide for future studies are lacking. To address this gap, we conducted a pooled analysis of 69 concept mapping studies to describe characteristics across study phases, generate specific indicators of validity and reliability, and examine the relationship between select study characteristics and quality indicators. Individual study characteristics and estimates were pooled and quantitatively summarized, describing the distribution, variation and parameters for each. In addition, variation in the concept mapping data collection in relation to characteristics and estimates was examined. Overall, results suggest concept mapping yields strong internal representational validity and very strong sorting and rating reliability estimates. Validity and reliability were consistently high despite variation in participation and task completion percentages across data collection modes. The implications of these findings as a practical reference to assess the quality and rigor for future concept mapping studies are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. Mapping Natech risk due to earthquakes using RAPID-N

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2013-04-01

    Natural hazard-triggered technological accidents (so-called Natech accidents) at hazardous installations are an emerging risk with possibly serious consequences due to the potential for release of hazardous materials, fires or explosions. For the reduction of Natech risk, one of the highest priority needs is the identification of Natech-prone areas and the systematic assessment of Natech risks. With hardly any Natech risk maps existing within the EU the European Commission's Joint Research Centre has developed a Natech risk analysis and mapping tool called RAPID-N, that estimates the overall risk of natural-hazard impact to industrial installations and its possible consequences. The results are presented as risk summary reports and interactive risk maps which can be used for decision making. Currently, RAPID-N focuses on Natech risk due to earthquakes at industrial installations. However, it will be extended to also analyse and map Natech risk due to floods in the near future. The RAPID-N methodology is based on the estimation of on-site natural hazard parameters, use of fragility curves to determine damage probabilities of plant units for various damage states, and the calculation of spatial extent, severity, and probability of Natech events potentially triggered by the natural hazard. The methodology was implemented as a web-based risk assessment and mapping software tool which allows easy data entry, rapid local or regional risk assessment and mapping. RAPID-N features an innovative property estimation framework to calculate on-site natural hazard parameters, industrial plant and plant unit characteristics, and hazardous substance properties. Custom damage states and fragility curves can be defined for different types of plant units. Conditional relationships can be specified between damage states and Natech risk states, which describe probable Natech event scenarios. Natech consequences are assessed using a custom implementation of U.S. EPA's Risk Management Program (RMP) Guidance for Offsite Consequence Analysis methodology. This custom implementation is based on the property estimation framework and allows the easy modification of model parameters and the substitution of equations with alternatives. RAPID-N can be applied at different stages of the Natech risk management process: It allows on the one hand the analysis of hypothetical Natech scenarios to prevent or prepare for a Natech accident by supporting land-use and emergency planning. On the other hand, once a natural disaster occurs RAPID-N can be used for rapidly locating facilities with potential Natech accident damage based on actual natural-hazard information. This provides a means to warn the population in the vicinity of the facilities in a timely manner. This presentation will introduce the specific features of RAPID-N and show the use of the tool by application to a case-study area.

  9. Evaluation of Mapping Methodologies at a Legacy Test Site

    NASA Astrophysics Data System (ADS)

    Sussman, A. J.; Schultz-Fellenz, E. S.; Roback, R. C.; Kelley, R. E.; Drellack, S.; Reed, D.; Miller, E.; Cooper, D. I.; Sandoval, M.; Wang, R.

    2013-12-01

    On June 12th, 1985, a nuclear test with an announced yield between 20-150kt was detonated in rhyolitic lava in a vertical emplacement borehole at a depth of 608m below the surface. This test did not collapse to the surface and form a crater, but rather resulted in a subsurface collapse with more subtle surface expressions of deformation, providing an opportunity to evaluate the site using a number of surface mapping methodologies. The site was investigated over a two-year time span by several mapping teams. In order to determine the most time efficient and accurate approach for mapping post-shot surface features at a legacy test site, a number of different techniques were employed. The site was initially divided into four quarters, with teams applying various methodologies, techniques, and instrumentations to each quarter. Early methods included transect lines and site gridding with a Brunton pocket transit, flagging tape, measuring tape, and stakes; surveying using a hand-held personal GPS to locate observed features with an accuracy of × 5-10m; and extensive photo-documentation. More recent methods have incorporated the use of near survey grade GPS devices to allow careful location and mapping of surface features. Initially, gridding was employed along with the high resolution GPS surveys, but this was found to be time consuming and of little observational value. Raw visual observation (VOB) data included GPS coordinates for artifacts or features of interest, field notes, and photographs. A categorization system was used to organize the myriad of items, in order to aid in database searches and for visual presentation of findings. The collected data set was imported into a geographic information system (GIS) as points, lines, or polygons and overlain onto a digital color orthophoto map of the test site. Once these data were mapped, spectral data were collected using a high resolution field spectrometer. In addition to geo-locating the field observations with 10cm resolution GPS, LiDAR and hyperspectral imagery were also acquired. The LiDAR and hyperspectral data are being processed and will be added to the existing geo-referenced database as separate information layers for remote sensing analysis of surface features associated with the legacy test. By consolidating the various components of a VOB data point (coordinates, photo and item description) into a standalone database, searching or querying for other components or collects such as subsurface geophysical and/or airborne imagery is made much easier. Work by Los Alamos National Laboratory was sponsored by the National Nuclear Security Administration Award No. DE-AC52-06NA25946/NST10-NCNS-PD00. Work by National Security Technologies, LLC, was performed under Contract No. DE AC52 06NA25946 with the U.S. Department of Energy.

  10. Where to Go Next? Identifying Target Areas in the North Atlantic for Future Seafloor Mapping Initiatives

    NASA Astrophysics Data System (ADS)

    Woelfl, A. C.; Jencks, J.; Johnston, G.; Varner, J. D.; Devey, C. W.

    2017-12-01

    Human activities are rapidly expanding into the oceans, yet detailed bathymetric maps do not exist for most of the seafloor that would permit governments to formulate sensible usage rules. Changing this situation will require an enormous international mapping effort. To ensure that this effort is directed towards the regions most in need of mapping, we need to know which areas have already been mapped and which areas are potentially most interesting. Despite various mapping efforts in recent years, large parts of the Atlantic still lack detailed bathymetric information. To successfully plan for future mapping efforts to fill these gaps, knowledge of current data coverage is imperative to avoid duplication of effort. While certain datasets are publically available online (e.g. NOAA's NCEI, EMODnet, IHO-DCDB, LDEO's GMRT), many are not. However, with the limited information we do have at hand, the question remains, where should we map next? And what criteria should we take into account? In 2016, a study was taken on as part of the efforts of the International Atlantic Seabed Mapping Working Group (ASMIWG). The ASMIWG, established by the Tri-Partite Galway Statement Implementation Committee, was tasked to develop a cohesive seabed mapping strategy for the Atlantic Ocean. The aim of our study was to develop a reproducible process for identifying and evaluating potential target areas within the North Atlantic that represent suitable sites for future bathymetric surveys. The sites were selected by applying a GIS-based suitability analysis that included specific user group-based parameters of the marine environment. Furthermore, information regarding current data coverage were gathered to take into account in the selection process. The results reveal the suitability of sites within the North Atlantic based on the selected criteria. Three potential target sites should be seen as flexible suggestions for future mapping initiatives rather than a rigid, defined set of areas. This methodology can be adjusted to other areas of interest and can include a variety of parameters based on stakeholder interest. Further this work only included accessible and displayable information about multibeam data coverage and would certainly benefit from more easily available and discoverable data sets or at least from location information.

  11. Indicators to facilitate the early identification of patients with major depressive disorder in need of highly specialized care: A concept mapping study.

    PubMed

    van Krugten, F C W; Goorden, M; van Balkom, A J L M; Spijker, J; Brouwer, W B F; Hakkaart-van Roijen, L

    2018-04-01

    Early identification of the subgroup of patients with major depressive disorder (MDD) in need of highly specialized care could enhance personalized intervention. This, in turn, may reduce the number of treatment steps needed to achieve and sustain an adequate treatment response. The aim of this study was to identify patient-related indicators that could facilitate the early identification of the subgroup of patients with MDD in need of highly specialized care. Initial patient indicators were derived from a systematic review. Subsequently, a structured conceptualization methodology known as concept mapping was employed to complement the initial list of indicators by clinical expertise and develop a consensus-based conceptual framework. Subject-matter experts were invited to participate in the subsequent steps (brainstorming, sorting, and rating) of the concept mapping process. A final concept map solution was generated using nonmetric multidimensional scaling and agglomerative hierarchical cluster analyses. In total, 67 subject-matter experts participated in the concept mapping process. The final concept map revealed the following 10 major clusters of indicators: 1-depression severity, 2-onset and (treatment) course, 3-comorbid personality disorder, 4-comorbid substance use disorder, 5-other psychiatric comorbidity, 6-somatic comorbidity, 7-maladaptive coping, 8-childhood trauma, 9-social factors, and 10-psychosocial dysfunction. The study findings highlight the need for a comprehensive assessment of patient indicators in determining the need for highly specialized care, and suggest that the treatment allocation of patients with MDD to highly specialized mental healthcare settings should be guided by the assessment of clinical and nonclinical patient factors. © 2018 Wiley Periodicals, Inc.

  12. Mapping Resource Selection Functions in Wildlife Studies: Concerns and Recommendations

    PubMed Central

    Morris, Lillian R.; Proffitt, Kelly M.; Blackburn, Jason K.

    2018-01-01

    Predicting the spatial distribution of animals is an important and widely used tool with applications in wildlife management, conservation, and population health. Wildlife telemetry technology coupled with the availability of spatial data and GIS software have facilitated advancements in species distribution modeling. There are also challenges related to these advancements including the accurate and appropriate implementation of species distribution modeling methodology. Resource Selection Function (RSF) modeling is a commonly used approach for understanding species distributions and habitat usage, and mapping the RSF results can enhance study findings and make them more accessible to researchers and wildlife managers. Currently, there is no consensus in the literature on the most appropriate method for mapping RSF results, methods are frequently not described, and mapping approaches are not always related to accuracy metrics. We conducted a systematic review of the RSF literature to summarize the methods used to map RSF outputs, discuss the relationship between mapping approaches and accuracy metrics, performed a case study on the implications of employing different mapping methods, and provide recommendations as to appropriate mapping techniques for RSF studies. We found extensive variability in methodology for mapping RSF results. Our case study revealed that the most commonly used approaches for mapping RSF results led to notable differences in the visual interpretation of RSF results, and there is a concerning disconnect between accuracy metrics and mapping methods. We make 5 recommendations for researchers mapping the results of RSF studies, which are focused on carefully selecting and describing the method used to map RSF studies, and relating mapping approaches to accuracy metrics. PMID:29887652

  13. Lava Lake Thermal Pattern Classification Using Self-Organizing Maps and Relationships to Eruption Processes at Kīlauea Volcano, Hawaii

    NASA Astrophysics Data System (ADS)

    Burzynski, A. M.; Anderson, S. W.; Morrison, K.; LeWinter, A. L.; Patrick, M. R.; Orr, T. R.; Finnegan, D. C.

    2014-12-01

    Nested within the Halema'uma'u Crater on the summit of Kīlauea Volcano, the active lava lake of Overlook Crater poses hazards to local residents and Hawaii Volcanoes National Park visitors. Since its formation in March 2008, the lava lake has enlarged to +28,500 m2 and has been closely monitored by researchers at the USGS Hawaiian Volcano Observatory (HVO). Time-lapse images, collected via visible and thermal infrared cameras, reveal thin crustal plates, separated by incandescent cracks, moving across the lake surface as lava circulates beneath. We hypothesize that changes in size, shape, velocity, and patterns of these crustal plates are related to other eruption processes at the volcano. Here we present a methodology to identify characteristic lava lake surface patterns from thermal infrared video footage using a self-organizing maps (SOM) algorithm. The SOM is an artificial neural network that performs unsupervised clustering and enables us to visualize the relationships between groups of input patterns on a 2-dimensional grid. In a preliminary trial, we input ~4 hours of thermal infrared time-lapse imagery collected on December 16-17, 2013 during a transient deflation-inflation deformation event at a rate of one frame every 10 seconds. During that same time period, we also acquired a series of one-second terrestrial laser scans (TLS) every 30 seconds to provide detailed topography of the lava lake surface. We identified clusters of characteristic thermal patterns using a self-organizing maps algorithm within the Matlab SOM Toolbox. Initial results from two SOMs, one large map (81 nodes) and one small map (9 nodes), indicate 4-6 distinct groups of thermal patterns. We compare these surface patterns with lava lake surface slope and crustal plate velocities derived from concurrent TLS surveys and with time series of other eruption variables, including outgassing rates and inflation-deflation events. This methodology may be applied to the continuous stream of thermal video footage at Kīlauea to expand the breadth of eruption information we are able to obtain from a remote thermal infrared camera and may potentially allow for the recognition of lava lake patterns as a proxy for other eruption parameters.

  14. Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation

    DTIC Science & Technology

    2016-05-01

    identifying and mapping flaw size distributions on glass surfaces for predicting mechanical response. International Journal of Applied Glass ...ARL-TN-0756 ● MAY 2016 US Army Research Laboratory Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation...Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation by Clayton M Weiss Oak Ridge Institute for Science and Education

  15. Probabilistic self-organizing maps for continuous data.

    PubMed

    Lopez-Rubio, Ezequiel

    2010-10-01

    The original self-organizing feature map did not define any probability distribution on the input space. However, the advantages of introducing probabilistic methodologies into self-organizing map models were soon evident. This has led to a wide range of proposals which reflect the current emergence of probabilistic approaches to computational intelligence. The underlying estimation theories behind them derive from two main lines of thought: the expectation maximization methodology and stochastic approximation methods. Here, we present a comprehensive view of the state of the art, with a unifying perspective of the involved theoretical frameworks. In particular, we examine the most commonly used continuous probability distributions, self-organization mechanisms, and learning schemes. Special emphasis is given to the connections among them and their relative advantages depending on the characteristics of the problem at hand. Furthermore, we evaluate their performance in two typical applications of self-organizing maps: classification and visualization.

  16. Mapping forest inventory and analysis data attributes within the framework of double sampling for stratification design

    Treesearch

    David C. Chojnacky; Randolph H. Wynne; Christine E. Blinn

    2009-01-01

    Methodology is lacking to easily map Forest Inventory and Analysis (FIA) inventory statistics for all attribute variables without having to develop separate models and methods for each variable. We developed a mapping method that can directly transfer tabular data to a map on which pixels can be added any way desired to estimate carbon (or any other variable) for a...

  17. Computational Acoustic Beamforming for Noise Source Identification for Small Wind Turbines

    PubMed Central

    Lien, Fue-Sang

    2017-01-01

    This paper develops a computational acoustic beamforming (CAB) methodology for identification of sources of small wind turbine noise. This methodology is validated using the case of the NACA 0012 airfoil trailing edge noise. For this validation case, the predicted acoustic maps were in excellent conformance with the results of the measurements obtained from the acoustic beamforming experiment. Following this validation study, the CAB methodology was applied to the identification of noise sources generated by a commercial small wind turbine. The simulated acoustic maps revealed that the blade tower interaction and the wind turbine nacelle were the two primary mechanisms for sound generation for this small wind turbine at frequencies between 100 and 630 Hz. PMID:28378012

  18. Satellite-aided survey sampling and implementation in low- and middle-income contexts: a low-cost/low-tech alternative.

    PubMed

    Haenssgen, Marco J

    2015-01-01

    The increasing availability of online maps, satellite imagery, and digital technology can ease common constraints of survey sampling in low- and middle-income countries. However, existing approaches require specialised software and user skills, professional GPS equipment, and/or commercial data sources; they tend to neglect spatial sampling considerations when using satellite maps; and they continue to face implementation challenges analogous to conventional survey implementation methods. This paper presents an alternative way of utilising satellite maps and digital aides that aims to address these challenges. The case studies of two rural household surveys in Rajasthan (India) and Gansu (China) compare conventional survey sampling and implementation techniques with the use of online map services such as Google, Bing, and HERE maps. Modern yet basic digital technology can be integrated into the processes of preparing, implementing, and monitoring a rural household survey. Satellite-aided systematic random sampling enhanced the spatial representativeness of the village samples and entailed savings of approximately £4000 compared to conventional household listing, while reducing the duration of the main survey by at least 25 %. This low-cost/low-tech satellite-aided survey sampling approach can be useful for student researchers and resource-constrained research projects operating in low- and middle-income contexts with high survey implementation costs. While achieving transparent and efficient survey implementation at low costs, researchers aiming to adopt a similar process should be aware of the locational, technical, and logistical requirements as well as the methodological challenges of this strategy.

  19. Listening to Students: Customer Journey Mapping at Birmingham City University Library and Learning Resources

    ERIC Educational Resources Information Center

    Andrews, Judith; Eade, Eleanor

    2013-01-01

    Birmingham City University's Library and Learning Resources' strategic aim is to improve student satisfaction. A key element is the achievement of the Customer Excellence Standard. An important component of the standard is the mapping of services to improve quality. Library and Learning Resources has developed a methodology to map these…

  20. Concept Mapping and Misconceptions: A Study of High-School Students' Understandings of Acids and Bases.

    ERIC Educational Resources Information Center

    Ross, Bertram; And Others

    1991-01-01

    An investigation of students understandings of acids and bases using concept maps, multiple-choice tests, and clinical interviews is described. The methodology and resulting analysis are illustrated with two abbreviated case studies selected from the study. Discussion of concept mapping points to how it starkly represents gaps in the understanding…

  1. Concept Maps: An Alternative Methodology to Assess Young Children

    ERIC Educational Resources Information Center

    Atiles, Julia T.; Dominique-Maikell, Nikole; McKean, Kathleen

    2014-01-01

    The authors investigated the utility and efficacy of using concepts maps as a research tool to assess young children. Pre- and post- concept maps have been used as an assessment and evaluation tool with teachers and with older students, typically children who can read and write; this article summarizes an investigation into the utility of using…

  2. The KULTURisk Regional Risk Assessment methodology for water-related natural hazards - Part 2: Application to the Zurich case study

    NASA Astrophysics Data System (ADS)

    Ronco, P.; Bullo, M.; Torresan, S.; Critto, A.; Olschewski, R.; Zappa, M.; Marcomini, A.

    2014-07-01

    The main objective of the paper is the application of the KULTURisk Regional Risk Assessment (KR-RRA) methodology, presented in the companion paper (Part 1, Ronco et al., 2014), to the Sihl River valley, in Switzerland. Through a tuning process of the methodology to the site-specific context and features, flood related risks have been assessed for different receptors lying on the Sihl River valley including the city of Zurich, which represents a typical case of river flooding in urban area. After characterizing the peculiarities of the specific case study, risk maps have been developed under a 300 years return period scenario (selected as baseline) for six identified relevant targets, exposed to flood risk in the Sihl valley, namely: people, economic activities (including buildings, infrastructures and agriculture), natural and semi-natural systems and cultural heritage. Finally, the total risk index map, which allows to identify and rank areas and hotspots at risk by means of Multi Criteria Decision Analysis tools, has been produced to visualize the spatial pattern of flood risk within the area of study. By means of a tailored participative approach, the total risk maps supplement the consideration of technical experts with the (essential) point of view of the relevant stakeholders for the appraisal of the specific scores and weights related to the receptor-relative risks. The total risk maps obtained for the Sihl River case study are associated with the lower classes of risk. In general, higher relative risks are concentrated in the deeply urbanized area within and around the Zurich city centre and areas that rely just behind to the Sihl River course. Here, forecasted injuries and potential fatalities are mainly due to high population density and high presence of old (vulnerable) people; inundated buildings are mainly classified as continuous and discontinuous urban fabric; flooded roads, pathways and railways, the majority of them referring to the Zurich main train station (Hauptbahnhof), are at high risk of inundation, causing huge indirect damages. The analysis of flood risk to agriculture, natural and semi-natural systems and cultural heritage have pointed out that these receptors could be relatively less impacted by the selected flood scenario mainly because their scattered presence. Finally, the application of the KR-RRA methodology to the Sihl River case study as well as to several other sites across Europe (not presented here), has demonstrated its flexibility and possible adaptation to different geographical and socio-economic contexts, depending on data availability and peculiarities of the sites, as well as for other hazard scenarios.

  3. Process mapping evaluation of medication reconciliation in academic teaching hospitals: a critical step in quality improvement.

    PubMed

    Holbrook, Anne; Bowen, James M; Patel, Harsit; O'Brien, Chris; You, John J; Tahavori, Roshan; Doleweerd, Jeff; Berezny, Tim; Perri, Dan; Nieuwstraten, Carmine; Troyan, Sue; Patel, Ameen

    2016-12-30

    Medication reconciliation (MedRec) has been a mandated or recommended activity in Canada, the USA and the UK for nearly 10 years. Accreditation bodies in North America will soon require MedRec for every admission, transfer and discharge of every patient. Studies of MedRec have revealed unintentional discrepancies in prescriptions but no clear evidence that clinically important outcomes are improved, leading to widely variable practices. Our objective was to apply process mapping methodology to MedRec to clarify current processes and resource usage, identify potential efficiencies and gaps in care, and make recommendations for improvement in the light of current literature evidence of effectiveness. Process engineers observed and recorded all MedRec activities at 3 academic teaching hospitals, from initial emergency department triage to patient discharge, for general internal medicine patients. Process maps were validated with frontline staff, then with the study team, managers and patient safety leads to summarise current problems and discuss solutions. Across all of the 3 hospitals, 5 general problem themes were identified: lack of use of all available medication sources, duplication of effort creating inefficiency, lack of timeliness of completion of the Best Possible Medication History, lack of standardisation of the MedRec process, and suboptimal communication of MedRec issues between physicians, pharmacists and nurses. MedRec as practised in this environment requires improvements in quality, timeliness, consistency and dissemination. Further research exploring efficient use of resources, in terms of personnel and costs, is required. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  4. Terrestrial Ecosystems - Land Surface Forms of the Conterminous United States

    USGS Publications Warehouse

    Cress, Jill J.; Sayre, Roger G.; Comer, Patrick; Warner, Harumi

    2009-01-01

    As part of an effort to map terrestrial ecosystems, the U.S. Geological Survey has generated land surface form classes to be used in creating maps depicting standardized, terrestrial ecosystem models for the conterminous United States, using an ecosystems classification developed by NatureServe . A biophysical stratification approach, developed for South America and now being implemented globally, was used to model the ecosystem distributions. Since land surface forms strongly influence the differentiation and distribution of terrestrial ecosystems, they are one of the key input layers in this biophysical stratification. After extensive investigation into various land surface form mapping methodologies, the decision was made to use the methodology developed by the Missouri Resource Assessment Partnership (MoRAP). MoRAP made modifications to Hammond's land surface form classification, which allowed the use of 30-meter source data and a 1-km2 window for analyzing the data cell and its surrounding cells (neighborhood analysis). While Hammond's methodology was based on three topographic variables, slope, local relief, and profile type, MoRAP's methodology uses only slope and local relief. Using the MoRAP method, slope is classified as gently sloping when more than 50 percent of the area in a 1-km2 neighborhood has slope less than 8 percent, otherwise the area is considered moderately sloping. Local relief, which is the difference between the maximum and minimum elevation in a neighborhood, is classified into five groups: 0-15 m, 16-30 m, 31-90 m, 91-150 m, and >150 m. The land surface form classes are derived by combining slope and local relief to create eight landform classes: flat plains (gently sloping and local relief = 90 m), low hills (not gently sloping and local relief = 150 m). However, in the USGS application of the MoRAP methodology, an additional local relief group was used (> 400 m) to capture additional local topographic variation. As a result, low mountains were redefined as not gently sloping and 151 m 400 m. The final application of the MoRAP methodology was implemented using the USGS 30-meter National Elevation Dataset and an existing USGS slope dataset that had been derived by calculating the slope from the NED in Universal Transverse Mercator (UTM) coordinates in each UTM zone, and then combining all of the zones into a national dataset. This map shows a smoothed image of the nine land surface form classes based on MoRAP's methodology. Additional information about this map and any data developed for the ecosystems modeling of the conterminous United States is available online at http://rmgsc.cr.usgs.gov/ecosystems/.

  5. A Bayesian-Based Novel Methodology to Generate Reliable Site Response Mapping Sensitive to Data Uncertainties

    NASA Astrophysics Data System (ADS)

    Chakraborty, A.; Goto, H.

    2017-12-01

    The 2011 off the Pacific coast of Tohoku earthquake caused severe damage in many areas further inside the mainland because of site-amplification. Furukawa district in Miyagi Prefecture, Japan recorded significant spatial differences in ground motion even at sub-kilometer scales. The site responses in the damage zone far exceeded the levels in the hazard maps. A reason why the mismatch occurred is that mapping follow only the mean value at the measurement locations with no regard to the data uncertainties and thus are not always reliable. Our research objective is to develop a methodology to incorporate data uncertainties in mapping and propose a reliable map. The methodology is based on a hierarchical Bayesian modeling of normally-distributed site responses in space where the mean (μ), site-specific variance (σ2) and between-sites variance(s2) parameters are treated as unknowns with a prior distribution. The observation data is artificially created site responses with varying means and variances for 150 seismic events across 50 locations in one-dimensional space. Spatially auto-correlated random effects were added to the mean (μ) using a conditionally autoregressive (CAR) prior. The inferences on the unknown parameters are done using Markov Chain Monte Carlo methods from the posterior distribution. The goal is to find reliable estimates of μ sensitive to uncertainties. During initial trials, we observed that the tau (=1/s2) parameter of CAR prior controls the μ estimation. Using a constraint, s = 1/(k×σ), five spatial models with varying k-values were created. We define reliability to be measured by the model likelihood and propose the maximum likelihood model to be highly reliable. The model with maximum likelihood was selected using a 5-fold cross-validation technique. The results show that the maximum likelihood model (μ*) follows the site-specific mean at low uncertainties and converges to the model-mean at higher uncertainties (Fig.1). This result is highly significant as it successfully incorporates the effect of data uncertainties in mapping. This novel approach can be applied to any research field using mapping techniques. The methodology is now being applied to real records from a very dense seismic network in Furukawa district, Miyagi Prefecture, Japan to generate a reliable map of the site responses.

  6. The «New Map of Rome» by Giambattista Nolli: a precise representation of the urban space in the 18th century

    NASA Astrophysics Data System (ADS)

    Lelo, Keti; Travaglini, Carlo Maria

    2010-05-01

    The paper refers to the on going experience of the project "The Historic Atlas of Modern Rome" implemented by CROMA (Centro di ateneo per lo studio di Roma) - University Roma Tre. The project combines research in urban history with geographical information systems, and has as main objective to study the "historic environment" of Rome and its transformations. In 1748, Giovanni Battista Nolli (1692-1756) published his «New Map of Rome» (Nuova Pianta di Roma). This work represents the first geometrically correct representation of Rome within the city walls, and the only map deriving from a topographical survey of which the procedures are known. The map represents a precious source of information and a valid cartographic basis for the study of the successive phases of the city development. The presentation will illustrate the characteristics of this cartographic source, the results obtained from the georeferencing process and the construction of a GIS system for the city of Rome in the 18th century. The described methodology stands at the basis of the first volume of the Atlas, that will be shortly published in printable as well as in digital version, in a CD Rom containing a graphical interface that permits the interactive interrogation of map and databases.

  7. Mapping texts through dimensionality reduction and visualization techniques for interactive exploration of document collections

    NASA Astrophysics Data System (ADS)

    de Andrade Lopes, Alneu; Minghim, Rosane; Melo, Vinícius; Paulovich, Fernando V.

    2006-01-01

    The current availability of information many times impair the tasks of searching, browsing and analyzing information pertinent to a topic of interest. This paper presents a methodology to create a meaningful graphical representation of documents corpora targeted at supporting exploration of correlated documents. The purpose of such an approach is to produce a map from a document body on a research topic or field based on the analysis of their contents, and similarities amongst articles. The document map is generated, after text pre-processing, by projecting the data in two dimensions using Latent Semantic Indexing. The projection is followed by hierarchical clustering to support sub-area identification. The map can be interactively explored, helping to narrow down the search for relevant articles. Tests were performed using a collection of documents pre-classified into three research subject classes: Case-Based Reasoning, Information Retrieval, and Inductive Logic Programming. The map produced was capable of separating the main areas and approaching documents by their similarity, revealing possible topics, and identifying boundaries between them. The tool can deal with the exploration of inter-topics and intra-topic relationship and is useful in many contexts that need deciding on relevant articles to read, such as scientific research, education, and training.

  8. Software engineering and Ada in design

    NASA Technical Reports Server (NTRS)

    Oneill, Don

    1986-01-01

    Modern software engineering promises significant reductions in software costs and improvements in software quality. The Ada language is the focus for these software methodology and tool improvements. The IBM FSD approach, including the software engineering practices that guide the systematic design and development of software products and the management of the software process are examined. The revised Ada design language adaptation is revealed. This four level design methodology is detailed including the purpose of each level, the management strategy that integrates the software design activity with the program milestones, and the technical strategy that maps the Ada constructs to each level of design. A complete description of each design level is provided along with specific design language recording guidelines for each level. Finally, some testimony is offered on education, tools, architecture, and metrics resulting from project use of the four level Ada design language adaptation.

  9. Markov state models of protein misfolding

    NASA Astrophysics Data System (ADS)

    Sirur, Anshul; De Sancho, David; Best, Robert B.

    2016-02-01

    Markov state models (MSMs) are an extremely useful tool for understanding the conformational dynamics of macromolecules and for analyzing MD simulations in a quantitative fashion. They have been extensively used for peptide and protein folding, for small molecule binding, and for the study of native ensemble dynamics. Here, we adapt the MSM methodology to gain insight into the dynamics of misfolded states. To overcome possible flaws in root-mean-square deviation (RMSD)-based metrics, we introduce a novel discretization approach, based on coarse-grained contact maps. In addition, we extend the MSM methodology to include "sink" states in order to account for the irreversibility (on simulation time scales) of processes like protein misfolding. We apply this method to analyze the mechanism of misfolding of tandem repeats of titin domains, and how it is influenced by confinement in a chaperonin-like cavity.

  10. An embodied perspective on expertise in solving the problem of making a geologic map

    NASA Astrophysics Data System (ADS)

    Callahan, Caitlin Norah

    The task of constructing a geologic map is a cognitively and physically demanding field-based problem. The map produced is understood to be an individual's two-dimensional interpretation or mental model of the three-dimensional underlying geology. A popular view within the geoscience community is that teaching students how to make a geologic map is valuable for preparing them to deal with disparate and incomplete data sets, for helping them develop problem-solving skills, and for acquiring expertise in geology. Few previous studies have focused specifically on expertise in geologic mapping. Drawing from literature related to expertise, to problem solving, and to mental models, two overarching research questions were identified: How do geologists of different levels of expertise constrain and solve an ill-structured problem such as making a geologic map? How do geologists address the uncertainties inherent to the processes and interpretations involved in solving a geologic mapping problem? These questions were answered using a methodology that captured the physical actions, expressed thoughts, and navigation paths of geologists as they made a geologic map. Eight geologists, from novice to expert, wore a head-mounted video camera with an attached microphone to record those actions and thoughts, creating "video logs" while in the field. The video logs were also time-stamped, which allowed the visual and audio data to be synchronized with the GPS data that tracked participants' movements in the field. Analysis of the video logs yielded evidence that all eight participants expressed thoughts that reflected the process of becoming mentally situated in the mapping task (e.g. relating between distance on a map and distance in three-dimensional space); the prominence of several of these early thoughts waned in the expressed thoughts later in the day. All participants collected several types of data while in the field; novices, however, did so more continuously throughout the day whereas the experts collected more of their data earlier in the day. Experts and novices also differed in that experts focused more on evaluating certainty in their interpretations; the novices focused more on evaluating the certainty of their observations and sense of location.

  11. Structured syncope care pathways based on lean six sigma methodology optimises resource use with shorter time to diagnosis and increased diagnostic yield.

    PubMed

    Martens, Leon; Goode, Grahame; Wold, Johan F H; Beck, Lionel; Martin, Georgina; Perings, Christian; Stolt, Pelle; Baggerman, Lucas

    2014-01-01

    To conduct a pilot study on the potential to optimise care pathways in syncope/Transient Loss of Consciousness management by using Lean Six Sigma methodology while maintaining compliance with ESC and/or NICE guidelines. Five hospitals in four European countries took part. The Lean Six Sigma methodology consisted of 3 phases: 1) Assessment phase, in which baseline performance was mapped in each centre, processes were evaluated and a new operational model was developed with an improvement plan that included best practices and change management; 2) Improvement phase, in which optimisation pathways and standardised best practice tools and forms were developed and implemented. Staff were trained on new processes and change-management support provided; 3) Sustaining phase, which included support, refinement of tools and metrics. The impact of the implementation of new pathways was evaluated on number of tests performed, diagnostic yield, time to diagnosis and compliance with guidelines. One hospital with focus on geriatric populations was analysed separately from the other four. With the new pathways, there was a 59% reduction in the average time to diagnosis (p = 0.048) and a 75% increase in diagnostic yield (p = 0.007). There was a marked reduction in repetitions of diagnostic tests and improved prioritisation of indicated tests. Applying a structured Lean Six Sigma based methodology to pathways for syncope management has the potential to improve time to diagnosis and diagnostic yield.

  12. Structured Syncope Care Pathways Based on Lean Six Sigma Methodology Optimises Resource Use with Shorter Time to Diagnosis and Increased Diagnostic Yield

    PubMed Central

    Martens, Leon; Goode, Grahame; Wold, Johan F. H.; Beck, Lionel; Martin, Georgina; Perings, Christian; Stolt, Pelle; Baggerman, Lucas

    2014-01-01

    Aims To conduct a pilot study on the potential to optimise care pathways in syncope/Transient Loss of Consciousness management by using Lean Six Sigma methodology while maintaining compliance with ESC and/or NICE guidelines. Methods Five hospitals in four European countries took part. The Lean Six Sigma methodology consisted of 3 phases: 1) Assessment phase, in which baseline performance was mapped in each centre, processes were evaluated and a new operational model was developed with an improvement plan that included best practices and change management; 2) Improvement phase, in which optimisation pathways and standardised best practice tools and forms were developed and implemented. Staff were trained on new processes and change-management support provided; 3) Sustaining phase, which included support, refinement of tools and metrics. The impact of the implementation of new pathways was evaluated on number of tests performed, diagnostic yield, time to diagnosis and compliance with guidelines. One hospital with focus on geriatric populations was analysed separately from the other four. Results With the new pathways, there was a 59% reduction in the average time to diagnosis (p = 0.048) and a 75% increase in diagnostic yield (p = 0.007). There was a marked reduction in repetitions of diagnostic tests and improved prioritisation of indicated tests. Conclusions Applying a structured Lean Six Sigma based methodology to pathways for syncope management has the potential to improve time to diagnosis and diagnostic yield. PMID:24927475

  13. Mapping species distribution of Canarian Monteverde forest by field spectroradiometry and satellite imagery

    NASA Astrophysics Data System (ADS)

    Martín-Luis, Antonio; Arbelo, Manuel; Hernández-Leal, Pedro; Arbelo-Bayó, Manuel

    2016-10-01

    Reliable and updated maps of vegetation in protected natural areas are essential for a proper management and conservation. Remote sensing is a valid tool for this purpose. In this study, a methodology based on a WorldView-2 (WV-2) satellite image and in situ spectral signatures measurements was applied to map the Canarian Monteverde ecosystem located in the north of the Tenerife Island (Canary Islands, Spain). Due to the high spectral similarity of vegetation species in the study zone, a Multiple Endmember Spectral Mixture Analysis (MESMA) was performed. MESMA determines the fractional cover of different components within one pixel and it allows for a pixel-by-pixel variation of endmembers. Two libraries of endmembers were collected for the most abundant species in the test area. The first library was collected from in situ spectral signatures measured with an ASD spectroradiometer during a field campaign in June 2015. The second library was obtained from pure pixels identified in the satellite image for the same species. The accuracy of the mapping process was assessed from a set of independent validation plots. The overall accuracy for the ASD-based method was 60.51 % compared to the 86.67 % reached for the WV-2 based mapping. The results suggest the possibility of using WV-2 images for monitoring and regularly updating the maps of the Monteverde forest on the island of Tenerife.

  14. [Do you mean benchmarking?].

    PubMed

    Bonnet, F; Solignac, S; Marty, J

    2008-03-01

    The purpose of benchmarking is to settle improvement processes by comparing the activities to quality standards. The proposed methodology is illustrated by benchmark business cases performed inside medical plants on some items like nosocomial diseases or organization of surgery facilities. Moreover, the authors have built a specific graphic tool, enhanced with balance score numbers and mappings, so that the comparison between different anesthesia-reanimation services, which are willing to start an improvement program, is easy and relevant. This ready-made application is even more accurate as far as detailed tariffs of activities are implemented.

  15. Indexing Anatomical Phrases in Neuro-Radiology Reports to the UMLS 2005AA

    PubMed Central

    Bashyam, Vijayaraghavan; Taira, Ricky K.

    2005-01-01

    This work describes a methodology to index anatomical phrases to the 2005AA release of the Unified Medical Language System (UMLS). A phrase chunking tool based on Natural Language Processing (NLP) was developed to identify semantically coherent phrases within medical reports. Using this phrase chunker, a set of 2,551 unique anatomical phrases was extracted from brain radiology reports. These phrases were mapped to the 2005AA release of the UMLS using a vector space model. Precision for the task of indexing unique phrases was 0.87. PMID:16778995

  16. Development of a 30 m Spatial Resolution Land Cover of Canada: Contribution to the Harmonized North America Land Cover Dataset

    NASA Astrophysics Data System (ADS)

    Pouliot, D.; Latifovic, R.; Olthof, I.

    2017-12-01

    Land cover is needed for a large range of environmental applications regarding climate impacts and adaption, emergency response, wildlife habitat, air quality, water yield, etc. In Canada a 2008 user survey revealed that the most practical scale for provision of land cover data is 30 m, nationwide, with an update frequency of five years (Ball, 2008). In response to this need the Canada Centre for Remote Sensing has generated a 30 m land cover of Canada for the base year 2010 as part of a planned series of maps at the recommended five year update frequency. This land cover is the Canadian contribution to the North American Land Change Monitoring System initiative, which seeks to provide harmonized land cover across Canada, the United States, and Mexico. The methodology developed in this research utilized a combination of unsupervised and machine learning techniques to map land cover, blend results between mapping units, locally optimize results, and process some thematic attributes with specific features sets. Accuracy assessment with available field data shows it was on average 75% for the five study areas assessed. In this presentation an overview of the unique processing aspects, example results, and initial accuracy assessment will be discussed.

  17. Generic framework for mining cellular automata models on protein-folding simulations.

    PubMed

    Diaz, N; Tischer, I

    2016-05-13

    Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined.

  18. Economic vulnerability of timber resources to forest fires.

    PubMed

    y Silva, Francisco Rodríguez; Molina, Juan Ramón; González-Cabán, Armando; Machuca, Miguel Ángel Herrera

    2012-06-15

    The temporal-spatial planning of activities for a territorial fire management program requires knowing the value of forest ecosystems. In this paper we extend to and apply the economic valuation principle to the concept of economic vulnerability and present a methodology for the economic valuation of the forest production ecosystems. The forest vulnerability is analyzed from criteria intrinsically associated to the forest characterization, and to the potential behavior of surface fires. Integrating a mapping process of fire potential and analytical valuation algorithms facilitates the implementation of fire prevention planning. The availability of cartography of economic vulnerability of the forest ecosystems is fundamental for budget optimization, and to help in the decision making process. Published by Elsevier Ltd.

  19. Assessing Hydrologic Impacts of Future Land Cover Change ...

    EPA Pesticide Factsheets

    Long‐term land‐use and land cover change and their associated impacts pose critical challenges to sustaining vital hydrological ecosystem services for future generations. In this study, a methodology was developed on the San Pedro River Basin to characterize hydrologic impacts from future urban growth through time. This methodology was then expanded and utilized to characterize the changing hydrology on the South Platte River Basin. Future urban growth is represented by housingdensity maps generated in decadal intervals from 2010 to 2100, produced by the U.S. Environmental Protection Agency (EPA) Integrated Climate and Land‐Use Scenarios (ICLUS) project. ICLUS developed future housing density maps by adapting the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) social, economic, and demographic storylines to the conterminous United States. To characterize hydrologic impacts from future growth, the housing density maps were reclassified to National Land Cover Database (NLCD) 2006 land cover classes and used to parameterize the Soil and Water Assessment Tool (SWAT) using the Automated Geospatial Watershed Assessment (AGWA) tool. The objectives of this project were to 1) develop and describe a methodology for adapting the ICLUS data for use in AGWA as anapproach to evaluate basin‐wide impacts of development on water‐quantity and ‐quality, 2) present initial results from the application of the methodology to

  20. An Ontology for State Analysis: Formalizing the Mapping to SysML

    NASA Technical Reports Server (NTRS)

    Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel

    2012-01-01

    State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.

  1. [Molecular combing method in the research of DNA replication parameters in isolated organs of Drosophyla melanogaster].

    PubMed

    Ivankin, A V; Kolesnikova, T D; Demakov, S A; Andreenkov, O V; Bil'danova, E R; Andreenkova, N G; Zhimulev, I F

    2011-01-01

    Methods of physical DNA mapping and direct visualization of replication and transcription in specific regions of genome play crucial role in the researches of structural and functional organization of eukaryotic genomes. Since DNA strands in the cells are organized into high-fold structure and present as highly compacted chromosomes, the majority of these methods have lower resolution at chromosomal level. One of the approaches to enhance the resolution and mapping accuracy is the method of molecular combing. The method is based on the process of stretching and alignment of DNA molecules that are covalently attached with one of the ends to the cover glass surface. In this article we describe the major methodological steps of molecular combing and their adaptation for researches of DNA replication parameters in polyploidy and diploid tissues of Drosophyla larvae.

  2. A strategic map for high-impact virtual experience design

    NASA Astrophysics Data System (ADS)

    Faste, Haakon; Bergamasco, Massimo

    2009-02-01

    We have employed methodologies of human centered design to inspire and guide the engineering of a definitive low-cost aesthetic multimodal experience intended to stimulate cultural growth. Using a combination of design research, trend analysis and the programming of immersive virtual 3D worlds, over 250 innovative concepts have been brainstormed, prototyped, evaluated and refined. These concepts have been used to create a strategic map for the development of highimpact virtual art experiences, the most promising of which have been incorporated into a multimodal environment programmed in the online interactive 3D platform XVR. A group of test users have evaluated the experience as it has evolved, using a multimodal interface with stereo vision, 3D audio and haptic feedback. This paper discusses the process, content, results, and impact on our engineering laboratory that this research has produced.

  3. Annual Fossil-Fuel CO2 Emissions: Uncertainty of Emissions Gridded by On Degree Latitude by One Degree Longitude (1950-2013) (V. 2016)

    DOE Data Explorer

    Andres, R. J. [CDIAC; Boden, T. A. [CDIAC

    2016-01-01

    The annual, gridded fossil-fuel CO2 emissions uncertainty estimates from 1950-2013 provided in this database are derived from time series of global, regional, and national fossil-fuel CO2 emissions (Boden et al. 2016). Andres et al. (2016) describes the basic methodology in estimating the uncertainty in the (gridded fossil fuel data product ). This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughout this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty.

  4. Monthly Fossil-Fuel CO2 Emissions: Uncertainty of Emissions Gridded by On Degree Latitude by One Degree Longitude (Uncertainties, V.2016)

    DOE Data Explorer

    Andres, J.A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Boden, T.A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-01-01

    The monthly, gridded fossil-fuel CO2 emissions uncertainty estimates from 1950-2013 provided in this database are derived from time series of global, regional, and national fossil-fuel CO2 emissions (Boden et al. 2016). Andres et al. (2016) describes the basic methodology in estimating the uncertainty in the (gridded fossil fuel data product ). This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughout this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty.

  5. Distributed Fast Self-Organized Maps for Massive Spectrophotometric Data Analysis †.

    PubMed

    Dafonte, Carlos; Garabato, Daniel; Álvarez, Marco A; Manteiga, Minia

    2018-05-03

    Analyzing huge amounts of data becomes essential in the era of Big Data, where databases are populated with hundreds of Gigabytes that must be processed to extract knowledge. Hence, classical algorithms must be adapted towards distributed computing methodologies that leverage the underlying computational power of these platforms. Here, a parallel, scalable, and optimized design for self-organized maps (SOM) is proposed in order to analyze massive data gathered by the spectrophotometric sensor of the European Space Agency (ESA) Gaia spacecraft, although it could be extrapolated to other domains. The performance comparison between the sequential implementation and the distributed ones based on Apache Hadoop and Apache Spark is an important part of the work, as well as the detailed analysis of the proposed optimizations. Finally, a domain-specific visualization tool to explore astronomical SOMs is presented.

  6. Failure mode and effects analysis outputs: are they valid?

    PubMed Central

    2012-01-01

    Background Failure Mode and Effects Analysis (FMEA) is a prospective risk assessment tool that has been widely used within the aerospace and automotive industries and has been utilised within healthcare since the early 1990s. The aim of this study was to explore the validity of FMEA outputs within a hospital setting in the United Kingdom. Methods Two multidisciplinary teams each conducted an FMEA for the use of vancomycin and gentamicin. Four different validity tests were conducted: · Face validity: by comparing the FMEA participants’ mapped processes with observational work. · Content validity: by presenting the FMEA findings to other healthcare professionals. · Criterion validity: by comparing the FMEA findings with data reported on the trust’s incident report database. · Construct validity: by exploring the relevant mathematical theories involved in calculating the FMEA risk priority number. Results Face validity was positive as the researcher documented the same processes of care as mapped by the FMEA participants. However, other healthcare professionals identified potential failures missed by the FMEA teams. Furthermore, the FMEA groups failed to include failures related to omitted doses; yet these were the failures most commonly reported in the trust’s incident database. Calculating the RPN by multiplying severity, probability and detectability scores was deemed invalid because it is based on calculations that breach the mathematical properties of the scales used. Conclusion There are significant methodological challenges in validating FMEA. It is a useful tool to aid multidisciplinary groups in mapping and understanding a process of care; however, the results of our study cast doubt on its validity. FMEA teams are likely to need different sources of information, besides their personal experience and knowledge, to identify potential failures. As for FMEA’s methodology for scoring failures, there were discrepancies between the teams’ estimates and similar incidents reported on the trust’s incident database. Furthermore, the concept of multiplying ordinal scales to prioritise failures is mathematically flawed. Until FMEA’s validity is further explored, healthcare organisations should not solely depend on their FMEA results to prioritise patient safety issues. PMID:22682433

  7. Rapid in vivo apparent diffusion coefficient mapping of hyperpolarized (13) C metabolites.

    PubMed

    Koelsch, Bertram L; Reed, Galen D; Keshari, Kayvan R; Chaumeil, Myriam M; Bok, Robert; Ronen, Sabrina M; Vigneron, Daniel B; Kurhanewicz, John; Larson, Peder E Z

    2015-09-01

    Hyperpolarized (13) C magnetic resonance allows for the study of real-time metabolism in vivo, including significant hyperpolarized (13) C lactate production in many tumors. Other studies have shown that aggressive and highly metastatic tumors rapidly transport lactate out of cells. Thus, the ability to not only measure the production of hyperpolarized (13) C lactate but also understand its compartmentalization using diffusion-weighted MR will provide unique information for improved tumor characterization. We used a bipolar, pulsed-gradient, double spin echo imaging sequence to rapidly generate diffusion-weighted images of hyperpolarized (13) C metabolites. Our methodology included a simultaneously acquired B1 map to improve apparent diffusion coefficient (ADC) accuracy and a diffusion-compensated variable flip angle scheme to improve ADC precision. We validated this sequence and methodology in hyperpolarized (13) C phantoms. Next, we generated ADC maps of several hyperpolarized (13) C metabolites in a normal rat, rat brain tumor, and prostate cancer mouse model using both preclinical and clinical trial-ready hardware. ADC maps of hyperpolarized (13) C metabolites provide information about the localization of these molecules in the tissue microenvironment. The methodology presented here allows for further studies to investigate ADC changes due to disease state that may provide unique information about cancer aggressiveness and metastatic potential. © 2014 Wiley Periodicals, Inc.

  8. GREENSCOPE: A Method for Modeling Chemical Process ...

    EPA Pesticide Factsheets

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Efficiency, and Energy, can evaluate processes with over a hundred different indicators. These indicators provide a means for realizing the principles of green chemistry and green engineering in the context of sustainability. Development of the methodology has centered around three focal points. One is a taxonomy of impacts that describe the indicators and provide absolute scales for their evaluation. The setting of best and worst limits for the indicators allows the user to know the status of the process under study in relation to understood values. Thus, existing or imagined processes can be evaluated according to their relative indicator scores, and process modifications can strive towards realizable targets. A second area of focus is in advancing definitions of data needs for the many indicators of the taxonomy. Each of the indicators has specific data that is necessary for their calculation. Values needed and data sources have been identified. These needs can be mapped according to the information source (e.g., input stream, output stream, external data, etc.) for each of the bases. The user can visualize data-indicator relationships on the way to choosing selected ones for evalua

  9. Temporally inter-comparable maps of terrestrial wilderness and the Last of the Wild

    PubMed Central

    Allan, James R.; Venter, Oscar; Watson, James E.M.

    2017-01-01

    Wilderness areas, defined as areas free of industrial scale activities and other human pressures which result in significant biophysical disturbance, are important for biodiversity conservation and sustaining the key ecological processes underpinning planetary life-support systems. Despite their importance, wilderness areas are being rapidly eroded in extent and fragmented. Here we present the most up-to-date temporally inter-comparable maps of global terrestrial wilderness areas, which are essential for monitoring changes in their extent, and for proactively planning conservation interventions to ensure their preservation. Using maps of human pressure on the natural environment for 1993 and 2009, we identified wilderness as all ‘pressure free’ lands with a contiguous area >10,000 km2. These places are likely operating in a natural state and represent the most intact habitats globally. We then created a regionally representative map of wilderness following the well-established ‘Last of the Wild’ methodology; which identifies the 10% area with the lowest human pressure within each of Earth’s 60 biogeographic realms, and identifies the ten largest contiguous areas, along with all contiguous areas >10,000 km2. PMID:29231923

  10. Constructing a strategy map for banking institutions with key performance indicators of the balanced scorecard.

    PubMed

    Wu, Hung-Yi

    2012-08-01

    This study presents a structural evaluation methodology to link key performance indicators (KPIs) into a strategy map of the balanced scorecard (BSC) for banking institutions. Corresponding with the four BSC perspectives (finance, customer, internal business process, and learning and growth), the most important evaluation indicators of banking performance are synthesized from the relevant literature and screened by a committee of experts. The Decision Making Trial and Evaluation Laboratory (DEMATEL) method, a multiple criteria analysis tool, is then employed to determine the causal relationships between the KPIs, to identify the critical central and influential factors, and to establish a visualized strategy map with logical links to improve banking performance. An empirical application is provided as an example. According to the expert evaluations, the three most essential KPIs for banking performance are customer satisfaction, sales performance, and customer retention rate. The DEMATEL results demonstrate a clear road map to assist management in prioritizing the performance indicators and in focusing attention on the strategy-related activities of the crucial indicators. According to the constructed strategy map, management could better invest limited resources in the areas that need improvement most. Although these strategy maps of the BSC are not universal, the research results show that the presented approach is an objective and feasible way to construct strategy maps more justifiably. The proposed framework can be applicable to institutions in other industries as well. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. High and ultra-high resolution metabolite mapping of the human brain using 1H FID MRSI at 9.4T.

    PubMed

    Nassirpour, Sahar; Chang, Paul; Henning, Anke

    2018-03-01

    Magnetic resonance spectroscopic imaging (MRSI) is a promising technique for mapping the spatial distribution of multiple metabolites in the human brain. These metabolite maps can be used as a diagnostic tool to gain insight into several biochemical processes and diseases in the brain. In comparison to lower field strengths, MRSI at ultra-high field strengths benefits from a higher signal to noise ratio (SNR) as well as higher chemical shift dispersion, and hence spectral resolution. This study combines the benefits of an ultra-high field magnet with the advantages of an ultra-short TE and TR single-slice FID-MRSI sequence (such as negligible J-evolution and loss of SNR due to T 2 relaxation effects) and presents the first metabolite maps acquired at 9.4T in the healthy human brain at both high (voxel size of 97.6µL) and ultra-high (voxel size of 24.4µL) spatial resolutions in a scan time of 11 and 46min respectively. In comparison to lower field strengths, more anatomically-detailed maps with higher SNR from a larger number of metabolites are shown. A total of 12 metabolites including glutamate (Glu), glutamine (Gln), N-acetyl-aspartyl-glutamate (NAAG), Gamma-aminobutyric acid (GABA) and glutathione (GSH) are reliably mapped. Comprehensive description of the methodology behind these maps is provided. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. "Total Deposition (TDEP) Maps"

    EPA Science Inventory

    The presentation provides an update on the use of a hybrid methodology that relies on measured values from national monitoring networks and modeled values from CMAQ to produce of maps of total deposition for use in critical loads and other ecological assessments. Additionally, c...

  13. Automating Flood Hazard Mapping Methods for Near Real-time Storm Surge Inundation and Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Weigel, A. M.; Griffin, R.; Gallagher, D.

    2015-12-01

    Storm surge has enough destructive power to damage buildings and infrastructure, erode beaches, and threaten human life across large geographic areas, hence posing the greatest threat of all the hurricane hazards. The United States Gulf of Mexico has proven vulnerable to hurricanes as it has been hit by some of the most destructive hurricanes on record. With projected rises in sea level and increases in hurricane activity, there is a need to better understand the associated risks for disaster mitigation, preparedness, and response. GIS has become a critical tool in enhancing disaster planning, risk assessment, and emergency response by communicating spatial information through a multi-layer approach. However, there is a need for a near real-time method of identifying areas with a high risk of being impacted by storm surge. Research was conducted alongside Baron, a private industry weather enterprise, to facilitate automated modeling and visualization of storm surge inundation and vulnerability on a near real-time basis. This research successfully automated current flood hazard mapping techniques using a GIS framework written in a Python programming environment, and displayed resulting data through an Application Program Interface (API). Data used for this methodology included high resolution topography, NOAA Probabilistic Surge model outputs parsed from Rich Site Summary (RSS) feeds, and the NOAA Census tract level Social Vulnerability Index (SoVI). The development process required extensive data processing and management to provide high resolution visualizations of potential flooding and population vulnerability in a timely manner. The accuracy of the developed methodology was assessed using Hurricane Isaac as a case study, which through a USGS and NOAA partnership, contained ample data for statistical analysis. This research successfully created a fully automated, near real-time method for mapping high resolution storm surge inundation and vulnerability for the Gulf of Mexico, and improved the accuracy and resolution of the Probabilistic Storm Surge model.

  14. Self Consistent Bathymetric Mapping Using Sub-maps: Survey Results From the TAG Hydrothermal Structure

    NASA Astrophysics Data System (ADS)

    Roman, C. N.; Reves-Sohn, R.; Singh, H.; Humphris, S.

    2005-12-01

    The spatial resolution of microbathymetry maps created using robotic vehicles such as ROVs, AUVs and manned submersibles in the deep ocean is currently limited by the accuracy of the vehicle navigation data. Errors in the vehicle position estimate commonly exceed the ranging errors of the acoustic mapping sensor itself, which creates inconsistency in the map making process and produces artifacts that lower resolution and distort map integrity. We present a methodology for producing self-consistent maps and improving vehicle position estimation by exploiting accurate local navigation and utilizing terrain relative measurements. The complete map is broken down into individual "sub-maps'', which are generated using short term Doppler based navigation. The sub-maps are pairwise registered to constrain the vehicle position estimates by matching terrain that has been imaged multiple times. This procedure is implemented using a delayed state Kalman filter to incorporate the sub-map registrations as relative position measurements between previously visited vehicle locations. Archiving of previous positions in a filter state vector allows for continual adjustment of the sub-map locations. The terrain registration is accomplished using a two dimensional correlation and a six degree of freedom point cloud alignment method tailored to bathymetric data. This registration procedure is applicable to fully 3 dimensional complex underwater environments. The complete bathymetric map is then created from the union of all sub-maps that have been aligned in a consistent manner. The method is applied to an SM2000 multibeam survey of the TAG hydrothermal structure on the Mid-Atlantic Ridge at 26(°)N using the Jason II ROV. The survey included numerous crossing tracklines designed to test this algorithm, and the final gridded bathymetry data is sub-meter accurate. The high-resolution map has allowed for the identification of previously unrecognized fracture patterns associated with flow focusing at TAG, as well as imaging of fine-scale features such as individual sulfide talus blocks and ODP re-entry cones.

  15. Towards the Crowdsourcing of Massive Smartphone Assisted-GPS Sensor Ground Observations for the Production of Digital Terrain Models

    PubMed Central

    Massad, Ido

    2018-01-01

    Digital Terrain Models (DTMs) used for the representation of the bare earth are produced from elevation data obtained using high-end mapping platforms and technologies. These require the handling of complex post-processing performed by authoritative and commercial mapping agencies. In this research, we aim to exploit user-generated data to produce DTMs by handling massive volumes of position and elevation data collected using ubiquitous smartphone devices equipped with Assisted-GPS sensors. As massive position and elevation data are collected passively and straightforwardly by pedestrians, cyclists, and drivers, it can be transformed into valuable topographic information. Specifically, in dense and concealed built and vegetated areas, where other technologies fail, handheld devices have an advantage. Still, Assisted-GPS measurements are not as accurate as high-end technologies, requiring pre- and post-processing of observations. We propose the development and implementation of a 2D Kalman filter and smoothing on the acquired crowdsourced observations for topographic representation production. When compared to an authoritative DTM, results obtained are very promising in producing good elevation values. Today, open-source mapping infrastructures, such as OpenStreetMap, rely primarily on the global authoritative SRTM (Shuttle Radar Topography Mission), which shows similar accuracy but inferior resolution when compared to the results obtained in this research. Accordingly, our crowdsourced methodology has the capacity for reliable topographic representation production that is based on ubiquitous volunteered user-generated data. PMID:29562627

  16. Evidence mapping based on systematic reviews of therapeutic interventions for gastrointestinal stromal tumors (GIST).

    PubMed

    Ballesteros, Mónica; Montero, Nadia; López-Pousa, Antonio; Urrútia, Gerard; Solà, Ivan; Rada, Gabriel; Pardo-Hernandez, Hector; Bonfill, Xavier

    2017-09-07

    Gastrointestinal Stromal Tumours (GISTs) are the most common mesenchymal tumours. Currently, different pharmacological and surgical options are used to treat localised and metastatic GISTs, although this research field is broad and the body of evidence is scattered and expanding. Our objectives are to identify, describe and organise the current available evidence for GIST through an evidence mapping approach. We followed the methodology of Global Evidence Mapping (GEM). We searched Pubmed, EMBASE, The Cochrane Library and Epistemonikos in order to identify systematic reviews (SRs) with or without meta-analyses published between 1990 and March 2016. Two authors assessed eligibility and extracted data. Methodological quality of the included systematic reviews was assessed using AMSTAR. We organised the results according to identified PICO questions and presented the evidence map in tables and a bubble plot. A total of 17 SRs met eligibility criteria. These reviews included 66 individual studies, of which three quarters were either observational or uncontrolled clinical trials. Overall, the quality of the included SRs was moderate or high. In total, we extracted 14 PICO questions from them and the corresponding results mostly favoured the intervention arm. The most common type of study used to evaluate therapeutic interventions in GIST sarcomas has been non-experimental studies. However, the majority of the interventions are reported as beneficial or probably beneficial by the respective authors of SRs. The evidence mapping is a useful and reliable methodology to identify and present the existing evidence about therapeutic interventions.

  17. Mapping of species richness for conservation of biological diversity: conceptual and methodological issues

    Treesearch

    M.J. Conroy; B.R. Noon

    1996-01-01

    Biodiversity mapping (e.g., the Gap Analysis Program [GAP]), in which vegetative features and categories of land use are mapped at coarse spatial scales, has been proposed as a reliable tool for land use decisions (e.g., reserve identification, selection, and design). This implicitly assumes that species richness data collected at coarse spatiotemporal scales provide a...

  18. Remote-sensing applications as utilized in Florida's coastal zone management program

    NASA Technical Reports Server (NTRS)

    Worley, D. R.

    1975-01-01

    Land use maps were developed from photomaps obtained by remote sensing in order to develop a comprehensive state plan for the protection, development, and zoning of coastal regions. Only photographic remote sensors have been used in support of the coastal council's planning/management methodology. Standard photointerpretation and cartographic application procedures for map compilation were used in preparing base maps.

  19. Using Landsat Thematic Mapper and SPOT Satellite Imagery to inventory wetland plants of the Coeur d'Alene Floodplain

    Treesearch

    F. M. Roberts; P. E. Gessler

    2000-01-01

    Landsat Thematic Mapper (TM) and SPOT Satellite Imagery were used to map wetland plant species in thc Coeur d'Alene floodplain in northern Idaho. This paper discusses the methodology used to create a wetland plant species map for the floodplain. Species mapped included common cattail (Typha latifolia); water horse-tail (Equisetum...

  20. Applications systems verification and transfer project. Volume 8: Satellite snow mapping and runoff prediction handbook

    NASA Technical Reports Server (NTRS)

    Bowley, C. J.; Barnes, J. C.; Rango, A.

    1981-01-01

    The purpose of the handbook is to update the various snowcover interpretation techniques, document the snow mapping techniques used in the various ASVT study areas, and describe the ways snowcover data have been applied to runoff prediction. Through documentation in handbook form, the methodology developed in the Snow Mapping ASVT can be applied to other areas.

  1. Using object-oriented classification and high-resolution imagery to map fuel types in a Mediterranean region.

    Treesearch

    L. Arroyo; S.P. Healey; W.B. Cohen; D. Cocero; J.A. Manzanera

    2006-01-01

    Knowledge of fuel load and composition is critical in fighting, preventing, and understanding wildfires. Commonly, the generation of fuel maps from remotely sensed imagery has made use of medium-resolution sensors such as Landsat. This paper presents a methodology to generate fuel type maps from high spatial resolution satellite data through object-oriented...

  2. Assessing map accuracy in a remotely sensed, ecoregion-scale cover map

    USGS Publications Warehouse

    Edwards, T.C.; Moisen, Gretchen G.; Cutler, D.R.

    1998-01-01

    Landscape- and ecoregion-based conservation efforts increasingly use a spatial component to organize data for analysis and interpretation. A challenge particular to remotely sensed cover maps generated from these efforts is how best to assess the accuracy of the cover maps, especially when they can exceed 1000 s/km2 in size. Here we develop and describe a methodological approach for assessing the accuracy of large-area cover maps, using as a test case the 21.9 million ha cover map developed for Utah Gap Analysis. As part of our design process, we first reviewed the effect of intracluster correlation and a simple cost function on the relative efficiency of cluster sample designs to simple random designs. Our design ultimately combined clustered and subsampled field data stratified by ecological modeling unit and accessibility (hereafter a mixed design). We next outline estimation formulas for simple map accuracy measures under our mixed design and report results for eight major cover types and the three ecoregions mapped as part of the Utah Gap Analysis. Overall accuracy of the map was 83.2% (SE=1.4). Within ecoregions, accuracy ranged from 78.9% to 85.0%. Accuracy by cover type varied, ranging from a low of 50.4% for barren to a high of 90.6% for man modified. In addition, we examined gains in efficiency of our mixed design compared with a simple random sample approach. In regard to precision, our mixed design was more precise than a simple random design, given fixed sample costs. We close with a discussion of the logistical constraints facing attempts to assess the accuracy of large-area, remotely sensed cover maps.

  3. Modeling biochemical transformation processes and information processing with Narrator.

    PubMed

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is specifically intended for users aiming to construct and simulate dynamic models of biology without recourse to extensive mathematical detail. Its design facilitates mappings to different formal languages and frameworks. The combined set of features makes Narrator unique among tools of its kind. Narrator is implemented as Java software program and available as open-source from http://www.narrator-tool.org.

  4. Modeling biochemical transformation processes and information processing with Narrator

    PubMed Central

    Mandel, Johannes J; Fuß, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-01-01

    Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a flexible and intuitive systems biology tool. It is specifically intended for users aiming to construct and simulate dynamic models of biology without recourse to extensive mathematical detail. Its design facilitates mappings to different formal languages and frameworks. The combined set of features makes Narrator unique among tools of its kind. Narrator is implemented as Java software program and available as open-source from . PMID:17389034

  5. Surface Emissivity Maps for Satellite Retrieval of the Longwave Radiation Budget

    NASA Technical Reports Server (NTRS)

    Gupta, Shashi K.; Wilber, Anne C.; Kratz, David P.

    1999-01-01

    This paper presents a brief description of the procedure used to produce global surface emissivity maps for the broadband LW, the 8-12 micrometer window, and 12 narrow LW bands. For a detailed description of the methodology and the input data, the reader is referred to Wilber et al. (1999). These maps are based on a time-independent surface type map published by the IGBP, and laboratory measurements of spectral reflectances of surface materials. These maps represent a first attempt to characterize emissivity based on surface types, and many improvements to the methodology presented here are already underway. Effects of viewing zenith angle and sea state on the emissivity of ocean surface (Smith et al. 1996, Wu and Smith 1997, Masuda et al. 1988) will be taken into account. Measurements form ASTER and MODIS will be incorporated as they become available. Seasonal variation of emissivity based on changes in the characteristics of vegetation will be considered, and the variability of emissivity of barren land areas will be accounted for with the use of Zobler World Soil Maps (Zobler 1986). The current maps have been made available to the scientific community from the web site: http://tanalo.larc.nasa.gov:8080/surf_htmls/ SARB_surf.html

  6. Compositional cokriging for mapping the probability risk of groundwater contamination by nitrates.

    PubMed

    Pardo-Igúzquiza, Eulogio; Chica-Olmo, Mario; Luque-Espinar, Juan A; Rodríguez-Galiano, Víctor

    2015-11-01

    Contamination by nitrates is an important cause of groundwater pollution and represents a potential risk to human health. Management decisions must be made using probability maps that assess the nitrate concentration potential of exceeding regulatory thresholds. However these maps are obtained with only a small number of sparse monitoring locations where the nitrate concentrations have been measured. It is therefore of great interest to have an efficient methodology for obtaining those probability maps. In this paper, we make use of the fact that the discrete probability density function is a compositional variable. The spatial discrete probability density function is estimated by compositional cokriging. There are several advantages in using this approach: (i) problems of classical indicator cokriging, like estimates outside the interval (0,1) and order relations, are avoided; (ii) secondary variables (e.g. aquifer parameters) can be included in the estimation of the probability maps; (iii) uncertainty maps of the probability maps can be obtained; (iv) finally there are modelling advantages because the variograms and cross-variograms of real variables that do not have the restrictions of indicator variograms and indicator cross-variograms. The methodology was applied to the Vega de Granada aquifer in Southern Spain and the advantages of the compositional cokriging approach were demonstrated. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. A methodological approach and framework for sustainability assessment in NGO-implemented primary health care programs.

    PubMed

    Sarriot, Eric G; Winch, Peter J; Ryan, Leo J; Bowie, Janice; Kouletio, Michelle; Swedberg, Eric; LeBan, Karen; Edison, Jay; Welch, Rikki; Pacqué, Michel C

    2004-01-01

    An estimated 10.8 million children under 5 continue to die each year in developing countries from causes easily treatable or preventable. Non governmental organizations (NGOs) are frontline implementers of low-cost and effective child health interventions, but their progress toward sustainable child health gains is a challenge to evaluate. This paper presents the Child Survival Sustainability Assessment (CSSA) methodology--a framework and process--to map progress towards sustainable child health from the community level and upward. The CSSA was developed with NGOs through a participatory process of research and dialogue. Commitment to sustainability requires a systematic and systemic consideration of human, social and organizational processes beyond a purely biomedical perspective. The CSSA is organized around three interrelated dimensions of evaluation: (1) health and health services; (2) capacity and viability of local organizations; (3) capacity of the community in its social ecological context. The CSSA uses a participatory, action-planning process, engaging a 'local system' of stakeholders in the contextual definition of objectives and indicators. Improved conditions measured in the three dimensions correspond to progress toward a sustainable health situation for the population. This framework opens new opportunities for evaluation and research design and places sustainability at the center of primary health care programming.

  8. Recognising and referring children exposed to domestic abuse: a multi-professional, proactive systems-based evaluation using a modified Failure Mode and Effects Analysis (FMEA).

    PubMed

    Ashley, Laura; Armitage, Gerry; Taylor, Julie

    2017-03-01

    Failure Modes and Effects Analysis (FMEA) is a prospective quality assurance methodology increasingly used in healthcare, which identifies potential vulnerabilities in complex, high-risk processes and generates remedial actions. We aimed, for the first time, to apply FMEA in a social care context to evaluate the process for recognising and referring children exposed to domestic abuse within one Midlands city safeguarding area in England. A multidisciplinary, multi-agency team of 10 front-line professionals undertook the FMEA, using a modified methodology, over seven group meetings. The FMEA included mapping out the process under evaluation to identify its component steps, identifying failure modes (potential errors) and possible causes for each step and generating corrective actions. In this article, we report the output from the FMEA, including illustrative examples of the failure modes and corrective actions generated. We also present an analysis of feedback from the FMEA team and provide future recommendations for the use of FMEA in appraising social care processes and practice. Although challenging, the FMEA was unequivocally valuable for team members and generated a significant number of corrective actions locally for the safeguarding board to consider in its response to children exposed to domestic abuse. © 2016 John Wiley & Sons Ltd.

  9. A MapReduce approach to diminish imbalance parameters for big deoxyribonucleic acid dataset.

    PubMed

    Kamal, Sarwar; Ripon, Shamim Hasnat; Dey, Nilanjan; Ashour, Amira S; Santhi, V

    2016-07-01

    In the age of information superhighway, big data play a significant role in information processing, extractions, retrieving and management. In computational biology, the continuous challenge is to manage the biological data. Data mining techniques are sometimes imperfect for new space and time requirements. Thus, it is critical to process massive amounts of data to retrieve knowledge. The existing software and automated tools to handle big data sets are not sufficient. As a result, an expandable mining technique that enfolds the large storage and processing capability of distributed or parallel processing platforms is essential. In this analysis, a contemporary distributed clustering methodology for imbalance data reduction using k-nearest neighbor (K-NN) classification approach has been introduced. The pivotal objective of this work is to illustrate real training data sets with reduced amount of elements or instances. These reduced amounts of data sets will ensure faster data classification and standard storage management with less sensitivity. However, general data reduction methods cannot manage very big data sets. To minimize these difficulties, a MapReduce-oriented framework is designed using various clusters of automated contents, comprising multiple algorithmic approaches. To test the proposed approach, a real DNA (deoxyribonucleic acid) dataset that consists of 90 million pairs has been used. The proposed model reduces the imbalance data sets from large-scale data sets without loss of its accuracy. The obtained results depict that MapReduce based K-NN classifier provided accurate results for big data of DNA. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Literature Review on Systems of Systems (SoS): A Methodology With Preliminary Results

    DTIC Science & Technology

    2013-11-01

    Appendix H. The Enhanced ISAAC Neural Simulation Toolkit (EINSTein) 73  Appendix I. The Map Aware Nonuniform Automata (MANA) Agent-Based Model 81...83  Figure I-3. Quadrant chart addressing SoS and associated SoSA designs for the Map Aware Nonuniform Automata (MANA) agent...Map Aware Nonuniform Automata (MANA) agent-based model. 85  Table I-2. SoS and SoSA software component maturation scores associated with the Map

  11. Functional Maps of Mechanosensory Features in the Drosophila Brain.

    PubMed

    Patella, Paola; Wilson, Rachel I

    2018-04-23

    Johnston's organ is the largest mechanosensory organ in Drosophila. It contributes to hearing, touch, vestibular sensing, proprioception, and wind sensing. In this study, we used in vivo 2-photon calcium imaging and unsupervised image segmentation to map the tuning properties of Johnston's organ neurons (JONs) at the site where their axons enter the brain. We then applied the same methodology to study two key brain regions that process signals from JONs: the antennal mechanosensory and motor center (AMMC) and the wedge, which is downstream of the AMMC. First, we identified a diversity of JON response types that tile frequency space and form a rough tonotopic map. Some JON response types are direction selective; others are specialized to encode amplitude modulations over a specific range (dynamic range fractionation). Next, we discovered that both the AMMC and the wedge contain a tonotopic map, with a significant increase in tonotopy-and a narrowing of frequency tuning-at the level of the wedge. Whereas the AMMC tonotopic map is unilateral, the wedge tonotopic map is bilateral. Finally, we identified a subregion of the AMMC/wedge that responds preferentially to the coherent rotation of the two mechanical organs in the same angular direction, indicative of oriented steady air flow (directional wind). Together, these maps reveal the broad organization of the primary and secondary mechanosensory regions of the brain. They provide a framework for future efforts to identify the specific cell types and mechanisms that underlie the hierarchical re-mapping of mechanosensory information in this system. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. A multivariate geostatistical methodology to delineate areas of potential interest for future sedimentary gold exploration.

    PubMed

    Goovaerts, P; Albuquerque, Teresa; Antunes, Margarida

    2016-11-01

    This paper describes a multivariate geostatistical methodology to delineate areas of potential interest for future sedimentary gold exploration, with an application to an abandoned sedimentary gold mining region in Portugal. The main challenge was the existence of only a dozen gold measurements confined to the grounds of the old gold mines, which precluded the application of traditional interpolation techniques, such as cokriging. The analysis could, however, capitalize on 376 stream sediment samples that were analyzed for twenty two elements. Gold (Au) was first predicted at all 376 locations using linear regression (R 2 =0.798) and four metals (Fe, As, Sn and W), which are known to be mostly associated with the local gold's paragenesis. One hundred realizations of the spatial distribution of gold content were generated using sequential indicator simulation and a soft indicator coding of regression estimates, to supplement the hard indicator coding of gold measurements. Each simulated map then underwent a local cluster analysis to identify significant aggregates of low or high values. The one hundred classified maps were processed to derive the most likely classification of each simulated node and the associated probability of occurrence. Examining the distribution of the hot-spots and cold-spots reveals a clear enrichment in Au along the Erges River downstream from the old sedimentary mineralization.

  13. Mycobacterium avium subspecies paratuberculosis causes Crohn's disease in some inflammatory bowel disease patients

    PubMed Central

    Naser, Saleh A; Sagramsingh, Sudesh R; Naser, Abed S; Thanigachalam, Saisathya

    2014-01-01

    Crohn’s disease (CD) is a chronic inflammatory condition that plagues millions all over the world. This debilitating bowel disease can start in early childhood and continue into late adulthood. Signs and symptoms are usually many and multiple tests are often required for the diagnosis and confirmation of this disease. However, little is still understood about the cause(s) of CD. As a result, several theories have been proposed over the years. One theory in particular is that Mycobacterium avium subspecies paratuberculosis (MAP) is intimately linked to the etiology of CD. This fastidious bacterium also known to cause Johne’s disease in cattle has infected the intestines of animals for years. It is believed that due to the thick, waxy cell wall of MAP it is able to survive the process of pasteurization as well as chemical processes seen in irrigation purification systems. Subsequently meat, dairy products and water serve as key vehicles in the transmission of MAP infection to humans (from farm to fork) who have a genetic predisposition, thus leading to the development of CD. The challenges faced in culturing this bacterium from CD are many. Examples include its extreme slow growth, lack of cell wall, low abundance, and its mycobactin dependency. In this review article, data from 60 studies showing the detection and isolation of MAP by PCR and culture techniques have been reviewed. Although this review may not be 100% comprehensive of all studies, clearly the majority of the studies overwhelmingly and definitively support the role of MAP in at least 30%-50% of CD patients. It is very possible that lack of detection of MAP from some CD patients may be due to the absence of MAP role in these patients. The latter statement is conditional on utilization of methodology appropriate for detection of human MAP strains. Ultimately, stratification of CD and inflammatory bowel disease patients for the presence or absence of MAP is necessary for appropriate and effective treatment which may lead to a cure. PMID:24966610

  14. The use of Remotely Piloted Aircraft Systems for the Innovative Methodologies in thermal Energy Release monitoring

    NASA Astrophysics Data System (ADS)

    Marotta, Enrica; Avino, Rosario; Avvisati, Gala; Belviso, Pasquale; Caliro, Stefano; Caputo, Teresa; Carandente, Antonio; Peluso, Rosario; Sangianantoni, Agata; Sansivero, Fabio; Vilardo, Giuseppe

    2017-04-01

    Last years have been characterized by a fast development of Remotely Piloted Aircraft Systems which are becoming cheaper, lighter and more powerful. The concurrent development of high resolution, lightweight and energy saving sensors sometimes specifically designed for air-borne applications are together rapidly changing the way in which it is possible to perform monitoring and surveys in hazardous environments such as volcanoes. An example of this convergence is the new methodology we are currently developing at the INGV-Osservatorio Vesuviano for the estimation of the thermal energy release of volcanic diffuse degassing areas using the ground temperatures from thermal infrared images. Preliminary experiments, carried out during many-years campaigns performed inside at La Solfatara crater by using thermal infrared images and K type thermocouples inserted into the ground at various depths, found a correlation between surface temperature and shallow gradient. Due to the large extent of areas affected by thermal anomalies, an effective and expedite tool to acquire the IR images is a RPAS equipped with high-resolution thermal and visible cameras. These acquisitions allow to quickly acquire the data to produce a heat release map. This map is then orthorectified and geocoded in order to be superimposed on digital terrain models or on the orthophotogrammetric mosaic obtained after processing photos acquired by RPAS. Such expedite maps of heat flux, taking in account accurate filtering of atmospheric influence, represents a useful tool for volcanic surveillance monitoring purposes. In order to start all the activities of these drones we had to acquire all necessary permissions required by the complex Italian normative.

  15. Optimizing Crawler4j using MapReduce Programming Model

    NASA Astrophysics Data System (ADS)

    Siddesh, G. M.; Suresh, Kavya; Madhuri, K. Y.; Nijagal, Madhushree; Rakshitha, B. R.; Srinivasa, K. G.

    2017-06-01

    World wide web is a decentralized system that consists of a repository of information on the basis of web pages. These web pages act as a source of information or data in the present analytics world. Web crawlers are used for extracting useful information from web pages for different purposes. Firstly, it is used in web search engines where the web pages are indexed to form a corpus of information and allows the users to query on the web pages. Secondly, it is used for web archiving where the web pages are stored for later analysis phases. Thirdly, it can be used for web mining where the web pages are monitored for copyright purposes. The amount of information processed by the web crawler needs to be improved by using the capabilities of modern parallel processing technologies. In order to solve the problem of parallelism and the throughput of crawling this work proposes to optimize the Crawler4j using the Hadoop MapReduce programming model by parallelizing the processing of large input data. Crawler4j is a web crawler that retrieves useful information about the pages that it visits. The crawler Crawler4j coupled with data and computational parallelism of Hadoop MapReduce programming model improves the throughput and accuracy of web crawling. The experimental results demonstrate that the proposed solution achieves significant improvements with respect to performance and throughput. Hence the proposed approach intends to carve out a new methodology towards optimizing web crawling by achieving significant performance gain.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sahoo, Satiprasad; Dhar, Anirban, E-mail: anirban.dhar@gmail.com; Kar, Amlanjyoti

    Environmental management of an area describes a policy for its systematic and sustainable environmental protection. In the present study, regional environmental vulnerability assessment in Hirakud command area of Odisha, India is envisaged based on Grey Analytic Hierarchy Process method (Grey–AHP) using integrated remote sensing (RS) and geographic information system (GIS) techniques. Grey–AHP combines the advantages of classical analytic hierarchy process (AHP) and grey clustering method for accurate estimation of weight coefficients. It is a new method for environmental vulnerability assessment. Environmental vulnerability index (EVI) uses natural, environmental and human impact related factors, e.g., soil, geology, elevation, slope, rainfall, temperature, windmore » speed, normalized difference vegetation index, drainage density, crop intensity, agricultural DRASTIC value, population density and road density. EVI map has been classified into four environmental vulnerability zones (EVZs) namely: ‘low’, ‘moderate’ ‘high’, and ‘extreme’ encompassing 17.87%, 44.44%, 27.81% and 9.88% of the study area, respectively. EVI map indicates that the northern part of the study area is more vulnerable from an environmental point of view. EVI map shows close correlation with elevation. Effectiveness of the zone classification is evaluated by using grey clustering method. General effectiveness is in between “better” and “common classes”. This analysis demonstrates the potential applicability of the methodology. - Highlights: • Environmental vulnerability zone identification based on Grey Analytic Hierarchy Process (AHP) • The effectiveness evaluation by means of a grey clustering method with support from AHP • Use of grey approach eliminates the excessive dependency on the experience of experts.« less

  17. Implementing the EU Floods Directive (2007/60/EC) in Austria: Flood Risk Management Plans

    NASA Astrophysics Data System (ADS)

    Neuhold, Clemens

    2013-04-01

    he Directive 2007/60/EC of the European Parliament and of the Council of 23 October 2007 on the assessment and management of flood risks (EFD) aims at the reduction of the adverse consequences for human health, the environment, cultural heritage and economic activity associated with floods in the Community. This task is to be achieved based on three process steps (1) preliminary flood risk assessment (finalised by the end of 2011), (2) flood hazard maps and flood risk maps (due 2013) and (3) flood risk management plans (due 2015). Currently, an interdisciplinary national working group is defining the methodological framework for flood risk management plans in Austria supported by a constant exchange with international bodies and experts. Referring to the EFD the components of the flood risk management plan are (excerpt): 1. conclusions of the preliminary flood risk assessment 2. flood hazard maps and flood risk maps and the conclusions that can be drawn from those maps 3. a description of the appropriate objectives of flood risk management 4. a summary of measures and their prioritisation aiming to achieve the appropriate objectives of flood risk management The poster refers to some of the major challenges in this process, such as the legal provisions, coordination of administrative units, definition of public relations, etc. The implementation of the EFD requires the harmonisation of legal instruments of various disciplines (e.g. water management, spatial planning, civil protection) enabling a coordinated - and ideally binding - practice of flood risk management. This process is highly influenced by the administrative organisation in Austria - federal, provincial and municipality level. The Austrian approach meets this organisational framework by structuring the development of the flood risk management plan into 3 time-steps: (a) federal blueprint, (b) provincial editing and (c) federal finishing as well as reporting to the European Commission. Each time-step addresses different administrative levels and spatial scales accompanied by the active involvement of interested parties.

  18. Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott

    1995-01-01

    This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.

  19. A deviation based assessment methodology for multiple machine health patterns classification and fault detection

    NASA Astrophysics Data System (ADS)

    Jia, Xiaodong; Jin, Chao; Buzza, Matt; Di, Yuan; Siegel, David; Lee, Jay

    2018-01-01

    Successful applications of Diffusion Map (DM) in machine failure detection and diagnosis have been reported in several recent studies. DM provides an efficient way to visualize the high-dimensional, complex and nonlinear machine data, and thus suggests more knowledge about the machine under monitoring. In this paper, a DM based methodology named as DM-EVD is proposed for machine degradation assessment, abnormality detection and diagnosis in an online fashion. Several limitations and challenges of using DM for machine health monitoring have been analyzed and addressed. Based on the proposed DM-EVD, a deviation based methodology is then proposed to include more dimension reduction methods. In this work, the incorporation of Laplacian Eigen-map and Principal Component Analysis (PCA) are explored, and the latter algorithm is named as PCA-Dev and is validated in the case study. To show the successful application of the proposed methodology, case studies from diverse fields are presented and investigated in this work. Improved results are reported by benchmarking with other machine learning algorithms.

  20. Image enhancements of Landsat 8 (OLI) and SAR data for preliminary landslide identification and mapping applied to the central region of Kenya

    NASA Astrophysics Data System (ADS)

    Mwaniki, M. W.; Kuria, D. N.; Boitt, M. K.; Ngigi, T. G.

    2017-04-01

    Image enhancements lead to improved performance and increased accuracy of feature extraction, recognition, identification, classification and hence change detection. This increases the utility of remote sensing to suit environmental applications and aid disaster monitoring of geohazards involving large areas. The main aim of this study was to compare the effect of image enhancement applied to synthetic aperture radar (SAR) data and Landsat 8 imagery in landslide identification and mapping. The methodology involved pre-processing Landsat 8 imagery, image co-registration, despeckling of the SAR data, after which Landsat 8 imagery was enhanced by Principal and Independent Component Analysis (PCA and ICA), a spectral index involving bands 7 and 4, and using a False Colour Composite (FCC) with the components bearing the most geologic information. The SAR data were processed using textural and edge filters, and computation of SAR incoherence. The enhanced spatial, textural and edge information from the SAR data was incorporated to the spectral information from Landsat 8 imagery during the knowledge based classification. The methodology was tested in the central highlands of Kenya, characterized by rugged terrain and frequent rainfall induced landslides. The results showed that the SAR data complemented Landsat 8 data which had enriched spectral information afforded by the FCC with enhanced geologic information. The SAR classification depicted landslides along the ridges and lineaments, important information lacking in the Landsat 8 image classification. The success of landslide identification and classification was attributed to the enhanced geologic features by spectral, textural and roughness properties.

  1. Experimental methodology for turbocompressor in-duct noise evaluation based on beamforming wave decomposition

    NASA Astrophysics Data System (ADS)

    Torregrosa, A. J.; Broatch, A.; Margot, X.; García-Tíscar, J.

    2016-08-01

    An experimental methodology is proposed to assess the noise emission of centrifugal turbocompressors like those of automotive turbochargers. A step-by-step procedure is detailed, starting from the theoretical considerations of sound measurement in flow ducts and examining specific experimental setup guidelines and signal processing routines. Special care is taken regarding some limiting factors that adversely affect the measuring of sound intensity in ducts, namely calibration, sensor placement and frequency ranges and restrictions. In order to provide illustrative examples of the proposed techniques and results, the methodology has been applied to the acoustic evaluation of a small automotive turbocharger in a flow bench. Samples of raw pressure spectra, decomposed pressure waves, calibration results, accurate surge characterization and final compressor noise maps and estimated spectrograms are provided. The analysis of selected frequency bands successfully shows how different, known noise phenomena of particular interest such as mid-frequency "whoosh noise" and low-frequency surge onset are correlated with operating conditions of the turbocharger. Comparison against external inlet orifice intensity measurements shows good correlation and improvement with respect to alternative wave decomposition techniques.

  2. Safety assessment on pedestrian crossing environments using MLS data.

    PubMed

    Soilán, Mario; Riveiro, Belén; Sánchez-Rodríguez, Ana; Arias, Pedro

    2018-02-01

    In the framework of infrastructure analysis and maintenance in an urban environment, it is important to address the safety of every road user. This paper presents a methodology for the evaluation of several safety indicators on pedestrian crossing environments using geometric and radiometric information extracted from 3D point clouds collected by a Mobile Mapping System (MMS). The methodology is divided in four main modules which analyze the accessibility of the crossing area, the presence of traffic lights and traffic signs, and the visibility between a driver and a pedestrian on the proximities of a pedestrian crossing. The outputs of the analysis are exported to a Geographic Information System (GIS) where they are visualized and can be further processed in the context of city management. The methodology has been tested on approximately 30 pedestrian crossings in cluttered urban environments of two different cities. Results show that MMS are a valid mean to assess the safety of a specific urban environment, regarding its geometric conditions. Remarkable results are presented on traffic light classification, with a global F-score close to 95%. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Between the Remnants of Colonialism and the Insurgence of Self-Narrative in Constructing Participatory Social Maps: Towards a Land Education Methodology

    ERIC Educational Resources Information Center

    Sato, Michèle; Silva, Regina; Jaber, Michelle

    2014-01-01

    This article summarizes a social mapping project conducted by the Environmental Education, Communication and Arts Research Group from the Federal University of Mato Grosso. The primary goals of the project were to map the vulnerable social groups of Mato Grosso, and identify the social and environmental conflicts that put them in situations of…

  4. Towards a Pre-Intervention Analytical Methodology

    DTIC Science & Technology

    2012-08-01

    such as concept maps ( Kaste 2007), mind maps, and dynamically interactive networks, available from a variety of proprietary, government, and open...Joint Publication 3-27, Counterterrorism (JP 3-27). Joint Staff. July 2008. Joint Publication 3-57, Civil-Military Operations (JP 3-57). Kaste , R; E

  5. An Attempt of Formalizing the Selection Parameters for Settlements Generalization in Small-Scales

    NASA Astrophysics Data System (ADS)

    Karsznia, Izabela

    2014-12-01

    The paper covers one of the most important problems concerning context-sensitive settlement selection for the purpose of the small-scale maps. So far, no formal parameters for small-scale settlements generalization have been specified, hence the problem seems to be an important and innovative challenge. It is also crucial from the practical point of view as it is necessary to develop appropriate generalization algorithms for the purpose of the General Geographic Objects Database generalization which is the essential Spatial Data Infrastructure component in Poland. The author proposes and verifies quantitative generalization parameters for the purpose of the settlement selection process in small-scale maps. The selection of settlements was carried out in two research areas - in Lower Silesia and Łódź Province. Based on the conducted analysis appropriate contextual-sensitive settlements selection parameters have been defined. Particular effort has been made to develop a methodology of quantitative settlements selection which would be useful in the automation processes and that would make it possible to keep specifics of generalized objects unchanged.

  6. An Expert Map of Gambling Risk Perception.

    PubMed

    Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul

    2015-12-01

    The purpose of the current study was to investigate the moderating or mediating role played by risk perception in decision-making, gambling behaviour, and disordered gambling aetiology. Eleven gambling expert clinicians and researchers completed a semi-structured interview derived from mental models and grounded theory methodologies. Expert interview data was used to construct a comprehensive expert mental model 'map' detailing risk-perception related factors contributing to harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and emergent themes. Findings indicated that experts considered idiosyncratic beliefs among gamblers result in overall underestimates of risk and loss, insufficient prioritization of needs, and planning and implementation of risk management strategies. Additional contextual factors influencing use of risk information (reinforcement and learning; mental states, environmental cues, ambivalence; and socio-cultural and biological variables) acted to shape risk perceptions and increase vulnerabilities to harm or disordered gambling. It was concluded that understanding the nature, extent and processes by which risk perception predisposes an individual to maintain gambling despite adverse consequences can guide the content of preventative educational responsible gambling campaigns.

  7. SMITHERS: An object-oriented modular mapping methodology for MCNP-based neutronic–thermal hydraulic multiphysics

    DOE PAGES

    Richard, Joshua; Galloway, Jack; Fensin, Michael; ...

    2015-04-04

    A novel object-oriented modular mapping methodology for externally coupled neutronics–thermal hydraulics multiphysics simulations was developed. The Simulator using MCNP with Integrated Thermal-Hydraulics for Exploratory Reactor Studies (SMITHERS) code performs on-the-fly mapping of material-wise power distribution tallies implemented by MCNP-based neutron transport/depletion solvers for use in estimating coolant temperature and density distributions with a separate thermal-hydraulic solver. The key development of SMITHERS is that it reconstructs the hierarchical geometry structure of the material-wise power generation tallies from the depletion solver automatically, with only a modicum of additional information required from the user. In addition, it performs the basis mapping from themore » combinatorial geometry of the depletion solver to the required geometry of the thermal-hydraulic solver in a generalizable manner, such that it can transparently accommodate varying levels of thermal-hydraulic solver geometric fidelity, from the nodal geometry of multi-channel analysis solvers to the pin-cell level of discretization for sub-channel analysis solvers.« less

  8. Integrating Statistical and Expert Knowledge to Develop Phenoregions for the Continental United States

    NASA Astrophysics Data System (ADS)

    Betancourt, J. L.; Biondi, F.; Bradford, J. B.; Foster, J. R.; Betancourt, J. L.; Foster, J. R.; Biondi, F.; Bradford, J. B.; Henebry, G. M.; Post, E.; Koenig, W.; Hoffman, F. M.; de Beurs, K.; Hoffman, F. M.; Kumar, J.; Hargrove, W. W.; Norman, S. P.; Brooks, B. G.

    2016-12-01

    Vegetated ecosystems exhibit unique phenological behavior over the course of a year, suggesting that remotely sensed land surface phenology may be useful for characterizing land cover and ecoregions. However, phenology is also strongly influenced by temperature and water stress; insect, fire, and weather disturbances; and climate change over seasonal, interannual, decadal and longer time scales. Normalized difference vegetation index (NDVI), a remotely sensed measure of greenness, provides a useful proxy for land surface phenology. We used NDVI for the conterminous United States (CONUS) derived from the Moderate Resolution Spectroradiometer (MODIS) every eight days at 250 m resolution for the period 2000-2015 to develop phenological signatures of emergent ecological regimes called phenoregions. We employed a "Big Data" classification approach on a supercomputer, specifically applying an unsupervised data mining technique, to this large collection of NDVI measurements to develop annual maps of phenoregions. This technique produces a prescribed number of prototypical phenological states to which every location belongs in any year. To reduce the impact of short-term disturbances, we derived a single map of the mode of annual phenological states for the CONUS, assigning each map cell to the state with the largest integrated NDVI in cases where multiple states tie for the highest frequency of occurrence. Since the data mining technique is unsupervised, individual phenoregions are not associated with an ecologically understandable label. To add automated supervision to the process, we applied the method of Mapcurves, developed by Hargrove and Hoffman, to associate individual phenoregions with labeled polygons in expert-derived maps of biomes, land cover, and ecoregions. We will present the phenoregions methodology and resulting maps for the CONUS, describe the "label-stealing" technique for ascribing biome characteristics to phenoregions, and introduce a new polar plotting scheme for processing NDVI data by localized seasonality.

  9. A comparison of the IGBP DISCover and University of Maryland 1 km global land cover products

    USGS Publications Warehouse

    Hansen, M.C.; Reed, B.

    2000-01-01

    Two global 1 km land cover data sets derived from 1992-1993 Advanced Very High Resolution Radiometer (AVHRR) data are currently available, the International Geosphere-Biosphere Programme Data and Information System (IGBP-DIS) DISCover and the University of Maryland (UMd) 1 km land cover maps. This paper makes a preliminary comparison of the methodologies and results of the two products. The DISCover methodology employed an unsupervised clustering classification scheme on a per-continent basis using 12 monthly maximum NDVI composites as inputs. The UMd approach employed a supervised classification tree method in which temporal metrics derived from all AVHRR bands and the NDVI were used to predict class membership across the entire globe. The DISCover map uses the IGBP classification scheme, while the UMd map employs a modified IGBP scheme minus the classes of permanent wetlands, cropland/natural vegetation mosaic and ice and snow. Global area totals of aggregated vegetation types are very similar and have a per-pixel agreement of 74%. For tall versus short/no vegetation, the per-pixel agreement is 84%. For broad vegetation types, core areas map similarly, while transition zones around core areas differ significantly. This results in high regional variability between the maps. Individual class agreement between the two 1 km maps is 49%. Comparison of the maps at a nominal 0.5 resolution with two global ground-based maps shows an improvement of thematic concurrency of 46% when viewing average class agreement. The absence of the cropland mosaic class creates a difficulty in comparing the maps, due to its significant extent in the DISCover map. The DISCover map, in general, has more forest, while the UMd map has considerably more area in the intermediate tree cover classes of woody savanna/ woodland and savanna/wooded grassland.

  10. Mapping soil texture classes and optimization of the result by accuracy assessment

    NASA Astrophysics Data System (ADS)

    Laborczi, Annamária; Takács, Katalin; Bakacsi, Zsófia; Szabó, József; Pásztor, László

    2014-05-01

    There are increasing demands nowadays on spatial soil information in order to support environmental related and land use management decisions. The GlobalSoilMap.net (GSM) project aims to make a new digital soil map of the world using state-of-the-art and emerging technologies for soil mapping and predicting soil properties at fine resolution. Sand, silt and clay are among the mandatory GSM soil properties. Furthermore, soil texture class information is input data of significant agro-meteorological and hydrological models. Our present work aims to compare and evaluate different digital soil mapping methods and variables for producing the most accurate spatial prediction of texture classes in Hungary. In addition to the Hungarian Soil Information and Monitoring System as our basic data, digital elevation model and its derived components, geological database, and physical property maps of the Digital Kreybig Soil Information System have been applied as auxiliary elements. Two approaches have been applied for the mapping process. At first the sand, silt and clay rasters have been computed independently using regression kriging (RK). From these rasters, according to the USDA categories, we have compiled the texture class map. Different combinations of reference and training soil data and auxiliary covariables have resulted several different maps. However, these results consequentially include the uncertainty factor of the three kriged rasters. Therefore we have suited data mining methods as the other approach of digital soil mapping. By working out of classification trees and random forests we have got directly the texture class maps. In this way the various results can be compared to the RK maps. The performance of the different methods and data has been examined by testing the accuracy of the geostatistically computed and the directly classified results. We have used the GSM methodology to assess the most predictive and accurate way for getting the best among the several result maps. Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).

  11. A process proof test for model concepts: Modelling the meso-scale

    NASA Astrophysics Data System (ADS)

    Hellebrand, Hugo; Müller, Christoph; Matgen, Patrick; Fenicia, Fabrizio; Savenije, Huub

    In hydrological modelling the use of detailed soil data is sometimes troublesome, since often these data are hard to obtain and, if available at all, difficult to interpret and process in a way that makes them meaningful for the model at hand. Intuitively the understanding and mapping of dominant runoff processes in the soil show high potential for improving hydrological models. In this study a labour-intensive methodology to assess dominant runoff processes is simplified in such a way that detailed soil maps are no longer needed. Nonetheless, there is an ongoing debate on how to integrate this type of information in hydrological models. In this study, dominant runoff processes (DRP) are mapped for meso-scale basins using the permeability of the substratum, land use information and the slope in a GIS. During a field campaign the processes are validated and for each DRP assumptions are made concerning their water storage capacity. The latter is done by means of combining soil data obtained during the field campaign with soil data obtained from the literature. Second, several parsimoniously parameterized conceptual hydrological models are used that incorporate certain aspects of the DRP. The result of these models are compared with a benchmark model in which the soil is represented as only one lumped parameter to test the contribution of the DRP in hydrological models. The proposed methodology is tested for 15 meso-scale river basins located in Luxembourg. The main goal of this study is to investigate if integrating dominant runoff processes, which have high information content concerning soil characteristics, with hydrological models allows the improvement of simulation results models with a view to regionalization and predictions in ungauged basins. The regionalization procedure gave no clear results. The calibration procedure and the well-mixed discharge signal of the calibration basins are considered major causes for this and it made the deconvolution of discharge signals of meso-scale basins problematic. From the results it is also suggested that DRP could very well display some sort of uniqueness of place, which was not foreseen in the methods from which they were derived. Furthermore, a strong seasonal influence on model performance was observed, implying a seasonal dependence of the DRP. When comparing the performance between the DRP models and the benchmark model no real distinction was found. To improve the performance of the DRP models, which are used in this study and also for then use of conceptual models in general, there is a need for an improved identification of the mechanisms that cause the different dominant runoff processes at the meso-scale. To achieve this, more orthogonal data could be of use for a better conceptualization of the DRPs. Then, models concepts should be adapted accordingly.

  12. A simple methodology to produce flood risk maps consistent with FEMA's base flood elevation maps: Implementation and validation over the entire contiguous United States

    NASA Astrophysics Data System (ADS)

    Goteti, G.; Kaheil, Y. H.; Katz, B. G.; Li, S.; Lohmann, D.

    2011-12-01

    In the United States, government agencies as well as the National Flood Insurance Program (NFIP) use flood inundation maps associated with the 100-year return period (base flood elevation, BFE), produced by the Federal Emergency Management Agency (FEMA), as the basis for flood insurance. A credibility check of the flood risk hydraulic models, often employed by insurance companies, is their ability to reasonably reproduce FEMA's BFE maps. We present results from the implementation of a flood modeling methodology aimed towards reproducing FEMA's BFE maps at a very fine spatial resolution using a computationally parsimonious, yet robust, hydraulic model. The hydraulic model used in this study has two components: one for simulating flooding of the river channel and adjacent floodplain, and the other for simulating flooding in the remainder of the catchment. The first component is based on a 1-D wave propagation model, while the second component is based on a 2-D diffusive wave model. The 1-D component captures the flooding from large-scale river transport (including upstream effects), while the 2-D component captures the flooding from local rainfall. The study domain consists of the contiguous United States, hydrologically subdivided into catchments averaging about 500 km2 in area, at a spatial resolution of 30 meters. Using historical daily precipitation data from the Climate Prediction Center (CPC), the precipitation associated with the 100-year return period event was computed for each catchment and was input to the hydraulic model. Flood extent from the FEMA BFE maps is reasonably replicated by the 1-D component of the model (riverine flooding). FEMA's BFE maps only represent the riverine flooding component and are unavailable for many regions of the USA. However, this modeling methodology (1-D and 2-D components together) covers the entire contiguous USA. This study is part of a larger modeling effort from Risk Management Solutions° (RMS) to estimate flood risk associated with extreme precipitation events in the USA. Towards this greater objective, state-of-the-art models of flood hazard and stochastic precipitation are being implemented over the contiguous United States. Results from the successful implementation of the modeling methodology will be presented.

  13. Geodiversity assessment for environmental management of geomorphosites: Derborence and Illgraben, Swiss Alps

    NASA Astrophysics Data System (ADS)

    Jaskulska, Alicja; Reynard, Emmanuel; Zwoliński, Zbigniew

    2013-04-01

    The concept of geodiversity was created relatively recently and has been accepted by geomorphologists and geologists worldwide. Nevertheless, despite the widespread use of the term, little progress has been made in its evaluation. Until now, only a few authors have undertaken, directly or indirectly, methodological issues related to the geodiversity estimation. In some studies, geodiversity maps were applied to investigate the spatial or genetic relationships with the richness of particular environmental elements like geosites, geomorphosites, geoarchaeological and palaeontological sites, etc. However, so far, the spatial differentiation of geodiversity values in areas already accepted as large geomorphosites has not been undertaken. This poster presents a new methodology developed to assess the geodiversity in geoinformation environments and tested in two geomorphosites located in the Swiss Alps: Derborence and Illgraben. Derborence is a quite isolated valley, where some big rockslides occurred in the past; the sharp rockslide landforms, high limestone cliffs and a lake dammed by the rockslide deposits attract tourists in summer. A part of the valley is a natural reserve managed by Pronatura (a national environmental association). Illgraben is a steep torrential system on the left bank of the Rhone River valley, characterized by high erosion rates and frequent occurrence of rockfalls and debris flows. The site is the largest active torrential system in Switzerland and is part of a Regional Nature Park. Both geomorphosites are recognized as geosites of national importance. The basis of the assessment is the selection of features of the geographical environment, which on one hand describe landforms, and on the other indicate geomorphometric differences. Firstly, seven factor maps were processed for each area: landform energy derived from a 25-meter digital elevation model, landform fragmentation generated from the Topographic Position Index (TPI), contemporary landform preservation derived from land use classification using high resolution ortho images, geological settings, geomorphological features, soils and hydrology elements. Input maps were then standardized by attributing grid geodiversity values in five classes to each raster map: very low geodiversity, low geodiversity, medium geodiversity, high geodiversity and very high geodiversity. Obtained maps result from map algebra operations carried out by multi criteria evaluation (MCE) with GIS-based Weighted Linear Combination (WLC) technique. The final geodiversity maps for each of the two geomorphosites were then compared with existing tourist trails and panoramic points to verify if there are any dependencies. Geosite inventories are a more or less qualitative selection of sites considered as important by the scientific community for their contribution to Earth history knowledge and more in general for the society. Some geosites, in particular geomorphosites, can be quite large (several sq. km), and sometimes heterogeneous. The proposed methodology, tested on two Swiss geomorphosites, allows the intrinsic geodiversity differentiation of large geosites to be assessed and the results could be used for other purposes such as the preservation of specific features within the geosite perimeter, spatial planning or tourist management.

  14. Lean methodology in i.v. medication processes in a children's hospital.

    PubMed

    L'Hommedieu, Timothy; Kappeler, Karl

    2010-12-15

    The impact of lean methodology in i.v. medication processes in a children's hospital was studied. Medication orders at a children's hospital were analyzed for 30 days to identify the specific times when most medications were changed or discontinued. Value-stream mapping was used to define the current state of preparation and identify non-value-added tasks in the i.v. medication preparation and dispensing processes. An optimization model was created using specific measurements to establish the optimal number of batches and batch preparation times of batches. Returned i.v. medications were collected for 7 days before and after implementation of the lean process to determine the impact of the lean process changes. Patient-days increased from 1,836 during the first collection period to 2,017 during the second, and the total number of i.v. doses dispensed increased from 8,054 to 9,907. Wasted i.v. doses decreased from 1,339 (16.6% of the total doses dispensed) to 853 (8.6%). With the new process, Nationwide Children's Hospital was projected to realize a weekly savings of $8,197 ($426,244 annually), resulting in a 2.6% reduction in annual drug expenditure. The annual savings is a conservative estimate, due to the 10% increase in patient-days after the lean collection period compared with baseline. The differences in wasted doses and their costs were significant (p < 0.05). Implementing lean concepts in the i.v. medication preparation process had a positive effect on efficiency and drug cost.

  15. Ten Years of Vegetation Change in Northern California Marshlands Detected using Landsat Satellite Image Analysis

    NASA Technical Reports Server (NTRS)

    Potter, Christopher

    2013-01-01

    The Landsat Ecosystem Disturbance Adaptive Processing System (LEDAPS) methodology was applied to detected changes in perennial vegetation cover at marshland sites in Northern California reported to have undergone restoration between 1999 and 2009. Results showed extensive contiguous areas of restored marshland plant cover at 10 of the 14 sites selected. Gains in either woody shrub cover and/or from recovery of herbaceous cover that remains productive and evergreen on a year-round basis could be mapped out from the image results. However, LEDAPS may not be highly sensitive changes in wetlands that have been restored mainly with seasonal herbaceous cover (e.g., vernal pools), due to the ephemeral nature of the plant greenness signal. Based on this evaluation, the LEDAPS methodology would be capable of fulfilling a pressing need for consistent, continual, low-cost monitoring of changes in marshland ecosystems of the Pacific Flyway.

  16. The Evolutionary History of Protein Domains Viewed by Species Phylogeny

    PubMed Central

    Yang, Song; Bourne, Philip E.

    2009-01-01

    Background Protein structural domains are evolutionary units whose relationships can be detected over long evolutionary distances. The evolutionary history of protein domains, including the origin of protein domains, the identification of domain loss, transfer, duplication and combination with other domains to form new proteins, and the formation of the entire protein domain repertoire, are of great interest. Methodology/Principal Findings A methodology is presented for providing a parsimonious domain history based on gain, loss, vertical and horizontal transfer derived from the complete genomic domain assignments of 1015 organisms across the tree of life. When mapped to species trees the evolutionary history of domains and domain combinations is revealed, and the general evolutionary trend of domain and combination is analyzed. Conclusions/Significance We show that this approach provides a powerful tool to study how new proteins and functions emerged and to study such processes as horizontal gene transfer among more distant species. PMID:20041107

  17. Numerical prediction of algae cell mixing feature in raceway ponds using particle tracing methods.

    PubMed

    Ali, Haider; Cheema, Taqi A; Yoon, Ho-Sung; Do, Younghae; Park, Cheol W

    2015-02-01

    In the present study, a novel technique, which involves numerical computation of the mixing length of algae particles in raceway ponds, was used to evaluate the mixing process. A value of mixing length that is higher than the maximum streamwise distance (MSD) of algae cells indicates that the cells experienced an adequate turbulent mixing in the pond. A coupling methodology was adapted to map the pulsating effects of a 2D paddle wheel on a 3D raceway pond in this study. The turbulent mixing was examined based on the computations of mixing length, residence time, and algae cell distribution in the pond. The results revealed that the use of particle tracing methodology is an improved approach to define the mixing phenomenon more effectively. Moreover, the algae cell distribution aided in identifying the degree of mixing in terms of mixing length and residence time. © 2014 Wiley Periodicals, Inc.

  18. Mapping Base Modifications in DNA by Transverse-Current Sequencing

    NASA Astrophysics Data System (ADS)

    Alvarez, Jose R.; Skachkov, Dmitry; Massey, Steven E.; Kalitsov, Alan; Velev, Julian P.

    2018-02-01

    Sequencing DNA modifications and lesions, such as methylation of cytosine and oxidation of guanine, is even more important and challenging than sequencing the genome itself. The traditional methods for detecting DNA modifications are either insensitive to these modifications or require additional processing steps to identify a particular type of modification. Transverse-current sequencing in nanopores can potentially identify the canonical bases and base modifications in the same run. In this work, we demonstrate that the most common DNA epigenetic modifications and lesions can be detected with any predefined accuracy based on their tunneling current signature. Our results are based on simulations of the nanopore tunneling current through DNA molecules, calculated using nonequilibrium electron-transport methodology within an effective multiorbital model derived from first-principles calculations, followed by a base-calling algorithm accounting for neighbor current-current correlations. This methodology can be integrated with existing experimental techniques to improve base-calling fidelity.

  19. Collective odor source estimation and search in time-variant airflow environments using mobile robots.

    PubMed

    Meng, Qing-Hao; Yang, Wei-Xing; Wang, Yang; Zeng, Ming

    2011-01-01

    This paper addresses the collective odor source localization (OSL) problem in a time-varying airflow environment using mobile robots. A novel OSL methodology which combines odor-source probability estimation and multiple robots' search is proposed. The estimation phase consists of two steps: firstly, the separate probability-distribution map of odor source is estimated via Bayesian rules and fuzzy inference based on a single robot's detection events; secondly, the separate maps estimated by different robots at different times are fused into a combined map by way of distance based superposition. The multi-robot search behaviors are coordinated via a particle swarm optimization algorithm, where the estimated odor-source probability distribution is used to express the fitness functions. In the process of OSL, the estimation phase provides the prior knowledge for the searching while the searching verifies the estimation results, and both phases are implemented iteratively. The results of simulations for large-scale advection-diffusion plume environments and experiments using real robots in an indoor airflow environment validate the feasibility and robustness of the proposed OSL method.

  20. Collective Odor Source Estimation and Search in Time-Variant Airflow Environments Using Mobile Robots

    PubMed Central

    Meng, Qing-Hao; Yang, Wei-Xing; Wang, Yang; Zeng, Ming

    2011-01-01

    This paper addresses the collective odor source localization (OSL) problem in a time-varying airflow environment using mobile robots. A novel OSL methodology which combines odor-source probability estimation and multiple robots’ search is proposed. The estimation phase consists of two steps: firstly, the separate probability-distribution map of odor source is estimated via Bayesian rules and fuzzy inference based on a single robot’s detection events; secondly, the separate maps estimated by different robots at different times are fused into a combined map by way of distance based superposition. The multi-robot search behaviors are coordinated via a particle swarm optimization algorithm, where the estimated odor-source probability distribution is used to express the fitness functions. In the process of OSL, the estimation phase provides the prior knowledge for the searching while the searching verifies the estimation results, and both phases are implemented iteratively. The results of simulations for large-scale advection–diffusion plume environments and experiments using real robots in an indoor airflow environment validate the feasibility and robustness of the proposed OSL method. PMID:22346650

  1. A combined approach based on MAF analysis and AHP method to fault detection mapping: A case study from a gas field, southwest of Iran

    NASA Astrophysics Data System (ADS)

    Shakiba, Sima; Asghari, Omid; Khah, Nasser Keshavarz Faraj

    2018-01-01

    A combined geostatitical methodology based on Min/Max Auto-correlation Factor (MAF) analysis and Analytical Hierarchy Process (AHP) is presented to generate a suitable Fault Detection Map (FDM) through seismic attributes. Five seismic attributes derived from a 2D time slice obtained from data related to a gas field located in southwest of Iran are used including instantaneous amplitude, similarity, energy, frequency, and Fault Enhancement Filter (FEF). The MAF analysis is implemented to reduce dimension of input variables, and then AHP method is applied on three obtained de-correlated MAF factors as evidential layer. Three Decision Makers (DMs) are used to construct PCMs for determining weights of selected evidential layer. Finally, weights obtained by AHP were multiplied in normalized valued of each alternative (MAF layers) and the concluded weighted layers were integrated in order to prepare final FDM. Results proved that applying algorithm proposed in this study generate a map more acceptable than the each individual attribute and sharpen the non-surface discontinuities as well as enhancing continuity of detected faults.

  2. Novel Passive Clearing Methods for the Rapid Production of Optical Transparency in Whole CNS Tissue.

    PubMed

    Woo, Jiwon; Lee, Eunice Yoojin; Park, Hyo-Suk; Park, Jeong Yoon; Cho, Yong Eun

    2018-05-08

    Since the development of CLARITY, a bioelectrochemical clearing technique that allows for three-dimensional phenotype mapping within transparent tissues, a multitude of novel clearing methodologies including CUBIC (clear, unobstructed brain imaging cocktails and computational analysis), SWITCH (system-wide control of interaction time and kinetics of chemicals), MAP (magnified analysis of the proteome), and PACT (passive clarity technique), have been established to further expand the existing toolkit for the microscopic analysis of biological tissues. The present study aims to improve upon and optimize the original PACT procedure for an array of intact rodent tissues, including the whole central nervous system (CNS), kidneys, spleen, and whole mouse embryos. Termed psPACT (process-separate PACT) and mPACT (modified PACT), these novel techniques provide highly efficacious means of mapping cell circuitry and visualizing subcellular structures in intact normal and pathological tissues. In the following protocol, we provide a detailed, step-by-step outline on how to achieve maximal tissue clearance with minimal invasion of their structural integrity via psPACT and mPACT.

  3. Planetary Geologic Mapping Handbook - 2010. Appendix

    NASA Technical Reports Server (NTRS)

    Tanaka, K. L.; Skinner, J. A., Jr.; Hare, T. M.

    2010-01-01

    Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces. Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962. Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete. Terrestrial geologic maps published by the USGS now are primarily digital products using geographic information system (GIS) software and file formats. GIS mapping tools permit easy spatial comparison, generation, importation, manipulation, and analysis of multiple raster image, gridded, and vector data sets. GIS software has also permitted the development of projectspecific tools and the sharing of geospatial products among researchers. GIS approaches are now being used in planetary geologic mapping as well. Guidelines or handbooks on techniques in planetary geologic mapping have been developed periodically. As records of the heritage of mapping methods and data, these remain extremely useful guides. However, many of the fundamental aspects of earlier mapping handbooks have evolved significantly, and a comprehensive review of currently accepted mapping methodologies is now warranted. As documented in this handbook, such a review incorporates additional guidelines developed in recent years for planetary geologic mapping by the NASA Planetary Geology and Geophysics (PGG) Program's Planetary Cartography and Geologic Mapping Working Group's (PCGMWG) Geologic Mapping Subcommittee (GEMS) on the selection and use of map bases as well as map preparation, review, publication, and distribution. In light of the current boom in planetary exploration and the ongoing rapid evolution of available data for planetary mapping, this handbook is especially timely.

  4. Groundwater vulnerability and risk mapping using GIS, modeling and a fuzzy logic tool.

    PubMed

    Nobre, R C M; Rotunno Filho, O C; Mansur, W J; Nobre, M M M; Cosenza, C A N

    2007-12-07

    A groundwater vulnerability and risk mapping assessment, based on a source-pathway-receptor approach, is presented for an urban coastal aquifer in northeastern Brazil. A modified version of the DRASTIC methodology was used to map the intrinsic and specific groundwater vulnerability of a 292 km(2) study area. A fuzzy hierarchy methodology was adopted to evaluate the potential contaminant source index, including diffuse and point sources. Numerical modeling was performed for delineation of well capture zones, using MODFLOW and MODPATH. The integration of these elements provided the mechanism to assess groundwater pollution risks and identify areas that must be prioritized in terms of groundwater monitoring and restriction on use. A groundwater quality index based on nitrate and chloride concentrations was calculated, which had a positive correlation with the specific vulnerability index.

  5. Centimeter-Level Robust Gnss-Aided Inertial Post-Processing for Mobile Mapping Without Local Reference Stations

    NASA Astrophysics Data System (ADS)

    Hutton, J. J.; Gopaul, N.; Zhang, X.; Wang, J.; Menon, V.; Rieck, D.; Kipka, A.; Pastor, F.

    2016-06-01

    For almost two decades mobile mapping systems have done their georeferencing using Global Navigation Satellite Systems (GNSS) to measure position and inertial sensors to measure orientation. In order to achieve cm level position accuracy, a technique referred to as post-processed carrier phase differential GNSS (DGNSS) is used. For this technique to be effective the maximum distance to a single Reference Station should be no more than 20 km, and when using a network of Reference Stations the distance to the nearest station should no more than about 70 km. This need to set up local Reference Stations limits productivity and increases costs, especially when mapping large areas or long linear features such as roads or pipelines. An alternative technique to DGNSS for high-accuracy positioning from GNSS is the so-called Precise Point Positioning or PPP method. In this case instead of differencing the rover observables with the Reference Station observables to cancel out common errors, an advanced model for every aspect of the GNSS error chain is developed and parameterized to within an accuracy of a few cm. The Trimble Centerpoint RTX positioning solution combines the methodology of PPP with advanced ambiguity resolution technology to produce cm level accuracies without the need for local reference stations. It achieves this through a global deployment of highly redundant monitoring stations that are connected through the internet and are used to determine the precise satellite data with maximum accuracy, robustness, continuity and reliability, along with advance algorithms and receiver and antenna calibrations. This paper presents a new post-processed realization of the Trimble Centerpoint RTX technology integrated into the Applanix POSPac MMS GNSS-Aided Inertial software for mobile mapping. Real-world results from over 100 airborne flights evaluated against a DGNSS network reference are presented which show that the post-processed Centerpoint RTX solution agrees with the DGNSS solution to better than 2.9 cm RMSE Horizontal and 5.5 cm RMSE Vertical. Such accuracies are sufficient to meet the requirements for a majority of airborne mapping applications.

  6. Mapping Sustainability Efforts at the Claremont Colleges

    ERIC Educational Resources Information Center

    Srebotnjak, Tanja; Norgaard, Lee Michelle

    2017-01-01

    Purpose: The purpose of this study is to map and analyze sustainability activities and relationships at the seven Claremont Colleges and graduate institutions using social network analysis (SNA) to inform sustainability planning and programming. Design/methodology/approach: Online surveys and interviews were conducted among faculty, staff and…

  7. Methodology of the interpretation of remote sensing data and applications in geology

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Veneziani, P.; Dosanjos, C. E.

    1981-01-01

    Methods used for interpreting orbital (LANDSAT) data for regional geological mapping in Brazil are examined. Particular attention is given to the levels of analysis used for studying geomorphology, structural geology, lithology, stratigraphy, surface geology, and dynamic processes. Examples of regional mapping described include: (1) rock intrusions in SE Sao Paulo, the southern parts of Minas Gerais, and the states of Rio de Janeiro, and Espiritu Santo; (2) a preliminary survey of Pre-Cambrian geology in the State of Piaui; and (3) the Gondwana Project - surveying Jaguaribe plants. Mineral exploration in Rio Grande do Sul, and the geology of the Alcalino complex of Itatiaia are discussed as well as the use of automatic classifications of rock intrusions and of ilmenite deposits in the Floresta Region. Aerial photography, side looking radar, and thermal infrared scanning are other types of remote sensors also used in prospecting for geothermal anomalies in the city of Caldas Novas-Goias.

  8. “Everybody sounds the same”: Otherwise Overlooked Ideology in Perceptual Dialectolgy

    PubMed Central

    Evans, Betsy E.

    2014-01-01

    When analyzing dialectology survey data, researchers usually exclude respondents who do not complete the survey as directed. It is argued here that such “unusable” responses can be considered “outlier” data and analyzed rather than be excluded, allowing otherwise overlooked language ideologies to emerge. Responses to a perceptual dialectology map survey in which 31 of the 229 respondents wrote comments on a map of Washington state, without drawing lines around perceived dialect areas as instructed, are described to illustrate this point. In the present data, ideologies such as the homogeneity of dialects and the importance of an urban/rural dichotomy surfaced. These themes are examined in terms outlined by Judith Irvine and Susan Gal in their discussion of how ideological processes are evident in language data. In addition, methodological issues regarding the presuppositions and orientation of respondents to the questionnaire itself are raised. PMID:25859054

  9. Implementing automatic LiDAR and supervised mapping methodologies to quantify agricultural terraced landforms at landscape scale: the case of Veneto Region

    NASA Astrophysics Data System (ADS)

    Eugenio Pappalardo, Salvatore; Ferrarese, Francesco; Tarolli, Paolo; Varotto, Mauro

    2016-04-01

    Traditional agricultural terraced landscapes presently embody an important cultural value to be deeply investigated, both for their role in local heritage and cultural economy and for their potential geo-hydrological hazard due to abandonment and degradation. Moreover, traditional terraced landscapes are usually based on non-intensive agro-systems and may enhance some important ecosystems services such as agro-biodiversity conservation and cultural services. Due to their unplanned genesis, mapping, quantifying and classifying agricultural terraces at regional scale is often critical as far as they are usually set up on geomorphologically and historically complex landscapes. Hence, traditional mapping methods are generally based on scientific literature and local documentation, historical and cadastral sources, technical cartography and aerial images visual interpretation or, finally, field surveys. By this, limitations and uncertainty in mapping at regional scale are basically related to forest cover and lack in thematic cartography. The Veneto Region (NE of Italy) presents a wide heterogeneity of agricultural terraced landscapes, mainly distributed within the hilly and Prealps areas. Previous studies performed by traditional mapping method quantified 2,688 ha of terraced areas, showing the higher values within the Prealps of Lessinia (1,013 ha, within the Province of Verona) and in the Brenta Valley (421 ha, within the Province of Vicenza); however, terraced features of these case studies show relevant differences in terms of fragmentation and intensity of terraces, highlighting dissimilar degrees of clusterization: 1.7 ha on one hand (Province of Verona) and 1.2 ha per terraced area (Province of Vicenza) on the other one. The aim of this paper is to implement and to compare automatic methodologies with traditional survey methodologies to map and assess agricultural terraces in two representative areas of the Veneto Region. Testing different Remote Sensing analyses such as LiDAR topography survey and visual interpretation from aerial orthophotos (RGB+NIR bands) we performed a territorial analysis in the Lessinia and Brenta Valley case studies. Preliminary results show that terraced feature extraction by automatic LiDAR survey is more efficient both in identifying geometries (walls and terraced surfaces) and in quantifying features under the forest canopy; however, traditional mapping methodology confirms its strength by matching different methods and different data such as aerial photo, visual interpretation, maps and field surveys. Hence, the two methods here compared represent a cross-validation and let us to better know the complexity of this kind of landscape.

  10. An experimental system for flood risk forecasting at global scale

    NASA Astrophysics Data System (ADS)

    Alfieri, L.; Dottori, F.; Kalas, M.; Lorini, V.; Bianchi, A.; Hirpa, F. A.; Feyen, L.; Salamon, P.

    2016-12-01

    Global flood forecasting and monitoring systems are nowadays a reality and are being applied by an increasing range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasts, combining streamflow estimations with expected inundated areas and flood impacts. To this end, we have developed an experimental procedure for near-real time flood mapping and impact assessment based on the daily forecasts issued by the Global Flood Awareness System (GloFAS). The methodology translates GloFAS streamflow forecasts into event-based flood hazard maps based on the predicted flow magnitude and the forecast lead time and a database of flood hazard maps with global coverage. Flood hazard maps are then combined with exposure and vulnerability information to derive flood risk. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To further increase the reliability of the proposed methodology we integrated model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification of impact forecasts. The preliminary tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management. In particular, the link with social media is crucial for improving the accuracy of impact predictions.

  11. Research into Australian emergency services personnel mental health and wellbeing: An evidence map.

    PubMed

    Varker, Tracey; Metcalf, Olivia; Forbes, David; Chisolm, Katherine; Harvey, Sam; Van Hooff, Miranda; McFarlane, Alexander; Bryant, Richard; Phelps, Andrea J

    2018-02-01

    Evidence maps are a method of systematically characterising the range of research activity in broad topic areas and are a tool for guiding research priorities. 'Evidence-mapping' methodology was used to quantify the nature and distribution of recent peer-reviewed research into the mental health and wellbeing of Australian emergency services personnel. A search of the PsycINFO, EMBASE and Cochrane Library databases was performed for primary research articles that were published between January 2011 and July 2016. In all, 43 studies of primary research were identified and mapped. The majority of the research focused on organisational and individual/social factors and how they relate to mental health problems/wellbeing. There were several areas of research where very few studies were detected through the mapping process, including suicide, personality, stigma and pre-employment factors that may contribute to mental health outcomes and the use of e-health. No studies were detected which examined the prevalence of self-harm and/or harm to others, bullying, alcohol/substance use, barriers to care or experience of families of emergency services personnel. In addition, there was no comprehensive national study that had investigated all sectors of emergency services personnel. This evidence map highlights the need for future research to address the current gaps in mental health and wellbeing research among Australian emergency services personnel. Improved understanding of the mental health and wellbeing of emergency services personnel, and the factors that contribute, should guide organisations' wellbeing policies and procedures.

  12. Utilizing Lean Six Sigma Methodology to Improve the Authored Works Command Approval Process at Naval Medical Center San Diego.

    PubMed

    Valdez, Michelle M; Liwanag, Maureen; Mount, Charles; Rodriguez, Rechell; Avalos-Reyes, Elisea; Smith, Andrew; Collette, David; Starsiak, Michael; Green, Richard

    2018-03-14

    Inefficiencies in the command approval process for publications and/or presentations negatively impact DoD Graduate Medical Education (GME) residency programs' ability to meet ACGME scholarly activity requirements. A preliminary review of the authored works approval process at Naval Medical Center San Diego (NMCSD) disclosed significant inefficiency, variation in process, and a low level of customer satisfaction. In order to facilitate and encourage scholarly activity at NMCSD, and meet ACGME requirements, the Executive Steering Council (ESC) chartered an interprofessional team to lead a Lean Six Sigma (LSS) Rapid Improvement Event (RIE) project. Two major outcome metrics were identified: (1) the number of authored works submissions containing all required signatures and (2) customer satisfaction with the authored works process. Primary metric baseline data were gathered utilizing a Clinical Investigations database tracking publications and presentations. Secondary metric baseline data were collected via a customer satisfaction survey to GME faculty and residents. The project team analyzed pre-survey data and utilized LSS tools and methodology including a "gemba" (environment) walk, cause and effect diagram, critical to quality tree, voice of the customer, "muda" (waste) chart, and a pre- and post-event value stream map. The team selected an electronic submission system as the intervention most likely to positively impact the RIE project outcome measures. The number of authored works compliant with all required signatures improved from 52% to 100%. Customer satisfaction rated as "completely or mostly satisfied" improved from 24% to 97%. For both outcomes, signature compliance and customer satisfaction, statistical significance was achieved with a p < 0.0001. This RIE project utilized LSS methodology and tools to improve signature compliance and increase customer satisfaction with the authored works approval process, leading to 100% signature compliance, a comprehensive longitudinal repository of all authored work requests, and a 97% "completely or mostly satisfied" customer rating of the process.

  13. Methodology for Elaborating Regional Susceptibility Maps of Slope Instability: the State of Guerrero (mexico) Case Study

    NASA Astrophysics Data System (ADS)

    González Huesca, A. E.; Ferrés, D.; Domínguez-M, L.

    2013-05-01

    Numerous cases of different types of slope instability occur every year in the mountain areas of México. Sometimes these instabilities severely affect the exposed communities, roads and infrastructure, causing deaths and serious material damage, mainly in the states of Puebla, Veracruz, Oaxaca, Guerrero and Chiapas, at the central and south sectors of the country. The occurrence of the slope instability is the result of the combination of climatic, geologic, hydrologic, geomorphologic and anthropogenic factors. The National Center for Disaster Prevention (CENAPRED) is developing several projects in order to offer civil protection authorities of the Mexican states some methodologies to address the hazard assessment for different natural phenomena in a regional level. In this framework, during the past two years, a methodology was prepared to construct susceptibility maps for slope instability at regional (≤ 1:100 000) and national (≤ 1:1 000 000) levels. This research was addressed in accordance to the criteria established by the International Association of Engineering Geology, which is the highest international authority in this topic. The state of Guerrero has been taken as a pilot scheme to elaborate the susceptibility map for slope instability at a regional level. The major constraints considered in the methodology to calculate susceptibility are: a) the slope of the surface, b) the geology and c) the land use, which were integrated using a Geographic Information System (GIS). The arithmetic sum and weighting factors to obtain the final susceptibility map were based on the average values calculated in the individual study of several cases of slope instability occurred in the state in the past decade. For each case, the evaluation format proposed by CENAPRED in 2006 in the "Guía Básica para la elaboración de Atlas Estatales y Municipales de Peligros y Riesgos" to evaluate instabilities in a local level, was applied. The resulting susceptibility map shows that the central and east-central sectors of the state of Guerrero are those with higher values of susceptibility to slope instability. Future work will elaborate the hazard maps of slope instability for the state of Guerrero using and combining the information of susceptibility obtained with the data of the trigger factors, such as precipitation and seismicity, for different periods of recurrence. The final goal is that this methodology can be applied to other states of the country, in order to nourish and enhance their Atlas of hazards and risk.

  14. Susceptibility and triggering scenarios at a regional scale for shallow landslides

    NASA Astrophysics Data System (ADS)

    Gullà, G.; Antronico, L.; Iaquinta, P.; Terranova, O.

    2008-07-01

    The work aims at identifying susceptible areas and pluviometric triggering scenarios at a regional scale in Calabria (Italy), with reference to shallow landsliding events. The proposed methodology follows a statistical approach and uses a database linked to a GIS that has been created to support the various steps of spatial data management and manipulation. The shallow landslide predisposing factors taken into account are derived from (i) the 40-m digital terrain model of the region, an ˜ 15,075 km 2 extension; (ii) outcropping lithology; (iii) soils; and (iv) land use. More precisely, a map of the slopes has been drawn from the digital terrain model. Two kinds of covers [prevalently coarse-grained (CG cover) or fine-grained (FG cover)] were identified, referring to the geotechnical characteristics of geomaterial covers and to the lithology map; soilscapes were drawn from soil maps; and finally, the land use map was employed without any prior processing. Subsequently, the inventory maps of some shallow landsliding events, totaling more than 30,000 instabilities of the past and detected by field surveys and photo aerial restitution, were employed to calibrate the relative importance of these predisposing factors. The use of single factors (first level analysis) therefore provides three different susceptibility maps. Second level analysis, however, enables better location of areas susceptible to shallow landsliding events by crossing the single susceptibility maps. On the basis of the susceptibility map obtained by the second level analysis, five different classes of susceptibility to shallow landsliding events have been outlined over the regional territory: 8.9% of the regional territory shows very high susceptibility, 14.3% high susceptibility, 15% moderate susceptibility, 3.6% low susceptibility, and finally, about 58% very low susceptibility. Finally, the maps of two significant shallow landsliding events of the past and their related rainfalls have been utilized to identify the relevant pluviometric triggering scenarios. By using 205 daily rainfall series, different triggering pluviometric scenarios have been identified with reference to CG and FG covers: a value of 365 mm of the total rainfall of the event and/or 170 mm/d of the rainfall maximum intensity and a value of 325 mm of the total rainfall of the event and/or 158 mm/d of the rainfall maximum intensity are able to trigger shallow landsliding events for CG and FG covers, respectively. The results obtained from this study can help administrative authorities to plan future development activities and mitigation measures in shallow landslide-prone areas. In addition, the proposed methodology can be useful in managing emergency situations at a regional scale for shallow landsliding events triggered by intense rainfalls; through this approach, the susceptibility and the pluviometric triggering scenario maps will be improved by means of finer calibration of the involved factors.

  15. An Investigation of Automatic Change Detection for Topographic Map Updating

    NASA Astrophysics Data System (ADS)

    Duncan, P.; Smit, J.

    2012-08-01

    Changes to the landscape are constantly occurring and it is essential for geospatial and mapping organisations that these changes are regularly detected and captured, so that map databases can be updated to reflect the current status of the landscape. The Chief Directorate of National Geospatial Information (CD: NGI), South Africa's national mapping agency, currently relies on manual methods of detecting changes and capturing these changes. These manual methods are time consuming and labour intensive, and rely on the skills and interpretation of the operator. It is therefore necessary to move towards more automated methods in the production process at CD: NGI. The aim of this research is to do an investigation into a methodology for automatic or semi-automatic change detection for the purpose of updating topographic databases. The method investigated for detecting changes is through image classification as well as spatial analysis and is focussed on urban landscapes. The major data input into this study is high resolution aerial imagery and existing topographic vector data. Initial results indicate the traditional pixel-based image classification approaches are unsatisfactory for large scale land-use mapping and that object-orientated approaches hold more promise. Even in the instance of object-oriented image classification generalization of techniques on a broad-scale has provided inconsistent results. A solution may lie with a hybrid approach of pixel and object-oriented techniques.

  16. High resolution regional soil carbon mapping in Madagascar : towards easy to update maps

    NASA Astrophysics Data System (ADS)

    Grinand, Clovis; Dessay, Nadine; Razafimbelo, Tantely; Razakamanarivo, Herintsitoaina; Albrecht, Alain; Vaudry, Romuald; Tiberghien, Matthieu; Rasamoelina, Maminiaina; Bernoux, Martial

    2013-04-01

    The soil organic carbon plays an important role in climate change regulation through carbon emissions and sequestration due to land use changes, notably tropical deforestation. Monitoring soil carbon emissions from shifting-cultivation requires to evaluate the amount of carbon stored at plot scale with a sufficient level of accuracy to be able to detect changes. The objective of this work was to map soil carbon stocks (30 cm and 100 cm depths) for different land use at regional scale using high resolution satellite dataset. The Andohahela National Parc and its surroundings (South-Est Madagascar) - a region with the largest deforestation rate in the country - was selected as a pilot area for the development of the methodology. A three steps approach was set up: (i) carbon inventory using mid infra-red spectroscopy and stock calculation, (ii) spatial data processing and (iii) modeling and mapping. Soil spectroscopy was successfully used for measuring organic carbon in this region. The results show that Random Forest was the inference model that produced the best estimates on calibration and validation datasets. By using a simple and robust method, we estimated uncertainty levels of of 35% and 43% for 30-cm and 100-cm carbon maps respectively. The approach developed in this study was based on open data and open source software that can be easily replicated to other regions and for other time periods using updated satellite images.

  17. Exploring identity within the recovery process of people with serious mental illnesses.

    PubMed

    Buckley-Walker, Kellie; Crowe, Trevor; Caputi, Peter

    2010-01-01

    To examine self-identity within the recovery processes of people with serious mental illnesses using a repertory grid methodology. Cross-sectional study involving 40 mental health service consumers. Participants rated different "self" and "other" elements on the repertory grid against constructs related to recovery, as well as other recovery focused measures. Perceptions of one's "ideal self" represented more advanced recovery in contrast to perceptions of "a person mentally unwell." Current perceptions of self were most similar to perceptions of "usual self" and least similar to "a person who is mentally unwell." Increased identification with one's "ideal self" reflected increased hopefulness in terms of recovery. The recovery repertory grid shows promise in clinical practice, in terms of exploring identity as a key variable within mental health recovery processes. Distance measures of similarity between various self-elements, including perceptions of others, maps logically against the recovery process of hope.

  18. Sentinel-1 data exploitation for geohazard activity map generation

    NASA Astrophysics Data System (ADS)

    Barra, Anna; Solari, Lorenzo; Béjar-Pizarro, Marta; Monserrat, Oriol; Herrera, Gerardo; Bianchini, Silvia; Crosetto, Michele; María Mateos, Rosa; Sarro, Roberto; Moretti, Sandro

    2017-04-01

    This work is focused on geohazard mapping and monitoring by exploiting Sentinel-1 (A and B) data and the DInSAR (Differential interferometric SAR (Synthetic Aperture Radar)) techniques. Sometimes the interpretation of the DInSAR derived product (like the velocity map) can be complex, mostly for a final user who do not usually works with radar. The aim of this work is to generate, in a rapid way, a clear product to be easily exploited by the authorities in the geohazard management: intervention planning and prevention activities. Specifically, the presented methodology has been developed in the framework of the European project SAFETY, which is aimed at providing Civil Protection Authorities (CPA) with the capability of periodically evaluating and assessing the potential impact of geohazards (volcanic activity, earthquakes, landslides and subsidence) on urban areas. The methodology has three phases, the interferograms generation, the activity map generation, in terms of velocity and accumulated deformation (with time-series), and the Active Deformation Area (ADA) map generation. The last one is the final product, derived from the original activity map by analyzing the data in a Geographic Information System (GIS) environment, which isolate only the true deformation areas over the noise. This product can be more easily read by the authorities than the original activity map, i.e. can be better exploited to integrate other information and analysis. This product also permit an easy monitoring of the active areas.

  19. iMindMap as an Innovative Tool in Teaching and Learning Accounting: An Exploratory Study

    ERIC Educational Resources Information Center

    Wan Jusoh, Wan Noor Hazlina; Ahmad, Suraya

    2016-01-01

    Purpose: The purpose of this study is to explore the use of iMindMap software as an interactive tool in the teaching and learning method and also to be able to consider iMindMap as an alternative instrument in achieving the ultimate learning outcome. Design/Methodology/Approach: Out of 268 students of the management accounting at the University of…

  20. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  1. Landslides susceptibility mapping at Gunung Ciremai National Park

    NASA Astrophysics Data System (ADS)

    Faizin; Nur, Bambang Azis

    2018-02-01

    In addition to agriculture, tourism became one of primary economic income for communities around Mount Ciremai, West, Java. Unfortunately, the landscape of West Java has many potential causes to disasters, mainly landslides. Mapping of disaster susceptibility area is needed as a consideration of tourism planning. The study was conducted in Gunung Ciremai National Park, West Java. This paper propose a methodology to map landslides susceptibilities based on spatial data. Using Geographic Information System tools, several environmental parameters such as slope, land use, elevation, and lithology are scored to build a landslide susceptibility map. Then, susceptibility map is overlaid with Utilization Zone.

  2. A Two-Layers Based Approach of an Enhanced-Map for Urban Positioning Support

    PubMed Central

    Piñana-Díaz, Carolina; Toledo-Moreo, Rafael; Toledo-Moreo, F. Javier; Skarmeta, Antonio

    2012-01-01

    This paper presents a two-layer based enhanced map that can support navigation in urban environments. One layer is dedicated to describe the drivable road with a special focus on the accurate description of its bounds. This feature can support positioning and advanced map-matching when compared with standard polyline-based maps. The other layer depicts building heights and locations, thus enabling the detection of non-line-of-sight signals coming from GPS satellites not in direct view. Both the concept and the methodology for creating these enhanced maps are shown in the paper. PMID:23202172

  3. Using Six Sigma to improve once daily gentamicin dosing and therapeutic drug monitoring performance.

    PubMed

    Egan, Sean; Murphy, Philip G; Fennell, Jerome P; Kelly, Sinead; Hickey, Mary; McLean, Carolyn; Pate, Muriel; Kirke, Ciara; Whiriskey, Annette; Wall, Niall; McCullagh, Eddie; Murphy, Joan; Delaney, Tim

    2012-12-01

    Safe, effective therapy with the antimicrobial gentamicin requires good practice in dose selection and monitoring of serum levels. Suboptimal therapy occurs with breakdown in the process of drug dosing, serum blood sampling, laboratory processing and level interpretation. Unintentional underdosing may result. This improvement effort aimed to optimise this process in an academic teaching hospital using Six Sigma process improvement methodology. A multidisciplinary project team was formed. Process measures considered critical to quality were defined, and baseline practice was examined through process mapping and audit. Root cause analysis informed improvement measures. These included a new dosing and monitoring schedule, and standardised assay sampling and drug administration timing which maximised local capabilities. Three iterations of the improvement cycle were conducted over a 24-month period. The attainment of serum level sampling in the required time window improved by 85% (p≤0.0001). A 66% improvement in accuracy of dosing was observed (p≤0.0001). Unnecessary dose omission while awaiting level results and inadvertent disruption to therapy due to dosing and monitoring process breakdown were eliminated. Average daily dose administered increased from 3.39 mg/kg to 4.78 mg/kg/day. Using Six Sigma methodology enhanced gentamicin usage process performance. Local process related factors may adversely affect adherence to practice guidelines for gentamicin, a drug which is complex to use. It is vital to adapt dosing guidance and monitoring requirements so that they are capable of being implemented in the clinical environment as a matter of routine. Improvement may be achieved through a structured localised approach with multidisciplinary stakeholder involvement.

  4. Statechart-based design controllers for FPGA partial reconfiguration

    NASA Astrophysics Data System (ADS)

    Łabiak, Grzegorz; Wegrzyn, Marek; Rosado Muñoz, Alfredo

    2015-09-01

    Statechart diagram and UML technique can be a vital part of early conceptual modeling. At the present time there is no much support in hardware design methodologies for reconfiguration features of reprogrammable devices. Authors try to bridge the gap between imprecise UML model and formal HDL description. The key concept in author's proposal is to describe the behavior of the digital controller by statechart diagrams and to map some parts of the behavior into reprogrammable logic by means of group of states which forms sequential automaton. The whole process is illustrated by the example with experimental results.

  5. Improving the extraction of crisis information in the context of flood, fire, and landslide rapid mapping using SAR and optical remote sensing data

    NASA Astrophysics Data System (ADS)

    Martinis, Sandro; Clandillon, Stephen; Twele, André; Huber, Claire; Plank, Simon; Maxant, Jérôme; Cao, Wenxi; Caspard, Mathilde; May, Stéphane

    2016-04-01

    Optical and radar satellite remote sensing have proven to provide essential crisis information in case of natural disasters, humanitarian relief activities and civil security issues in a growing number of cases through mechanisms such as the Copernicus Emergency Management Service (EMS) of the European Commission or the International Charter 'Space and Major Disasters'. The aforementioned programs and initiatives make use of satellite-based rapid mapping services aimed at delivering reliable and accurate crisis information after natural hazards. Although these services are increasingly operational, they need to be continuously updated and improved through research and development (R&D) activities. The principal objective of ASAPTERRA (Advancing SAR and Optical Methods for Rapid Mapping), the ESA-funded R&D project being described here, is to improve, automate and, hence, speed-up geo-information extraction procedures in the context of natural hazards response. This is performed through the development, implementation, testing and validation of novel image processing methods using optical and Synthetic Aperture Radar (SAR) data. The methods are mainly developed based on data of the German radar satellites TerraSAR-X and TanDEM-X, the French satellite missions Pléiades-1A/1B as well as the ESA missions Sentinel-1/2 with the aim to better characterize the potential and limitations of these sensors and their synergy. The resulting algorithms and techniques are evaluated in real case applications during rapid mapping activities. The project is focussed on three types of natural hazards: floods, landslides and fires. Within this presentation an overview of the main methodological developments in each topic is given and demonstrated in selected test areas. The following developments are presented in the context of flood mapping: a fully automated Sentinel-1 based processing chain for detecting open flood surfaces, a method for the improved detection of flooded vegetation in Sentinel-1data using Entropy/Alpha decomposition, unsupervised Wishart Classification, and object-based post-classification as well as semi-automatic approaches for extracting inundated areas and flood traces in rural and urban areas from VHR and HR optical imagery using machine learning techniques. Methodological developments related to fires are the implementation of fast and robust methods for mapping burnt scars using change detection procedures using SAR (Sentinel-1, TerraSAR-X) and HR optical (e.g. SPOT, Sentinel-2) data as well as the extraction of 3D surface and volume change information from Pléiades stereo-pairs. In the context of landslides, fast and transferable change detection procedures based on SAR (TerraSAR-X) and optical (SPOT) data as well methods for extracting the extent of landslides only based on polarimetric VHR SAR (TerraSAR-X) data are presented.

  6. GuidosToolbox: universal digital image object analysis

    Treesearch

    Peter Vogt; Kurt Riitters

    2017-01-01

    The increased availability of mapped environmental data calls for better tools to analyze the spatial characteristics and information contained in those maps. Publicly available, userfriendly and universal tools are needed to foster the interdisciplinary development and application of methodologies for the extraction of image object information properties contained in...

  7. Completion of the 1:1,500,000-Scale Geologic Map of Western Libya Montes and Northwestern Tyrrhena Terra

    NASA Astrophysics Data System (ADS)

    Huff, A. E.; Skinner, J. A.

    2018-06-01

    Final progress report on the 1:1,500,000-scale mapping of western Libya Montes and northwestern Tyrrhena Terra. The final unit names, labels, and descriptions are reported as well as the methodology for age determinations and brief geologic history.

  8. Validating Domain Ontologies: A Methodology Exemplified for Concept Maps

    ERIC Educational Resources Information Center

    Steiner, Christina M.; Albert, Dietrich

    2017-01-01

    Ontologies play an important role as knowledge domain representations in technology-enhanced learning and instruction. Represented in form of concept maps they are commonly used as teaching and learning material and have the potential to enhance positive educational outcomes. To ensure the effective use of an ontology representing a knowledge…

  9. Healthy Universities: Mapping Health-Promotion Interventions

    ERIC Educational Resources Information Center

    Sarmiento, Juan Pablo

    2017-01-01

    Purpose: The purpose of this paper is to map out and characterize existing health-promotion initiatives at Florida International University (FIU) in the USA in order to inform decision makers involved in the development of a comprehensive and a long-term healthy university strategy. Design/methodology/approach: This study encompasses a narrative…

  10. 40 CFR 86.1332-90 - Engine mapping procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Regulations for New Otto-Cycle and Diesel Heavy-Duty Engines; Gaseous and Particulate Exhaust Test Procedures... maximum mapping speed per the following methodologies. (Note paragraph (d)(1) below.) (1) Otto-cycle engines. (i) For ungoverned engines using the transient operating cycle set forth in paragraph (f)(1) of...

  11. Time-lapse electrical surveys to locate infiltration zones in weathered hard rock tropical areas

    NASA Astrophysics Data System (ADS)

    Wubda, M.; Descloitres, M.; Yalo, N.; Ribolzi, O.; Vouillamoz, J. M.; Boukari, M.; Hector, B.; Séguis, L.

    2017-07-01

    In West Africa, infiltration and groundwater recharge processes in hard rock areas are depending on climatic, surface and subsurface conditions, and are poorly documented. Part of the reason is that identification, location and monitoring of these processes is still a challenge. Here, we explore the potential for time-lapse electrical surveys to bring additional information on these processes for two different climate situations: a semi-arid Sahelian site (north of Burkina and a humid Sudanian site (north of Benin), respectively focusing on indirect (localized) and direct (diffuse) recharge processes. The methodology is based on surveys in dry season and rainy season on typical pond or gully using Electrical Resistivity Tomography (ERT) and frequency electromagnetic (FEM) apparent conductivity mapping. The results show that in the Sahelian zone an indirect recharge occurs as expected, but infiltration doesn't takes place at the center of the pond to the aquifer, but occurs laterally in the banks. In Sudanian zone, the ERT survey shows a direct recharge process as expected, but also a complicated behavior of groundwater dilution, as well as the role of hardpans for fast infiltration. These processes are ascertained by groundwater monitoring in adjacent observing wells. At last, FEM time lapse mapping is found to be difficult to quantitatively interpreted due to the non-uniqueness of the model, clearly evidenced comparing FEM result to auger holes monitoring. Finally, we found that time-lapse ERT can be an efficient way to track infiltration processes across ponds and gullies in both climatic conditions, the Sahelian setting providing results easier to interpret, due to significant resistivity contrasts between dry and rain seasons. Both methods can be used for efficient implementation of punctual sensors for complementary studies. However, FEM time-lapse mapping remains difficult to practice without external information that renders this method less attractive for quantitative interpretation purposes.

  12. Estimating Forest Canopy Heights and Aboveground Biomass with Simulated ICESat-2 Data

    NASA Astrophysics Data System (ADS)

    Malambo, L.; Narine, L.; Popescu, S. C.; Neuenschwander, A. L.; Sheridan, R.

    2016-12-01

    The Ice, Cloud and Land Elevation Satellite (ICESat) 2 is scheduled for launch in 2017 and one of its overall science objectives will be to measure vegetation heights, which can be used to estimate and monitor aboveground biomass (AGB) over large spatial scales. This study serves to develop a methodology for utilizing vegetation data collected by ICESat-2 that will be on a five-year mission from 2017, for mapping forest canopy heights and estimating aboveground forest biomass (AGB). The specific objectives are to, (1) simulate ICESat-2 photon-counting lidar (PCL) data, (2) utilize simulated PCL data to estimate forest canopy heights and propose a methodology for upscaling PCL height measurements to obtain spatially contiguous coverage and, (3) estimate and map AGB using simulated PCL data. The laser pulse from ICESat-2 will be divided into three pairs of beams spaced approximately 3 km apart, with footprints measuring approximately 14 m in diameter and with 70 cm along-track intervals. Using existing airborne lidar data (ALS) for Sam Houston National Forest (SHNF) and known ICESat-2 beam locations, footprints are generated along beam locations and PCL data are then simulated from discrete return lidar points within each footprint. By applying data processing algorithms, photons are classified into top of canopy points and ground surface elevation points to yield tree canopy height values within each ICESat-2 footprint. AGB is then estimated using simple linear regression that utilizes AGB from a biomass map generated with ALS data for SHNF and simulated PCL height metrics for 100 m segments along ICESat-2 tracks. Two approaches also investigated for upscaling AGB estimates to provide wall-to-wall coverage of AGB are (1) co-kriging and (2) Random Forest. Height and AGB maps, which are the outcomes of this study, will demonstrate how data acquired by ICESat-2 can be used to measure forest parameters and in extension, estimate forest carbon for climate change initiatives.

  13. Sustainability of evidence-based healthcare: research agenda, methodological advances, and infrastructure support.

    PubMed

    Proctor, Enola; Luke, Douglas; Calhoun, Annaliese; McMillen, Curtis; Brownson, Ross; McCrary, Stacey; Padek, Margaret

    2015-06-11

    Little is known about how well or under what conditions health innovations are sustained and their gains maintained once they are put into practice. Implementation science typically focuses on uptake by early adopters of one healthcare innovation at a time. The later-stage challenges of scaling up and sustaining evidence-supported interventions receive too little attention. This project identifies the challenges associated with sustainability research and generates recommendations for accelerating and strengthening this work. A multi-method, multi-stage approach, was used: (1) identifying and recruiting experts in sustainability as participants, (2) conducting research on sustainability using concept mapping, (3) action planning during an intensive working conference of sustainability experts to expand the concept mapping quantitative results, and (4) consolidating results into a set of recommendations for research, methodological advances, and infrastructure building to advance understanding of sustainability. Participants comprised researchers, funders, and leaders in health, mental health, and public health with shared interest in the sustainability of evidence-based health care. Prompted to identify important issues for sustainability research, participants generated 91 distinct statements, for which a concept mapping process produced 11 conceptually distinct clusters. During the conference, participants built upon the concept mapping clusters to generate recommendations for sustainability research. The recommendations fell into three domains: (1) pursue high priority research questions as a unified agenda on sustainability; (2) advance methods for sustainability research; (3) advance infrastructure to support sustainability research. Implementation science needs to pursue later-stage translation research questions required for population impact. Priorities include conceptual consistency and operational clarity for measuring sustainability, developing evidence about the value of sustaining interventions over time, identifying correlates of sustainability along with strategies for sustaining evidence-supported interventions, advancing the theoretical base and research designs for sustainability research, and advancing the workforce capacity, research culture, and funding mechanisms for this important work.

  14. Overcoming the momentum of anachronism: American geologic mapping in a twenty-first-century world

    USGS Publications Warehouse

    House, P. Kyle; Clark, Ryan; Kopera, Joe

    2013-01-01

    The practice of geologic mapping is undergoing conceptual and methodological transformation. Profound changes in digital technology in the past 10 yr have potential to impact all aspects of geologic mapping. The future of geologic mapping as a relevant scientific enterprise depends on widespread adoption of new technology and ideas about the collection, meaning, and utility of geologic map data. It is critical that the geologic community redefine the primary elements of the traditional paper geologic map and improve the integration of the practice of making maps in the field and office with the new ways to record, manage, share, and visualize their underlying data. A modern digital geologic mapping model will enhance scientific discovery, meet elevated expectations of modern geologic map users, and accommodate inevitable future changes in technology.

  15. Accuracy assessment/validation methodology and results of 2010–11 land-cover/land-use data for Pools 13, 26, La Grange, and Open River South, Upper Mississippi River System

    USGS Publications Warehouse

    Jakusz, J.W.; Dieck, J.J.; Langrehr, H.A.; Ruhser, J.J.; Lubinski, S.J.

    2016-01-11

    Similar to an AA, validation involves generating random points based on the total area for each map class. However, instead of collecting field data, two or three individuals not involved with the photo-interpretative mapping separately review each of the points onscreen and record a best-fit vegetation type(s) for each site. Once the individual analyses are complete, results are joined together and a comparative analysis is performed. The objective of this initial analysis is to identify areas where the validation results were in agreement (matches) and areas where validation results were in disagreement (mismatches). The two or three individuals then perform an analysis, looking at each mismatched site, and agree upon a final validation class. (If two vegetation types at a specific site appear to be equally prevalent, the validation team is permitted to assign the site two best-fit vegetation types.) Following the validation team’s comparative analysis of vegetation assignments, the data are entered into a database and compared to the mappers’ vegetation assignments. Agreements and disagreements between the map and validation classes are identified, and a contingency table is produced. This document presents the AA processes/results for Pools 13 and La Grange, as well as the validation process/results for Pools 13 and 26 and Open River South.

  16. Multimedia integration of cartographic source materials for researching and presenting phenomena from economic history

    NASA Astrophysics Data System (ADS)

    Lorek, Dariusz

    2016-12-01

    The article presents a framework for integrating historical sources with elements of the geographical space recorded in unique cartographic materials. The aim of the project was to elaborate a method of integrating spatial data sources that would facilitate studying and presenting the phenomena of economic history. The proposed methodology for multimedia integration of old materials made it possible to demonstrate the successive stages of the transformation which was characteristic of the 19th-century space. The point of reference for this process of integrating information was topographic maps from the first half of the 19th century, while the research area comprised the castle complex in Kórnik together with the small town - the pre-industrial landscape in Wielkopolska (Greater Poland). On the basis of map and plan transformation, graphic processing of the scans of old drawings, texture mapping of the facades of historic buildings, and a 360° panorama, the source material collected was integrated. The final product was a few-minute-long video, composed of nine sequences. It captures the changing form of the castle building together with its facades, the castle park, and its further topographic and urban surroundings, since the beginning of the 19th century till the present day. For a topographic map sheet dating back to the first half of the 19th century, in which the hachuring method had been used to present land relief, a terrain model was generated. The transition from parallel to bird's-eye-view perspective served to demonstrate the distinctive character of the pre-industrial landscape.

  17. An approach for setting evidence-based and stakeholder-informed research priorities in low- and middle-income countries.

    PubMed

    Rehfuess, Eva A; Durão, Solange; Kyamanywa, Patrick; Meerpohl, Joerg J; Young, Taryn; Rohwer, Anke

    2016-04-01

    To derive evidence-based and stakeholder-informed research priorities for implementation in African settings, the international research consortium Collaboration for Evidence-Based Healthcare and Public Health in Africa (CEBHA+) developed and applied a pragmatic approach. First, an online survey and face-to-face consultation between CEBHA+ partners and policy-makers generated priority research areas. Second, evidence maps for these priority research areas identified gaps and related priority research questions. Finally, study protocols were developed for inclusion within a grant proposal. Policy and practice representatives were involved throughout the process. Tuberculosis, diabetes, hypertension and road traffic injuries were selected as priority research areas. Evidence maps covered screening and models of care for diabetes and hypertension, population-level prevention of diabetes and hypertension and their risk factors, and prevention and management of road traffic injuries. Analysis of these maps yielded three priority research questions on hypertension and diabetes and one on road traffic injuries. The four resulting study protocols employ a broad range of primary and secondary research methods; a fifth promotes an integrated methodological approach across all research activities. The CEBHA+ approach, in particular evidence mapping, helped to formulate research questions and study protocols that would be owned by African partners, fill gaps in the evidence base, address policy and practice needs and be feasible given the existing research infrastructure and expertise. The consortium believes that the continuous involvement of decision-makers throughout the research process is an important means of ensuring that studies are relevant to the African context and that findings are rapidly implemented.

  18. SSMap: a new UniProt-PDB mapping resource for the curation of structural-related information in the UniProt/Swiss-Prot Knowledgebase.

    PubMed

    David, Fabrice P A; Yip, Yum L

    2008-09-23

    Sequences and structures provide valuable complementary information on protein features and functions. However, it is not always straightforward for users to gather information concurrently from the sequence and structure levels. The UniProt knowledgebase (UniProtKB) strives to help users on this undertaking by providing complete cross-references to Protein Data Bank (PDB) as well as coherent feature annotation using available structural information. In this study, SSMap - a new UniProt-PDB residue-residue level mapping - was generated. The primary objective of this mapping is not only to facilitate the two tasks mentioned above, but also to palliate a number of shortcomings of existent mappings. SSMap is the first isoform sequence-specific mapping resource and is up-to-date for UniProtKB annotation tasks. The method employed by SSMap differs from the other mapping resources in that it stresses on the correct reconstruction of the PDB sequence from structures, and on the correct attribution of a UniProtKB entry to each PDB chain by using a series of post-processing steps. SSMap was compared to other existing mapping resources in terms of the correctness of the attribution of PDB chains to UniProtKB entries, and of the quality of the pairwise alignments supporting the residue-residue mapping. It was found that SSMap shared about 80% of the mappings with other mapping sources. New and alternative mappings proposed by SSMap were mostly good as assessed by manual verification of data subsets. As for local pairwise alignments, it was shown that major discrepancies (both in terms of alignment lengths and boundaries), when present, were often due to differences in methodologies used for the mappings. SSMap provides an independent, good quality UniProt-PDB mapping. The systematic comparison conducted in this study allows the further identification of general problems in UniProt-PDB mappings so that both the coverage and the quality of the mappings can be systematically improved for the benefit of the scientific community. SSMap mapping is currently used to provide PDB cross-references in UniProtKB.

  19. Documentation & Condition Mapping for Restoration & Revitalisation of Historic Sheesh Mahal & Char Bagh Complex in Patiala (punjab), India

    NASA Astrophysics Data System (ADS)

    Dasgupta, S.

    2017-08-01

    Located in the Northern State of Punjab, the historic city of Patiala has always been a centre of culture in north India, and has seen the evolution of its own distinct style of architecture with Rajput and Mughal influences. The city is renowned for its rich architectural heritage, Music, Craft, Sports and Cuisine. The fourth Maharaja Narinder Singh was a great patron of art, architecture and music and it was during his time that several palaces like the Moti Bagh Palace, Sheesh Mahal and Banasur Bagh were designed followed by Baradari Palace. Later it was Maharaja Bhupinder Singh (1900-1938) who made Patiala State famous with his lavish lifestyle.This paper describes the process followed for Documentation and condition assessment of the historic Sheesh Mahal & Char Bagh Complex in order to restore and revitalise the palace building and the Mughal garden. The exercise included Archival research, Field surveys, Condition Mapping, inventories using traditional methods as well as GIS and preparation of restoration & conservation solutions along with post conservation management manual. The Major challenges encountered were identifying the correct documentation methodology for mapping as well as managing the large database generated on site. The Documentation and Mapping was used as a significant tool to guide towards the conservation and Management strategy of the complex.

  20. Multi-Hazard Vulnerability Assessment Along the Coast of Visakhapatnam, North-East Coast of India

    NASA Astrophysics Data System (ADS)

    Vivek, G.; Grinivasa Kumar, T.

    2016-08-01

    The current study area is coastal zone of Visakhapatnam, district of Andhra Pradesh along the coast of India. This area is mostly vulnerable to many disasters such as storms, cyclone, flood, tsunami and erosion. This area is considered as cyclone prone area because of frequently occurrence of the cyclones in this area. Recently the two tropical cyclones that formed in the Bay of Bengal are Hudhud (October 13, 2014) and Phylin (October 11, 2013), has caused devastating impacts on the eastern coast and shows that the country has lack of preparedness to cyclone, storm surge and related natural hazards. The multi-hazard vulnerability maps prepared here are a blended and combined overlay of multiple hazards those affecting the coastal zone. The present study aims to develop a methodology for coastal multi-hazard vulnerability assessment. This study carried out using parameters like probability of coastal slope, tsunami arrival height, future sea level rise, coastal erosion and tidal range. The multi-hazard vulnerability maps prepared by overlaying of multi hazards those affecting the coastal zone. Multi-hazard vulnerability maps further reproduced as risk maps with the land use information. The decision making tools presented here can provide a useful information during the disaster for the evacuation process and to evolve a management strategy.

  1. Elaboration of a framework for the compilation of countrywide, digital maps for the satisfaction of recent demands on spatial, soil related information in Hungary

    NASA Astrophysics Data System (ADS)

    Pásztor, László; Dobos, Endre; Szabó, József; Bakacsi, Zsófia; Laborczi, Annamária

    2013-04-01

    There is a heap of evidences that demands on soil related information have been significant worldwide and it is still increasing. Soil maps were typically used for long time to satisfy these demands. By the spread of GI technology, spatial soil information systems (SSIS) and digital soil mapping (DSM) took the role of traditional soil maps. Due to the relatively high costs of data collection, new conventional soil surveys and inventories are getting less and less frequent, which fact valorises legacy soil information and the systems which are serving the their digitally processed version. The existing data contain a wealth of information that can be exploited by proper methodology. Not only the degree of current needs for soil information has changed but also its nature. Traditionally the agricultural functions of soils were focussed on, which was also reflected in the methodology of data collection and mapping. Recently the multifunctionality of soils is getting to gain more and more ground; consequently information related to additional functions of soils becomes identically important. The new types of information requirements however cannot be fulfilled generally with new data collections at least not on such a level as it was done in the frame of traditional soil surveys. Soil monitoring systems have been established for the collection of recent information on the various elements of the DPSIR (Driving Forces-Pressures-State-Impacts-Responses) framework, but the primary goal of these systems has not been mapping by all means. And definitely this is the case concerning the two recently working Hungarian soil monitoring systems. In Hungary, presently soil data requirements are fulfilled with the recently available datasets either by their direct usage or after certain specific and generally fortuitous, thematic and/or spatial inference. Due to the more and more frequently emerging discrepancies between the available and the expected data, there might be notable imperfection as for the accuracy and reliability of the delivered products. Since, similarly to the great majority of the world, large-scale, comprehensive new surveys cannot be expected in the near future, the actually available legacy data should be relied on. With a recently started project we would like to significantly extend the potential, how countrywide soil information requirements could be satisfied. In the frame of our project we plan the execution of spatial and thematic data mining of significant amount of soil related information available in the form of legacy soil data as well as digital databases and spatial soil information systems. In the course of the analyses we will lean on auxiliary, spatial data themes related to environmental elements. Based on the established relationships we will convert and integrate the specific data sets for the regionalization of the various, derived soil parameters. By the aid of GIS and geostatistical tools we will carry out the spatial extension of certain pedological variables featuring the (including degradation) state, processes or functions of soils. We plan to compile digital soil maps which fulfil optimally the national and international demands from points of view of thematic, spatial and temporal accuracy. The targeted spatial resolution of the proposed countrywide, digital, thematic soil property and function maps is at least 1:50.000 (approx. 50-100 meter raster). Our stressful objective is the definite solution of the regionalization of the information collected in the frame of two recent, contemporary, national, systematic soil data collection (not designed for mapping purpose) on the recent state of soils, in order to produce countrywide maps for the spatial inventory of certain soil properties, processes and functions with sufficient accuracy and reliability.

  2. Moving forward socio-economically focused models of deforestation.

    PubMed

    Dezécache, Camille; Salles, Jean-Michel; Vieilledent, Ghislain; Hérault, Bruno

    2017-09-01

    Whilst high-resolution spatial variables contribute to a good fit of spatially explicit deforestation models, socio-economic processes are often beyond the scope of these models. Such a low level of interest in the socio-economic dimension of deforestation limits the relevancy of these models for decision-making and may be the cause of their failure to accurately predict observed deforestation trends in the medium term. This study aims to propose a flexible methodology for taking into account multiple drivers of deforestation in tropical forested areas, where the intensity of deforestation is explicitly predicted based on socio-economic variables. By coupling a model of deforestation location based on spatial environmental variables with several sub-models of deforestation intensity based on socio-economic variables, we were able to create a map of predicted deforestation over the period 2001-2014 in French Guiana. This map was compared to a reference map for accuracy assessment, not only at the pixel scale but also over cells ranging from 1 to approximately 600 sq. km. Highly significant relationships were explicitly established between deforestation intensity and several socio-economic variables: population growth, the amount of agricultural subsidies, gold and wood production. Such a precise characterization of socio-economic processes allows to avoid overestimation biases in high deforestation areas, suggesting a better integration of socio-economic processes in the models. Whilst considering deforestation as a purely geographical process contributes to the creation of conservative models unable to effectively assess changes in the socio-economic and political contexts influencing deforestation trends, this explicit characterization of the socio-economic dimension of deforestation is critical for the creation of deforestation scenarios in REDD+ projects. © 2017 John Wiley & Sons Ltd.

  3. A proposed reductionist solution to address the methodological challenges of inconsistent reflexology maps and poor experimental controls in reflexology research: a discussion paper.

    PubMed

    Jones, Jenny; Thomson, Patricia; Lauder, William; Leslie, Stephen J

    2013-03-01

    Reflexology is a complex massage intervention, based on the concept that specific areas of the feet (reflex points) correspond to individual internal organs within the body. Reflexologists trained in the popular Ingham reflexology method claim that massage to these points, using massage techniques unique to reflexology, stimulates an increase in blood supply to the corresponding organ. Reflexology researchers face two key methodological challenges that need to be addressed if a specific treatment-related hemodynamic effect is to be scientifically demonstrated. The first is the problem of inconsistent reflexology foot maps; the second is the issue of poor experimental controls. This article proposes a potential experimental solution that we believe can address both methodological challenges and in doing so, allow any specific hemodynamic treatment effect unique to reflexology to experimentally reveal itself.

  4. Error and Uncertainty in the Accuracy Assessment of Land Cover Maps

    NASA Astrophysics Data System (ADS)

    Sarmento, Pedro Alexandre Reis

    Traditionally the accuracy assessment of land cover maps is performed through the comparison of these maps with a reference database, which is intended to represent the "real" land cover, being this comparison reported with the thematic accuracy measures through confusion matrixes. Although, these reference databases are also a representation of reality, containing errors due to the human uncertainty in the assignment of the land cover class that best characterizes a certain area, causing bias in the thematic accuracy measures that are reported to the end users of these maps. The main goal of this dissertation is to develop a methodology that allows the integration of human uncertainty present in reference databases in the accuracy assessment of land cover maps, and analyse the impacts that uncertainty may have in the thematic accuracy measures reported to the end users of land cover maps. The utility of the inclusion of human uncertainty in the accuracy assessment of land cover maps is investigated. Specifically we studied the utility of fuzzy sets theory, more precisely of fuzzy arithmetic, for a better understanding of human uncertainty associated to the elaboration of reference databases, and their impacts in the thematic accuracy measures that are derived from confusion matrixes. For this purpose linguistic values transformed in fuzzy intervals that address the uncertainty in the elaboration of reference databases were used to compute fuzzy confusion matrixes. The proposed methodology is illustrated using a case study in which the accuracy assessment of a land cover map for Continental Portugal derived from Medium Resolution Imaging Spectrometer (MERIS) is made. The obtained results demonstrate that the inclusion of human uncertainty in reference databases provides much more information about the quality of land cover maps, when compared with the traditional approach of accuracy assessment of land cover maps. None

  5. Visualizing the semantic content of large text databases using text maps

    NASA Technical Reports Server (NTRS)

    Combs, Nathan

    1993-01-01

    A methodology for generating text map representations of the semantic content of text databases is presented. Text maps provide a graphical metaphor for conceptualizing and visualizing the contents and data interrelationships of large text databases. Described are a set of experiments conducted against the TIPSTER corpora of Wall Street Journal articles. These experiments provide an introduction to current work in the representation and visualization of documents by way of their semantic content.

  6. [Risk maps. The concept and the methodology for their development].

    PubMed

    García Gómez, M M

    1994-01-01

    In this article the concept of risk map is revised. It is considered as an instrument for the knowledge of risks and damages in a certain environment. A historic revision is made analyzing the birth and evolution of the concept. Different experiences and types of maps in different countries are described. Finally the operative steps, the data sources and the risk indicators which should be used in Spain are included.

  7. Comparison of 3D point clouds obtained by photogrammetric UAVs and TLS to determine the attitude of dolerite outcrops discontinuities.

    NASA Astrophysics Data System (ADS)

    Duarte, João; Gonçalves, Gil; Duarte, Diogo; Figueiredo, Fernando; Mira, Maria

    2015-04-01

    Photogrammetric Unmanned Aerial Vehicles (UAVs) and Terrestrial Laser Scanners (TLS) are two emerging technologies that allows the production of dense 3D point clouds of the sensed topographic surfaces. Although image-based stereo-photogrammetric point clouds could not, in general, compete on geometric quality over TLS point clouds, fully automated mapping solutions based on ultra-light UAVs (or drones) have recently become commercially available at very reasonable accuracy and cost for engineering and geological applications. The purpose of this paper is to compare the two point clouds generated by these two technologies, in order to automatize the manual process tasks commonly used to detect and represent the attitude of discontinuities (Stereographic projection: Schmidt net - Equal area). To avoid the difficulties of access and guarantee the data survey security conditions, this fundamental step in all geological/geotechnical studies, applied to the extractive industry and engineering works, has to be replaced by a more expeditious and reliable methodology. This methodology will allow, in a more actuated clear way, give answers to the needs of evaluation of rock masses, by mapping the structures present, which will reduce considerably the associated risks (investment, structures dimensioning, security, etc.). A case study of a dolerite outcrop locate in the center of Portugal (the dolerite outcrop is situated in the volcanic complex of Serra de Todo-o-Mundo, Casais Gaiola, intruded in Jurassic sandstones) will be used to assess this methodology. The results obtained show that the 3D point cloud produced by the Photogrammetric UAV platform has the appropriate geometric quality for extracting the parameters that define the discontinuities of the dolerite outcrops. Although, they are comparable to the manual extracted parameters, their quality is inferior to parameters extracted from the TLS point cloud.

  8. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  9. Improving operating room efficiency in academic children's hospital using Lean Six Sigma methodology.

    PubMed

    Tagge, Edward P; Thirumoorthi, Arul S; Lenart, John; Garberoglio, Carlos; Mitchell, Kenneth W

    2017-06-01

    Lean Six Sigma (LSS) is a process improvement methodology that utilizes a collaborative team effort to improve performance by systematically identifying root causes of problems. Our objective was to determine whether application of LSS could improve efficiency when applied simultaneously to all services of an academic children's hospital. In our tertiary academic medical center, a multidisciplinary committee was formed, and the entire perioperative process was mapped, using fishbone diagrams, Pareto analysis, and other process improvement tools. Results for Children's Hospital scheduled main operating room (OR) cases were analyzed, where the surgical attending followed themselves. Six hundred twelve cases were included in the seven Children's Hospital operating rooms (OR) over a 6-month period. Turnover Time (interval between patient OR departure and arrival of the subsequent patient) decreased from a median 41min in the baseline period to 32min in the intervention period (p<0.0001). Turnaround Time (interval between surgical dressing application and subsequent surgical incision) decreased from a median 81.5min in the baseline period to 71min in the intervention period (p<0.0001). These results demonstrate that a coordinated multidisciplinary process improvement redesign can significantly improve efficiency in an academic Children's Hospital without preselecting specific services, removing surgical residents, or incorporating new personnel or technology. Prospective comparative study, Level II. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Urban Groundwater Mapping - Bucharest City Area Case Study

    NASA Astrophysics Data System (ADS)

    Gaitanaru, Dragos; Radu Gogu, Constantin; Bica, Ioan; Anghel, Leonard; Amine Boukhemacha, Mohamed; Ionita, Angela

    2013-04-01

    Urban Groundwater Mapping (UGM) is a generic term for a collection of procedures and techniques used to create targeted cartographic representation of the groundwater related aspects in urban areas. The urban environment alters the physical and chemical characteristics of the underneath aquifers. The scale of the pressure is controlled by the urban development in time and space. To have a clear image on the spatial and temporal distribution of different groundwater- urban structures interaction we need a set of thematic maps is needed. In the present study it is described the methodological approach used to obtain a reliable cartographic product for Bucharest City area. The first step in the current study was to identify the groundwater related problems and aspects (changes in the groundwater table, infiltration and seepage from and to the city sewer network, contamination spread to all three aquifers systems located in quaternary sedimentary formations, dewatering impact for large underground structures, management and political drawbacks). The second step was data collection and validation. In urban areas there is a big spectrum of data providers related to groundwater. Due to the fact that data is produced and distributed by different types of organizations (national agencies, private companies, municipal water regulator, etc) the validation and cross check process is mandatory. The data is stored and managed by a geospatial database. The design of the database follows an object-orientated paradigm and is easily extensible. The third step consists of a set of procedures based on a multi criteria assessment that creates the specific setup for the thematic maps. The assessment is based on the following criteria: (1) scale effect , (2) time , (3) vertical distribution and (4) type of the groundwater related problem. The final step is the cartographic representation. In this final step the urban groundwater maps are created. All the methodological steps are doubled by programmed procedures developed in a groundwater management platform for urban areas. The core of the procedures is represented by a set of well defined hydrogeological set of geospatial queries. The cartographic products (urban groundwater maps) can be used by different types of users: civil engineers, urban planners, scientist as well as decision and policies makers.

  11. Addressing the English Language Arts Technology Standard in a Secondary Reading Methodology Course.

    ERIC Educational Resources Information Center

    Merkley, Donna J.; Schmidt, Denise A.; Allen, Gayle

    2001-01-01

    Describes efforts to integrate technology into a reading methodology course for secondary English majors. Discusses the use of e-mail, multimedia, distance education for videoconferences, online discussion technology, subject-specific software, desktop publishing, a database management system, a concept mapping program, and the use of the World…

  12. Cooperative Autonomous Observation of Volcanic Environments with sUAS

    NASA Astrophysics Data System (ADS)

    Ravela, S.

    2015-12-01

    The Cooperative Autonomous Observing System Project (CAOS) at the MIT Earth Signals and Systems Group has developed methodology and systems for dynamically mapping coherent fluids such as plumes using small unmanned aircraft systems (sUAS). In the CAOS approach, two classes of sUAS, one remote the other in-situ, implement a dynamic data-driven mapping system by closing the loop between Modeling, Estimation, Sampling, Planning and Control (MESPAC). The continually gathered measurements are assimilated to produce maps/analyses which also guide the sUAS network to adaptively resample the environment. Rather than scan the volume in fixed Eulerian or Lagrangian flight plans, the adaptive nature of the sampling process enables objectives for efficiency and resilience to be incorporated. Modeling includes realtime prediction using two types of reduced models, one based on nowcasting remote observations of plume tracer using scale-cascaded alignment, and another based on dynamically-deformable EOF/POD developed for coherent structures. Ensemble-based Information-theoretic machine learning approaches are used for the highly non-linear/non-Gaussian state/parameter estimation, and for planning. Control of the sUAS is based on model reference control coupled with hierarchical PID. MESPAC is implemented in part on a SkyCandy platform, and implements an airborne mesh that provides instantaneous situational awareness and redundant communication to an operating fleet. SkyCandy is deployed on Itzamna Aero's I9X/W UAS with low-cost sensors, and is currently being used to study the Popocatepetl volcano. Results suggest that operational communities can deploy low-cost sUAS to systematically monitor whilst optimizing for efficiency/maximizing resilience. The CAOS methodology is applicable to many other environments where coherent structures are present in the background. More information can be found at caos.mit.edu.

  13. Processing LiDAR Data to Predict Natural Hazards

    NASA Technical Reports Server (NTRS)

    Fairweather, Ian; Crabtree, Robert; Hager, Stacey

    2008-01-01

    ELF-Base and ELF-Hazards (wherein 'ELF' signifies 'Extract LiDAR Features' and 'LiDAR' signifies 'light detection and ranging') are developmental software modules for processing remote-sensing LiDAR data to identify past natural hazards (principally, landslides) and predict future ones. ELF-Base processes raw LiDAR data, including LiDAR intensity data that are often ignored in other software, to create digital terrain models (DTMs) and digital feature models (DFMs) with sub-meter accuracy. ELF-Hazards fuses raw LiDAR data, data from multispectral and hyperspectral optical images, and DTMs and DFMs generated by ELF-Base to generate hazard risk maps. Advanced algorithms in these software modules include line-enhancement and edge-detection algorithms, surface-characterization algorithms, and algorithms that implement innovative data-fusion techniques. The line-extraction and edge-detection algorithms enable users to locate such features as faults and landslide headwall scarps. Also implemented in this software are improved methodologies for identification and mapping of past landslide events by use of (1) accurate, ELF-derived surface characterizations and (2) three LiDAR/optical-data-fusion techniques: post-classification data fusion, maximum-likelihood estimation modeling, and hierarchical within-class discrimination. This software is expected to enable faster, more accurate forecasting of natural hazards than has previously been possible.

  14. A high performance biometric signal and image processing method to reveal blood perfusion towards 3D oxygen saturation mapping

    NASA Astrophysics Data System (ADS)

    Imms, Ryan; Hu, Sijung; Azorin-Peris, Vicente; Trico, Michaël.; Summers, Ron

    2014-03-01

    Non-contact imaging photoplethysmography (PPG) is a recent development in the field of physiological data acquisition, currently undergoing a large amount of research to characterize and define the range of its capabilities. Contact-based PPG techniques have been broadly used in clinical scenarios for a number of years to obtain direct information about the degree of oxygen saturation for patients. With the advent of imaging techniques, there is strong potential to enable access to additional information such as multi-dimensional blood perfusion and saturation mapping. The further development of effective opto-physiological monitoring techniques is dependent upon novel modelling techniques coupled with improved sensor design and effective signal processing methodologies. The biometric signal and imaging processing platform (bSIPP) provides a comprehensive set of features for extraction and analysis of recorded iPPG data, enabling direct comparison with other biomedical diagnostic tools such as ECG and EEG. Additionally, utilizing information about the nature of tissue structure has enabled the generation of an engineering model describing the behaviour of light during its travel through the biological tissue. This enables the estimation of the relative oxygen saturation and blood perfusion in different layers of the tissue to be calculated, which has the potential to be a useful diagnostic tool.

  15. Development and characterization of a 3D high-resolution terrain database

    NASA Astrophysics Data System (ADS)

    Wilkosz, Aaron; Williams, Bryan L.; Motz, Steve

    2000-07-01

    A top-level description of methods used to generate elements of a high resolution 3D characterization database is presented. The database elements are defined as ground plane elevation map, vegetation height elevation map, material classification map, discrete man-made object map, and temperature radiance map. The paper will cover data collection by means of aerial photography, techniques of soft photogrammetry used to derive the elevation data, and the methodology followed to generate the material classification map. The discussion will feature the development of the database elements covering Fort Greely, Alaska. The developed databases are used by the US Army Aviation and Missile Command to evaluate the performance of various missile systems.

  16. Methodology to design a municipal solid waste generation and composition map: a case study.

    PubMed

    Gallardo, A; Carlos, M; Peris, M; Colomer, F J

    2014-11-01

    The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consist in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Methodology to design a municipal solid waste generation and composition map: a case study.

    PubMed

    Gallardo, A; Carlos, M; Peris, M; Colomer, F J

    2015-02-01

    The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consists in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Nationwide Natural Resource Inventory of the Philippines Using Lidar: Strategies, Progress, and Challenges

    NASA Astrophysics Data System (ADS)

    Blanco, A. C.; Tamondong, A.; Perez, A. M.; Ang, M. R. C.; Paringit, E.; Alberto, R.; Alibuyog, N.; Aquino, D.; Ballado, A.; Garcia, P.; Japitana, M.; Ignacio, M. T.; Macandog, D.; Novero, A.; Otadoy, R. E.; Regis, E.; Rodriguez, M.; Silapan, J.; Villar, R.

    2016-06-01

    The Philippines has embarked on a detailed nationwide natural resource inventory using LiDAR through the Phil-LiDAR 2 Program. This 3-year program has developed and has been implementing mapping methodologies and protocols to produce high-resolution maps of agricultural, forest, coastal marine, hydrological features, and renewable energy resources. The Program has adopted strategies on system and process development, capacity building and enhancement, and expanding the network of collaborations. These strategies include training programs (on point cloud and image processing, GIS, and field surveys), workshops, forums, and colloquiums (program-wide, cluster-based, and project-based), and collaboration with partner national government agencies and other organizations. In place is a cycle of training, implementation, and feedback in order to continually improve the system and processes. To date, the Program has achieved progress in the development of workflows and in rolling out products such as resource maps and GIS data layers, which are indispensable in planning and decision-making. Challenges remains in speeding up output production (including quality checks) and in ensuring sustainability considering the short duration of the program. Enhancements in the workflows and protocols have been incorporated to address data quality and data availability issues. More trainings have been conducted for project staff hired to address human resource gaps. Collaborative arrangements with more partners are being established. To attain sustainability, the Program is developing and instituting a system of training, data updating and sharing, information utilization, and feedback. This requires collaboration and cooperation of the government agencies, LGUs, universities, other organizations, and the communities.

  19. Image analysis method for the measurement of water saturation in a two-dimensional experimental flow tank

    NASA Astrophysics Data System (ADS)

    Belfort, Benjamin; Weill, Sylvain; Lehmann, François

    2017-07-01

    A novel, non-invasive imaging technique is proposed that determines 2D maps of water content in unsaturated porous media. This method directly relates digitally measured intensities to the water content of the porous medium. This method requires the classical image analysis steps, i.e., normalization, filtering, background subtraction, scaling and calibration. The main advantages of this approach are that no calibration experiment is needed, because calibration curve relating water content and reflected light intensities is established during the main monitoring phase of each experiment and that no tracer or dye is injected into the flow tank. The procedure enables effective processing of a large number of photographs and thus produces 2D water content maps at high temporal resolution. A drainage/imbibition experiment in a 2D flow tank with inner dimensions of 40 cm × 14 cm × 6 cm (L × W × D) is carried out to validate the methodology. The accuracy of the proposed approach is assessed using a statistical framework to perform an error analysis and numerical simulations with a state-of-the-art computational code that solves the Richards' equation. Comparison of the cumulative mass leaving and entering the flow tank and water content maps produced by the photographic measurement technique and the numerical simulations demonstrate the efficiency and high accuracy of the proposed method for investigating vadose zone flow processes. Finally, the photometric procedure has been developed expressly for its extension to heterogeneous media. Other processes may be investigated through different laboratory experiments which will serve as benchmark for numerical codes validation.

  20. Overcoming complexities for consistent, continental-scale flood mapping

    NASA Astrophysics Data System (ADS)

    Smith, Helen; Zaidman, Maxine; Davison, Charlotte

    2013-04-01

    The EU Floods Directive requires all member states to produce flood hazard maps by 2013. Although flood mapping practices are well developed in Europe, there are huge variations in the scale and resolution of the maps between individual countries. Since extreme flood events are rarely confined to a single country, this is problematic, particularly for the re/insurance industry whose exposures often extend beyond country boundaries. Here, we discuss the challenges of large-scale hydrological and hydraulic modelling, using our experience of developing a 12-country model and set of maps, to illustrate how consistent, high-resolution river flood maps across Europe can be produced. The main challenges addressed include: data acquisition; manipulating the vast quantities of high-resolution data; and computational resources. Our starting point was to develop robust flood-frequency models that are suitable for estimating peak flows for a range of design flood return periods. We used the index flood approach, based on a statistical analysis of historic river flow data pooled on the basis of catchment characteristics. Historical flow data were therefore sourced for each country and collated into a large pan-European database. After a lengthy validation these data were collated into 21 separate analysis zones or regions, grouping smaller river basins according to their physical and climatic characteristics. The very large continental scale basins were each modelled separately on account of their size (e.g. Danube, Elbe, Drava and Rhine). Our methodology allows the design flood hydrograph to be predicted at any point on the river network for a range of return periods. Using JFlow+, JBA's proprietary 2D hydraulic hydrodynamic model, the calculated out-of-bank flows for all watercourses with an upstream drainage area exceeding 50km2 were routed across two different Digital Terrain Models in order to map the extent and depth of floodplain inundation. This generated modelling for a total river length of approximately 250,000km. Such a large-scale, high-resolution modelling exercise is extremely demanding on computational resources and would have been unfeasible without the use of Graphics Processing Units on a network of standard specification gaming computers. Our GPU grid is the world's largest flood-dedicated computer grid. The European river basins were split out into approximately 100 separate hydraulic models and managed individually, although care was taken to ensure flow continuity was maintained between models. The flood hazard maps from the modelling were pieced together using GIS techniques, to provide flood depth and extent information across Europe to a consistent scale and standard. After discussing the methodological challenges, we shall present our flood hazard maps and, from extensive validation work, compare these against historical flow records and observed flood extents.

  1. Using Concept Maps to Reveal Conceptual Typologies

    ERIC Educational Resources Information Center

    Hay, David B.; Kinchin, Ian M.

    2006-01-01

    Purpose: The purpose of this paper is to explain and develop a classification of cognitive structures (or typologies of thought), previously designated as spoke, chain and network thinking by Kinchin "et al." Design/methodology/approach: The paper shows how concept mapping can be used to reveal these conceptual typologies and endeavours to place…

  2. Identifying Multi-Level Culturally Appropriate Smoking Cessation Strategies for Aboriginal Health Staff: A Concept Mapping Approach

    ERIC Educational Resources Information Center

    Dawson, Anna P.; Cargo, Margaret; Stewart, Harold; Chong, Alwin; Daniel, Mark

    2013-01-01

    Aboriginal Australians, including Aboriginal Health Workers (AHWs), smoke at rates double the non-Aboriginal population. This study utilized concept mapping methodology to identify and prioritize culturally relevant strategies to promote smoking cessation in AHWs. Stakeholder participants included AHWs, other health service employees and tobacco…

  3. Mapping of Supply Chain Learning: A Framework for SMEs

    ERIC Educational Resources Information Center

    Thakkar, Jitesh; Kanda, Arun; Deshmukh, S. G.

    2011-01-01

    Purpose: The aim of this paper is to propose a mapping framework for evaluating supply chain learning potential for the context of small- to medium-sized enterprises (SMEs). Design/methodology/approach: The extracts of recently completed case based research for ten manufacturing SME units and facts reported in the previous research are utilized…

  4. A multivariate geostatistical methodology to delineate areas of potential interest for future sedimentary gold exploration

    PubMed Central

    Goovaerts, P.; Albuquerque, Teresa; Antunes, Margarida

    2015-01-01

    This paper describes a multivariate geostatistical methodology to delineate areas of potential interest for future sedimentary gold exploration, with an application to an abandoned sedimentary gold mining region in Portugal. The main challenge was the existence of only a dozen gold measurements confined to the grounds of the old gold mines, which precluded the application of traditional interpolation techniques, such as cokriging. The analysis could, however, capitalize on 376 stream sediment samples that were analyzed for twenty two elements. Gold (Au) was first predicted at all 376 locations using linear regression (R2=0.798) and four metals (Fe, As, Sn and W), which are known to be mostly associated with the local gold’s paragenesis. One hundred realizations of the spatial distribution of gold content were generated using sequential indicator simulation and a soft indicator coding of regression estimates, to supplement the hard indicator coding of gold measurements. Each simulated map then underwent a local cluster analysis to identify significant aggregates of low or high values. The one hundred classified maps were processed to derive the most likely classification of each simulated node and the associated probability of occurrence. Examining the distribution of the hot-spots and cold-spots reveals a clear enrichment in Au along the Erges River downstream from the old sedimentary mineralization. PMID:27777638

  5. Spatiotemporal Mapping of Interictal Spike Propagation: A Novel Methodology Applied to Pediatric Intracranial EEG Recordings

    PubMed Central

    Tomlinson, Samuel B.; Bermudez, Camilo; Conley, Chiara; Brown, Merritt W.; Porter, Brenda E.; Marsh, Eric D.

    2016-01-01

    Synchronized cortical activity is implicated in both normative cognitive functioning and many neurologic disorders. For epilepsy patients with intractable seizures, irregular synchronization within the epileptogenic zone (EZ) is believed to provide the network substrate through which seizures initiate and propagate. Mapping the EZ prior to epilepsy surgery is critical for detecting seizure networks in order to achieve postsurgical seizure control. However, automated techniques for characterizing epileptic networks have yet to gain traction in the clinical setting. Recent advances in signal processing and spike detection have made it possible to examine the spatiotemporal propagation of interictal spike discharges across the epileptic cortex. In this study, we present a novel methodology for detecting, extracting, and visualizing spike propagation and demonstrate its potential utility as a biomarker for the EZ. Eighteen presurgical intracranial EEG recordings were obtained from pediatric patients ultimately experiencing favorable (i.e., seizure-free, n = 9) or unfavorable (i.e., seizure-persistent, n = 9) surgical outcomes. Novel algorithms were applied to extract multichannel spike discharges and visualize their spatiotemporal propagation. Quantitative analysis of spike propagation was performed using trajectory clustering and spatial autocorrelation techniques. Comparison of interictal propagation patterns revealed an increase in trajectory organization (i.e., spatial autocorrelation) among Sz-Free patients compared with Sz-Persist patients. The pathophysiological basis and clinical implications of these findings are considered. PMID:28066315

  6. Model-Driven Approach for Body Area Network Application Development.

    PubMed

    Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata

    2016-05-12

    This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application.

  7. Model-Driven Approach for Body Area Network Application Development

    PubMed Central

    Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata

    2016-01-01

    This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application. PMID:27187394

  8. A methodology for quantifying and mapping ecosystem services provided by watersheds

    USGS Publications Warehouse

    Villamagna, Amy M.; Angermeier, Paul L.

    2015-01-01

    Watershed processes – physical, chemical, and biological – are the foundation for many benefits that ecosystems provide for human societies. A crucial step toward accurately representing those benefits, so they can ultimately inform decisions about land and water management, is the development of a coherent methodology that can translate available data into the ecosystem services (ES) produced by watersheds. Ecosystem services (ES) provide an instinctive way to understand the tradeoffs associated with natural resource management. We provide a synthesis of common terminology and explain a rationale and framework for distinguishing among the components of ecosystem service delivery, including: an ecosystem’s capacity to produce a service; societal demand for the service; ecological pressures on this service; and flow of the service to people. We discuss how interpretation and measurement of these components can differ among provisioning, regulating, and cultural services and describe selected methods for quantifying ES components as well as constraints on data availability. We also present several case studies to illustrate our methods, including mapping capacity of several water purification services and demand for two forms of wildlife-based recreation, and discuss future directions for ecosystem service assessments. Our flexible framework treats service capacity, demand, ecological pressure, and flow as separate but interactive entities to better evaluate the sustainability of service provision across space and time and to help guide management decisions.

  9. The Flint Food Store Survey: combining spatial analysis with a modified Nutrition Environment Measures Survey in Stores (NEMS-S) to measure the community and consumer nutrition environments.

    PubMed

    Shaver, Erika R; Sadler, Richard C; Hill, Alex B; Bell, Kendall; Ray, Myah; Choy-Shin, Jennifer; Lerner, Joy; Soldner, Teresa; Jones, Andrew D

    2018-06-01

    The goal of the present study was to use a methodology that accurately and reliably describes the availability, price and quality of healthy foods at both the store and community levels using the Nutrition Environment Measures Survey in Stores (NEMS-S), to propose a spatial methodology for integrating these store and community data into measures for defining objective food access. Two hundred and sixty-five retail food stores in and within 2 miles (3·2 km) of Flint, Michigan, USA, were mapped using ArcGIS mapping software. A survey based on the validated NEMS-S was conducted at each retail food store. Scores were assigned to each store based on a modified version of the NEMS-S scoring system and linked to the mapped locations of stores. Neighbourhood characteristics (race and socio-economic distress) were appended to each store. Finally, spatial and kernel density analyses were run on the mapped store scores to obtain healthy food density metrics. Regression analyses revealed that neighbourhoods with higher socio-economic distress had significantly lower dairy sub-scores compared with their lower-distress counterparts (β coefficient=-1·3; P=0·04). Additionally, supermarkets were present only in neighbourhoods with <60 % African-American population and low socio-economic distress. Two areas in Flint had an overall NEMS-S score of 0. By identifying areas with poor access to healthy foods via a validated metric, this research can be used help local government and organizations target interventions to high-need areas. Furthermore, the methodology used for the survey and the mapping exercise can be replicated in other cities to provide comparable results.

  10. Manyscale Computing for Sensor Processing in Support of Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Schmalz, M.; Chapman, W.; Hayden, E.; Sahni, S.; Ranka, S.

    2014-09-01

    Increasing image and signal data burden associated with sensor data processing in support of space situational awareness implies continuing computational throughput growth beyond the petascale regime. In addition to growing applications data burden and diversity, the breadth, diversity and scalability of high performance computing architectures and their various organizations challenge the development of a single, unifying, practicable model of parallel computation. Therefore, models for scalable parallel processing have exploited architectural and structural idiosyncrasies, yielding potential misapplications when legacy programs are ported among such architectures. In response to this challenge, we have developed a concise, efficient computational paradigm and software called Manyscale Computing to facilitate efficient mapping of annotated application codes to heterogeneous parallel architectures. Our theory, algorithms, software, and experimental results support partitioning and scheduling of application codes for envisioned parallel architectures, in terms of work atoms that are mapped (for example) to threads or thread blocks on computational hardware. Because of the rigor, completeness, conciseness, and layered design of our manyscale approach, application-to-architecture mapping is feasible and scalable for architectures at petascales, exascales, and above. Further, our methodology is simple, relying primarily on a small set of primitive mapping operations and support routines that are readily implemented on modern parallel processors such as graphics processing units (GPUs) and hybrid multi-processors (HMPs). In this paper, we overview the opportunities and challenges of manyscale computing for image and signal processing in support of space situational awareness applications. We discuss applications in terms of a layered hardware architecture (laboratory > supercomputer > rack > processor > component hierarchy). Demonstration applications include performance analysis and results in terms of execution time as well as storage, power, and energy consumption for bus-connected and/or networked architectures. The feasibility of the manyscale paradigm is demonstrated by addressing four principal challenges: (1) architectural/structural diversity, parallelism, and locality, (2) masking of I/O and memory latencies, (3) scalability of design as well as implementation, and (4) efficient representation/expression of parallel applications. Examples will demonstrate how manyscale computing helps solve these challenges efficiently on real-world computing systems.

  11. Assessment of groundwater vulnerability to pollution: a combination of GIS, fuzzy logic and decision making techniques

    NASA Astrophysics Data System (ADS)

    Gemitzi, Alexandra; Petalas, Christos; Tsihrintzis, Vassilios A.; Pisinaras, Vassilios

    2006-03-01

    The assessment of groundwater vulnerability to pollution aims at highlighting areas at a high risk of being polluted. This study presents a methodology, to estimate the risk of an aquifer to be polluted from concentrated and/or dispersed sources, which applies an overlay and index method involving several parameters. The parameters are categorized into three factor groups: factor group 1 includes parameters relevant to the internal aquifer system’s properties, thus determining the intrinsic aquifer vulnerability to pollution; factor group 2 comprises parameters relevant to the external stresses to the system, such as human activities and rainfall effects; factor group 3 incorporates specific geological settings, such as the presence of geothermal fields or salt intrusion zones, into the computation process. Geographical information systems have been used for data acquisition and processing, coupled with a multicriteria evaluation technique enhanced with fuzzy factor standardization. Moreover, besides assigning weights to factors, a second set of weights, i.e., order weights, has been applied to factors on a pixel by pixel basis, thus allowing control of the level of risk in the vulnerability determination and the enhancement of local site characteristics. Individual analysis of each factor group resulted in three intermediate groundwater vulnerability to pollution maps, which were combined in order to produce the final composite groundwater vulnerability map for the study area. The method has been applied in the region of Eastern Macedonia and Thrace (Northern Greece), an area of approximately 14,000 km2. The methodology has been tested and calibrated against the measured nitrate concentration in wells, in the northwest part of the study area, providing results related to the aggregation and weighting procedure.

  12. Planetary Geologic Mapping Handbook - 2009

    NASA Technical Reports Server (NTRS)

    Tanaka, K. L.; Skinner, J. A.; Hare, T. M.

    2009-01-01

    Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces (e.g., Varnes, 1974). Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962 (Hackman, 1962). Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete. Terrestrial geologic maps published by the USGS now are primarily digital products using geographic information system (GIS) software and file formats. GIS mapping tools permit easy spatial comparison, generation, importation, manipulation, and analysis of multiple raster image, gridded, and vector data sets. GIS software has also permitted the development of project-specific tools and the sharing of geospatial products among researchers. GIS approaches are now being used in planetary geologic mapping as well (e.g., Hare and others, 2009). Guidelines or handbooks on techniques in planetary geologic mapping have been developed periodically (e.g., Wilhelms, 1972, 1990; Tanaka and others, 1994). As records of the heritage of mapping methods and data, these remain extremely useful guides. However, many of the fundamental aspects of earlier mapping handbooks have evolved significantly, and a comprehensive review of currently accepted mapping methodologies is now warranted. As documented in this handbook, such a review incorporates additional guidelines developed in recent years for planetary geologic mapping by the NASA Planetary Geology and Geophysics (PGG) Program s Planetary Cartography and Geologic Mapping Working Group s (PCGMWG) Geologic Mapping Subcommittee (GEMS) on the selection and use of map bases as well as map preparation, review, publication, and distribution. In light of the current boom in planetary exploration and the ongoing rapid evolution of available data for planetary mapping, this handbook is especially timely.

  13. Mapping flood and flooding potential indices: a methodological approach to identifying areas susceptible to flood and flooding risk. Case study: the Prahova catchment (Romania)

    NASA Astrophysics Data System (ADS)

    Zaharia, Liliana; Costache, Romulus; Prăvălie, Remus; Ioana-Toroimac, Gabriela

    2017-04-01

    Given that floods continue to cause yearly significant worldwide human and material damages, flood risk mitigation is a key issue and a permanent challenge in developing policies and strategies at various spatial scales. Therefore, a basic phase is elaborating hazard and flood risk maps, documents which are an essential support for flood risk management. The aim of this paper is to develop an approach that allows for the identification of flash-flood and flood-prone susceptible areas based on computing and mapping of two indices: FFPI (Flash-Flood Potential Index) and FPI (Flooding Potential Index). These indices are obtained by integrating in a GIS environment several geographical variables which control runoff (in the case of the FFPI) and favour flooding (in the case of the FPI). The methodology was applied in the upper (mountainous) and middle (hilly) catchment of the Prahova River, a densely populated and socioeconomically well-developed area which has been affected repeatedly by water-related hazards over the past decades. The resulting maps showing the spatialization of the FFPI and FPI allow for the identification of areas with high susceptibility to flashfloods and flooding. This approach can provide useful mapped information, especially for areas (generally large) where there are no flood/hazard risk maps. Moreover, the FFPI and FPI maps can constitute a preliminary step for flood risk and vulnerability assessment.

  14. Probability genotype imputation method and integrated weighted lasso for QTL identification.

    PubMed

    Demetrashvili, Nino; Van den Heuvel, Edwin R; Wit, Ernst C

    2013-12-30

    Many QTL studies have two common features: (1) often there is missing marker information, (2) among many markers involved in the biological process only a few are causal. In statistics, the second issue falls under the headings "sparsity" and "causal inference". The goal of this work is to develop a two-step statistical methodology for QTL mapping for markers with binary genotypes. The first step introduces a novel imputation method for missing genotypes. Outcomes of the proposed imputation method are probabilities which serve as weights to the second step, namely in weighted lasso. The sparse phenotype inference is employed to select a set of predictive markers for the trait of interest. Simulation studies validate the proposed methodology under a wide range of realistic settings. Furthermore, the methodology outperforms alternative imputation and variable selection methods in such studies. The methodology was applied to an Arabidopsis experiment, containing 69 markers for 165 recombinant inbred lines of a F8 generation. The results confirm previously identified regions, however several new markers are also found. On the basis of the inferred ROC behavior these markers show good potential for being real, especially for the germination trait Gmax. Our imputation method shows higher accuracy in terms of sensitivity and specificity compared to alternative imputation method. Also, the proposed weighted lasso outperforms commonly practiced multiple regression as well as the traditional lasso and adaptive lasso with three weighting schemes. This means that under realistic missing data settings this methodology can be used for QTL identification.

  15. Virtual environment navigation with look-around mode to explore new real spaces by people who are blind.

    PubMed

    Lahav, Orly; Gedalevitz, Hadas; Battersby, Steven; Brown, David; Evett, Lindsay; Merritt, Patrick

    2018-05-01

    This paper examines the ability of people who are blind to construct a mental map and perform orientation tasks in real space by using Nintendo Wii technologies to explore virtual environments. The participant explores new spaces through haptic and auditory feedback triggered by pointing or walking in the virtual environments and later constructs a mental map, which can be used to navigate in real space. The study included 10 participants who were congenitally or adventitiously blind, divided into experimental and control groups. The research was implemented by using virtual environments exploration and orientation tasks in real spaces, using both qualitative and quantitative methods in its methodology. The results show that the mode of exploration afforded to the experimental group is radically new in orientation and mobility training; as a result 60% of the experimental participants constructed mental maps that were based on map model, compared with only 30% of the control group participants. Using technology that enabled them to explore and to collect spatial information in a way that does not exist in real space influenced the ability of the experimental group to construct a mental map based on the map model. Implications for rehabilitation The virtual cane system for the first time enables people who are blind to explore and collect spatial information via the look-around mode in addition to the walk-around mode. People who are blind prefer to use look-around mode to explore new spaces, as opposed to the walking mode. Although the look-around mode requires users to establish a complex collecting and processing procedure for the spatial data, people who are blind using this mode are able to construct a mental map as a map model. For people who are blind (as for the sighted) construction of a mental map based on map model offers more flexibility in choosing a walking path in a real space, accounting for changes that occur in the space.

  16. The bedrock electrical conductivity map of the UK

    NASA Astrophysics Data System (ADS)

    Beamish, David

    2013-09-01

    Airborne electromagnetic (AEM) surveys, when regionally extensive, may sample a wide-range of geological formations. The majority of AEM surveys can provide estimates of apparent (half-space) conductivity and such derived data provide a mapping capability. Depth discrimination of the geophysical mapping information is controlled by the bandwidth of each particular system. The objective of this study is to assess the geological information contained in accumulated frequency-domain AEM survey data from the UK where existing geological mapping can be considered well-established. The methodology adopted involves a simple GIS-based, spatial join of AEM and geological databases. A lithology-based classification of bedrock is used to provide an inherent association with the petrophysical rock parameters controlling bulk conductivity. At a scale of 1:625k, the UK digital bedrock geological lexicon comprises just 86 lithological classifications compared with 244 standard lithostratigraphic assignments. The lowest common AEM survey frequency of 3 kHz is found to provide an 87% coverage (by area) of the UK formations. The conductivities of the unsampled classes have been assigned on the basis of inherent lithological associations between formations. The statistical analysis conducted uses over 8 M conductivity estimates and provides a new UK national scale digital map of near-surface bedrock conductivity. The new baseline map, formed from central moments of the statistical distributions, allows assessments/interpretations of data exhibiting departures from the norm. The digital conductivity map developed here is believed to be the first such UK geophysical map compilation for over 75 years. The methodology described can also be applied to many existing AEM data sets.

  17. Multiple ligand simultaneous docking: orchestrated dancing of ligands in binding sites of protein.

    PubMed

    Li, Huameng; Li, Chenglong

    2010-07-30

    Present docking methodologies simulate only one single ligand at a time during docking process. In reality, the molecular recognition process always involves multiple molecular species. Typical protein-ligand interactions are, for example, substrate and cofactor in catalytic cycle; metal ion coordination together with ligand(s); and ligand binding with water molecules. To simulate the real molecular binding processes, we propose a novel multiple ligand simultaneous docking (MLSD) strategy, which can deal with all the above processes, vastly improving docking sampling and binding free energy scoring. The work also compares two search strategies: Lamarckian genetic algorithm and particle swarm optimization, which have respective advantages depending on the specific systems. The methodology proves robust through systematic testing against several diverse model systems: E. coli purine nucleoside phosphorylase (PNP) complex with two substrates, SHP2NSH2 complex with two peptides and Bcl-xL complex with ABT-737 fragments. In all cases, the final correct docking poses and relative binding free energies were obtained. In PNP case, the simulations also capture the binding intermediates and reveal the binding dynamics during the recognition processes, which are consistent with the proposed enzymatic mechanism. In the other two cases, conventional single-ligand docking fails due to energetic and dynamic coupling among ligands, whereas MLSD results in the correct binding modes. These three cases also represent potential applications in the areas of exploring enzymatic mechanism, interpreting noisy X-ray crystallographic maps, and aiding fragment-based drug design, respectively. 2010 Wiley Periodicals, Inc.

  18. A comparison of top-down and bottom-up approaches to benthic habitat mapping to inform offshore wind energy development

    NASA Astrophysics Data System (ADS)

    LaFrance, Monique; King, John W.; Oakley, Bryan A.; Pratt, Sheldon

    2014-07-01

    Recent interest in offshore renewable energy within the United States has amplified the need for marine spatial planning to direct management strategies and address competing user demands. To assist this effort in Rhode Island, benthic habitat classification maps were developed for two sites in offshore waters being considered for wind turbine installation. Maps characterizing and representing the distribution and extent of benthic habitats are valuable tools for improving understanding of ecosystem patterns and processes, and promoting scientifically-sound management decisions. This project presented the opportunity to conduct a comparison of the methodologies and resulting map outputs of two classification approaches, “top-down” and “bottom-up” in the two study areas. This comparison was undertaken to improve understanding of mapping methodologies and their applicability, including the bottom-up approach in offshore environments where data density tends to be lower, as well as to provide case studies for scientists and managers to consider for their own areas of interest. Such case studies can offer guidance for future work for assessing methodologies and translating them to other areas. The traditional top-down mapping approach identifies biological community patterns based on communities occurring within geologically defined habitat map units, under the concept that geologic environments contain distinct biological assemblages. Alternatively, the bottom-up approach aims to establish habitat map units centered on biological similarity and then uses statistics to identify relationships with associated environmental parameters and determine habitat boundaries. When applied to the two study areas, both mapping approaches produced habitat classes with distinct macrofaunal assemblages and each established statistically strong and significant biotic-abiotic relationships with geologic features, sediment characteristics, water depth, and/or habitat heterogeneity over various spatial scales. The approaches were also able to integrate various data at differing spatial resolutions. The classification outputs exhibited similar results, including the number of habitat classes generated, the number of species defining the classes, the level of distinction of the biological communities, and dominance by tube-building amphipods. These results indicate that both approaches are able to discern a comparable degree of habitat variability and produce cohesive macrofaunal assemblages. The mapping approaches identify broadly similar benthic habitats at the two study sites and methods were able to distinguish the differing levels of heterogeneity between them. The top-down approach to habitat classification was faster and simpler to accomplish with the data available in this study when compared to the bottom-up approach. Additionally, the top-down approach generated full-coverage habitat classes that are clearly delineated and can easily be interpreted by the map user, which is desirable from a management perspective for providing a more complete assessment of the areas of interest. However, a higher level of biological variability was noted in some of the habitat classes created, indicating that the biological communities present in this area are influenced by factors not captured in the broad-scale geological habitat units used in this approach. The bottom-up approach was valuable in its ability to more clearly define macrofaunal assemblages among habitats, discern finer-scale habitat characteristics, and directly assess the degree of macrofaunal assemblage variability captured by the environmental parameters. From a user perspective, the map is more complex, which may be perceived as a limitation, though likely reflects natural gradations in habitat structure and likely presents a more ecologically realistic portrayal of the study areas. Though more comprehensive, the bottom-up approach in this study was limited by the reliance on full-coverage data to create full-coverage habitat classes. Such classes could only be developed when sediment data was excluded, since this point-sample dataset could not be interpolated due to high spatial heterogeneity of the study areas. Given a higher density of bottom samples, this issue could be rectified. While the top-down approach was more appropriate for this study, both approaches were found to be suitable for mapping and classifying benthic habitats. In the United States, objectives for mapping and classification for renewable energy development have not been well established. Therefore, at this time, the best-suited approach primarily depends on mapping objectives, resource availability, data quality and coverage, and geographical location, as these factors impact the types of data included, the analyses and modeling that can be performed, and the biotic-abiotic relationships identified.

  19. Mapping Land Cover and Land Use Changes in the Congo Basin Forests with Optical Satellite Remote Sensing: a Pilot Project Exploring Methodologies that Improve Spatial Resolution and Map Accuracy

    NASA Astrophysics Data System (ADS)

    Molinario, G.; Baraldi, A.; Altstatt, A. L.; Nackoney, J.

    2011-12-01

    The University of Maryland has been a USAID Central Africa Rregional Program for the Environment (CARPE) cross-cutting partner for many years, providing remote sensing derived information on forest cover and forest cover changes in support of CARPE's objectives of diminishing forest degradation, loss and biodiversity loss as a result of poor or inexistent land use planning strategies. Together with South Dakota State University, Congo Basin-wide maps have been provided that map forest cover loss at a maximum of 60m resolution, using Landsat imagery and higher resolution imagery for algorithm training and validation. However, to better meet the needs within the CARPE Landscapes, which call for higher resolution, more accurate land cover change maps, UMD has been exploring the use of the SIAM automatic spectral -rule classifier together with pan-sharpened Landsat data (15m resolution) and Very High Resolution imagery from various sources. The pilot project is being developed in collaboration with the African Wildlife Foundation in the Maringa Lopori Wamba CARPE Landscape. If successful in the future this methodology will make the creation of high resolution change maps faster and easier, making it accessible to other entities in the Congo Basin that need accurate land cover and land use change maps in order, for example, to create sustainable land use plans, conserve biodiversity and resources and prepare Reducing Emissions from forest Degradation and Deforestation (REDD) Measurement, Reporting and Verification (MRV) projects. The paper describes the need for higher resolution land cover change maps that focus on forest change dynamics such as the cycling between primary forests, secondary forest, agriculture and other expanding and intensifying land uses in the Maringa Lopori Wamba CARPE Landscape in the Equateur Province of the Democratic Republic of Congo. The Methodology uses the SIAM remote sensing imagery automatic spectral rule classifier, together with pan-sharpened Landsat imagery with 15m resolution and Very High Resolution imagery from different sensors, obtained from the Department of Defense database that was recently opened to NASA and its Earth Observation partners. Particular emphasis is placed on the detection of agricultural fields and their expansion in primary forests or intensification in secondary forests and fallow fields, as this is the primary driver of deforestation in this area. Fields in this area area also of very small size and irregular shapes, often partly obscured by neighboring forest canopy, hence the technical challenge of correctly detecting them and tracking them through time. Finally, the potential for use of this methodology in other regions where information on land cover changes is needed for land use sustainability planning, is also addressed.

  20. A Multitemporal, Multisensor Approach to Mapping the Canadian Boreal Forest

    NASA Astrophysics Data System (ADS)

    Reith, Ernest

    The main anthropogenic source of CO2 emissions is the combustion of fossil fuels, while the clearing and burning of forests contribute significant amounts as well. Vegetation represents a major reservoir for terrestrial carbon stocks, and improving our ability to inventory vegetation will enhance our understanding of the impacts of land cover and climate change on carbon stocks and fluxes. These relationships may be an indication of a series of troubling biosphere-atmospheric feedback mechanisms that need to be better understood and modeled. Valuable land cover information can be provided to the global climate change modeling community using advanced remote sensing capabilities such as Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and Airborne Synthetic Aperture Radar (AIRSAR). Individually and synergistically, data were successfully used to characterize the complex nature of the Canadian boreal forest land cover types. The multiple endmember spectral mixture analysis process was applied against seasonal AVIRIS data to produce species-level vegetated land cover maps of two study sites in the Canadian boreal forest: Old Black Spruce (OBS) and Old Jack Pine (OJP). The highest overall accuracy was assessed to be at least 66% accurate to the available reference map, providing evidence that high-quality, species-level land cover mapping of the Canadian boreal forest is achievable at accuracy levels greater than other previous research efforts in the region. Backscatter information from multichannel, polarimetric SAR utilizing a binary decision tree-based classification technique methodology was moderately successfully applied to AIRSAR to produce maps of the boreal land cover types at both sites, with overall accuracies at least 59%. A process, centered around noise whitening and principal component analysis features of the minimum noise fraction transform, was implemented to leverage synergies contained within spatially coregistered multitemporal and multisensor AVIRIS and AIRSAR data sets to successfully produce high-accuracy boreal forest land cover maps. Overall land cover map accuracies of 78% and 72% were assessed for OJP and OBS sites, respectively, for either seasonal or multitemporal data sets. High individual land cover accuracies appeared to be independent of site, season, or multisensor combination in the minimum-noise fraction-based approach.

  1. Direct Bio-printing with Heterogeneous Topology Design.

    PubMed

    Ahsan, Amm Nazmul; Xie, Ruinan; Khoda, Bashir

    2017-01-01

    Bio-additive manufacturing is a promising tool to fabricate porous scaffold structures for expediting the tissue regeneration processes. Unlike the most traditional bulk material objects, the microstructures of tissue and organs are mostly highly anisotropic, heterogeneous, and porous in nature. However, modelling the internal heterogeneity of tissues/organs structures in the traditional CAD environment is difficult and oftentimes inaccurate. Besides, the de facto STL conversion of bio-models introduces loss of information and piles up more errors in each subsequent step (build orientation, slicing, tool-path planning) of the bio-printing process plan. We are proposing a topology based scaffold design methodology to accurately represent the heterogeneous internal architecture of tissues/organs. An image analysis technique is used that digitizes the topology information contained in medical images of tissues/organs. A weighted topology reconstruction algorithm is implemented to represent the heterogeneity with parametric functions. The parametric functions are then used to map the spatial material distribution. The generated information is directly transferred to the 3D bio-printer and heterogeneous porous tissue scaffold structure is manufactured without STL file. The proposed methodology is implemented to verify the effectiveness of the approach and the designed example structure is bio-fabricated with a deposition based bio-additive manufacturing system.

  2. Estimating economic losses from earthquakes using an empirical approach

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2013-01-01

    We extended the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) empirical fatality estimation methodology proposed by Jaiswal et al. (2009) to rapidly estimate economic losses after significant earthquakes worldwide. The requisite model inputs are shaking intensity estimates made by the ShakeMap system, the spatial distribution of population available from the LandScan database, modern and historic country or sub-country population and Gross Domestic Product (GDP) data, and economic loss data from Munich Re's historical earthquakes catalog. We developed a strategy to approximately scale GDP-based economic exposure for historical and recent earthquakes in order to estimate economic losses. The process consists of using a country-specific multiplicative factor to accommodate the disparity between economic exposure and the annual per capita GDP, and it has proven successful in hindcast-ing past losses. Although loss, population, shaking estimates, and economic data used in the calibration process are uncertain, approximate ranges of losses can be estimated for the primary purpose of gauging the overall scope of the disaster and coordinating response. The proposed methodology is both indirect and approximate and is thus best suited as a rapid loss estimation model for applications like the PAGER system.

  3. (Semi-)Automated landform mapping of the alpine valley Gradental (Austria) based on LiDAR data

    NASA Astrophysics Data System (ADS)

    Strasser, T.; Eisank, C.

    2012-04-01

    Alpine valleys are typically characterised as complex, hierarchical structured systems with rapid landform changes. Detection of landform changes can be supported by automated geomorphological mapping. Especially, the analysis over short time scales require a method for standardised, unbiased geomorphological map reproduction, which is delivered by automated mapping techniques. In general, digital geomorphological mapping is a challenging task, since knowledge about landforms with respect to their natural boundaries as well as their hierarchical and scaling relationships, has to be integrated in an objective way. A combination of very-high spatial resolution data (VHSR) such as LiDAR and new methods like object based image analysis (OBIA) allow for a more standardised production of geomorphological maps. In OBIA the processing units are spatially configured objects that are created by multi-scale segmentation. Therefore, not only spectral information can be used for assigning the objects to geomorphological classes, but also spatial and topological properties can be exploited. In this study we focus on the detection of landforms, especially bedrock sediment deposits (alluvion, debris cone, talus, moraine, rockglacier), as well as glaciers. The study site Gradental [N 46°58'29.1"/ E 12°48'53.8"] is located in the Schobergruppe (Austria, Carinthia) and is characterised by heterogenic geology conditions and high process activity. The area is difficult to access and dominated by steep slopes, thus hindering a fast and detailed geomorphological field mapping. Landforms are identified using aerial and terrestrial LiDAR data (1 m spatial resolution). These DEMs are analysed by an object based hierarchical approach, which is structured in three main steps. The first step is to define occurring landforms by basic land surface parameters (LSPs), topology and hierarchy relations. Based on those definitions a semantic model is created. Secondly, a multi-scale segmentation is performed on a three-band LSP that integrates slope, aspect and plan curvature, which expresses the driving forces of geomorphological processes. In the third step, the generated multi-level object structures are classified in order to produce the geomorphological map. The classification rules are derived from the semantic model. Due to landform type-specific scale dependencies of LSPs, the values of LSPs used in the classification are calculated in a multi-scale manner by constantly enlarging the size of the moving window. In addition, object form properties (density, compactness, rectangular fit) are utilised as additional information for landform characterisation. Validation of classification is performed by intersecting a visually interpreted reference map with the classification output map and calculating accuracy matrices. Validation shows an overall accuracy of 78.25 % and a Kappa of 0.65. The natural borders of landforms can be easily detected by the use of slope, aspect and plan curvature. This study illustrates the potential of OBIA for a more standardised and automated mapping of surface units (landforms, landcover). Therefore, the presented methodology features a prospective automated geomorphological mapping approach for alpine regions.

  4. Dictionary Based Machine Translation from Kannada to Telugu

    NASA Astrophysics Data System (ADS)

    Sindhu, D. V.; Sagar, B. M.

    2017-08-01

    Machine Translation is a task of translating from one language to another language. For the languages with less linguistic resources like Kannada and Telugu Dictionary based approach is the best approach. This paper mainly focuses on Dictionary based machine translation for Kannada to Telugu. The proposed methodology uses dictionary for translating word by word without much correlation of semantics between them. The dictionary based machine translation process has the following sub process: Morph analyzer, dictionary, transliteration, transfer grammar and the morph generator. As a part of this work bilingual dictionary with 8000 entries is developed and the suffix mapping table at the tag level is built. This system is tested for the children stories. In near future this system can be further improved by defining transfer grammar rules.

  5. Toxicophore exploration as a screening technology for drug design and discovery: techniques, scope and limitations.

    PubMed

    Singh, Pankaj Kumar; Negi, Arvind; Gupta, Pawan Kumar; Chauhan, Monika; Kumar, Raj

    2016-08-01

    Toxicity is a common drawback of newly designed chemotherapeutic agents. With the exception of pharmacophore-induced toxicity (lack of selectivity at higher concentrations of a drug), the toxicity due to chemotherapeutic agents is based on the toxicophore moiety present in the drug. To date, methodologies implemented to determine toxicophores may be broadly classified into biological, bioanalytical and computational approaches. The biological approach involves analysis of bioactivated metabolites, whereas the computational approach involves a QSAR-based method, mapping techniques, an inverse docking technique and a few toxicophore identification/estimation tools. Being one of the major steps in drug discovery process, toxicophore identification has proven to be an essential screening step in drug design and development. The paper is first of its kind, attempting to cover and compare different methodologies employed in predicting and determining toxicophores with an emphasis on their scope and limitations. Such information may prove vital in the appropriate selection of methodology and can be used as screening technology by researchers to discover the toxicophoric potentials of their designed and synthesized moieties. Additionally, it can be utilized in the manipulation of molecules containing toxicophores in such a manner that their toxicities might be eliminated or removed.

  6. Potential and limitations of webcam images for snow cover monitoring in the Swiss Alps

    NASA Astrophysics Data System (ADS)

    Dizerens, Céline; Hüsler, Fabia; Wunderle, Stefan

    2017-04-01

    In Switzerland, several thousands of outdoor webcams are currently connected to the Internet. They deliver freely available images that can be used to analyze snow cover variability on a high spatio-temporal resolution. To make use of this big data source, we have implemented a webcam-based snow cover mapping procedure, which allows to almost automatically derive snow cover maps from such webcam images. As there is mostly no information about the webcams and its parameters available, our registration approach automatically resolves these parameters (camera orientation, principal point, field of view) by using an estimate of the webcams position, the mountain silhouette, and a high-resolution digital elevation model (DEM). Combined with an automatic snow classification and an image alignment using SIFT features, our procedure can be applied to arbitrary images to generate snow cover maps with a minimum of effort. Resulting snow cover maps have the same resolution as the digital elevation model and indicate whether each grid cell is snow-covered, snow-free, or hidden from webcams' positions. Up to now, we processed images of about 290 webcams from our archive, and evaluated images of 20 webcams using manually selected ground control points (GCPs) to evaluate the mapping accuracy of our procedure. We present methodological limitations and ongoing improvements, show some applications of our snow cover maps, and demonstrate that webcams not only offer a great opportunity to complement satellite-derived snow retrieval under cloudy conditions, but also serve as a reference for improved validation of satellite-based approaches.

  7. A New Map of Standardized Terrestrial Ecosystems of the Conterminous United States

    USGS Publications Warehouse

    Sayre, Roger G.; Comer, Patrick; Warner, Harumi; Cress, Jill

    2009-01-01

    A new map of standardized, mesoscale (tens to thousands of hectares) terrestrial ecosystems for the conterminous United States was developed by using a biophysical stratification approach. The ecosystems delineated in this top-down, deductive modeling effort are described in NatureServe's classification of terrestrial ecological systems of the United States. The ecosystems were mapped as physically distinct areas and were associated with known distributions of vegetation assemblages by using a standardized methodology first developed for South America. This approach follows the geoecosystems concept of R.J. Huggett and the ecosystem geography approach of R.G. Bailey. Unique physical environments were delineated through a geospatial combination of national data layers for biogeography, bioclimate, surficial materials lithology, land surface forms, and topographic moisture potential. Combining these layers resulted in a comprehensive biophysical stratification of the conterminous United States, which produced 13,482 unique biophysical areas. These were considered as fundamental units of ecosystem structure and were aggregated into 419 potential terrestrial ecosystems. The ecosystems classification effort preceded the mapping effort and involved the independent development of diagnostic criteria, descriptions, and nomenclature for describing expert-derived ecological systems. The aggregation and labeling of the mapped ecosystem structure units into the ecological systems classification was accomplished in an iterative, expert-knowledge-based process using automated rulesets for identifying ecosystems on the basis of their biophysical and biogeographic attributes. The mapped ecosystems, at a 30-meter base resolution, represent an improvement in spatial and thematic (class) resolution over existing ecoregionalizations and are useful for a variety of applications, including ecosystem services assessments, climate change impact studies, biodiversity conservation, and resource management.

  8. A new GIS-based tsunami risk evaluation: MeTHuVA (METU tsunami human vulnerability assessment) at Yenikapı, Istanbul

    NASA Astrophysics Data System (ADS)

    Cankaya, Zeynep Ceren; Suzen, Mehmet Lutfi; Yalciner, Ahmet Cevdet; Kolat, Cagil; Zaytsev, Andrey; Aytore, Betul

    2016-07-01

    Istanbul is a mega city with various coastal utilities located on the northern coast of the Sea of Marmara. At Yenikapı, there are critical vulnerable coastal utilities, structures, and active metropolitan life. Fishery ports, commercial ports, small craft harbors, passenger terminals of intercity maritime transportation, waterfront commercial and/or recreational structures with residential/commercial areas and public utility areas are some examples of coastal utilization that are vulnerable to marine disasters. Therefore, the tsunami risk in the Yenikapı region is an important issue for Istanbul. In this study, a new methodology for tsunami vulnerability assessment for areas susceptible to tsunami is proposed, in which the Yenikapı region is chosen as a case study. Available datasets from the Istanbul Metropolitan Municipality and Turkish Navy are used as inputs for high-resolution GIS-based multi-criteria decision analysis (MCDA) evaluation of tsunami risk in Yenikapı. Bathymetry and topography database is used for high-resolution tsunami numerical modeling where the tsunami hazard, in terms of coastal inundation, is deterministically computed using the NAMI DANCE numerical code, considering earthquake worst case scenarios. In order to define the tsunami human vulnerability of the region, two different aspects, vulnerability at location and evacuation resilience maps were created using the analytical hierarchical process (AHP) method of MCDA. A vulnerability at location map is composed of metropolitan use, geology, elevation, and distance from shoreline layers, whereas an evacuation resilience map is formed by slope, distance within flat areas, distance to buildings, and distance to road networks layers. The tsunami risk map is then computed by the proposed new relationship which uses flow depth maps, vulnerability at location maps, and evacuation resilience maps.

  9. Planck CMB Anomalies: Astrophysical and Cosmological Secondary Effects and the Curse of Masking

    NASA Astrophysics Data System (ADS)

    Rassat, Anais

    2016-07-01

    Large-scale anomalies have been reported in CMB data with both WMAP and Planck data. These could be due to foreground residuals and or systematic effects, though their confirmation with Planck data suggests they are not due to a problem in the WMAP or Planck pipelines. If these anomalies are in fact primordial, then understanding their origin is fundamental to either validate the standard model of cosmology or to explore new physics. We investigate three other possible issues: 1) the trade-off between minimising systematics due to foreground contamination (with a conservative mask) and minimising systematics due to masking, 2) astrophysical secondary effects (the kinetic Doppler quadrupole and kinetic Sunyaev-Zel'dovich effect), and 3) secondary cosmological signals (the integrated Sachs-Wolfe effect). We address the masking issue by considering new procedures that use both WMAP and Planck to produce higher quality full-sky maps using the sparsity methodology (LGMCA maps). We show the impact of masking is dominant over that of residual foregrounds, and the LGMCA full-sky maps can be used without further processing to study anomalies. We consider four official Planck PR1 and two LGMCA CMB maps. Analysis of the observed CMB maps shows that only the low quadrupole and quadrupole-octopole alignment seem significant, but that the planar octopole, Axis of Evil, mirror parity and cold spot are not significant in nearly all maps considered. After subtraction of astrophysical and cosmological secondary effects, only the low quadrupole may still be considered anomalous, meaning the significance of only one anomaly is affected by secondary effect subtraction out of six anomalies considered. In the spirit of reproducible research all reconstructed maps and codes are available online.

  10. Planck CMB anomalies: astrophysical and cosmological secondary effects and the curse of masking

    NASA Astrophysics Data System (ADS)

    Rassat, A.; Starck, J.-L.; Paykari, P.; Sureau, F.; Bobin, J.

    2014-08-01

    Large-scale anomalies have been reported in CMB data with both WMAP and Planck data. These could be due to foreground residuals and or systematic effects, though their confirmation with Planck data suggests they are not due to a problem in the WMAP or Planck pipelines. If these anomalies are in fact primordial, then understanding their origin is fundamental to either validate the standard model of cosmology or to explore new physics. We investigate three other possible issues: 1) the trade-off between minimising systematics due to foreground contamination (with a conservative mask) and minimising systematics due to masking, 2) astrophysical secondary effects (the kinetic Doppler quadrupole and kinetic Sunyaev-Zel'dovich effect), and 3) secondary cosmological signals (the integrated Sachs-Wolfe effect). We address the masking issue by considering new procedures that use both WMAP and Planck to produce higher quality full-sky maps using the sparsity methodology (LGMCA maps). We show the impact of masking is dominant over that of residual foregrounds, and the LGMCA full-sky maps can be used without further processing to study anomalies. We consider four official Planck PR1 and two LGMCA CMB maps. Analysis of the observed CMB maps shows that only the low quadrupole and quadrupole-octopole alignment seem significant, but that the planar octopole, Axis of Evil, mirror parity and cold spot are not significant in nearly all maps considered. After subtraction of astrophysical and cosmological secondary effects, only the low quadrupole may still be considered anomalous, meaning the significance of only one anomaly is affected by secondary effect subtraction out of six anomalies considered. In the spirit of reproducible research all reconstructed maps and codes will be made available for download here http://www.cosmostat.org/anomaliesCMB.html.

  11. Combined use of GIS and environmental indicators for assessment of chemical, physical and biological soil degradation in a Spanish Mediterranean region.

    PubMed

    de Paz, José-Miguel; Sánchez, Juan; Visconti, Fernando

    2006-04-01

    Soil is one of the main non-renewable natural resources in the world. In the Valencian Community (Mediterranean coast of Spain), it is especially important because agriculture and forest biomass exploitation are two of the main economic activities in the region. More than 44% of the total area is under agriculture and 52% is forested. The frequently arid or semi-arid climate with rainfall concentrated in few events, usually in the autumn and spring, scarcity of vegetation cover, and eroded and shallow soils in several areas lead to soil degradation processes. These processes, mainly water erosion and salinization, can be intense in many locations within the Valencian Community. Evaluation of soil degradation on a regional scale is important because degradation is incompatible with sustainable development. Policy makers involved in land use planning require tools to evaluate soil degradation so they can go on to develop measures aimed at protecting and conserving soils. In this study, a methodology to evaluate physical, chemical and biological soil degradation in a GIS-based approach was developed for the Valencian Community on a 1/200,000 scale. The information used in this study was obtained from two different sources: (i) a soil survey with more than 850 soil profiles sampled within the Valencian Community, and (ii) the environmental information implemented in the Geo-scientific map of the Valencian Community digitised on an Arc/Info GIS. Maps of physical, chemical and biological soil degradation in the Valencian Community on a 1/200,000 scale were obtained using the methodology devised. These maps can be used to make a cost-effective evaluation of soil degradation on a regional scale. Around 29% of the area corresponding to the Valencian Community is affected by high to very high physical soil degradation, 36% by high to very high biological degradation, and 6% by high to very high chemical degradation. It is, therefore, necessary to draw up legislation and to establish the policy framework for actions focused on preventing soil degradation and conserving its productive potential.

  12. Assessment of Data Fusion Algorithms for Earth Observation Change Detection Processes.

    PubMed

    Molina, Iñigo; Martinez, Estibaliz; Morillo, Carmen; Velasco, Jesus; Jara, Alvaro

    2016-09-30

    In this work a parametric multi-sensor Bayesian data fusion approach and a Support Vector Machine (SVM) are used for a Change Detection problem. For this purpose two sets of SPOT5-PAN images have been used, which are in turn used for Change Detection Indices (CDIs) calculation. For minimizing radiometric differences, a methodology based on zonal "invariant features" is suggested. The choice of one or the other CDI for a change detection process is a subjective task as each CDI is probably more or less sensitive to certain types of changes. Likewise, this idea might be employed to create and improve a "change map", which can be accomplished by means of the CDI's informational content. For this purpose, information metrics such as the Shannon Entropy and "Specific Information" have been used to weight the changes and no-changes categories contained in a certain CDI and thus introduced in the Bayesian information fusion algorithm. Furthermore, the parameters of the probability density functions (pdf's) that best fit the involved categories have also been estimated. Conversely, these considerations are not necessary for mapping procedures based on the discriminant functions of a SVM. This work has confirmed the capabilities of probabilistic information fusion procedure under these circumstances.

  13. Mapping species abundance by a spatial zero-inflated Poisson model: a case study in the Wadden Sea, the Netherlands.

    PubMed

    Lyashevska, Olga; Brus, Dick J; van der Meer, Jaap

    2016-01-01

    The objective of the study was to provide a general procedure for mapping species abundance when data are zero-inflated and spatially correlated counts. The bivalve species Macoma balthica was observed on a 500×500 m grid in the Dutch part of the Wadden Sea. In total, 66% of the 3451 counts were zeros. A zero-inflated Poisson mixture model was used to relate counts to environmental covariates. Two models were considered, one with relatively fewer covariates (model "small") than the other (model "large"). The models contained two processes: a Bernoulli (species prevalence) and a Poisson (species intensity, when the Bernoulli process predicts presence). The model was used to make predictions for sites where only environmental data are available. Predicted prevalences and intensities show that the model "small" predicts lower mean prevalence and higher mean intensity, than the model "large". Yet, the product of prevalence and intensity, which might be called the unconditional intensity, is very similar. Cross-validation showed that the model "small" performed slightly better, but the difference was small. The proposed methodology might be generally applicable, but is computer intensive.

  14. VHR satellite imagery for humanitarian crisis management: a case study

    NASA Astrophysics Data System (ADS)

    Bitelli, Gabriele; Eleias, Magdalena; Franci, Francesca; Mandanici, Emanuele

    2017-09-01

    During the last years, remote sensing data along with GIS have been largely employed for supporting emergency management activities. In this context, the use of satellite images and derived map products has become more common also in the different phases of humanitarian crisis response. In this work very high resolution satellite imagery was processed to assess the evolution of Za'atari Refugee Camp, built in Jordan in 2012 by the UN Refugee Agency to host Syrian refugees. Multispectral satellite scenes of the Za'atari area were processed by means of object-based classifications. The main aim of the present work is the development of a semiautomated procedure for multi-temporal camp monitoring with particular reference to the dwellings detection. Whilst in the emergency mapping domain automation of feature extraction is widely investigated, in the field of humanitarian missions the information is often extracted by means of photointerpretation of the satellite data. This approach requires time for the interpretation; moreover, it is not reliable enough in complex situations, where features of interest are often small, heterogeneous and inconsistent. Therefore, the present paper discusses a methodology to obtain information for assisting humanitarian crisis management, using a semi-automatic classification approach applied to satellite imagery.

  15. The effect of image sharpness on quantitative eye movement data and on image quality evaluation while viewing natural images

    NASA Astrophysics Data System (ADS)

    Vuori, Tero; Olkkonen, Maria

    2006-01-01

    The aim of the study is to test both customer image quality rating (subjective image quality) and physical measurement of user behavior (eye movements tracking) to find customer satisfaction differences in imaging technologies. Methodological aim is to find out whether eye movements could be quantitatively used in image quality preference studies. In general, we want to map objective or physically measurable image quality to subjective evaluations and eye movement data. We conducted a series of image quality tests, in which the test subjects evaluated image quality while we recorded their eye movements. Results show that eye movement parameters consistently change according to the instructions given to the user, and according to physical image quality, e.g. saccade duration increased with increasing blur. Results indicate that eye movement tracking could be used to differentiate image quality evaluation strategies that the users have. Results also show that eye movements would help mapping between technological and subjective image quality. Furthermore, these results give some empirical emphasis to top-down perception processes in image quality perception and evaluation by showing differences between perceptual processes in situations when cognitive task varies.

  16. Map of assessed tight-gas resources in the United States

    USGS Publications Warehouse

    Biewick, Laura R. H.; ,

    2014-01-01

    This report presents a digital map of tight-gas resource assessments in the United States as part of the U.S. Geological Survey’s (USGS) National Assessment of Oil and Gas Project. Using a geology-based assessment methodology, the USGS quantitatively estimated potential volumes of undiscovered, technically recoverable natural gas resources within tight-gas assessment units (AUs). This is the second digital map product in a series of USGS unconventional oil and gas resource maps. The map plate included in this report can be printed in hard-copy form or downloaded in a Geographic Information System (GIS) data package, including an ArcGIS ArcMap document (.mxd), geodatabase (.gdb), and published map file (.pmf). In addition, the publication access table contains hyperlinks to current USGS tight-gas assessment publications and web pages.

  17. Map of assessed coalbed-gas resources in the United States, 2014

    USGS Publications Warehouse

    ,; Biewick, Laura R. H.

    2014-01-01

    This report presents a digital map of coalbed-gas resource assessments in the United States as part of the U.S. Geological Survey’s (USGS) National Assessment of Oil and Gas Project. Using a geology-based assessment methodology, the USGS quantitatively estimated potential volumes of undiscovered, technically recoverable natural gas resources within coalbed-gas assessment units (AUs). This is the third digital map product in a series of USGS unconventional oil and gas resource maps. The map plate included in this report can be printed in hardcopy form or downloaded in a Geographic Information System (GIS) data package, including an ArcGIS ArcMap document (.mxd), geodatabase (.gdb), and published map file (.pmf). In addition, the publication access table contains hyperlinks to current USGS coalbed-gas assessment publications and web pages.

  18. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    PubMed

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  19. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    PubMed Central

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  20. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  1. From Determinism and Probability to Chaos: Chaotic Evolution towards Philosophy and Methodology of Chaotic Optimization

    PubMed Central

    2015-01-01

    We present and discuss philosophy and methodology of chaotic evolution that is theoretically supported by chaos theory. We introduce four chaotic systems, that is, logistic map, tent map, Gaussian map, and Hénon map, in a well-designed chaotic evolution algorithm framework to implement several chaotic evolution (CE) algorithms. By comparing our previous proposed CE algorithm with logistic map and two canonical differential evolution (DE) algorithms, we analyse and discuss optimization performance of CE algorithm. An investigation on the relationship between optimization capability of CE algorithm and distribution characteristic of chaotic system is conducted and analysed. From evaluation result, we find that distribution of chaotic system is an essential factor to influence optimization performance of CE algorithm. We propose a new interactive EC (IEC) algorithm, interactive chaotic evolution (ICE) that replaces fitness function with a real human in CE algorithm framework. There is a paired comparison-based mechanism behind CE search scheme in nature. A simulation experimental evaluation is conducted with a pseudo-IEC user to evaluate our proposed ICE algorithm. The evaluation result indicates that ICE algorithm can obtain a significant better performance than or the same performance as interactive DE. Some open topics on CE, ICE, fusion of these optimization techniques, algorithmic notation, and others are presented and discussed. PMID:25879067

  2. From determinism and probability to chaos: chaotic evolution towards philosophy and methodology of chaotic optimization.

    PubMed

    Pei, Yan

    2015-01-01

    We present and discuss philosophy and methodology of chaotic evolution that is theoretically supported by chaos theory. We introduce four chaotic systems, that is, logistic map, tent map, Gaussian map, and Hénon map, in a well-designed chaotic evolution algorithm framework to implement several chaotic evolution (CE) algorithms. By comparing our previous proposed CE algorithm with logistic map and two canonical differential evolution (DE) algorithms, we analyse and discuss optimization performance of CE algorithm. An investigation on the relationship between optimization capability of CE algorithm and distribution characteristic of chaotic system is conducted and analysed. From evaluation result, we find that distribution of chaotic system is an essential factor to influence optimization performance of CE algorithm. We propose a new interactive EC (IEC) algorithm, interactive chaotic evolution (ICE) that replaces fitness function with a real human in CE algorithm framework. There is a paired comparison-based mechanism behind CE search scheme in nature. A simulation experimental evaluation is conducted with a pseudo-IEC user to evaluate our proposed ICE algorithm. The evaluation result indicates that ICE algorithm can obtain a significant better performance than or the same performance as interactive DE. Some open topics on CE, ICE, fusion of these optimization techniques, algorithmic notation, and others are presented and discussed.

  3. Harmonisation of geological data to support geohazard mapping: the case of eENVplus project

    NASA Astrophysics Data System (ADS)

    Cipolloni, Carlo; Krivic, Matija; Novak, Matevž; Pantaloni, Marco; Šinigoj, Jasna

    2014-05-01

    In the eENVplus project, which aims is to unlock huge amounts of environmental datamanaged by the national and regional environmental agencies and other public and private organisations, we have developed a cross-border pilot on the geological data harmonisation through the integration and harmonisation of existing services. The pilot analyses the methodology and results of the OneGeology-Europe project, elaborated at the scale of 1:1M, to point out difficulties and unsolved problems highlighted during the project. This preliminary analysis is followed by a comparison of two geological maps provided by the neighbouring countries with the objective to compare and define the geometric and semantic anomalous contacts between geological polygons and lines in the maps. This phase will be followed by a detailed scale geological map analysis aimed to solve the anomalies identified in the previous phase. The two Geological Surveys involved into the pilot will discuss the problems highlighted during this phase. Subsequently the semantic description will be redefined and the geometry of the polygons in geological maps will be redrawn or adjusted according to a lithostratigraphic approach that takes in account the homogeneity of age, lithology, depositional environment and consolidation degree of geological units. The two Geological Surveys have decided to apply the harmonisation process on two different dataset: the first is represented by the Geological Map at the scale of 1:1,000,000, partially harmonised within the OneGeology-Europe project that will be re-aligned with GE INSPIRE data model to produce data and services compliant with INSPIRE target schema. The main target of Geological Surveys is to produce data and web services compliant with the wider international schema, where there are more options to provide data, with specific attributes that are important to obtain the geohazard map as in the case of this pilot project; therefore we have decided to apply GeoSciML 3.2 schema to the dataset that represents Geological Map at the scale of 1:100,000. Within the pilot will be realised two main geohazard examples with a semi-automatized procedure based on a specific tool component integrated in the client: a landslide susceptibility map and a potential flooding map. In this work we want to present the first results obtained with use case geo-processing procedure in the first test phase, where we have developed a dataset compliant with GE INSPIRE to perform the landslide and flooding susceptibility maps.

  4. Empirically Guided Coordination of Multiple Evidence-Based Treatments: An Illustration of Relevance Mapping in Children's Mental Health Services

    ERIC Educational Resources Information Center

    Chorpita, Bruce F.; Bernstein, Adam; Daleiden, Eric L.

    2011-01-01

    Objective: Despite substantial progress in the development and identification of psychosocial evidence-based treatments (EBTs) in mental health, there is minimal empirical guidance for selecting an optimal "set" of EBTs maximally applicable and generalizable to a chosen service sample. Relevance mapping is a proposed methodology that…

  5. THEMATIC ACCURACY OF THE 1992 NATIONAL LAND-COVER DATA (NLCD) FOR THE EASTERN UNITED STATES: STATISTICAL METHODOLOGY AND REGIONAL RESULTS

    EPA Science Inventory

    The accuracy of the National Land Cover Data (NLCD) map is assessed via a probability sampling design incorporating three levels of stratification and two stages of selection. Agreement between the map and reference land-cover labels is defined as a match between the primary or a...

  6. A Votable Concept Mapping Approach to Promoting Students' Attentional Behavior: An Analysis of Sequential Behavioral Patterns and Brainwave Data

    ERIC Educational Resources Information Center

    Sun, Jerry Chih-Yuan; Hwang, Gwo-Jen; Lin, Yu-Yan; Yu, Shih-Jou; Pan, Liu-Cheng; Chen, Ariel Yu-Zhen

    2018-01-01

    This study explores the effects of integrated concept maps and classroom polling systems on students' learning performance, attentional behavior, and brainwaves associated with attention. Twenty-nine students from an Educational Research Methodology course were recruited as participants. For data collection, inclass quizzes, attentional behavior…

  7. A Methodology to Assess the Benefit of Operational or Tactic Adjustments to Reduce Marine Corps Fuel Consumption

    DTIC Science & Technology

    2015-12-01

    simulation M777A2 howitzer MAGTF Marine Air-Ground Task Force MANA Map Aware Non-Uniform Automata MCWL Marine Corps Warfighting Lab MEB Marine...met. The project developed a Map Aware Non-Uniform Automata (MANA) model for each SPMAGTF size. The MANA models simulated the maneuver and direct

  8. New mapping technologies - mapping and imaging from space

    NASA Technical Reports Server (NTRS)

    Blom, R. G.

    2000-01-01

    New and significantly enhanced space based observational capabiities are available which are of potential use to the hazards community. In combination with existing methodologies, these instruments and data can significantly enhance and extend current procedures for seismic zonation and hazards evaluation. This paper provides a brief overview of several of the more useful data sets available.

  9. Mapping Surface Features Produced by an Active Landslide

    NASA Astrophysics Data System (ADS)

    Parise, Mario; Gueguen, Erwan; Vennari, Carmela

    2016-10-01

    A large landslide reactivated on December 2013, at Montescaglioso, southern Italy, after 56 hours of rainfall. The landslide disrupted over 500 m of a freeway, involved a few warehouses, a supermarket, and private homes. After the event, it has been performed field surveys, aided by visual analysis of terrestrial and helicopter photographs, to compile a map of the surface deformations. The geomorphological features mapped included single fractures, sets of fractures, tension cracks, trenches, and pressure ridges. In this paper we present the methodology used, the map obtained through the intensive field work, and discuss the main surface features produced by the landslide.

  10. Mapping public policy options responding to obesity: the case of Spain.

    PubMed

    González-Zapata, L I; Ortiz-Moncada, R; Alvarez-Dardet, C

    2007-05-01

    This study assesses the opinions of the main Spanish stakeholders from food and physical exercise policy networks on public policy options for responding to obesity. We followed the multi-criteria mapping methodology in the framework of the European project 'Policy options in responding to obesity' (PorGrow), through a structured interview to 21 stakeholders. A four-step approach was taken: options, criteria, scoring and weighting, obtaining in this way a measure of the performance of each option which integrates qualitative and quantitative information. In an overall analysis, the more popular policy options where those grouped as educational initiatives: include food and health in the school curriculum, improve health education to the general public, improve the training of health professionals in obesity care and prevention, incentives to caterers to provide healthier menus and improve community sports facilities. Fiscal measures as subsidies and taxes had the lowest support. The criteria assessed as priorities were grouped as efficacy and societal benefits. Obesity in Spain can be approached through public policies, although the process will not be easy or immediate. The feasibility of changes requires concerned public policymakers developing long-term actions taking into account the map of prioritized options by the stakeholders.

  11. System dynamic modelling to assess economic viability and risk trade-offs for ecological restoration in South Africa.

    PubMed

    Crookes, D J; Blignaut, J N; de Wit, M P; Esler, K J; Le Maitre, D C; Milton, S J; Mitchell, S A; Cloete, J; de Abreu, P; Fourie nee Vlok, H; Gull, K; Marx, D; Mugido, W; Ndhlovu, T; Nowell, M; Pauw, M; Rebelo, A

    2013-05-15

    Can markets assist by providing support for ecological restoration, and if so, under what conditions? The first step in addressing this question is to develop a consistent methodology for economic evaluation of ecological restoration projects. A risk analysis process was followed in which a system dynamics model was constructed for eight diverse case study sites where ecological restoration is currently being pursued. Restoration costs vary across each of these sites, as do the benefits associated with restored ecosystem functioning. The system dynamics model simulates the ecological, hydrological and economic benefits of ecological restoration and informs a portfolio mapping exercise where payoffs are matched against the likelihood of success of a project, as well as a number of other factors (such as project costs and risk measures). This is the first known application that couples ecological restoration with system dynamics and portfolio mapping. The results suggest an approach that is able to move beyond traditional indicators of project success, since the effect of discounting is virtually eliminated. We conclude that systems dynamic modelling with portfolio mapping can guide decisions on when markets for restoration activities may be feasible. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Wetland Mapping with Quad-Pol Data Acquired during Tandem-X Science Phase

    NASA Astrophysics Data System (ADS)

    Mleczko, M.; Mroz, M.; Fitrzyk, M.

    2016-06-01

    The aim of this study was to exploit fully polarimetric SAR data acquired during TanDEM-X - Science Phase (2014/2015) over herbaceous wetlands of the Biebrza National Park (BbNP) in North-Eastern Poland for mapping seasonally flooded grasslands and permanent natural vegetation associations. The main goal of this work was to estimate the advantage of fully polarimetric radar images (QuadPol) versus alternative polarization (AltPol) modes. The methodology consisted in processing of several data subsets through polarimetric decompositions of complex quad-pol datasets, classification of multitemporal backscattering images, complementing backscattering images with Shannon Entropy, exploitation of interferometric coherence from tandem operations. In each case the multidimensional stack of images has been classified using ISODATA unsupervised clustering algorithm. With 6 QUAD-POL TSX/TDX acquisitions it was possible to distinguish correctly 5 thematic classes related to their water regime: permanent water bodies, temporarily flooded areas, wet grasslands, dry grasslands and common reed. This last category was possible to distinguish from deciduous forest only with Yamaguchi 4 component decomposition. The interferometric coherence calculated for tandem pairs turned out not so efficient as expected for this wetland mapping.

  13. A geostatistical approach to data harmonization - Application to radioactivity exposure data

    NASA Astrophysics Data System (ADS)

    Baume, O.; Skøien, J. O.; Heuvelink, G. B. M.; Pebesma, E. J.; Melles, S. J.

    2011-06-01

    Environmental issues such as air, groundwater pollution and climate change are frequently studied at spatial scales that cross boundaries between political and administrative regions. It is common for different administrations to employ different data collection methods. If these differences are not taken into account in spatial interpolation procedures then biases may appear and cause unrealistic results. The resulting maps may show misleading patterns and lead to wrong interpretations. Also, errors will propagate when these maps are used as input to environmental process models. In this paper we present and apply a geostatistical model that generalizes the universal kriging model such that it can handle heterogeneous data sources. The associated best linear unbiased estimation and prediction (BLUE and BLUP) equations are presented and it is shown that these lead to harmonized maps from which estimated biases are removed. The methodology is illustrated with an example of country bias removal in a radioactivity exposure assessment for four European countries. The application also addresses multicollinearity problems in data harmonization, which arise when both artificial bias factors and natural drifts are present and cannot easily be distinguished. Solutions for handling multicollinearity are suggested and directions for further investigations proposed.

  14. UAV Deployment Exercise for Mapping Purposes: Evaluation of Emergency Response Applications.

    PubMed

    Boccardo, Piero; Chiabrando, Filiberto; Dutto, Furio; Tonolo, Fabio Giulio; Lingua, Andrea

    2015-07-02

    Exploiting the decrease of costs related to UAV technology, the humanitarian community started piloting the use of similar systems in humanitarian crises several years ago in different application fields, i.e., disaster mapping and information gathering, community capacity building, logistics and even transportation of goods. Part of the author's group, composed of researchers in the field of applied geomatics, has been piloting the use of UAVs since 2006, with a specific focus on disaster management application. In the framework of such activities, a UAV deployment exercise was jointly organized with the Regional Civil Protection authority, mainly aimed at assessing the operational procedures to deploy UAVs for mapping purposes and the usability of the acquired data in an emergency response context. In the paper the technical features of the UAV platforms will be described, comparing the main advantages/disadvantages of fixed-wing versus rotor platforms. The main phases of the adopted operational procedure will be discussed and assessed especially in terms of time required to carry out each step, highlighting potential bottlenecks and in view of the national regulation framework, which is rapidly evolving. Different methodologies for the processing of the acquired data will be described and discussed, evaluating the fitness for emergency response applications.

  15. Semi-automatic mapping of linear-trending bedforms using 'Self-Organizing Maps' algorithm

    NASA Astrophysics Data System (ADS)

    Foroutan, M.; Zimbelman, J. R.

    2017-09-01

    Increased application of high resolution spatial data such as high resolution satellite or Unmanned Aerial Vehicle (UAV) images from Earth, as well as High Resolution Imaging Science Experiment (HiRISE) images from Mars, makes it necessary to increase automation techniques capable of extracting detailed geomorphologic elements from such large data sets. Model validation by repeated images in environmental management studies such as climate-related changes as well as increasing access to high-resolution satellite images underline the demand for detailed automatic image-processing techniques in remote sensing. This study presents a methodology based on an unsupervised Artificial Neural Network (ANN) algorithm, known as Self Organizing Maps (SOM), to achieve the semi-automatic extraction of linear features with small footprints on satellite images. SOM is based on competitive learning and is efficient for handling huge data sets. We applied the SOM algorithm to high resolution satellite images of Earth and Mars (Quickbird, Worldview and HiRISE) in order to facilitate and speed up image analysis along with the improvement of the accuracy of results. About 98% overall accuracy and 0.001 quantization error in the recognition of small linear-trending bedforms demonstrate a promising framework.

  16. Nonparametric modeling of longitudinal covariance structure in functional mapping of quantitative trait loci.

    PubMed

    Yap, John Stephen; Fan, Jianqing; Wu, Rongling

    2009-12-01

    Estimation of the covariance structure of longitudinal processes is a fundamental prerequisite for the practical deployment of functional mapping designed to study the genetic regulation and network of quantitative variation in dynamic complex traits. We present a nonparametric approach for estimating the covariance structure of a quantitative trait measured repeatedly at a series of time points. Specifically, we adopt Huang et al.'s (2006, Biometrika 93, 85-98) approach of invoking the modified Cholesky decomposition and converting the problem into modeling a sequence of regressions of responses. A regularized covariance estimator is obtained using a normal penalized likelihood with an L(2) penalty. This approach, embedded within a mixture likelihood framework, leads to enhanced accuracy, precision, and flexibility of functional mapping while preserving its biological relevance. Simulation studies are performed to reveal the statistical properties and advantages of the proposed method. A real example from a mouse genome project is analyzed to illustrate the utilization of the methodology. The new method will provide a useful tool for genome-wide scanning for the existence and distribution of quantitative trait loci underlying a dynamic trait important to agriculture, biology, and health sciences.

  17. Surrogate based wind farm layout optimization using manifold mapping

    NASA Astrophysics Data System (ADS)

    Kaja Kamaludeen, Shaafi M.; van Zuijle, Alexander; Bijl, Hester

    2016-09-01

    High computational cost associated with the high fidelity wake models such as RANS or LES serves as a primary bottleneck to perform a direct high fidelity wind farm layout optimization (WFLO) using accurate CFD based wake models. Therefore, a surrogate based multi-fidelity WFLO methodology (SWFLO) is proposed. The surrogate model is built using an SBO method referred as manifold mapping (MM). As a verification, optimization of spacing between two staggered wind turbines was performed using the proposed surrogate based methodology and the performance was compared with that of direct optimization using high fidelity model. Significant reduction in computational cost was achieved using MM: a maximum computational cost reduction of 65%, while arriving at the same optima as that of direct high fidelity optimization. The similarity between the response of models, the number of mapping points and its position, highly influences the computational efficiency of the proposed method. As a proof of concept, realistic WFLO of a small 7-turbine wind farm is performed using the proposed surrogate based methodology. Two variants of Jensen wake model with different decay coefficients were used as the fine and coarse model. The proposed SWFLO method arrived at the same optima as that of the fine model with very less number of fine model simulations.

  18. Sub-hectare crop area mapped wall-to-wall in Tigray Ethiopia with HEC processing of WorldView sub-meter panchromatic image texture

    NASA Astrophysics Data System (ADS)

    Neigh, C. S. R.; Carroll, M.; Wooten, M.; McCarty, J. L.; Powell, B.; Husak, G. J.; Enenkel, M.; Hain, C.

    2017-12-01

    Global food production in the developing world occurs within sub-hectare fields that are difficult to identify with moderate resolution satellite imagery. Knowledge about the distribution of these fields is critical in food security programs. We developed a semi-automated image segmentation approach using wall-to-wall sub-meter imagery with high-end computing (HEC) to map crop area (CA) throughout Tigray, Ethiopia that encompasses over 41,000 km2. Our approach tested multiple HEC processing streams to reduce processing time and minimize mapping error. We applied multiple resolution smoothing kernels to capture differences in land surface texture associated to CA. Typically, very-small fields (mean < 2 ha) have a smooth image roughness compared to natural scrub/shrub woody vegetation at the 1 m scale and these features can be segmented in panchromatic imagery with multi-level histogram thresholding. We found multi-temporal very-high resolution (VHR) panchromatic imagery with multi-spectral VHR and moderate resolution imagery are sufficient in extracting critical CA information needed in food security programs. We produced a 2011 ‒ 2015 CA map using over 3,000 WorldView-1 panchromatic images wall-to-wall in 1/2° mosaics for Tigray, Ethiopia in 1 week. We evaluated CA estimates with nearly 3,000 WorldView-2 2 m multispectral 250 × 250 m image subsets, with seven expert interpretations, and with in-situ global positioning system (GPS) photography. Our CA estimates ranged from 32 to 41% in sub-regions of Tigray with median maximum per bin commission and omission errors of 11% and 1% respectively, with most of the error occurring in bins less than 15%. This empirical, simple, and low direct cost approach via U.S. government license agreement and HEC could be a viable big-data methodology to extract wall-to-wall CA for other regions of the world that have very-small agriculture fields with similar image texture.

  19. Summaries of the thematic conferences on remote sensing for exploration geology

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The Thematic Conference series was initiated to address the need for concentrated discussion of particular remote sensing applications. The program is primarily concerned with the application of remote sensing to mineral and hydrocarbon exploration, with special emphasis on data integration, methodologies, and practical solutions for geologists. Some fifty invited papers are scheduled for eleven plenary sessions, formulated to address such important topics as basement tectonics and their surface expressions, spectral geology, applications for hydrocarbon exploration, and radar applications and future systems. Other invited presentations will discuss geobotanical remote sensing, mineral exploration, engineering and environmental applications, advanced image processing, and integration and mapping.

  20. Applying machine learning to pattern analysis for automated in-design layout optimization

    NASA Astrophysics Data System (ADS)

    Cain, Jason P.; Fakhry, Moutaz; Pathak, Piyush; Sweis, Jason; Gennari, Frank; Lai, Ya-Chieh

    2018-04-01

    Building on previous work for cataloging unique topological patterns in an integrated circuit physical design, a new process is defined in which a risk scoring methodology is used to rank patterns based on manufacturing risk. Patterns with high risk are then mapped to functionally equivalent patterns with lower risk. The higher risk patterns are then replaced in the design with their lower risk equivalents. The pattern selection and replacement is fully automated and suitable for use for full-chip designs. Results from 14nm product designs show that the approach can identify and replace risk patterns with quantifiable positive impact on the risk score distribution after replacement.

Top