Science.gov

Sample records for level set framework

  1. Bayesian inversion for facies detection: An extensible level set framework

    NASA Astrophysics Data System (ADS)

    Cardiff, M.; Kitanidis, P. K.

    2009-10-01

    In many cases, it has been assumed that the variability in hydrologic parameters can be adequately described through a simple geostatistical model with a given variogram. In other cases, variability may be best described as a series of "jumps" in parameter behavior, e.g., those that occur at geologic facies contacts. When using indirect measurements such as pump tests to try to map such heterogeneity (during inverse modeling), the resulting images of the subsurface are always affected by the assumptions invoked. In this paper, we discuss inversion for parameter fields where prior information has suggested that major variability can be described by boundaries between geologic units or facies. In order to identify such parameter fields, we propose a Bayesian level set inversion protocol framework, which allows for flexible zones of any shape, size, and number. We review formulas for defining facies locations using the level set method and for moving the boundaries between zones using a gradient-based technique that improves fit through iterative deformation of the boundaries. We describe the optimization algorithm employed when multiple level set functions are used to represent a field with more than two facies. We extend these formulas to the investigation of the inverse problem in a Bayesian context in which prior information is taken into account and through which measures of uncertainty can be derived. We also demonstrate that the level set method is well suited for joint inversion problems and present a strategy for integrating different data types (such as hydrologic and geophysical) without assuming strict petrophysical relations. Our framework for joint inversion also contrasts with many previous methods in that all data sources (e.g., both hydrologic and geophysical) contribute to boundary delineation at once.

  2. An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs

    PubMed Central

    Mosaliganti, Kishore R.; Gelas, Arnaud; Megason, Sean G.

    2013-01-01

    In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish

  3. An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs.

    PubMed

    Mosaliganti, Kishore R; Gelas, Arnaud; Megason, Sean G

    2013-01-01

    In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish

  4. Setting Healthcare Priorities at the Macro and Meso Levels: A Framework for Evaluation

    PubMed Central

    Barasa, Edwine W.; Molyneux, Sassy; English, Mike; Cleary, Susan

    2015-01-01

    Background: Priority setting in healthcare is a key determinant of health system performance. However, there is no widely accepted priority setting evaluation framework. We reviewed literature with the aim of developing and proposing a framework for the evaluation of macro and meso level healthcare priority setting practices. Methods: We systematically searched Econlit, PubMed, CINAHL, and EBSCOhost databases and supplemented this with searches in Google Scholar, relevant websites and reference lists of relevant papers. A total of 31 papers on evaluation of priority setting were identified. These were supplemented by broader theoretical literature related to evaluation of priority setting. A conceptual review of selected papers was undertaken. Results: Based on a synthesis of the selected literature, we propose an evaluative framework that requires that priority setting practices at the macro and meso levels of the health system meet the following conditions: (1) Priority setting decisions should incorporate both efficiency and equity considerations as well as the following outcomes; (a) Stakeholder satisfaction, (b) Stakeholder understanding, (c) Shifted priorities (reallocation of resources), and (d) Implementation of decisions. (2) Priority setting processes should also meet the procedural conditions of (a) Stakeholder engagement, (b) Stakeholder empowerment, (c) Transparency, (d) Use of evidence, (e) Revisions, (f) Enforcement, and (g) Being grounded on community values. Conclusion: Available frameworks for the evaluation of priority setting are mostly grounded on procedural requirements, while few have included outcome requirements. There is, however, increasing recognition of the need to incorporate both consequential and procedural considerations in priority setting practices. In this review, we adapt an integrative approach to develop and propose a framework for the evaluation of priority setting practices at the macro and meso levels that draws from these

  5. A coupled level set framework for bladder wall segmentation with application to MR cystography.

    PubMed

    Duan, Chaijie; Liang, Zhengrong; Bao, Shangliang; Zhu, Hongbin; Wang, Su; Zhang, Guangxiang; Chen, John J; Lu, Hongbing

    2010-03-01

    In this paper, we propose a coupled level set (LS) framework for segmentation of bladder wall using T(1)-weighted magnetic resonance (MR) images with clinical applications to virtual cystoscopy (i.e., MR cystography). The framework uses two collaborative LS functions and a regional adaptive clustering algorithm to delineate the bladder wall for the wall thickness measurement on a voxel-by-voxel basis. It is significantly different from most of the pre-existing bladder segmentation work in four aspects. First of all, while most previous work only segments the inner border of the wall or at most manually segments the outer border, our framework extracts both the inner and outer borders automatically except that the initial seed point is given by manual selection. Secondly, it is adaptive to T(1)-weighted images with decreased intensities in urine, as opposed to enhanced intensities in T(2)-weighted scenario and computed tomography. Thirdly, by considering the image global intensity distribution and local intensity contrast, the defined image energy function in the framework is more immune to inhomogeneity effect, motion artifacts and image noise. Finally, the bladder wall thickness is measured by the length of integral path between the two borders which mimic the electric field line between two iso-potential surfaces. The framework was tested on six datasets with comparison to the well-known Chan-Vese (C-V) LS model. Five experts blindly scored the segmented inner and outer borders of the presented framework and the C-V model. The scores demonstrated statistically the improvement in detecting the inner and outer borders.

  6. A unified variational segmentation framework with a level-set based sparse composite shape prior

    NASA Astrophysics Data System (ADS)

    Liu, Wenyang; Ruan, Dan

    2015-03-01

    Image segmentation plays an essential role in many medical applications. Low SNR conditions and various artifacts makes its automation challenging. To achieve robust and accurate segmentation results, a good approach is to introduce proper shape priors. In this study, we present a unified variational segmentation framework that regularizes the target shape with a level-set based sparse composite prior. When the variational problem is solved with a block minimization/decent scheme, the regularizing impact of the sparse composite prior can be observed to adjust to the most recent shape estimate, and may be interpreted as a ‘dynamic’ shape prior, yet without compromising convergence thanks to the unified energy framework. The proposed method was applied to segment corpus callosum from 2D MR images and liver from 3D CT volumes. Its performance was evaluated using Dice Similarity Coefficient and Hausdorff distance, and compared with two benchmark level-set based segmentation methods. The proposed method has achieved statistically significant higher accuracy in both experiments and avoided faulty inclusion/exclusion of surrounding structures with similar intensities, as opposed to the benchmark methods.

  7. A multi-phase level set framework for source reconstruction in bioluminescence tomography

    SciTech Connect

    Huang Heyu; Qu Xiaochao; Liang Jimin; He Xiaowei; Chen Xueli; Yang Da'an; Tian Jie

    2010-07-01

    We propose a novel multi-phase level set algorithm for solving the inverse problem of bioluminescence tomography. The distribution of unknown interior source is considered as piecewise constant and represented by using multiple level set functions. The localization of interior bioluminescence source is implemented by tracing the evolution of level set function. An alternate search scheme is incorporated to ensure the global optimal of reconstruction. Both numerical and physical experiments are performed to evaluate the developed level set reconstruction method. Reconstruction results show that the proposed method can stably resolve the interior source of bioluminescence tomography.

  8. An automatic variational level set segmentation framework for computer aided dental X-rays analysis in clinical environments.

    PubMed

    Li, Shuo; Fevens, Thomas; Krzyzak, Adam; Li, Song

    2006-03-01

    An automatic variational level set segmentation framework for Computer Aided Dental X-rays Analysis (CADXA) in clinical environments is proposed. Designed for clinical environments, the segmentation contains two stages: a training stage and a segmentation stage. During the training stage, first, manually chosen representative images are segmented using hierarchical level set region detection. Then the window based feature extraction followed by principal component analysis (PCA) is applied and results are used to train a support vector machine (SVM) classifier. During the segmentation stage, dental X-rays are classified first by the trained SVM. The classifier provides initial contours which are close to correct boundaries for three coupled level sets driven by a proposed pathologically variational modeling which greatly accelerates the level set segmentation. Based on the segmentation results and uncertainty maps that are built based on a proposed uncertainty measurement, a computer aided analysis scheme is applied. The experimental results show that the proposed method is able to provide an automatic pathological segmentation which naturally segments those problem areas. Based on the segmentation results, the analysis scheme is able to provide indications of possible problem areas of bone loss and decay to the dentists. As well, the experimental results show that the proposed segmentation framework is able to speed up the level set segmentation in clinical environments.

  9. Improving district level health planning and priority setting in Tanzania through implementing accountability for reasonableness framework: Perceptions of stakeholders

    PubMed Central

    2010-01-01

    Background In 2006, researchers and decision-makers launched a five-year project - Response to Accountable Priority Setting for Trust in Health Systems (REACT) - to improve planning and priority-setting through implementing the Accountability for Reasonableness framework in Mbarali District, Tanzania. The objective of this paper is to explore the acceptability of Accountability for Reasonableness from the perspectives of the Council Health Management Team, local government officials, health workforce and members of user boards and committees. Methods Individual interviews were carried out with different categories of actors and stakeholders in the district. The interview guide consisted of a series of questions, asking respondents to describe their perceptions regarding each condition of the Accountability for Reasonableness framework in terms of priority setting. Interviews were analysed using thematic framework analysis. Documentary data were used to support, verify and highlight the key issues that emerged. Results Almost all stakeholders viewed Accountability for Reasonableness as an important and feasible approach for improving priority-setting and health service delivery in their context. However, a few aspects of Accountability for Reasonableness were seen as too difficult to implement given the socio-political conditions and traditions in Tanzania. Respondents mentioned: budget ceilings and guidelines, low level of public awareness, unreliable and untimely funding, as well as the limited capacity of the district to generate local resources as the major contextual factors that hampered the full implementation of the framework in their context. Conclusion This study was one of the first assessments of the applicability of Accountability for Reasonableness in health care priority-setting in Tanzania. The analysis, overall, suggests that the Accountability for Reasonableness framework could be an important tool for improving priority-setting processes in the

  10. A framework for comparing different image segmentation methods and its use in studying equivalences between level set and fuzzy connectedness frameworks

    PubMed Central

    Ciesielski, Krzysztof Chris; Udupa, Jayaram K.

    2011-01-01

    In the current vast image segmentation literature, there seems to be considerable redundancy among algorithms, while there is a serious lack of methods that would allow their theoretical comparison to establish their similarity, equivalence, or distinctness. In this paper, we make an attempt to fill this gap. To accomplish this goal, we argue that: (1) every digital segmentation algorithm A should have a well defined continuous counterpart MA, referred to as its model, which constitutes an asymptotic of A when image resolution goes to infinity; (2) the equality of two such models MA and MA′ establishes a theoretical (asymptotic) equivalence of their digital counterparts A and A′. Such a comparison is of full theoretical value only when, for each involved algorithm A, its model MA is proved to be an asymptotic of A. So far, such proofs do not appear anywhere in the literature, even in the case of algorithms introduced as digitizations of continuous models, like level set segmentation algorithms. The main goal of this article is to explore a line of investigation for formally pairing the digital segmentation algorithms with their asymptotic models, justifying such relations with mathematical proofs, and using the results to compare the segmentation algorithms in this general theoretical framework. As a first step towards this general goal, we prove here that the gradient based thresholding model M∇ is the asymptotic for the fuzzy connectedness Udupa and Samarasekera segmentation algorithm used with gradient based affinity A∇. We also argue that, in a sense, M∇ is the asymptotic for the original front propagation level set algorithm of Malladi, Sethian, and Vemuri, thus establishing a theoretical equivalence between these two specific algorithms. Experimental evidence of this last equivalence is also provided. PMID:21442014

  11. Interoperability Context-Setting Framework

    SciTech Connect

    Widergren, Steven E.; Hardin, Dave; Ambrosio, Ron; Drummond, R.; Gunther, E.; Gilchrist, Grant; Cohen, David

    2007-01-31

    -conditioning (HVAC) unit up several degrees. The resulting load reduction becomes part of an aggregated response from the electricity service provider to the bulk system operator who is now in a better position to manage total system load with available generation. Looking across the electric system, from generating plants, to transmission substations, to the distribution system, to factories, office parks, and buildings, automation is growing, and the opportunities for unleashing new value propositions are exciting. How can we facilitate this change and do so in a way that ensures the reliability of electric resources for the wellbeing of our economy and security? The GridWise Architecture Council (GWAC) mission is to enable interoperability among the many entities that interact with the electric power system. A good definition of interoperability is, “The capability of two or more networks, systems, devices, applications, or components to exchange information between them and to use the information so exchanged.” As a step in the direction of enabling interoperability, the GWAC proposes a context-setting framework to organize concepts and terminology so that interoperability issues can be identified and debated, improvements to address issues articulated, and actions prioritized and coordinated across the electric power community.

  12. A new framework for detection of initial flat polyp candidates based on a dual level set competition model

    NASA Astrophysics Data System (ADS)

    Wang, Huafeng; Li, Lihong C.; Wei, Xinzhou; Liu, Wanquan; Wang, Yuehai; Liang, Zhengrong

    2017-03-01

    Computer-aided detection (CAD) of colonic polyps plays an important role in advancing computed tomographic colonography (CTC) toward a screening modality. Detection of flat polyps is very challenging because of their plaquelike morphology with limited geometric features for detection purpose. In this paper, we present a novel scheme to automatically detect initial polyp candidates (IPCs) of flat polyp in CTC images. First, tagged materials in CTC images were automatically removed via the partial volume (PV) based electronic colon cleansing (ECC) strategy. We then propose a dual level set competition model to segment the volumetric colon wall from CTC images after ECC. In this model, we developed a comprehensive cost function which takes consideration of the essential characteristics of colon wall such as colon mucosa and weak boundaries, to simulate the mutual interference relationships among those compositions of the colon wall. Furthermore, we introduced a CAD scheme based on the thickness mapping of the colon wall. By tracing the gradient direction of the potential field between inner and outer borders of the colon wall, we focus on the local thickness measures for the detection of IPCs. The proposed CAD approach was validated on patient CTC scans with flat polyps. Experimental results indicate that the present scheme is very promising towards detection of colonic flat polyp candidates via CTC.

  13. Monitoring Street-Level Spatial-Temporal Variations of Carbon Monoxide in Urban Settings Using a Wireless Sensor Network (WSN) Framework

    PubMed Central

    Wen, Tzai-Hung; Jiang, Joe-Air; Sun, Chih-Hong; Juang, Jehn-Yih; Lin, Tzu-Shiang

    2013-01-01

    Air pollution has become a severe environmental problem due to urbanization and heavy traffic. Monitoring street-level air quality is an important issue, but most official monitoring stations are installed to monitor large-scale air quality conditions, and their limited spatial resolution cannot reflect the detailed variations in air quality that may be induced by traffic jams. By deploying wireless sensors on crossroads and main roads, this study established a pilot framework for a wireless sensor network (WSN)-based real-time monitoring system to understand street-level spatial-temporal changes of carbon monoxide (CO) in urban settings. The system consists of two major components. The first component is the deployment of wireless sensors. We deployed 44 sensor nodes, 40 transmitter nodes and four gateway nodes in this study. Each sensor node includes a signal processing module, a CO sensor and a wireless communication module. In order to capture realistic human exposure to traffic pollutants, all sensors were deployed at a height of 1.5 m on lampposts and traffic signs. The study area covers a total length of 1.5 km of Keelung Road in Taipei City. The other component is a map-based monitoring platform for sensor data visualization and manipulation in time and space. Using intensive real-time street-level monitoring framework, we compared the spatial-temporal patterns of air pollution in different time periods. Our results capture four CO concentration peaks throughout the day at the location, which was located along an arterial and nearby traffic sign. The hourly average could reach 5.3 ppm from 5:00 pm to 7:00 pm due to the traffic congestion. The proposed WSN-based framework captures detailed ground information and potential risk of human exposure to traffic-related air pollution. It also provides street-level insights into real-time monitoring for further early warning of air pollution and urban environmental management. PMID:24287859

  14. Monitoring street-level spatial-temporal variations of carbon monoxide in urban settings using a wireless sensor network (WSN) framework.

    PubMed

    Wen, Tzai-Hung; Jiang, Joe-Air; Sun, Chih-Hong; Juang, Jehn-Yih; Lin, Tzu-Shiang

    2013-11-27

    Air pollution has become a severe environmental problem due to urbanization and heavy traffic. Monitoring street-level air quality is an important issue, but most official monitoring stations are installed to monitor large-scale air quality conditions, and their limited spatial resolution cannot reflect the detailed variations in air quality that may be induced by traffic jams. By deploying wireless sensors on crossroads and main roads, this study established a pilot framework for a wireless sensor network (WSN)-based real-time monitoring system to understand street-level spatial-temporal changes of carbon monoxide (CO) in urban settings. The system consists of two major components. The first component is the deployment of wireless sensors. We deployed 44 sensor nodes, 40 transmitter nodes and four gateway nodes in this study. Each sensor node includes a signal processing module, a CO sensor and a wireless communication module. In order to capture realistic human exposure to traffic pollutants, all sensors were deployed at a height of 1.5 m on lampposts and traffic signs. The study area covers a total length of 1.5 km of Keelung Road in Taipei City. The other component is a map-based monitoring platform for sensor data visualization and manipulation in time and space. Using intensive real-time street-level monitoring framework, we compared the spatial-temporal patterns of air pollution in different time periods. Our results capture four CO concentration peaks throughout the day at the location, which was located along an arterial and nearby traffic sign. The hourly average could reach 5.3 ppm from 5:00 pm to 7:00 pm due to the traffic congestion. The proposed WSN-based framework captures detailed ground information and potential risk of human exposure to traffic-related air pollution. It also provides street-level insights into real-time monitoring for further early warning of air pollution and urban environmental management.

  15. Level Set Strategy for SCFT

    NASA Astrophysics Data System (ADS)

    Ouaknin, Gaddiel

    This thesis investigates the design of sharp in terface level set methods in the context of self-consistent field theory (SCFT) in polymer physics. SCFT computes the structure and energy of inhomogeneous self-assembling polymers at thermodynamic equilibrium. Level set methods are based on an implicit representation of free boundaries, which enable motions with arbitrary change in topology. In addition, recent advances on how to impose Robin boundary conditions enables the study of free boundary problems of interest in the community interested in self-assembly. We first present a computational framework, encoded on a forest of quad/oct-trees in a parallel environment. We then present results of imposing sharp Neumann boundary conditions as was first proposed by de Gennes, which enables SCFT computations of meaningful quantities at the boundary of irregular geometries. We then introduce the concept of functional level-set derivative in the context of SCFT and rigorously derive expressions for the change of energy of a diblock copolymer with respect to an enclosing shape. The level-set derivative is then used to embed SCFT into a variable shape simulator, where the internal structure and the enclosing shape are coupled together and evolve in tandem in order to reduce the energy of the diblock copolymer. Finally an algorithm for solving the inverse problem for directed self-assembly is presented.

  16. A framework and a set of tools called Nutting models to estimate retention capacities and loads of nitrogen and phosphorus in rivers at catchment and national level (France)

    NASA Astrophysics Data System (ADS)

    Legeay, Pierre-Louis; Moatar, Florentina; Dupas, Rémi; Gascuel-Odoux, Chantal

    2016-04-01

    The Nutting-N and Nutting-P models (Dupas et al., 2013, 2015) have been developed to estimate Nitrogen and Phosphorus nonpoint-source emissions to surface water, using readily available data. These models were inspired from US model SPARROW (Smith al., 1997) and European model GREEN (Grizzetti et al., 2008), i.e. statistical approaches consisting of linking nitrogen and phosphorus surplus to catchment's land and rivers characteristics to find the catchment relative retention capacities. The nutrient load (L) at the outlet of each catchment is expressed as: L=R*(B*DS+PS) [1] where DS is diffuse sources (i.e. surplus in kg.ha-1/yr-1 for N, P storage in soil for P), PS is point sources from domestic and industrial origin (kg.ha-1.yr-1), R and B are the river system and basin reduction factor, respectively and they combine observed variables and calibrated parameters. The model was calibrated on independent catchments for the 2005-2009 and 2008-2012 periods. Variables were selected according to Bayesian Information Criterion (BIC) in order to optimize the predictive performance of the models. From these basic models, different improvements have been realized to build a framework and a set of tools: 1) a routing module has been added in order to improve estimations on 4 or 5 stream order, i.e. upscaling the basic Nutting approach; 2) a territorial module, in order to test the models at local scale (from 500 to 5000 km²); 3) a seasonal estimation has been investigated. The basic approach as well territorial application will be illustrated. These tools allow water manager to identify areas at risk where high nutrients loads are estimated, as well areas where retention is potentially high and can buffer high nutrient sources. References Dupas R., Curie F., Gascuel-Odoux C., Moatar F., Delmas M., Parnaudeau, V., Durand P., 2013. Assessing N emissions in surface water at the national level: Comparison of country-wide vs. regionalized models. Science of the Total Environment

  17. Conceptual frameworks for setting environmental standards.

    PubMed

    Philipp, R

    1996-01-01

    Following the Second European Conference on Environment and Health, held from 20 to 22 June 1994 in Helsinki, the World Health Organization (WHO) established a National Environmental Health Action Plan pilot project. During 1995, and as part of its work for this project with the WHO European Environmental Health Committee, the UK Royal Commission on Environmental Pollution began to seek evidence for the basis of setting environmental standards and to ask if a more consistent and robust basis can be found for establishing them. This paper explores the conceptual frameworks needed to help establish policy and address practical questions associated with different pollutants, exposures and environmental settings. It addresses sustainable development, inter-generational equity and environmental quality, the European Charter on Environment and Health, the Treaty of Maastricht, economic, educational and training issues, risk assessment, the role of environmental epidemiology, and definitions of environmental quality objectives, environmental health indicators, environmental epidemiology and environmental impact assessment.

  18. An adaptive level set method

    SciTech Connect

    Milne, Roger Brent

    1995-12-01

    This thesis describes a new method for the numerical solution of partial differential equations of the parabolic type on an adaptively refined mesh in two or more spatial dimensions. The method is motivated and developed in the context of the level set formulation for the curvature dependent propagation of surfaces in three dimensions. In that setting, it realizes the multiple advantages of decreased computational effort, localized accuracy enhancement, and compatibility with problems containing a range of length scales.

  19. High-Level Application Framework for LCLS

    SciTech Connect

    Chu, P; Chevtsov, S.; Fairley, D.; Larrieu, C.; Rock, J.; Rogind, D.; White, G.; Zalazny, M.; /SLAC

    2008-04-22

    A framework for high level accelerator application software is being developed for the Linac Coherent Light Source (LCLS). The framework is based on plug-in technology developed by an open source project, Eclipse. Many existing functionalities provided by Eclipse are available to high-level applications written within this framework. The framework also contains static data storage configuration and dynamic data connectivity. Because the framework is Eclipse-based, it is highly compatible with any other Eclipse plug-ins. The entire infrastructure of the software framework will be presented. Planned applications and plug-ins based on the framework are also presented.

  20. A Framework for Describing Interlanguages in Multilingual Settings.

    ERIC Educational Resources Information Center

    Tenjoh-Okwen, Thomas

    1989-01-01

    Outlines a contrastive analysis model and a non-contrastive analysis model for studying interlanguage in strictly bilingual settings, and suggests a bidimensional framework, including both linguistic and curricular components, for studying interlanguage in multilingual settings. (21 references) (CB)

  1. Exploring the UMLS: a rough sets based theoretical framework.

    PubMed

    Srinivasan, P

    1999-01-01

    The Unified Medical Language System (UMLS) [1] has a unique and leading position in the evolution of thesauri and metathesauri. Features that set it apart are: its composition from more than fifty component health care vocabularies; the sophisticated UMLS ontology linking the Metathesaurus with structures such as the Semantic Network and the SPECIALIST lexicon; and the high level of social collaboration invested in its construction and growth. It is our thesis that in order to successfully harness such a complex vocabulary for text retrieval we need sophisticated methods derived from a deeper understanding of the UMLS system. Thus we propose a theoretical framework based on the theory of rough sets, that supports the systematic and exploratory investigation of the UMLS Metathesaurus for text retrieval. Our goal is to make it more feasible for individuals such as patients and health care professionals to access relevant information at the point of need.

  2. Can frameworks inform knowledge about health policy processes? Reviewing health policy papers on agenda setting and testing them against a specific priority-setting framework.

    PubMed

    Walt, Gill; Gilson, Lucy

    2014-12-01

    This article systematically reviews a set of health policy papers on agenda setting and tests them against a specific priority-setting framework. The article applies the Shiffman and Smith framework in extracting and synthesizing data from an existing set of papers, purposively identified for their relevance and systematically reviewed. Its primary aim is to assess how far the component parts of the framework help to identify the factors that influence the agenda setting stage of the policy process at global and national levels. It seeks to advance the field and inform the development of theory in health policy by examining the extent to which the framework offers a useful approach for organizing and analysing data. Applying the framework retrospectively to the selected set of papers, it aims to explore influences on priority setting and to assess how far the framework might gain from further refinement or adaptation, if used prospectively. In pursuing its primary aim, the article also demonstrates how the approach of framework synthesis can be used in health policy analysis research. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014; all rights reserved.

  3. Standard Setting to an International Reference Framework: Implications for Theory and Practice

    ERIC Educational Resources Information Center

    Lim, Gad S.; Geranpayeh, Ardeshir; Khalifa, Hanan; Buckendahl, Chad W.

    2013-01-01

    Standard setting theory has largely developed with reference to a typical situation, determining a level or levels of performance for one exam for one context. However, standard setting is now being used with international reference frameworks, where some parameters and assumptions of classical standard setting do not hold. We consider the…

  4. Standard Setting to an International Reference Framework: Implications for Theory and Practice

    ERIC Educational Resources Information Center

    Lim, Gad S.; Geranpayeh, Ardeshir; Khalifa, Hanan; Buckendahl, Chad W.

    2013-01-01

    Standard setting theory has largely developed with reference to a typical situation, determining a level or levels of performance for one exam for one context. However, standard setting is now being used with international reference frameworks, where some parameters and assumptions of classical standard setting do not hold. We consider the…

  5. Towards a Framework for Change Detection in Data Sets

    NASA Astrophysics Data System (ADS)

    Böttcher, Mirko; Nauck, Detlef; Ruta, Dymitr; Spott, Martin

    Since the world with its markets, innovations and customers is changing faster than ever before, the key to survival for businesses is the ability to detect, assess and respond to changing conditions rapidly and intelligently. Discovering changes and reacting to or acting upon them before others do has therefore become a strategical issue for many companies. However, existing data analysis techniques are insufflent for this task since they typically assume that the domain under consideration is stable over time. This paper presents a framework that detects changes within a data set at virtually any level of granularity. The underlying idea is to derive a rule-based description of the data set at different points in time and to subsequently analyse how these rules change. Nevertheless, further techniques are required to assist the data analyst in interpreting and assessing their changes. Therefore the framework also contains methods to discard rules that are non-drivers for change and to assess the interestingness of detected changes.

  6. On reinitializing level set functions

    NASA Astrophysics Data System (ADS)

    Min, Chohong

    2010-04-01

    In this paper, we consider reinitializing level functions through equation ϕt+sgn(ϕ0)(‖∇ϕ‖-1)=0[16]. The method of Russo and Smereka [11] is taken in the spatial discretization of the equation. The spatial discretization is, simply speaking, the second order ENO finite difference with subcell resolution near the interface. Our main interest is on the temporal discretization of the equation. We compare the three temporal discretizations: the second order Runge-Kutta method, the forward Euler method, and a Gauss-Seidel iteration of the forward Euler method. The fact that the time in the equation is fictitious makes a hypothesis that all the temporal discretizations result in the same result in their stationary states. The fact that the absolute stability region of the forward Euler method is not wide enough to include all the eigenvalues of the linearized semi-discrete system of the second order ENO spatial discretization makes another hypothesis that the forward Euler temporal discretization should invoke numerical instability. Our results in this paper contradict both the hypotheses. The Runge-Kutta and Gauss-Seidel methods obtain the second order accuracy, and the forward Euler method converges with order between one and two. Examining all their properties, we conclude that the Gauss-Seidel method is the best among the three. Compared to the Runge-Kutta, it is twice faster and requires memory two times less with the same accuracy.

  7. DISJUNCTIVE NORMAL LEVEL SET: AN EFFICIENT PARAMETRIC IMPLICIT METHOD

    PubMed Central

    Mesadi, Fitsum; Cetin, Mujdat; Tasdizen, Tolga

    2016-01-01

    Level set methods are widely used for image segmentation because of their capability to handle topological changes. In this paper, we propose a novel parametric level set method called Disjunctive Normal Level Set (DNLS), and apply it to both two phase (single object) and multiphase (multi-object) image segmentations. The DNLS is formed by union of polytopes which themselves are formed by intersections of half-spaces. The proposed level set framework has the following major advantages compared to other level set methods available in the literature. First, segmentation using DNLS converges much faster. Second, the DNLS level set function remains regular throughout its evolution. Third, the proposed multiphase version of the DNLS is less sensitive to initialization, and its computational cost and memory requirement remains almost constant as the number of objects to be simultaneously segmented grows. The experimental results show the potential of the proposed method.

  8. DISJUNCTIVE NORMAL LEVEL SET: AN EFFICIENT PARAMETRIC IMPLICIT METHOD.

    PubMed

    Mesadi, Fitsum; Cetin, Mujdat; Tasdizen, Tolga

    2016-09-01

    Level set methods are widely used for image segmentation because of their capability to handle topological changes. In this paper, we propose a novel parametric level set method called Disjunctive Normal Level Set (DNLS), and apply it to both two phase (single object) and multiphase (multi-object) image segmentations. The DNLS is formed by union of polytopes which themselves are formed by intersections of half-spaces. The proposed level set framework has the following major advantages compared to other level set methods available in the literature. First, segmentation using DNLS converges much faster. Second, the DNLS level set function remains regular throughout its evolution. Third, the proposed multiphase version of the DNLS is less sensitive to initialization, and its computational cost and memory requirement remains almost constant as the number of objects to be simultaneously segmented grows. The experimental results show the potential of the proposed method.

  9. Goal setting and action planning in the rehabilitation setting: development of a theoretically informed practice framework.

    PubMed

    Scobbie, Lesley; Dixon, Diane; Wyke, Sally

    2011-05-01

    Setting and achieving goals is fundamental to rehabilitation practice but has been criticized for being a-theoretical and the key components of replicable goal-setting interventions are not well established. To describe the development of a theory-based goal setting practice framework for use in rehabilitation settings and to detail its component parts. Causal modelling was used to map theories of behaviour change onto the process of setting and achieving rehabilitation goals, and to suggest the mechanisms through which patient outcomes are likely to be affected. A multidisciplinary task group developed the causal model into a practice framework for use in rehabilitation settings through iterative discussion and implementation with six patients. Four components of a goal-setting and action-planning practice framework were identified: (i) goal negotiation, (ii) goal identification, (iii) planning, and (iv) appraisal and feedback. The variables hypothesized to effect change in patient outcomes were self-efficacy and action plan attainment. A theory-based goal setting practice framework for use in rehabilitation settings is described. The framework requires further development and systematic evaluation in a range of rehabilitation settings.

  10. Fast Sparse Level Sets on Graphics Hardware.

    PubMed

    Jalba, Andrei C; van der Laan, Wladimir J; Roerdink, Jos B T M

    2013-01-01

    The level-set method is one of the most popular techniques for capturing and tracking deformable interfaces. Although level sets have demonstrated great potential in visualization and computer graphics applications, such as surface editing and physically based modeling, their use for interactive simulations has been limited due to the high computational demands involved. In this paper, we address this computational challenge by leveraging the increased computing power of graphics processors, to achieve fast simulations based on level sets. Our efficient, sparse GPU level-set method is substantially faster than other state-of-the-art, parallel approaches on both CPU and GPU hardware. We further investigate its performance through a method for surface reconstruction, based on GPU level sets. Our novel multiresolution method for surface reconstruction from unorganized point clouds compares favorably with recent, existing techniques and other parallel implementations. Finally, we point out that both level-set computations and rendering of level-set surfaces can be performed at interactive rates, even on large volumetric grids. Therefore, many applications based on level sets can benefit from our sparse level-set method.

  11. International Review of Frameworks for Standard Setting & Labeling Development

    SciTech Connect

    Zhou, Nan; Khanna, Nina Zheng; Fridley, David; Romankiewicz, John

    2012-09-01

    As appliance energy efficiency standards and labeling (S&L) programs reach a broader geographic and product scope, a series of sophisticated and complex technical and economic analyses have been adopted by different countries in the world to support and enhance these growing S&L programs. The initial supporting techno-economic and impact analyses for S&L development make up a defined framework and process for setting and developing appropriate appliance efficiency standards and labeling programs. This report reviews in-depth the existing framework for standards setting and label development in the well-established programs of the U.S., Australia and the EU to identify and evaluate major trends in how and why key analyses are undertaken and to understand major similarities and differences between each of the frameworks.

  12. A Lagrangian particle level set method

    NASA Astrophysics Data System (ADS)

    Hieber, Simone E.; Koumoutsakos, Petros

    2005-11-01

    We present a novel particle level set method for capturing interfaces. The level set equation is solved in a Lagrangian frame using particles that carry the level set information. A key aspect of the method involves a consistent remeshing procedure for the regularization of the particle locations when the particle map gets distorted by the advection field. The Lagrangian description of the level set method is inherently adaptive and exact in the case of solid body motions. The efficiency and accuracy of the method is demonstrated in several benchmark problems in two and three dimensions involving pure advection and curvature induced motion of the interface. The simplicity of the particle description is shown to be well suited for real time simulations of surfaces involving cutting and reconnection as in virtual surgery environments.

  13. Interpretable Decision Sets: A Joint Framework for Description and Prediction

    PubMed Central

    Lakkaraju, Himabindu; Bach, Stephen H.; Jure, Leskovec

    2016-01-01

    One of the most important obstacles to deploying predictive models is the fact that humans do not understand and trust them. Knowing which variables are important in a model’s prediction and how they are combined can be very powerful in helping people understand and trust automatic decision making systems. Here we propose interpretable decision sets, a framework for building predictive models that are highly accurate, yet also highly interpretable. Decision sets are sets of independent if-then rules. Because each rule can be applied independently, decision sets are simple, concise, and easily interpretable. We formalize decision set learning through an objective function that simultaneously optimizes accuracy and interpretability of the rules. In particular, our approach learns short, accurate, and non-overlapping rules that cover the whole feature space and pay attention to small but important classes. Moreover, we prove that our objective is a non-monotone submodular function, which we efficiently optimize to find a near-optimal set of rules. Experiments show that interpretable decision sets are as accurate at classification as state-of-the-art machine learning techniques. They are also three times smaller on average than rule-based models learned by other methods. Finally, results of a user study show that people are able to answer multiple-choice questions about the decision boundaries of interpretable decision sets and write descriptions of classes based on them faster and more accurately than with other rule-based models that were designed for interpretability. Overall, our framework provides a new approach to interpretable machine learning that balances accuracy, interpretability, and computational efficiency. PMID:27853627

  14. Interpretable Decision Sets: A Joint Framework for Description and Prediction.

    PubMed

    Lakkaraju, Himabindu; Bach, Stephen H; Jure, Leskovec

    2016-08-01

    One of the most important obstacles to deploying predictive models is the fact that humans do not understand and trust them. Knowing which variables are important in a model's prediction and how they are combined can be very powerful in helping people understand and trust automatic decision making systems. Here we propose interpretable decision sets, a framework for building predictive models that are highly accurate, yet also highly interpretable. Decision sets are sets of independent if-then rules. Because each rule can be applied independently, decision sets are simple, concise, and easily interpretable. We formalize decision set learning through an objective function that simultaneously optimizes accuracy and interpretability of the rules. In particular, our approach learns short, accurate, and non-overlapping rules that cover the whole feature space and pay attention to small but important classes. Moreover, we prove that our objective is a non-monotone submodular function, which we efficiently optimize to find a near-optimal set of rules. Experiments show that interpretable decision sets are as accurate at classification as state-of-the-art machine learning techniques. They are also three times smaller on average than rule-based models learned by other methods. Finally, results of a user study show that people are able to answer multiple-choice questions about the decision boundaries of interpretable decision sets and write descriptions of classes based on them faster and more accurately than with other rule-based models that were designed for interpretability. Overall, our framework provides a new approach to interpretable machine learning that balances accuracy, interpretability, and computational efficiency.

  15. Novel gene sets improve set-level classification of prokaryotic gene expression data.

    PubMed

    Holec, Matěj; Kuželka, Ondřej; Železný, Filip

    2015-10-28

    Set-level classification of gene expression data has received significant attention recently. In this setting, high-dimensional vectors of features corresponding to genes are converted into lower-dimensional vectors of features corresponding to biologically interpretable gene sets. The dimensionality reduction brings the promise of a decreased risk of overfitting, potentially resulting in improved accuracy of the learned classifiers. However, recent empirical research has not confirmed this expectation. Here we hypothesize that the reported unfavorable classification results in the set-level framework were due to the adoption of unsuitable gene sets defined typically on the basis of the Gene ontology and the KEGG database of metabolic networks. We explore an alternative approach to defining gene sets, based on regulatory interactions, which we expect to collect genes with more correlated expression. We hypothesize that such more correlated gene sets will enable to learn more accurate classifiers. We define two families of gene sets using information on regulatory interactions, and evaluate them on phenotype-classification tasks using public prokaryotic gene expression data sets. From each of the two gene-set families, we first select the best-performing subtype. The two selected subtypes are then evaluated on independent (testing) data sets against state-of-the-art gene sets and against the conventional gene-level approach. The novel gene sets are indeed more correlated than the conventional ones, and lead to significantly more accurate classifiers. The novel gene sets are indeed more correlated than the conventional ones, and lead to significantly more accurate classifiers. Novel gene sets defined on the basis of regulatory interactions improve set-level classification of gene expression data. The experimental scripts and other material needed to reproduce the experiments are available at http://ida.felk.cvut.cz/novelgenesets.tar.gz.

  16. Live level set: A hybrid method of livewire and level set for medical image segmentation

    PubMed Central

    Yao, Jianhua; Chen, David

    2008-01-01

    Livewire and level set are popular methods for medical image segmentation. In this article, the authors propose a hybrid method of livewire and level set, termed the live level set (LLS). The LLS replaces the one graph update iteration in the classic livewire with two iterations of graph updates. The first iteration generates an initial contour for a level set computation. The level set distance is then factored back into the cost function in the second iteration of graph update. The authors validated LLS using synthetic images. The results show that the performance of the LLS is superior to both the classic live wire and traditional level set methods in terms of accuracy, reproducibility, smoothness and running time. They also qualitatively evaluated the LLS using real clinical data. PMID:18841864

  17. Level Set Segmentation of Lumbar Vertebrae Using Appearance Models

    NASA Astrophysics Data System (ADS)

    Fritscher, Karl; Leber, Stefan; Schmölz, Werner; Schubert, Rainer

    For the planning of surgical interventions of the spine exact knowledge about 3D shape and the local bone quality of vertebrae are of great importance in order to estimate the anchorage strength of screws or implants. As a prerequisite for quantitative analysis a method for objective and therefore automated segmentation of vertebrae is needed. In this paper a framework for the automatic segmentation of vertebrae using 3D appearance models in a level set framework is presented. In this framework model information as well as gradient information and probabilities of pixel intensities at object edges in the unseen image are used. The method is tested on 29 lumbar vertebrae leading to accurate results, which can be useful for surgical planning and further analysis of the local bone quality.

  18. Setting the stage for master's level success

    NASA Astrophysics Data System (ADS)

    Roberts, Donna

    Comprehensive reading, writing, research, and study skills play a critical role in a graduate student's success and ability to contribute to a field of study effectively. The literature indicated a need to support graduate student success in the areas of mentoring, navigation, as well as research and writing. The purpose of this two-phased mixed methods explanatory study was to examine factors that characterize student success at the Master's level in the fields of education, sociology and social work. The study was grounded in a transformational learning framework which focused on three levels of learning: technical knowledge, practical or communicative knowledge, and emancipatory knowledge. The study included two data collection points. Phase one consisted of a Master's Level Success questionnaire that was sent via Qualtrics to graduate level students at three colleges and universities in the Central Valley of California: a California State University campus, a University of California campus, and a private college campus. The results of the chi-square indicated that seven questionnaire items were significant with p values less than .05. Phase two in the data collection included semi-structured interview questions that resulted in three themes emerged using Dedoose software: (1) the need for more language and writing support at the Master's level, (2) the need for mentoring, especially for second-language learners, and (3) utilizing the strong influence of faculty in student success. It is recommended that institutions continually assess and strengthen their programs to meet the full range of learners and to support students to degree completion.

  19. Level-Set-Segmentierung von Rattenhirn MRTs

    NASA Astrophysics Data System (ADS)

    Eiben, Björn; Kunz, Dietmar; Pietrzyk, Uwe; Palm, Christoph

    In dieser Arbeit wird die Segmentierung von Gehirngewebe aus Kopfaufnahmen von Ratten mittels Level-Set-Methoden vorgeschlagen. Dazu wird ein zweidimensionaler, kontrastbasierter Ansatz zu einem dreidimensionalen, lokal an die Bildintensität adaptierten Segmentierer erweitert. Es wird gezeigt, dass mit diesem echten 3D-Ansatz die lokalen Bildstrukturen besser berücksichtigt werden können. Insbesondere Magnet-Resonanz-Tomographien (MRTs) mit globalen Helligkeitsgradienten, beispielsweise bedingt durch Oberfiächenspulen, können auf diese Weise zuverlässiger und ohne weitere Vorverarbeitungsschritte segmentiert werden. Die Leistungsfähigkeit des Algorithmus wird experimentell an Hand dreier Rattenhirn-MRTs demonstriert.

  20. Framework for State-Level Renewable Energy Market Potential Studies

    SciTech Connect

    Kreycik, C.; Vimmerstedt, L.; Doris, E.

    2010-01-01

    State-level policymakers are relying on estimates of the market potential for renewable energy resources as they set goals and develop policies to accelerate the development of these resources. Therefore, accuracy of such estimates should be understood and possibly improved to appropriately support these decisions. This document provides a framework and next steps for state officials who require estimates of renewable energy market potential. The report gives insight into how to conduct a market potential study, including what supporting data are needed and what types of assumptions need to be made. The report distinguishes between goal-oriented studies and other types of studies, and explains the benefits of each.

  1. Simulation of Etching Profiles Using Level Sets

    NASA Technical Reports Server (NTRS)

    Hwang, Helen; Govindan, T. R.; Meyyappan, M.; Arnold, James O. (Technical Monitor)

    1998-01-01

    Using plasma discharges to etch trenches and via holes in substrates is an important process in semiconductor manufacturing. Ion enhanced etching involves both neutral fluxes, which are isotropic, and ion fluxes, which are anisotropic. The angular distributions for the ions determines the degree of vertical etch, while the amount of the neutral fluxes determines the etch rate. We have developed a 2D profile evolution simulation which uses level set methods to model the plasma-substrate interface. Using level sets instead of traditional string models avoids the use of complicated delooping algorithms. The simulation calculates the etch rate based on the fluxes and distribution functions of both ions and neutrals. We will present etching profiles of Si substrates in low pressure (10s mTorr) Ar/Cl2 discharges for a variety of incident ion angular distributions. Both ion and neutral re-emission fluxes are included in the calculation of the etch rate, and their contributions to the total etch profile will be demonstrated. In addition, we will show RIE lag effects as a function of different trench aspect ratios. (For sample profiles, please see http://www.ipt.arc.nasa.gov/hwangfig1.html)

  2. Simulation of Etching Profiles Using Level Sets

    NASA Technical Reports Server (NTRS)

    Hwang, Helen; Govindan, T. R.; Meyyappan, M.; Arnold, James O. (Technical Monitor)

    1998-01-01

    Using plasma discharges to etch trenches and via holes in substrates is an important process in semiconductor manufacturing. Ion enhanced etching involves both neutral fluxes, which are isotropic, and ion fluxes, which are anisotropic. The angular distributions for the ions determines the degree of vertical etch, while the amount of the neutral fluxes determines the etch rate. We have developed a 2D profile evolution simulation which uses level set methods to model the plasma-substrate interface. Using level sets instead of traditional string models avoids the use of complicated delooping algorithms. The simulation calculates the etch rate based on the fluxes and distribution functions of both ions and neutrals. We will present etching profiles of Si substrates in low pressure (10s mTorr) Ar/Cl2 discharges for a variety of incident ion angular distributions. Both ion and neutral re-emission fluxes are included in the calculation of the etch rate, and their contributions to the total etch profile will be demonstrated. In addition, we will show RIE lag effects as a function of different trench aspect ratios. (For sample profiles, please see http://www.ipt.arc.nasa.gov/hwangfig1.html)

  3. A Framework for Resolving Geotechnical Parameters in Hardrock Setting

    NASA Astrophysics Data System (ADS)

    Kassam, A.; Milkereit, B.

    2016-12-01

    A new framework resolving geotechnical parameters for hardrock environments has been established by linking seismic parameter (sp), bulk modulus (k), and density (d) in a 3D space. Conventional empirical relations (i.e Birch, Gardner, and Nafe-Drake) use a 2-parameter approach correlating seismic velocity and density. These relations prove compressional velocities do increase with increasing densities and this trend is observable, however, it only holds true for common crystalline rocks and not for high-density ores. Instead of a smooth trend as observed with silicate rocks, high-density materials, such as massive sulphides and Fe-oxides, are outliers and only exhibit a wide scatter distribution. Exploiting borehole seismic data (compressional and shear wave velocities), the new framework uses a three-parameter configuration of an elastic moduli (bulk modulus), density, and seismic parameter. Seismic parameter is defined as sp = Vp2 - (4/3)Vs2and is used as proxy for seismic velocities. In the d-k-sp space, all rock types project onto a single surface, including the high-density crustal rocks. Furthermore, the 3-parameter fit could hold as the basis for establishing calibration curves for ore materials of known hardrock environments (felsic, intermediate, to mafic setting) and in turn obtain elastic moduli and density from seismic data. Borehole logging is used extensively in mineral exploration to measure physical rock property contrasts and evaluate the quantitative link between geology and geophysics. A large dataset of over 2000 petrophysical measurements from laboratory experiments and multi-parameter borehole logging databases were used to provide the basis of investigating this new technique.

  4. Texture descriptor approaches to level set segmentation in medical images

    NASA Astrophysics Data System (ADS)

    Olveres, Jimena; Nava, Rodrigo; Moya-Albor, Ernesto; Escalante-Ramírez, Boris; Brieva, Jorge; Cristóbal, Gabriel; Vallejo, Enrique

    2014-05-01

    Medical image analysis has become an important tool for improving medical diagnosis and planning treatments. It involves volume or still image segmentation that plays a critical role in understanding image content by facilitating extraction of the anatomical organ or region-of-interest. It also may help towards the construction of reliable computer-aided diagnosis systems. Specifically, level set methods have emerged as a general framework for image segmentation; such methods are mainly based on gradient information and provide satisfactory results. However, the noise inherent to images and the lack of contrast information between adjacent regions hamper the performance of the algorithms, thus, others proposals have been suggested in the literature. For instance, characterization of regions as statistical parametric models to handle level set evolution. In this paper, we study the influence of texture on a level-set-based segmentation and propose the use of Hermite features that are incorporated into the level set model to improve organ segmentation that may be useful for quantifying left ventricular blood flow. The proposal was also compared against other texture descriptors such as local binary patterns, Image derivatives, and Hounsfield low attenuation values.

  5. A probabilistic level set formulation for interactive organ segmentation

    NASA Astrophysics Data System (ADS)

    Cremers, Daniel; Fluck, Oliver; Rousson, Mikael; Aharon, Shmuel

    2007-03-01

    Level set methods have become increasingly popular as a framework for image segmentation. Yet when used as a generic segmentation tool, they suffer from an important drawback: Current formulations do not allow much user interaction. Upon initialization, boundaries propagate to the final segmentation without the user being able to guide or correct the segmentation. In the present work, we address this limitation by proposing a probabilistic framework for image segmentation which integrates input intensity information and user interaction on equal footings. The resulting algorithm determines the most likely segmentation given the input image and the user input. In order to allow a user interaction in real-time during the segmentation, the algorithm is implemented on a graphics card and in a narrow band formulation.

  6. Iris segmentation using variational level set method

    NASA Astrophysics Data System (ADS)

    Roy, Kaushik; Bhattacharya, Prabir; Suen, Ching Y.

    2011-04-01

    Continuous efforts have been made to process degraded iris images for enhancement of the iris recognition performance in unconstrained situations. Recently, many researchers have focused on developing the iris segmentation techniques, which can deal with iris images in a non-cooperative environment where the probability of acquiring unideal iris images is very high due to gaze deviation, noise, blurring, and occlusion by eyelashes, eyelids, glasses, and hair. Although there have been many iris segmentation methods, most focus primarily on the accurate detection of iris images captured in a closely controlled environment. The novelty of this research effort is that we propose to apply a variational level set-based curve evolution scheme that uses a significantly larger time step to numerically solve the evolution partial differential equation (PDE) for segmentation of an unideal iris image accurately, and thereby, speeding up the curve evolution process drastically. The iris boundary represented by the variational level set may break and merge naturally during evolution, and thus, the topological changes are handled automatically. The proposed variational model is also robust against poor localization and weak iris/sclera boundaries. In order to solve the size irregularities occurring due to arbitrary shapes of the extracted iris/pupil regions, a simple method is applied based on connection of adjacent contour points. Furthermore, to reduce the noise effect, we apply a pixel-wise adaptive 2D Wiener filter. The verification and identification performance of the proposed scheme is validated on three challenging iris image datasets, namely, the ICE 2005, the WVU Unideal, and the UBIRIS Version 1.

  7. Advanced level set segmentation of the right atrium in MR

    NASA Astrophysics Data System (ADS)

    Chen, Siqi; Kohlberger, Timo; Kirchberg, Klaus J.

    2011-03-01

    Atrial fibrillation is a common heart arrhythmia, and can be effectively treated with ablation. Ablation planning requires 3D models of the patient's left atrium (LA) and/or right atrium (RA), therefore an automatic segmentation procedure to retrieve these models is desirable. In this study, we investigate the use of advanced level set segmentation approaches to automatically segment RA in magnetic resonance angiographic (MRA) volume images. Low contrast to noise ratio makes the boundary between the RA and the nearby structures nearly indistinguishable. Therefore, pure data driven segmentation approaches such as watershed and ChanVese methods are bound to fail. Incorporating training shapes through PCA modeling to constrain the segmentation is one popular solution, and is also used in our segmentation framework. The shape parameters from PCA are optimized with a global histogram based energy model. However, since the shape parameters span a much smaller space, it can not capture fine details of the shape. Therefore, we employ a second refinement step after the shape based segmentation stage, which follows closely the recent work of localized appearance model based techniques. The local appearance model is established through a robust point tracking mechanism and is learned through landmarks embedded on the surface of training shapes. The key contribution of our work is the combination of a statistical shape prior and a localized appearance prior for level set segmentation of the right atrium from MRA. We test this two step segmentation framework on porcine RA to verify the algorithm.

  8. Inter-sectoral Transfer of the Food for Life Settings Framework in England.

    PubMed

    Gray, Selena; Jones, Matthew; Means, Robin; Orme, Judy; Pitt, Hannah; Salmon, Debra

    2017-04-11

    Organisational settings such as schools, workplaces and hospitals are well recognised as key environments for health promotion. Whilst there is extensive literature on specific types of settings, little empirical research has investigated the transfer of frameworks between sectors. This study analyses Food for Life, an England-wide healthy and sustainable food programme that evolved in schools and is being adapted for children's centres, universities, care homes, and hospital settings. Following a case study design, we interviewed 85 stakeholders in nine settings. Food for Life's systemic framework of 'food education, skills and experience' 'food and catering quality', 'community and partnerships' and 'leadership' carried salience in all types of settings. These were perceived to act both as principles and operational priorities for driving systemic change. However, each setting type differed in terms of the mix of facilitating factors and appropriate indicators for change. Barriers in common included the level of culture-shift required, cost perceptions and organisational complexity. For settings based health promotion practice, this study points to the importance of 'frame-working' (the systematic activity of scoping and categorising the field of change) alongside the development and application of benchmarks to stimulate change. These processes are critical in the transfer of learning from between sectors in a form that balances commonality with sufficient flexibility to adapt to specific settings. Synergy between types of settings is an under-recognised, but critical, part of action to address complex issues such as those emerging from the intersection between food, health and sustainability. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Chemically Induced Surface Evolutions with Level Sets

    SciTech Connect

    2006-11-17

    ChISELS is used for the theoretical modeling of detailed surface chemistry and consomitant surface evolutions occurring during microsystem fabrication processes conducted at low pressures. Examples include physical vapor deposition (PVD), low pressure chemical vapor deposition (PECVD), and plasma etching. Evolving interfaces are represented using the level-set method and the evolution equations time integrated using a Semi-Lagrangian approach. A Ballistic transport model is employed to solve for the fluxes incident on each of the surface elements. Surface chemistry leading to etching or deposition is computed by either coupling to Surface Chemkin (a commercially available code) or by providing user defined subroutines. The computational meshes used are quad-trees (2-D) and oct-trees (3-D), constructed such that grid refinement is localized to regions near the surface interfaces. As the interface evolves, the mesh is dynamically reconstructed as needed for the grid to remain fine only around the interface. For parallel computation, a domain decomposition scheme with dynamic load balancing is used to distribute the computational work across processors.

  10. A Bayesian Level-Set Inversion Protocol for Structural Zonation

    NASA Astrophysics Data System (ADS)

    Cardiff, M.; Kitanidis, P.

    2008-12-01

    Mapping the variability of subsurface properties via indirect methods is of great importance for problems in contaminant remediation and resource evaluation. In general, methods for inverse modeling commonly assume smooth and/or geostatistical distributions of the parameters being estimated. However, especially for field- and catchment-scale inverse problems, the existence of distinct, separate geologic facies is not consistent with the assumptions of these inversion techniques. Because of this drawback, it is important that we develop inversion methods that are built for imaging so-called "structural" parameter fields accurately. In our presentation, we discuss the use of a facies-based level set method for imaging geologic parameter fields. The level set framework is applicable when subsurface heterogeneity can be adequately represented as a set of relatively homogeneous geologic facies separated by sharp boundaries. During the inversion optimization, the shape of boundaries between facies are optimized in order to improve data fit. Our method can represent boundaries between arbitrary numbers of facies, and extensions to joint inversion can be handled without relying on petrophysical relations. As examples, we present several synthetic inverse problems that cover realistic estimation problems using nonlinear models with multiple datasets. Throughout our work, we adopt a Bayesian perspective which allows integration of prior information as well as linearized estimation of uncertainty in the boundary locations.

  11. A Five-Level Design Framework for Bicluster Visualizations.

    PubMed

    Sun, Maoyuan; North, Chris; Ramakrishnan, Naren

    2014-12-01

    Analysts often need to explore and identify coordinated relationships (e.g., four people who visited the same five cities on the same set of days) within some large datasets for sensemaking. Biclusters provide a potential solution to ease this process, because each computed bicluster bundles individual relationships into coordinated sets. By understanding such computed, structural, relations within biclusters, analysts can leverage their domain knowledge and intuition to determine the importance and relevance of the extracted relationships for making hypotheses. However, due to the lack of systematic design guidelines, it is still a challenge to design effective and usable visualizations of biclusters to enhance their perceptibility and interactivity for exploring coordinated relationships. In this paper, we present a five-level design framework for bicluster visualizations, with a survey of the state-of-the-art design considerations and applications that are related or that can be applied to bicluster visualizations. We summarize pros and cons of these design options to support user tasks at each of the five-level relationships. Finally, we discuss future research challenges for bicluster visualizations and their incorporation into visual analytics tools.

  12. Decentralized health care priority-setting in Tanzania: evaluating against the accountability for reasonableness framework.

    PubMed

    Maluka, Stephen; Kamuzora, Peter; San Sebastiån, Miguel; Byskov, Jens; Olsen, Øystein E; Shayo, Elizabeth; Ndawi, Benedict; Hurtig, Anna-Karin

    2010-08-01

    Priority-setting has become one of the biggest challenges faced by health decision-makers worldwide. Fairness is a key goal of priority-setting and Accountability for Reasonableness has emerged as a guiding framework for fair priority-setting. This paper describes the processes of setting health care priorities in Mbarali district, Tanzania, and evaluates the descriptions against Accountability for Reasonableness. Key informant interviews were conducted with district health managers, local government officials and other stakeholders using a semi-structured interview guide. Relevant documents were also gathered and group priority-setting in the district was observed. The results indicate that, while Tanzania has a decentralized public health care system, the reality of the district level priority-setting process was that it was not nearly as participatory as the official guidelines suggest it should have been. Priority-setting usually occurred in the context of budget cycles and the process was driven by historical allocation. Stakeholders' involvement in the process was minimal. Decisions (but not the reasoning behind them) were publicized through circulars and notice boards, but there were no formal mechanisms in place to ensure that this information reached the public. There were neither formal mechanisms for challenging decisions nor an adequate enforcement mechanism to ensure that decisions were made in a fair and equitable manner. Therefore, priority-setting in Mbarali district did not satisfy all four conditions of Accountability for Reasonableness; namely relevance, publicity, appeals and revision, and enforcement. This paper aims to make two important contributions to this problematic situation. First, it provides empirical analysis of priority-setting at the district level in the contexts of low-income countries. Second, it provides guidance to decision-makers on how to improve fairness, legitimacy, and sustainability of the priority-setting process.

  13. A modular framework for gene set analysis integrating multilevel omics data

    PubMed Central

    Sass, Steffen; Buettner, Florian; Mueller, Nikola S.; Theis, Fabian J.

    2013-01-01

    Modern high-throughput methods allow the investigation of biological functions across multiple ‘omics’ levels. Levels include mRNA and protein expression profiling as well as additional knowledge on, for example, DNA methylation and microRNA regulation. The reason for this interest in multi-omics is that actual cellular responses to different conditions are best explained mechanistically when taking all omics levels into account. To map gene products to their biological functions, public ontologies like Gene Ontology are commonly used. Many methods have been developed to identify terms in an ontology, overrepresented within a set of genes. However, these methods are not able to appropriately deal with any combination of several data types. Here, we propose a new method to analyse integrated data across multiple omics-levels to simultaneously assess their biological meaning. We developed a model-based Bayesian method for inferring interpretable term probabilities in a modular framework. Our Multi-level ONtology Analysis (MONA) algorithm performed significantly better than conventional analyses of individual levels and yields best results even for sophisticated models including mRNA fine-tuning by microRNAs. The MONA framework is flexible enough to allow for different underlying regulatory motifs or ontologies. It is ready-to-use for applied researchers and is available as a standalone application from http://icb.helmholtz-muenchen.de/mona. PMID:23975194

  14. Beyond SMART? A New Framework for Goal Setting

    ERIC Educational Resources Information Center

    Day, Trevor; Tosey, Paul

    2011-01-01

    This article extends currently reported theory and practice in the use of learning goals or targets with students in secondary and further education. Goal-setting and action-planning constructs are employed in personal development plans (PDPs) and personal learning plans (PLPs) and are advocated as practice within the English national policy…

  15. Beyond SMART? A New Framework for Goal Setting

    ERIC Educational Resources Information Center

    Day, Trevor; Tosey, Paul

    2011-01-01

    This article extends currently reported theory and practice in the use of learning goals or targets with students in secondary and further education. Goal-setting and action-planning constructs are employed in personal development plans (PDPs) and personal learning plans (PLPs) and are advocated as practice within the English national policy…

  16. Confidence sets for optimal factor levels of a response surface.

    PubMed

    Wan, Fang; Liu, Wei; Bretz, Frank; Han, Yang

    2016-12-01

    Construction of confidence sets for the optimal factor levels is an important topic in response surfaces methodology. In Wan et al. (2015), an exact (1-α) confidence set has been provided for a maximum or minimum point (i.e., an optimal factor level) of a univariate polynomial function in a given interval. In this article, the method has been extended to construct an exact (1-α) confidence set for the optimal factor levels of response surfaces. The construction method is readily applied to many parametric and semiparametric regression models involving a quadratic function. A conservative confidence set has been provided as an intermediate step in the construction of the exact confidence set. Two examples are given to illustrate the application of the confidence sets. The comparison between confidence sets indicates that our exact confidence set is better than the only other confidence set available in the statistical literature that guarantees the (1-α) confidence level.

  17. A Systematic Framework for Addressing Treatment Integrity in School Settings

    ERIC Educational Resources Information Center

    Kupzyk, Sara; Shriver, Mark D.

    2016-01-01

    School psychologists are tasked with ensuring treatment integrity because the level of intervention implementation affects decisions about student progress. Treatment integrity includes multiple dimensions that may impact the effectiveness of an intervention including adherence, dosage, quality, and engagement. Unfortunately, treatment integrity…

  18. A Systematic Framework for Addressing Treatment Integrity in School Settings

    ERIC Educational Resources Information Center

    Kupzyk, Sara; Shriver, Mark D.

    2016-01-01

    School psychologists are tasked with ensuring treatment integrity because the level of intervention implementation affects decisions about student progress. Treatment integrity includes multiple dimensions that may impact the effectiveness of an intervention including adherence, dosage, quality, and engagement. Unfortunately, treatment integrity…

  19. A contribution to set a legal framework for biofertilisers.

    PubMed

    Malusá, E; Vassilev, N

    2014-08-01

    The extensive research, production and use of microorganisms to improve plant nutrition have resulted in an inconsistent definition of the term "biofertiliser" which, in some cases, is due to the different microbial mechanisms involved. The rationale for adopting the term biofertiliser is that it derives from "biological fertiliser", that, in turn, implies the use of living microorganisms. Here, we propose a definition for this kind of products which is distinguishing them from biostimulants or other inorganic and organic fertilisers. Special emphasis is given to microorganism(s) with multifunctional properties and biofertilisers containing more than one microorganism. This definition could be included in legal provisions regulating registration and marketing requirements. A set of rules is also proposed which could guarantee the quality of biofertilisers present on the market and thus foster their use by farmers.

  20. Tailoring Healthy Workplace Interventions to Local Healthcare Settings: A Complexity Theory-Informed Workplace of Well-Being Framework.

    PubMed

    Brand, Sarah L; Fleming, Lora E; Wyatt, Katrina M

    2015-01-01

    Many healthy workplace interventions have been developed for healthcare settings to address the consistently low scores of healthcare professionals on assessments of mental and physical well-being. Complex healthcare settings present challenges for the scale-up and spread of successful interventions from one setting to another. Despite general agreement regarding the importance of the local setting in affecting intervention success across different settings, there is no consensus on what it is about a local setting that needs to be taken into account to design healthy workplace interventions appropriate for different local settings. Complexity theory principles were used to understand a workplace as a complex adaptive system and to create a framework of eight domains (system characteristics) that affect the emergence of system-level behaviour. This Workplace of Well-being (WoW) framework is responsive and adaptive to local settings and allows a shared understanding of the enablers and barriers to behaviour change by capturing local information for each of the eight domains. We use the results of applying the WoW framework to one workplace, a UK National Health Service ward, to describe the utility of this approach in informing design of setting-appropriate healthy workplace interventions that create workplaces conducive to healthy behaviour change.

  1. Tailoring Healthy Workplace Interventions to Local Healthcare Settings: A Complexity Theory-Informed Workplace of Well-Being Framework

    PubMed Central

    Brand, Sarah L.; Fleming, Lora E.; Wyatt, Katrina M.

    2015-01-01

    Many healthy workplace interventions have been developed for healthcare settings to address the consistently low scores of healthcare professionals on assessments of mental and physical well-being. Complex healthcare settings present challenges for the scale-up and spread of successful interventions from one setting to another. Despite general agreement regarding the importance of the local setting in affecting intervention success across different settings, there is no consensus on what it is about a local setting that needs to be taken into account to design healthy workplace interventions appropriate for different local settings. Complexity theory principles were used to understand a workplace as a complex adaptive system and to create a framework of eight domains (system characteristics) that affect the emergence of system-level behaviour. This Workplace of Well-being (WoW) framework is responsive and adaptive to local settings and allows a shared understanding of the enablers and barriers to behaviour change by capturing local information for each of the eight domains. We use the results of applying the WoW framework to one workplace, a UK National Health Service ward, to describe the utility of this approach in informing design of setting-appropriate healthy workplace interventions that create workplaces conducive to healthy behaviour change. PMID:26380358

  2. Fast and robust clinical triple-region image segmentation using one level set function.

    PubMed

    Li, Shuo; Fevens, Thomas; Krzyzak, Adam; Jin, Chao; Li, Song

    2006-01-01

    This paper proposes a novel method for clinical triple-region image segmentation using a single level set function. Triple-region image segmentation finds wide application in the computer aided X-ray, CT, MRI and ultrasound image analysis and diagnosis. Usually multiple level set functions are used consecutively or simultaneously to segment triple-region medical images. These approaches are either time consuming or suffer from the convergence problems. With the new proposed triple-regions level set energy modelling, the triple-region segmentation is handled within the two region level set framework where only one single level set function needed. Since only a single level set function is used, the segmentation is much faster and more robust than using multiple level set functions. Adapted to the clinical setting, individual principal component analysis and a support vector machine classifier based clinical acceleration scheme are used to accelerate the segmentation. The clinical acceleration scheme takes the strengths of both machine learning and the level set method while limiting their weaknesses to achieve automatic and fast clinical segmentation. Both synthesized and practical images are used to test the proposed method. These results show that the proposed method is able to successfully segment the triple-region using a single level set function. Also this segmentation is very robust to the placement of initial contour. While still quickly converging to the final image, with the clinical acceleration scheme, our proposed method can be used during pre-processing for automatic computer aided diagnosis and surgery.

  3. A Framework for the Study of Complex mHealth Interventions in Diverse Cultural Settings

    PubMed Central

    Yeates, Karen; Perkins, Nancy; Boesch, Lisa; Hua-Stewart, Diane; Liu, Peter; Sleeth, Jessica; Tobe, Sheldon W

    2017-01-01

    Background To facilitate decision-making capacity between options of care under real-life service conditions, clinical trials must be pragmatic to evaluate mobile health (mHealth) interventions under the variable conditions of health care settings with a wide range of participants. The mHealth interventions require changes in the behavior of patients and providers, creating considerable complexity and ambiguity related to causal chains. Process evaluations of the implementation are necessary to shed light on the range of unanticipated effects an intervention may have, what the active ingredients in everyday practice are, how they exert their effect, and how these may vary among recipients or between sites. Objective Building on the CONSORT-EHEALTH (Consolidated Standards of Reporting Trials of Electronic and Mobile HEalth Applications and onLine TeleHealth) statement and participatory evaluation theory, we present a framework for the process evaluations for mHealth interventions in multiple cultural settings. We also describe the application of this evaluation framework to the implementation of DREAM-GLOBAL (Diagnosing hypertension—Engaging Action and Management in Getting Lower BP in Indigenous and LMIC [low- and middle-income countries]), a pragmatic randomized controlled trial (RCT), and mHealth intervention designed to improve hypertension management in low-resource environments. We describe the evaluation questions and the data collection processes developed by us. Methods Our literature review revealed that there is a significant knowledge gap related to the development of a process evaluation framework for mHealth interventions. We used community-based participatory research (CBPR) methods and formative research data to develop a process evaluation framework nested within a pragmatic RCT. Results Four human organizational levels of participants impacted by the mHealth intervention were identified that included patients, providers, community and

  4. Identifying Heterogeneities in Subsurface Environment using the Level Set Method

    SciTech Connect

    Lei, Hongzhuan; Lu, Zhiming; Vesselinov, Velimir Valentinov

    2016-08-25

    These are slides from a presentation on identifying heterogeneities in subsurface environment using the level set method. The slides start with the motivation, then explain Level Set Method (LSM), the algorithms, some examples are given, and finally future work is explained.

  5. A new level set model for multimaterial flows

    SciTech Connect

    Starinshak, David P.; Karni, Smadar; Roe, Philip L.

    2014-01-08

    We present a new level set model for representing multimaterial flows in multiple space dimensions. Instead of associating a level set function with a specific fluid material, the function is associated with a pair of materials and the interface that separates them. A voting algorithm collects sign information from all level sets and determines material designations. M(M ₋1)/2 level set functions might be needed to represent a general M-material configuration; problems of practical interest use far fewer functions, since not all pairs of materials share an interface. The new model is less prone to producing indeterminate material states, i.e. regions claimed by more than one material (overlaps) or no material at all (vacuums). It outperforms existing material-based level set models without the need for reinitialization schemes, thereby avoiding additional computational costs and preventing excessive numerical diffusion.

  6. From Mouth-level to Tooth-level DMFS: Conceptualizing a Theoretical Framework

    PubMed Central

    Bandyopadhyay, Dipankar

    2015-01-01

    Objective There is no dearth of correlated count data in any biological or clinical settings, and the ability to accurately analyze and interpret such data remains an exciting area of research. In oral health epidemiology, the Decayed, Missing, Filled (DMF) index has been continuously used for over 70 years as the key measure to quantify caries experience. The DMF index projects a subject’s caries status using either the DMF(T), the total number of DMF teeth, or the DMF(S), counting the total DMF teeth surfaces, for that subject. However, surfaces within a particular tooth or a subject constitute clustered data, and the DMFS mostly overlook this clustering effect to attain an over-simplified summary index, ignoring the true tooth-level caries status. Besides, the DMFT/DMFS might exhibit excess of some specific counts (say, zeroes representing the set of relatively disease-free carious state), or can exhibit overdispersion, and accounting for the excess responses or overdispersion remains a key component is selecting the appropriate modeling strategy. Methods & Results This concept paper presents the rationale and the theoretical framework which a dental researcher might consider at the onset in order to choose a plausible statistical model for tooth-level DMFS. Various nuances related to model fitting, selection and parameter interpretation are also explained. Conclusion The author recommends conceptualizing the correct stochastic framework should serve as the guiding force to the dental researcher’s never-ending goal of assessing complex covariate-response relationships efficiently. PMID:26618183

  7. Public health and health promotion capacity at national and regional level: a review of conceptual frameworks.

    PubMed

    Aluttis, Christoph; den Broucke, Stephan Van; Chiotan, Cristina; Costongs, Caroline; Michelsen, Kai; Brand, Helmut

    2014-03-26

    The concept of capacity building for public health has gained much attention during the last decade. National as well as international organizations increasingly focus their efforts on capacity building to improve performance in the health sector. During the past two decades, a variety of conceptual frameworks have been developed which describe relevant dimensions for public health capacity. Notably, these frameworks differ in design and conceptualization. This paper therefore reviews the existing conceptual frameworks and integrates them into one framework, which contains the most relevant dimensions for public health capacity at the country- or regional level. A comprehensive literature search was performed to identify frameworks addressing public health capacity building at the national or regional level. We content-analysed these frameworks to identify the core dimensions of public health capacity. The dimensions were subsequently synthesized into a set of thematic areas to construct a conceptual framework which describes the most relevant dimensions for capacities at the national- or regional level. The systematic review resulted in the identification of seven core domains for public health capacity: resources, organizational structures, workforce, partnerships, leadership and governance, knowledge development and country specific context. Accordingly, these dimensions were used to construct a framework, which describes these core domains more in detail. Our research shows that although there is no generally agreedupon model of public health capacity, a number of key domains for public health and health promotion capacity are consistently recurring in existing frameworks, regardless of their geographical location or thematic area. As only little work on the core concepts of public health capacities has yet taken place, this study adds value to the discourse by identifying these consistencies across existing frameworks and by synthesising them into a new

  8. Efficient algorithm for level set method preserving distance function.

    PubMed

    Estellers, Virginia; Zosso, Dominique; Lai, Rongjie; Osher, Stanley; Thiran, Jean-Philippe; Bresson, Xavier

    2012-12-01

    The level set method is a popular technique for tracking moving interfaces in several disciplines, including computer vision and fluid dynamics. However, despite its high flexibility, the original level set method is limited by two important numerical issues. First, the level set method does not implicitly preserve the level set function as a distance function, which is necessary to estimate accurately geometric features, s.a. the curvature or the contour normal. Second, the level set algorithm is slow because the time step is limited by the standard Courant-Friedrichs-Lewy (CFL) condition, which is also essential to the numerical stability of the iterative scheme. Recent advances with graph cut methods and continuous convex relaxation methods provide powerful alternatives to the level set method for image processing problems because they are fast, accurate, and guaranteed to find the global minimizer independently to the initialization. These recent techniques use binary functions to represent the contour rather than distance functions, which are usually considered for the level set method. However, the binary function cannot provide the distance information, which can be essential for some applications, s.a. the surface reconstruction problem from scattered points and the cortex segmentation problem in medical imaging. In this paper, we propose a fast algorithm to preserve distance functions in level set methods. Our algorithm is inspired by recent efficient l(1) optimization techniques, which will provide an efficient and easy to implement algorithm. It is interesting to note that our algorithm is not limited by the CFL condition and it naturally preserves the level set function as a distance function during the evolution, which avoids the classical re-distancing problem in level set methods. We apply the proposed algorithm to carry out image segmentation, where our methods prove to be 5-6 times faster than standard distance preserving level set techniques. We

  9. An Expanded Theoretical Framework of Care Coordination Across Transitions in Care Settings.

    PubMed

    Radwin, Laurel E; Castonguay, Denise; Keenan, Carolyn B; Hermann, Cherice

    2016-01-01

    For many patients, high-quality, patient-centered, and cost-effective health care requires coordination among multiple clinicians and settings. Ensuring optimal care coordination requires a clear understanding of how clinician activities and continuity during transitions affect patient-centeredness and quality outcomes. This article describes an expanded theoretical framework to better understand care coordination. The framework provides clear articulation of concepts. Examples are provided of ways to measure the concepts.

  10. The exchange boundary framework: understanding the evolution of power within collaborative decision-making settings.

    PubMed

    Watson, Erin R; Foster-Fishman, Pennie G

    2013-03-01

    Many community decision-making bodies encounter challenges in creating conditions where stakeholders from disadvantaged populations can authentically participate in ways that give them actual influence over decisions affecting their lives (Foster-Fishman et al., Lessons for the journey: Strategies and suggestions for guiding planning, governance, and sustainability in comprehensive community initiatives. W.K. Kellogg Foundation, Battle Creek, MI, 2004). These challenges are often rooted in asymmetrical power dynamics operating within the settings (Prilleltensky, J Commun Psychol 36:116-136, 2008). In response, this paper presents the Exchange Boundary Framework, a new approach for understanding and promoting authentic, empowered participation within collaborative decision-making settings. The framework expands upon theories currently used in the field of community psychology by focusing on the underlying processes through which power operates in relationships and examining the evolution of power dynamics over time. By integrating concepts from social exchange theory (Emerson, Am Soc Rev 27:31-41, 1962) and social boundaries theory (Hayward, Polity 31(1):1-22, 1998), the framework situates power within parallel processes of resources exchange and social regulation. The framework can be used to understand the conditions leading to power asymmetries within collaborative decisionmaking processes, and guide efforts to promote more equitable and authentic participation by all stakeholders within these settings. In this paper we describe the Exchange Boundary Framework, apply it to three distinct case studies, and discuss key considerations for its application within collaborative community settings.

  11. A 3D Level Set Method for Microwave Breast Imaging

    PubMed Central

    Colgan, Timothy J.; Hagness, Susan C.; Van Veen, Barry D.

    2015-01-01

    Objective Conventional inverse-scattering algorithms for microwave breast imaging result in moderate resolution images with blurred boundaries between tissues. Recent 2D numerical microwave imaging studies demonstrate that the use of a level set method preserves dielectric boundaries, resulting in a more accurate, higher resolution reconstruction of the dielectric properties distribution. Previously proposed level set algorithms are computationally expensive and thus impractical in 3D. In this paper we present a computationally tractable 3D microwave imaging algorithm based on level sets. Methods We reduce the computational cost of the level set method using a Jacobian matrix, rather than an adjoint method, to calculate Frechet derivatives. We demonstrate the feasibility of 3D imaging using simulated array measurements from 3D numerical breast phantoms. We evaluate performance by comparing full 3D reconstructions to those from a conventional microwave imaging technique. We also quantitatively assess the efficacy of our algorithm in evaluating breast density. Results Our reconstructions of 3D numerical breast phantoms improve upon those of a conventional microwave imaging technique. The density estimates from our level set algorithm are more accurate than those of conventional microwave imaging, and the accuracy is greater than that reported for mammographic density estimation. Conclusion Our level set method leads to a feasible level of computational complexity for full 3D imaging, and reconstructs the heterogeneous dielectric properties distribution of the breast more accurately than conventional microwave imaging methods. Significance 3D microwave breast imaging using a level set method is a promising low-cost, non-ionizing alternative to current breast imaging techniques. PMID:26011863

  12. Hippocampus segmentation using locally weighted prior based level set

    NASA Astrophysics Data System (ADS)

    Achuthan, Anusha; Rajeswari, Mandava

    2015-12-01

    Segmentation of hippocampus in the brain is one of a major challenge in medical image segmentation due to its' imaging characteristics, with almost similar intensity between another adjacent gray matter structure, such as amygdala. The intensity similarity has causes the hippocampus to have weak or fuzzy boundaries. With this main challenge being demonstrated by hippocampus, a segmentation method that relies on image information alone may not produce accurate segmentation results. Therefore, it is needed an assimilation of prior information such as shape and spatial information into existing segmentation method to produce the expected segmentation. Previous studies has widely integrated prior information into segmentation methods. However, the prior information has been utilized through a global manner integration, and this does not reflect the real scenario during clinical delineation. Therefore, in this paper, a locally integrated prior information into a level set model is presented. This work utilizes a mean shape model to provide automatic initialization for level set evolution, and has been integrated as prior information into the level set model. The local integration of edge based information and prior information has been implemented through an edge weighting map that decides at voxel level which information need to be observed during a level set evolution. The edge weighting map shows which corresponding voxels having sufficient edge information. Experiments shows that the proposed integration of prior information locally into a conventional edge-based level set model, known as geodesic active contour has shown improvement of 9% in averaged Dice coefficient.

  13. Construal level mind-sets moderate self- and social stereotyping.

    PubMed

    McCrea, Sean M; Wieber, Frank; Myers, Andrea L

    2012-01-01

    Construal level theory suggests that events and objects can be represented at either a higher, more abstract level involving consideration of superordinate goals, desirability, global processing, and broad categorizations or a lower, more concrete level involving consideration of subordinate goals, feasibility, local processing, and narrow categorizations. Analogously, social targets (including the self) can be represented more broadly, as members of a group, or more narrowly, as individuals. Because abstract construals induce a similarity focus, they were predicted to increase the perceived fit between social targets and a salient social category. Accordingly, placing individuals into a more abstract construal mind-set via an unrelated task increased the activation and use of stereotypes of salient social groups, stereotype-consistent trait ratings of the self, group identification, and stereotype-consistent performance relative to more concrete construal mind-sets. Thus, nonsocial contextual influences (construal level mind-sets) affect stereotyping of self and others.

  14. A PDE-Based Fast Local Level Set Method

    NASA Astrophysics Data System (ADS)

    Peng, Danping; Merriman, Barry; Osher, Stanley; Zhao, Hongkai; Kang, Myungjoo

    1999-11-01

    We develop a fast method to localize the level set method of Osher and Sethian (1988, J. Comput. Phys.79, 12) and address two important issues that are intrinsic to the level set method: (a) how to extend a quantity that is given only on the interface to a neighborhood of the interface; (b) how to reset the level set function to be a signed distance function to the interface efficiently without appreciably moving the interface. This fast local level set method reduces the computational effort by one order of magnitude, works in as much generality as the original one, and is conceptually simple and easy to implement. Our approach differs from previous related works in that we extract all the information needed from the level set function (or functions in multiphase flow) and do not need to find explicitly the location of the interface in the space domain. The complexity of our method to do tasks such as extension and distance reinitialization is O(N), where N is the number of points in space, not O(N log N) as in works by Sethian (1996, Proc. Nat. Acad. Sci. 93, 1591) and Helmsen and co-workers (1996, SPIE Microlithography IX, p. 253). This complexity estimation is also valid for quite general geometrically based front motion for our localized method.

  15. The ICF: A Framework for Setting Goals for Children with Speech Impairment

    ERIC Educational Resources Information Center

    McLeod, Sharynne; Bleile, Ken

    2004-01-01

    The International Classification of Functioning, Disability and Health (ICF) (World Health Organization, 2001) is proposed as a framework for integrative goal setting for children with speech impairment. The ICF incorporates both impairment and social factors to consider when selecting appropriate goals to bring about change in the lives of…

  16. The ICF: A Framework for Setting Goals for Children with Speech Impairment

    ERIC Educational Resources Information Center

    McLeod, Sharynne; Bleile, Ken

    2004-01-01

    The International Classification of Functioning, Disability and Health (ICF) (World Health Organization, 2001) is proposed as a framework for integrative goal setting for children with speech impairment. The ICF incorporates both impairment and social factors to consider when selecting appropriate goals to bring about change in the lives of…

  17. Conceptual framework for indexing visual information at multiple levels

    NASA Astrophysics Data System (ADS)

    Jaimes, Alejandro; Chang, Shih-Fu

    1999-12-01

    In this paper, we present a conceptual framework for indexing different aspects of visual information. Our framework unifies concepts from this literature in diverse fields such as cognitive psychology, library sciences, art, and the more recent content-based retrieval. We present multiple level structures for visual and non-visual and non- visual information. The ten-level visual structure presented provides a systematic way of indexing images based on syntax and semantics, and includes distinctions between general concept and visual concept. We define different types of relations at different levels of the visual structure, and also use a semantic information table to summarize important aspects related to an image. While the focus is on the development of a conceptual indexing structure, our aim is also to bring together the knowledge from various fields, unifying the issues that should be considered when building a digital image library. Our analysis stresses the limitations of state of the art content-based retrieval systems and suggests areas in which improvements are necessary.

  18. A Framework for Translating a High Level Security Policy into Low Level Security Mechanisms

    NASA Astrophysics Data System (ADS)

    Hassan, Ahmed A.; Bahgat, Waleed M.

    2010-01-01

    Security policies have different components; firewall, active directory, and IDS are some examples of these components. Enforcement of network security policies to low level security mechanisms faces some essential difficulties. Consistency, verification, and maintenance are the major ones of these difficulties. One approach to overcome these difficulties is to automate the process of translation of high level security policy into low level security mechanisms. This paper introduces a framework of an automation process that translates a high level security policy into low level security mechanisms. The framework is described in terms of three phases; in the first phase all network assets are categorized according to their roles in the network security and relations between them are identified to constitute the network security model. This proposed model is based on organization based access control (OrBAC). However, the proposed model extend the OrBAC model to include not only access control policy but also some other administrative security policies like auditing policy. Besides, the proposed model enables matching of each rule of the high level security policy with the corresponding ones of the low level security policy. Through the second phase of the proposed framework, the high level security policy is mapped into the network security model. The second phase could be considered as a translation of the high level security policy into an intermediate model level. Finally, the intermediate model level is translated automatically into low level security mechanism. The paper illustrates the applicability of proposed approach through an application example.

  19. Level-Set Topology Optimization with Aeroelastic Constraints

    NASA Technical Reports Server (NTRS)

    Dunning, Peter D.; Stanford, Bret K.; Kim, H. Alicia

    2015-01-01

    Level-set topology optimization is used to design a wing considering skin buckling under static aeroelastic trim loading, as well as dynamic aeroelastic stability (flutter). The level-set function is defined over the entire 3D volume of a transport aircraft wing box. Therefore, the approach is not limited by any predefined structure and can explore novel configurations. The Sequential Linear Programming (SLP) level-set method is used to solve the constrained optimization problems. The proposed method is demonstrated using three problems with mass, linear buckling and flutter objective and/or constraints. A constraint aggregation method is used to handle multiple buckling constraints in the wing skins. A continuous flutter constraint formulation is used to handle difficulties arising from discontinuities in the design space caused by a switching of the critical flutter mode.

  20. An improved level set method for vertebra CT image segmentation

    PubMed Central

    2013-01-01

    Background Clinical diagnosis and therapy for the lumbar disc herniation requires accurate vertebra segmentation. The complex anatomical structure and the degenerative deformations of the vertebrae makes its segmentation challenging. Methods An improved level set method, namely edge- and region-based level set method (ERBLS), is proposed for vertebra CT images segmentation. By considering the gradient information and local region characteristics of images, the proposed model can efficiently segment images with intensity inhomogeneity and blurry or discontinuous boundaries. To reduce the dependency on manual initialization in many active contour models and for an automatic segmentation, a simple initialization method for the level set function is built, which utilizes the Otsu threshold. In addition, the need of the costly re-initialization procedure is completely eliminated. Results Experimental results on both synthetic and real images demonstrated that the proposed ERBLS model is very robust and efficient. Compared with the well-known local binary fitting (LBF) model, our method is much more computationally efficient and much less sensitive to the initial contour. The proposed method has also applied to 56 patient data sets and produced very promising results. Conclusions An improved level set method suitable for vertebra CT images segmentation is proposed. It has the flexibility of segmenting the vertebra CT images with blurry or discontinuous edges, internal inhomogeneity and no need of re-initialization. PMID:23714300

  1. Developing a pressure ulcer risk factor minimum data set and risk assessment framework.

    PubMed

    Coleman, Susanne; Nelson, E Andrea; Keen, Justin; Wilson, Lyn; McGinnis, Elizabeth; Dealey, Carol; Stubbs, Nikki; Muir, Delia; Farrin, Amanda; Dowding, Dawn; Schols, Jos M G A; Cuddigan, Janet; Berlowitz, Dan; Jude, Edward; Vowden, Peter; Bader, Dan L; Gefen, Amit; Oomens, Cees W J; Schoonhoven, Lisette; Nixon, Jane

    2014-10-01

    To agree a draft pressure ulcer risk factor Minimum Data Set to underpin the development of a new evidenced-based Risk Assessment Framework. A recent systematic review identified the need for a pressure ulcer risk factor Minimum Data Set and development and validation of an evidenced-based pressure ulcer Risk Assessment Framework. This was undertaken through the Pressure UlceR Programme Of reSEarch (RP-PG-0407-10056), funded by the National Institute for Health Research and incorporates five phases. This article reports phase two, a consensus study. Consensus study. A modified nominal group technique based on the Research and Development/University of California at Los Angeles appropriateness method. This incorporated an expert group, review of the evidence and the views of a Patient and Public Involvement service user group. Data were collected December 2010-December 2011. The risk factors and assessment items of the Minimum Data Set (including immobility, pressure ulcer and skin status, perfusion, diabetes, skin moisture, sensory perception and nutrition) were agreed. In addition, a draft Risk Assessment Framework incorporating all Minimum Data Set items was developed, comprising a two stage assessment process (screening and detailed full assessment) and decision pathways. The draft Risk Assessment Framework will undergo further design and pre-testing with clinical nurses to assess and improve its usability. It will then be evaluated in clinical practice to assess its validity and reliability. The Minimum Data Set could be used in future for large scale risk factor studies informing refinement of the Risk Assessment Framework. © 2014 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.

  2. Developing a pressure ulcer risk factor minimum data set and risk assessment framework

    PubMed Central

    Coleman, Susanne; Nelson, E Andrea; Keen, Justin; Wilson, Lyn; McGinnis, Elizabeth; Dealey, Carol; Stubbs, Nikki; Muir, Delia; Farrin, Amanda; Dowding, Dawn; Schols, Jos MGA; Cuddigan, Janet; Berlowitz, Dan; Jude, Edward; Vowden, Peter; Bader, Dan L; Gefen, Amit; Oomens, Cees WJ; Schoonhoven, Lisette; Nixon, Jane

    2014-01-01

    Aim To agree a draft pressure ulcer risk factor Minimum Data Set to underpin the development of a new evidenced-based Risk Assessment Framework. Background A recent systematic review identified the need for a pressure ulcer risk factor Minimum Data Set and development and validation of an evidenced-based pressure ulcer Risk Assessment Framework. This was undertaken through the Pressure UlceR Programme Of reSEarch (RP-PG-0407-10056), funded by the National Institute for Health Research and incorporates five phases. This article reports phase two, a consensus study. Design Consensus study. Method A modified nominal group technique based on the Research and Development/University of California at Los Angeles appropriateness method. This incorporated an expert group, review of the evidence and the views of a Patient and Public Involvement service user group. Data were collected December 2010–December 2011. Findings The risk factors and assessment items of the Minimum Data Set (including immobility, pressure ulcer and skin status, perfusion, diabetes, skin moisture, sensory perception and nutrition) were agreed. In addition, a draft Risk Assessment Framework incorporating all Minimum Data Set items was developed, comprising a two stage assessment process (screening and detailed full assessment) and decision pathways. Conclusion The draft Risk Assessment Framework will undergo further design and pre-testing with clinical nurses to assess and improve its usability. It will then be evaluated in clinical practice to assess its validity and reliability. The Minimum Data Set could be used in future for large scale risk factor studies informing refinement of the Risk Assessment Framework. PMID:24845398

  3. Bi-directional evolutionary level set method for topology optimization

    NASA Astrophysics Data System (ADS)

    Zhu, Benliang; Zhang, Xianmin; Fatikow, Sergej; Wang, Nianfeng

    2015-03-01

    A bi-directional evolutionary level set method for solving topology optimization problems is presented in this article. The proposed method has three main advantages over the standard level set method. First, new holes can be automatically generated in the design domain during the optimization process. Second, the dependency of the obtained optimized configurations upon the initial configurations is eliminated. Optimized configurations can be obtained even being started from a minimum possible initial guess. Third, the method can be easily implemented and is computationally more efficient. The validity of the proposed method is tested on the mean compliance minimization problem and the compliant mechanisms topology optimization problem.

  4. Geologic setting of the low-level burial grounds

    SciTech Connect

    Lindsey, K.A.; Jaeger, G.K.; Slate, J.L.; Swett, K.J.; Mercer, R.B.

    1994-10-13

    This report describes the regional and site specific geology of the Hanford Sites low-level burial grounds in the 200 East and West Areas. The report incorporates data from boreholes across the entire 200 Areas, integrating the geology of this area into a single framework. Geologic cross-sections, isopach maps, and structure contour maps of all major geological units from the top of the Columbia River Basalt Group to the surface are included. The physical properties and characteristics of the major suprabasalt sedimentary units also are discussed.

  5. Modeling wildland fire propagation with level set methods

    Treesearch

    V. Mallet; D.E Keyes; F.E. Fendell

    2009-01-01

    Level set methods are versatile and extensible techniques for general front tracking problems, including the practically important problem of predicting the advance of a fire front across expanses of surface vegetation. Given a rule, empirical or otherwise, to specify the rate of advance of an infinitesimal segment of fire front arc normal to itself (i.e., given the...

  6. A Quadrature Free Discontinuous Galerkin Conservative Level Set Scheme

    NASA Astrophysics Data System (ADS)

    Czajkowski, Mark; Desjardins, Olivier

    2010-11-01

    In an effort to improve the scalability and accuracy of the Accurate Conservative Level Set (ACLS) scheme [Desjardins et al., J COMPUT PHYS 227 (2008)], a scheme based on the quadrature free discontinuous Galerkin (DG) methodology has been developed. ACLS relies on a hyperbolic tangent level set function that is transported and reinitialized using conservative schemes in order to alleviate mass conservation issues known to plague level set methods. DG allows for an arbitrarily high order representation of the interface by using a basis of high order polynomials while only using data from the faces of neighboring cells. The small stencil allows DG to have excellent parallel scalability. The diffusion term present in the conservative reinitialization equation is handled using local DG method [Cockburn et al., SIAM J NUMER ANAL 39, (2001)] while the normals are computed from a limited form of the level set function in order to avoid spurious oscillations. The resulting scheme is shown to be both robust, accurate, and highly scalable, making it a method of choice for large-scale simulations of multiphase flows with complex interfacial topology.

  7. Thermal Infrared Pedestrian Image Segmentation Using Level Set Method

    PubMed Central

    Qiao, Yulong; Wei, Ziwei; Zhao, Yan

    2017-01-01

    The edge-based active contour model has been one of the most influential models in image segmentation, in which the level set method is usually used to minimize the active contour energy function and then find the desired contour. However, for infrared thermal pedestrian images, the traditional level set-based method that utilizes the gradient information as edge indicator function fails to provide the satisfactory boundary of the target. That is due to the poorly defined boundaries and the intensity inhomogeneity. Therefore, we propose a novel level set-based thermal infrared image segmentation method that is able to deal with the above problems. Specifically, we firstly explore the one-bit transform convolution kernel and define a soft mark, from which the target boundary is enhanced. Then we propose a weight function to adaptively adjust the intensity of the infrared image so as to reduce the intensity inhomogeneity. In the level set formulation, those processes can adaptively adjust the edge indicator function, from which the evolving curve will stop at the target boundary. We conduct the experiments on benchmark infrared pedestrian images and compare our introduced method with the state-of-the-art approaches to demonstrate the excellent performance of the proposed method. PMID:28783080

  8. Counselors' Job Satisfaction across Education Levels, Settings, and Specialties

    ERIC Educational Resources Information Center

    Gambrell, Crista E.

    2010-01-01

    This study examined counselor satisfaction across education levels (Masters and Doctorate), work settings (private practice and institutions), and specializations (mental health counselors, school counselors, counselor educators, and creative arts/other counselors). Counselors were surveyed counseling professionals across these variables to…

  9. Optic disc segmentation: level set methods and blood vessels inpainting

    NASA Astrophysics Data System (ADS)

    Almazroa, A.; Sun, Weiwei; Alodhayb, Sami; Raahemifar, Kaamran; Lakshminarayanan, Vasudevan

    2017-03-01

    Segmenting the optic disc (OD) is an important and essential step in creating a frame of reference for diagnosing optic nerve head (ONH) pathology such as glaucoma. Therefore, a reliable OD segmentation technique is necessary for automatic screening of ONH abnormalities. The main contribution of this paper is in presenting a novel OD segmentation algorithm based on applying a level set method on a localized OD image. To prevent the blood vessels from interfering with the level set process, an inpainting technique is applied. The algorithm is evaluated using a new retinal fundus image dataset called RIGA (Retinal Images for Glaucoma Analysis). In the case of low quality images, a double level set is applied in which the first level set is considered to be a localization for the OD. Five hundred and fifty images are used to test the algorithm accuracy as well as its agreement with manual markings by six ophthalmologists. The accuracy of the algorithm in marking the optic disc area and centroid is 83.9%, and the best agreement is observed between the results of the algorithm and manual markings in 379 images.

  10. Thermal Infrared Pedestrian Image Segmentation Using Level Set Method.

    PubMed

    Qiao, Yulong; Wei, Ziwei; Zhao, Yan

    2017-08-06

    The edge-based active contour model has been one of the most influential models in image segmentation, in which the level set method is usually used to minimize the active contour energy function and then find the desired contour. However, for infrared thermal pedestrian images, the traditional level set-based method that utilizes the gradient information as edge indicator function fails to provide the satisfactory boundary of the target. That is due to the poorly defined boundaries and the intensity inhomogeneity. Therefore, we propose a novel level set-based thermal infrared image segmentation method that is able to deal with the above problems. Specifically, we firstly explore the one-bit transform convolution kernel and define a soft mark, from which the target boundary is enhanced. Then we propose a weight function to adaptively adjust the intensity of the infrared image so as to reduce the intensity inhomogeneity. In the level set formulation, those processes can adaptively adjust the edge indicator function, from which the evolving curve will stop at the target boundary. We conduct the experiments on benchmark infrared pedestrian images and compare our introduced method with the state-of-the-art approaches to demonstrate the excellent performance of the proposed method.

  11. Identifying Attributes of CO2 Leakage Zones in Shallow Aquifers Using a Parametric Level Set Method

    NASA Astrophysics Data System (ADS)

    Sun, A. Y.; Islam, A.; Wheeler, M.

    2016-12-01

    Leakage through abandoned wells and geologic faults poses the greatest risk to CO2 storage permanence. For shallow aquifers, secondary CO2 plumes emanating from the leak zones may go undetected for a sustained period of time and has the greatest potential to cause large-scale and long-term environmental impacts. Identification of the attributes of leak zones, including their shape, location, and strength, is required for proper environmental risk assessment. This study applies a parametric level set (PaLS) method to characterize the leakage zone. Level set methods are appealing for tracking topological changes and recovering unknown shapes of objects. However, level set evolution using the conventional level set methods is challenging. In PaLS, the level set function is approximated using a weighted sum of basis functions and the level set evolution problem is replaced by an optimization problem. The efficacy of PaLS is demonstrated through recovering the source zone created by CO2 leakage into a carbonate aquifer. Our results show that PaLS is a robust source identification method that can recover the approximate source locations in the presence of measurement errors, model parameter uncertainty, and inaccurate initial guesses of source flux strengths. The PaLS inversion framework introduced in this work is generic and can be adapted for any reactive transport model by switching the pre- and post-processing routines.

  12. A variational approach to path planning in three dimensions using level set methods

    NASA Astrophysics Data System (ADS)

    Cecil, Thomas; Marthaler, Daniel E.

    2006-01-01

    In this paper we extend the two-dimensional methods set forth in [T. Cecil, D. Marthaler, A variational approach to search and path planning using level set methods, UCLA CAM Report, 04-61, 2004], proposing a variational approach to a path planning problem in three dimensions using a level set framework. After defining an energy integral over the path, we use gradient flow on the defined energy and evolve the entire path until a locally optimal steady state is reached. We follow the framework for motion of curves in three dimensions set forth in [P. Burchard, L.-T. Cheng, B. Merriman, S. Osher, Motion of curves in three spatial dimensions using a level set approach, J. Comput. Phys. 170(2) (2001) 720-741], modified appropriately to take into account that we allow for paths with positive, varying widths. Applications of this method extend to robotic motion and visibility problems, for example. Numerical methods and algorithms are given, and examples are presented.

  13. The distortion of the level set gradient under advection

    NASA Astrophysics Data System (ADS)

    Trujillo, Mario F.; Anumolu, Lakshman; Ryddner, Doug

    2017-04-01

    The practice of periodically reinitializing the level set function is well established in two-phase flow applications as a way of controlling the growth of anomalies and/or numerical errors. In the present work, the underlying roots of this anomalous growth are studied, where it is established that the augmentation of the magnitude of the level set gradient (| ∇ϕ |) is directly connected to the nature of the flow field; hence, it is not necessarily the result of some type of numerical error. More specifically, for a general flow field advecting the level set function, it is shown that the eigenpairs of the strain rate tensor are responsible for the rate of change of | ∇ϕ | along a fluid particle trajectory. This straining action not only affects the magnitude of | ∇ϕ |, but the general character of ϕ, and consequently contributes to the growth in numerical error. These numerical consequences are examined by adopting the Gradient Augmented Level Set method. Specifically, it is shown that the local error for ϕ is directly connected to the size of | ∇ϕ | and to the magnitude of the second and fourth order derivatives of ϕ. These analytical findings are subsequently supported by various examples. The role of reinitialization is discussed, where it is shown that in cases where the zero level set contour has a local radius of curvature that is below the local grid resolution, reinitialization exacerbates rather than diminishes the degree of error. For other cases, where the interface is well resolved, reinitialization helps stabilize the error as intended.

  14. Level set method coupled with Energy Image features for brain MR image segmentation.

    PubMed

    Punga, Mirela Visan; Gaurav, Rahul; Moraru, Luminita

    2014-06-01

    Up until now, the noise and intensity inhomogeneity are considered one of the major drawbacks in the field of brain magnetic resonance (MR) image segmentation. This paper introduces the energy image feature approach for intensity inhomogeneity correction. Our approach of segmentation takes the advantage of image features and preserves the advantages of the level set methods in region-based active contours framework. The energy image feature represents a new image obtained from the original image when the pixels' values are replaced by local energy values computed in the 3×3 mask size. The performance and utility of the energy image features were tested and compared through two different variants of level set methods: one as the encompassed local and global intensity fitting method and the other as the selective binary and Gaussian filtering regularized level set method. The reported results demonstrate the flexibility of the energy image feature to adapt to level set segmentation framework and to perform the challenging task of brain lesion segmentation in a rather robust way.

  15. Development of a technical assistance framework for building organizational capacity of health programs in resource-limited settings.

    PubMed

    Reyes, E Michael; Sharma, Anjali; Thomas, Kate K; Kuehn, Chuck; Morales, José Rafael

    2014-09-17

    Little information exists on the technical assistance needs of local indigenous organizations charged with managing HIV care and treatment programs funded by the US President's Emergency Plan for AIDS Relief (PEPFAR). This paper describes the methods used to adapt the Primary Care Assessment Tool (PCAT) framework, which has successfully strengthened HIV primary care services in the US, into one that could strengthen the capacity of local partners to deliver priority health programs in resource-constrained settings by identifying their specific technical assistance needs. Qualitative methods and inductive reasoning approaches were used to conceptualize and adapt the new Clinical Assessment for Systems Strengthening (ClASS) framework. Stakeholder interviews, comparisons of existing assessment tools, and a pilot test helped determine the overall ClASS framework for use in low-resource settings. The framework was further refined one year post-ClASS implementation. Stakeholder interviews, assessment of existing tools, a pilot process and the one-year post- implementation assessment informed the adaptation of the ClASS framework for assessing and strengthening technical and managerial capacities of health programs at three levels: international partner, local indigenous partner, and local partner treatment facility. The PCAT focus on organizational strengths and systems strengthening was retained and implemented in the ClASS framework and approach. A modular format was chosen to allow the use of administrative, fiscal and clinical modules in any combination and to insert new modules as needed by programs. The pilot led to refined pre-visit planning, informed review team composition, increased visit duration, and restructured modules. A web-based toolkit was developed to capture three years of experiential learning; this kit can also be used for independent implementation of the ClASS framework. A systematic adaptation process has produced a qualitative framework that can

  16. Ambient ultraviolet radiation levels in public shade settings.

    PubMed

    Moise, A F; Aynsley, R

    1999-11-01

    As people become better informed about the harmful effects of prolonged exposure to solar ultraviolet radiation (UVR, 280-400 nm) they will seek the protection of shade, particularly in tropical locations such as Townsville (19 degrees south). Using broad-band radiation sensors for solar ultraviolet-B (280-315 nm), ultraviolet-A (315-400 nm) and daylight (400-800 nm) radiation, the exposure levels were measured in both the horizontal (shaded and unshaded) and vertical (shaded and unshaded) directions. The measurements were conducted at eight locations (shade settings) in Townsville during the period between December 1997 (summer) and May 1998 (beginning of winter). The quality of protection was assessed by the ratio of unshaded to shaded radiation exposure, the UVB/shade protection ratio (UVB-SPR). The UVB-SPR varies considerably between the different shade settings, with a beach umbrella showing the least protection and dense foliage the highest protection. The roof of a house verandah can provide only little protection if the verandah catches the afternoon sun. Increasing cloud cover decreases the UVB-SPR for all settings because of the increase in the diffuse fraction of the radiation. Only one setting provided a UVB-SPR of 15 or higher, as suggested for protective shading against solar UVB radiation. Shade from direct sunlight alone does not provide enough protection against high levels of solar UVR. Apart from the transmission qualities of the shading material, it is the construction of the whole shade setting that determines the exposure levels underneath. A shade structure with enough overhang is recommended so that high levels of scattered radiation do not reach the skin.

  17. A linear optimal transportation framework for quantifying and visualizing variations in sets of images

    PubMed Central

    Wang, Wei; Slepčev, Dejan; Basu, Saurav; Ozolek, John A.

    2012-01-01

    Transportation-based metrics for comparing images have long been applied to analyze images, especially where one can interpret the pixel intensities (or derived quantities) as a distribution of ‘mass’ that can be transported without strict geometric constraints. Here we describe a new transportation-based framework for analyzing sets of images. More specifically, we describe a new transportation-related distance between pairs of images, which we denote as linear optimal transportation (LOT). The LOT can be used directly on pixel intensities, and is based on a linearized version of the Kantorovich-Wasserstein metric (an optimal transportation distance, as is the earth mover’s distance). The new framework is especially well suited for computing all pairwise distances for a large database of images efficiently, and thus it can be used for pattern recognition in sets of images. In addition, the new LOT framework also allows for an isometric linear embedding, greatly facilitating the ability to visualize discriminant information in different classes of images. We demonstrate the application of the framework to several tasks such as discriminating nuclear chromatin patterns in cancer cells, decoding differences in facial expressions, galaxy morphologies, as well as sub cellular protein distributions. PMID:23729991

  18. A theoretical and computational setting for a geometrically nonlinear gradient damage modelling framework

    NASA Astrophysics Data System (ADS)

    Nedjar, B.

    The present work deals with the extension to the geometrically nonlinear case of recently proposed ideas on elastic- and elastoplastic-damage modelling frameworks within the infinitesimal theory. The particularity of these models is that the damage part of the modelling involves the gradient of damage quantity which, together with the equations of motion, are ensuing from a new formulation of the principle of virtual power. It is shown how the thermodynamics of irreversible processes is crucial in the characterization of the dissipative phenomena and in setting the convenient forms for the constitutive relations. On the numerical side, we discuss the problem of numerically integrating these equations and the implementation within the context of the finite element method is described in detail. And finally, we present a set of representative numerical simulations to illustrate the effectiveness of the proposed framework.

  19. Fusion of Imperfect Information in the Unified Framework of Random Sets Theory: Application to Target Identification

    DTIC Science & Technology

    2007-11-01

    Informatique WGZ Anne-Laure Jousselme Éloi Bossé DRDC Valcartier Defence R&D Canada – Valcartier Technical Report DRDC Valcartier TR 2003-319 November 2007...Fusion of imperfect information in the unified framework of random sets theory Application to target identification Mihai Cristian Florea Informatique ...Cell CFB Esquimalt P.O. Box 17000 Stn Forces Victoria, British Columbia, V9A 7N2 Attn: Commanding Officer 1 M. C. Florea (author) Informatique WGZ inc

  20. A chance-constrained programming level set method for longitudinal segmentation of lung tumors in CT.

    PubMed

    Rouchdy, Youssef; Bloch, Isabelle

    2011-01-01

    This paper presents a novel stochastic level set method for the longitudinal tracking of lung tumors in computed tomography (CT). The proposed model addresses the limitations of registration based and segmentation based methods for longitudinal tumor tracking. It combines the advantages of each approach using a new probabilistic framework, namely Chance-Constrained Programming (CCP). Lung tumors can shrink or grow over time, which can be reflected in large changes of shape, appearance and volume in CT images. Traditional level set methods with a priori knowledge about shape are not suitable since the tumors are undergoing random and large changes in shape. Our CCP level set model allows to introduce a flexible prior to track structures with a highly variable shape by permitting a constraint violation of the prior up to a specified probability level. The chance constraints are computed from two given points by the user or from segmented tumors from a reference image. The reference image can be one of the images studied or an external template. We present a numerical scheme to approximate the solution of the proposed model and apply it to track lung tumors in CT. Finally, we compare our approach with a Bayesian level set. The CCP level set model gives the best results: it is more coherent with the manual segmentation.

  1. Framework development for the assessment of interprofessional teamwork in mental health settings.

    PubMed

    Tomizawa, Ryoko; Shigeta, Masahiro; Reeves, Scott

    2017-01-01

    In mental health settings, interprofessional practice is regarded as a comprehensive approach to prevent relapse and manage chronic conditions with practice of various teamwork interventions. To reinforce the potential of interprofessional teamwork, it is recommended that theories or conceptual frameworks be employed. There continues, however, to be a limited use of such approaches that assess the quality of interprofessional teamwork in mental health settings. This article aimed to present a new conceptual framework for the assessment of interprofessional teamwork based on the findings of a scoping review of the literature. This review was undertaken to identify conceptual frameworks utilised in interprofessional teamwork in mental health settings. After reviewing 952 articles, the methodological characteristics extracted from 12 articles were considered. The included studies were synthesised into the Donabedian structure-process-outcome model. The findings revealed that structural issues comprised three elements: professional characteristics, client-care characteristics, and contextual characteristics in organisations. Process issues comprised two elements: team mechanisms and community-oriented services. Finally, outcome issues comprised the following elements: clients' outcomes and professionals' outcomes. The review findings suggested possibilities for further development of how to assess the quality of interprofessional teamwork and provided information about what specific approach is required to improve interprofessional teamwork. Future research should utilise various areas and cultures to clarify the adaptation potential.

  2. The constrained reinitialization equation for level set methods

    NASA Astrophysics Data System (ADS)

    Hartmann, Daniel; Meinke, Matthias; Schröder, Wolfgang

    2010-03-01

    Based on the constrained reinitialization scheme [D. Hartmann, M. Meinke, W. Schröder, Differential equation based constrained reinitialization for level set methods, J. Comput. Phys. 227 (2008) 6821-6845] a new constrained reinitialization equation incorporating a forcing term is introduced. Two formulations for high-order constrained reinitialization (HCR) are presented combining the simplicity and generality of the original reinitialization equation [M. Sussman, P. Smereka, S. Osher, A level set approach for computing solutions to incompressible two-phase flow, J. Comput. Phys. 114 (1994) 146-159] in terms of high-order standard discretization and the accuracy of the constrained reinitialization scheme in terms of interface displacement. The novel HCR schemes represent simple extensions of standard implementations of the original reinitialization equation. The results evidence the significantly increased accuracy and robustness of the novel schemes.

  3. Skull defect reconstruction based on a new hybrid level set.

    PubMed

    Zhang, Ziqun; Zhang, Ran; Song, Zhijian

    2014-01-01

    Skull defect reconstruction is an important aspect of surgical repair. Historically, a skull defect prosthesis was created by the mirroring technique, surface fitting, or formed templates. These methods are not based on the anatomy of the individual patient's skull, and therefore, the prosthesis cannot precisely correct the defect. This study presented a new hybrid level set model, taking into account both the global optimization region information and the local accuracy edge information, while avoiding re-initialization during the evolution of the level set function. Based on the new method, a skull defect was reconstructed, and the skull prosthesis was produced by rapid prototyping technology. This resulted in a skull defect prosthesis that well matched the skull defect with excellent individual adaptation.

  4. High Frequency Acoustic Propagation using Level Set Methods

    DTIC Science & Technology

    2007-01-01

    solution of the high frequency approximation to the wave equation. Traditional solutions to the Eikonal equation in high frequency acoustics are...curvature can be extracted at any point of the front from the level set function (provided the normal and curvature are well-defined at that point ), and... points per wavelength to resolve the wave). Ray tracing is therefore the current standard for high frequency propagation modeling. LSM may provide

  5. A Level Set Filter for Speckle Reduction in SAR Images

    NASA Astrophysics Data System (ADS)

    Li, Hongga; Huang, Bo; Huang, Xiaoxia

    2010-12-01

    Despite much effort and significant progress in recent years, speckle removal for Synthetic Aperture Radar (SAR) image still is a challenging problem in image processing. Unlike the traditional noise filters, which are mainly based on local neighborhood statistical average or frequencies transform, in this paper, we propose a speckle reduction method based on the theory of level set, one form of curvature flow propagation. Firstly, based on partial differential equation, the Lee filter can be cast as a formulation of anisotropic diffusion function; furthermore, we continued to deduce it into a level set formulation. Level set flow into the method allows the front interface to propagate naturally with topological changes, where the speed is proportional to the curvature of the intensity contours in an image. Hence, small speckle will disappear quickly, while large scale interfaces will be slow to evolve. Secondly, for preserving finer detailed structures in images when smoothing the speckle, the evolution is switched between minimum or maximum curvature speed depending on the scale of speckle. The proposed method has been illustrated by experiments on simulation image and ERS-2 SAR images under different circumstances. Its advantages over the traditional speckle reduction filter approaches have also been demonstrated.

  6. Level Set Approach to Anisotropic Wet Etching of Silicon

    PubMed Central

    Radjenović, Branislav; Radmilović-Radjenović, Marija; Mitrić, Miodrag

    2010-01-01

    In this paper a methodology for the three dimensional (3D) modeling and simulation of the profile evolution during anisotropic wet etching of silicon based on the level set method is presented. Etching rate anisotropy in silicon is modeled taking into account full silicon symmetry properties, by means of the interpolation technique using experimentally obtained values for the etching rates along thirteen principal and high index directions in KOH solutions. The resulting level set equations are solved using an open source implementation of the sparse field method (ITK library, developed in medical image processing community), extended for the case of non-convex Hamiltonians. Simulation results for some interesting initial 3D shapes, as well as some more practical examples illustrating anisotropic etching simulation in the presence of masks (simple square aperture mask, convex corner undercutting and convex corner compensation, formation of suspended structures) are shown also. The obtained results show that level set method can be used as an effective tool for wet etching process modeling, and that is a viable alternative to the Cellular Automata method which now prevails in the simulations of the wet etching process. PMID:22399916

  7. A Level Set Approach to Image Segmentation With Intensity Inhomogeneity.

    PubMed

    Zhang, Kaihua; Zhang, Lei; Lam, Kin-Man; Zhang, David

    2016-02-01

    It is often a difficult task to accurately segment images with intensity inhomogeneity, because most of representative algorithms are region-based that depend on intensity homogeneity of the interested object. In this paper, we present a novel level set method for image segmentation in the presence of intensity inhomogeneity. The inhomogeneous objects are modeled as Gaussian distributions of different means and variances in which a sliding window is used to map the original image into another domain, where the intensity distribution of each object is still Gaussian but better separated. The means of the Gaussian distributions in the transformed domain can be adaptively estimated by multiplying a bias field with the original signal within the window. A maximum likelihood energy functional is then defined on the whole image region, which combines the bias field, the level set function, and the piecewise constant function approximating the true image signal. The proposed level set method can be directly applied to simultaneous segmentation and bias correction for 3 and 7T magnetic resonance images. Extensive evaluation on synthetic and real-images demonstrate the superiority of the proposed method over other representative algorithms.

  8. Improvements to Level Set, Immersed Boundary methods for Interface Tracking

    NASA Astrophysics Data System (ADS)

    Vogl, Chris; Leveque, Randy

    2014-11-01

    It is not uncommon to find oneself solving a moving boundary problem under flow in the context of some application. Of particular interest is when the moving boundary exerts a curvature-dependent force on the liquid. Such a force arises when observing a boundary that is resistant to bending or has surface tension. Numerically speaking, stable numerical computation of the curvature can be difficult as it is often described in terms of high-order derivatives of either marker particle positions or of a level set function. To address this issue, the level set method is modified to track not only the position of the boundary, but the curvature as well. The definition of the signed-distance function that is used to modify the level set method is also used to develop an interpolation-free, closest-point method. These improvements are used to simulate a bending-resistant, inextensible boundary under shear flow to highlight area and volume conservation, as well as stable curvature calculation. Funded by a NSF MSPRF grant.

  9. A conceptual framework of computations in mid-level vision

    PubMed Central

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P.

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words—or, rather, descriptors—capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations. PMID:25566044

  10. A conceptual framework of computations in mid-level vision.

    PubMed

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words-or, rather, descriptors-capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations.

  11. A level set method for materials with texturally equilibrated pores

    NASA Astrophysics Data System (ADS)

    Ghanbarzadeh, Soheil; Hesse, Marc A.; Prodanović, Maša

    2015-09-01

    Textural equilibrium controls the distribution of the liquid phase in many naturally occurring porous materials such as partially molten rocks and alloys, salt-brine and ice-water systems. In these materials, pore geometry evolves to minimize the solid-liquid interfacial energy while maintaining a constant dihedral angle, θ, at solid-liquid contact lines. We present a level set method to compute an implicit representation of the liquid-solid interface in textural equilibrium with space-filling tessellations of multiple solid grains in three dimensions. Each grain is represented by a separate level set function and interfacial energy minimization is achieved by evolving the solid-liquid interface under surface diffusion to constant mean curvature surface. The liquid volume and dihedral angle constraints are added to the formulation using virtual convective and normal velocity terms. This results in an initial value problem for a system of non-linear coupled PDEs governing the evolution of the level sets for each grain, using the implicit representation of the solid grains as initial condition. A domain decomposition scheme is devised to restrict the computational domain of each grain to few grid points around the grain. The coupling between the interfaces is achieved in a higher level on the original computational domain. The spatial resolution of the discretization is improved through high-order spatial differentiation schemes and localization of computations through domain composition. Examples of three-dimensional solutions are also obtained for different grain distributions networks that illustrate the geometric flexibility of the method.

  12. Crossing levels in systems ergonomics: a framework to support 'mesoergonomic' inquiry.

    PubMed

    Karsh, Ben-Tzion; Waterson, Patrick; Holden, Richard J

    2014-01-01

    In this paper we elaborate and articulate the need for what has been termed 'mesoergonomics'. In particular, we argue that the concept has the potential to bridge the gap between, and integrate, established work within the domains of micro- and macroergonomics. Mesoergonomics is defined as an open systems approach to human factors and ergonomics (HFE) theory and research whereby the relationship between variables in at least two different system levels or echelons is studied, and where the dependent variables are human factors and ergonomic constructs. We present a framework which can be used to structure a set of questions for future work and prompt further empirical and conceptual inquiry. The framework consists of four steps: (1) establishing the purpose of the mesoergonomic investigation; (2) selecting human factors and ergonomics variables; (3) selecting a specific type of mesoergonomic investigation; and (4) establishing relationships between system levels. In addition, we describe two case studies which illustrate the workings of the framework and the value of adopting a mesoergonomic perspective within HFE. The paper concludes with a set of issues which could form part of a future agenda for research within systems ergonomics.

  13. Implementing a framework for goal setting in community based stroke rehabilitation: a process evaluation

    PubMed Central

    2013-01-01

    Background Goal setting is considered ‘best practice’ in stroke rehabilitation; however, there is no consensus regarding the key components of goal setting interventions or how they should be optimally delivered in practice. We developed a theory-based goal setting and action planning framework (G-AP) to guide goal setting practice. G-AP has 4 stages: goal negotiation, goal setting, action planning & coping planning and appraisal & feedback. All stages are recorded in a patient-held record. In this study we examined the implementation, acceptability and perceived benefits of G-AP in one community rehabilitation team with people recovering from stroke. Methods G-AP was implemented for 6 months with 23 stroke patients. In-depth interviews with 8 patients and 8 health professionals were analysed thematically to investigate views of its implementation, acceptability and perceived benefits. Case notes of interviewed patients were analysed descriptively to assess the fidelity of G-AP implementation. Results G-AP was mostly implemented according to protocol with deviations noted at the planning and appraisal and feedback stages. Each stage was felt to make a useful contribution to the overall process; however, in practice, goal negotiation and goal setting merged into one stage and the appraisal and feedback stage included an explicit decision making component. Only two issues were raised regarding G-APs acceptability: (i) health professionals were concerned about the impact of goal non-attainment on patient’s well-being (patients did not share their concerns), and (ii) some patients and health professionals found the patient-held record unhelpful. G-AP was felt to have a positive impact on patient goal attainment and professional goal setting practice. Collaborative partnerships between health professionals and patients were apparent throughout the process. Conclusions G-AP has been perceived as both beneficial and broadly acceptable in one community

  14. Microarray missing data imputation based on a set theoretic framework and biological knowledge

    PubMed Central

    Gan, Xiangchao; Liew, Alan Wee-Chung; Yan, Hong

    2006-01-01

    Gene expressions measured using microarrays usually suffer from the missing value problem. However, in many data analysis methods, a complete data matrix is required. Although existing missing value imputation algorithms have shown good performance to deal with missing values, they also have their limitations. For example, some algorithms have good performance only when strong local correlation exists in data while some provide the best estimate when data is dominated by global structure. In addition, these algorithms do not take into account any biological constraint in their imputation. In this paper, we propose a set theoretic framework based on projection onto convex sets (POCS) for missing data imputation. POCS allows us to incorporate different types of a priori knowledge about missing values into the estimation process. The main idea of POCS is to formulate every piece of prior knowledge into a corresponding convex set and then use a convergence-guaranteed iterative procedure to obtain a solution in the intersection of all these sets. In this work, we design several convex sets, taking into consideration the biological characteristic of the data: the first set mainly exploit the local correlation structure among genes in microarray data, while the second set captures the global correlation structure among arrays. The third set (actually a series of sets) exploits the biological phenomenon of synchronization loss in microarray experiments. In cyclic systems, synchronization loss is a common phenomenon and we construct a series of sets based on this phenomenon for our POCS imputation algorithm. Experiments show that our algorithm can achieve a significant reduction of error compared to the KNNimpute, SVDimpute and LSimpute methods. PMID:16549873

  15. An Adaptive Mesh Refined Gradient-Augmented Level Set Method

    NASA Astrophysics Data System (ADS)

    Nave, Jean-Christophe; Seibold, Benjamin; Rosales, Ruben

    2010-11-01

    The Gradient-Augmented Level Set method (GA-LS) was introduced at the 62^nd annual APS-DFD meeting by Nave et al. (arXiv:0905.3409). Leveraging the optimal locality and unconditional stability of the method, we present a natural extension to adaptive quad-tree meshes. The new method possesses many desirable features such as improved mass conservation, reduced computational effort, and is, due to the optimal locality property of the underlying GA-LS, very easy to implement. Several key benchmark tests will be presented to help demonstrate the benefits of the approach, and the overall simplicity of the algorithm.

  16. A geometric level set model for ultrasounds analysis

    SciTech Connect

    Sarti, A.; Malladi, R.

    1999-10-01

    We propose a partial differential equation (PDE) for filtering and segmentation of echocardiographic images based on a geometric-driven scheme. The method allows edge-preserving image smoothing and a semi-automatic segmentation of the heart chambers, that regularizes the shapes and improves edge fidelity especially in presence of distinct gaps in the edge map as is common in ultrasound imagery. A numerical scheme for solving the proposed PDE is borrowed from level set methods. Results on human in vivo acquired 2D, 2D+time,3D, 3D+time echocardiographic images are shown.

  17. Multiregion level-set partitioning of synthetic aperture radar images.

    PubMed

    Ben Ayed, Ismail; Mitiche, Amar; Belhadj, Ziad

    2005-05-01

    The purpose of this study is to investigate Synthetic Aperture Radar (SAR) image segmentation into a given but arbitrary number of gamma homogeneous regions via active contours and level sets. The segmentation of SAR images is a difficult problem due to the presence of speckle which can be modeled as strong, multiplicative noise. The proposed algorithm consists of evolving simple closed planar curves within an explicit correspondence between the interiors of curves and regions of segmentation to minimize a criterion containing a term of conformity of data to a speckle model of noise and a term of regularization. Results are shown on both synthetic and real images.

  18. A risk-informed decision framework for setting environmental windows for dredging projects.

    PubMed

    Suedel, Burton C; Kim, Jongbum; Clarke, Douglas G; Linkov, Igor

    2008-09-15

    Sediment dredging is necessary to sustain navigation infrastructure in ports and harbor areas. In the United States alone between 250 and 300 million cubic yards of sediment are dredged annually. Dredging activities may cause stress on aquatic biota by locally increasing turbidity and suspended sediment concentrations, physically disturbing habitat by elevated sedimentation rates, interfering in migratory behaviors, and hydraulically entraining bottom dwelling organisms. Environmental windows are a management practice used to alleviate such stresses on resident and transient biota by placing temporal restrictions on the conduct of dredging operations. Adherence to environmental windows can significantly inflate costs for project sponsors and local stakeholders. Since their inception following passage of NEPA in 1969 the process for setting environmental windows has not followed structured procedures and represents an example of the difficulty inherent in achieving a balance between biological resource protection and cost-effective construction and maintenance of navigation infrastructure. Recent developments in the fields of risk assessment for non-chemical stressors as well as experience in implementing structured risk-informed decision-making tools for sediment and natural resource management are summarized in this paper in relation to setting environmental windows. Combining risk assessment and multi-criteria decision analysis allows development of a framework for an objective process consistent with recommendations by the National Academy of Sciences for setting environmental windows. A hypothetical application of the framework for protection of Pacific herring (Clupea pallasii) in San Francisco Bay is discussed.

  19. Space Object Detection and Tracking Within a Finite Set Statistics Framework

    DTIC Science & Technology

    2017-04-13

    AFRL-AFOSR-CL-TR-2017-0005 Space Object Detection & Tracking Within a Finite Set Statistics Framework Martin Adams Department of Electrical...MM-YYYY)      21-04-2017 2. REPORT TYPE Final 3. DATES COVERED (From - To) 01 Feb 2015 to 31 Jan 2017 4. TITLE AND SUBTITLE Space Object Detection...Grant No. FA9550-15-1-0069, devoted to the investigation and improvement of the detection and tracking methods of inactive Resident Space Objects (RSOs

  20. Variational level set segmentation for forest based on MCMC sampling

    NASA Astrophysics Data System (ADS)

    Yang, Tie-Jun; Huang, Lin; Jiang, Chuan-xian; Nong, Jian

    2014-11-01

    Environmental protection is one of the themes of today's world. The forest is a recycler of carbon dioxide and natural oxygen bar. Protection of forests, monitoring of forest growth is long-term task of environmental protection. It is very important to automatically statistic the forest coverage rate using optical remote sensing images and the computer, by which we can timely understand the status of the forest of an area, and can be freed from tedious manual statistics. Towards the problem of computational complexity of the global optimization using convexification, this paper proposes a level set segmentation method based on Markov chain Monte Carlo (MCMC) sampling and applies it to forest segmentation in remote sensing images. The presented method needs not to do any convexity transformation for the energy functional of the goal, and uses MCMC sampling method with global optimization capability instead. The possible local minima occurring by using gradient descent method is also avoided. There are three major contributions in the paper. Firstly, by using MCMC sampling, the convexity of the energy functional is no longer necessary and global optimization can still be achieved. Secondly, taking advantage of the data (texture) and knowledge (a priori color) to guide the construction of Markov chain, the convergence rate of Markov chains is improved significantly. Finally, the level set segmentation method by integrating a priori color and texture for forest is proposed. The experiments show that our method can efficiently and accurately segment forest in remote sensing images.

  1. Using the level set method to track ice sheet boundaries

    NASA Astrophysics Data System (ADS)

    Lindsey, D. S.; Dupont, T. K.

    2009-12-01

    Simulating ice-sheet volume changes requires tracking the interface of ice and its surrounding media, e.g. water, air, and sediment or rock. This can be challenging when using a fixed, or Eulerian, grid and allowing the interface to move via kinematic boundary conditions. For example, the interface may fall between grid points at a given point in time, making the application of boundary conditions less than straightforward. The level set method of Osher and Sethian (1988) offers an alternative approach, wherein a continuous level set function evolves within the domain via the combined kinematics of ice and its encompassing materials. The methods true strength lies in tracking the interface of two materials through time. Pralong and Funk (2004) applied this method to the movement of a glacier’s ice/air interface, offering a glimpse of the potential of this method for glaciology. Here we perform a simple preliminary test of the method for a two-dimensional (flow-line) model of an ice shelf, comparing the results to analytic approximations of the movement of both the ice/air interface and the ice front. Future experiments will incorporate grounded ice and include basal and lateral-shear stresses. The ultimate goal of this work is provide a practical approach for two and three-dimensional ice-sheet models to naturally track their moving boundaries.

  2. Interface Surface Area Tracking for the Conservative Level Set Method

    NASA Astrophysics Data System (ADS)

    Firehammer, Stephanie; Desjardins, Olivier

    2015-11-01

    One key question in liquid-gas flows is how to model the interface between phases in a way that is mass, momentum, and energy conserving. The accurate conservative level set (ACLS) method of Desjardins et al. provides a tool for tracking a liquid-gas interface with minimal mass conservation issues; however, it does not explicitly compute the interface surface area and thus nothing can be said a priori about the balance between kinetic energy and surface energy. This work examines an equation for the transport of interface surface area density, which can be written in terms of the gradient of the volume fraction. Furthermore this presentation will outline a numerical method for jointly transporting a conservative level set and surface area density. Finally, we will explore oppportunities for energy conservation via the accurate exchange of energy between the flow field and the interface through surface tension, with test cases to show the results of our extended ACLS method. Funding from the National Science Foundation is gratefully acknowledged.

  3. PET image reconstruction with anatomical edge guided level set prior

    NASA Astrophysics Data System (ADS)

    Cheng-Liao, Jinxiu; Qi, Jinyi

    2011-11-01

    Acquiring both anatomical and functional images during one scan, PET/CT systems improve the ability to detect and localize abnormal uptakes. In addition, CT images provide anatomical boundary information that can be used to regularize positron emission tomography (PET) images. Here we propose a new approach to maximum a posteriori reconstruction of PET images with a level set prior guided by anatomical edges. The image prior models both the smoothness of PET images and the similarity between functional boundaries in PET and anatomical boundaries in CT. Level set functions (LSFs) are used to represent smooth and closed functional boundaries. The proposed method does not assume an exact match between PET and CT boundaries. Instead, it encourages similarity between the two boundaries, while allowing different region definition in PET images to accommodate possible signal and position mismatch between functional and anatomical images. While the functional boundaries are guaranteed to be closed by the LSFs, the proposed method does not require closed anatomical boundaries and can utilize incomplete edges obtained from an automatic edge detection algorithm. We conducted computer simulations to evaluate the performance of the proposed method. Two digital phantoms were constructed based on the Digimouse data and a human CT image, respectively. Anatomical edges were extracted automatically from the CT images. Tumors were simulated in the PET phantoms with different mismatched anatomical boundaries. Compared with existing methods, the new method achieved better bias-variance performance. The proposed method was also applied to real mouse data and achieved higher contrast than other methods.

  4. Implementing accountability for reasonableness framework at district level in Tanzania: a realist evaluation.

    PubMed

    Maluka, Stephen; Kamuzora, Peter; Sansebastián, Miguel; Byskov, Jens; Ndawi, Benedict; Olsen, Øystein E; Hurtig, Anna-Karin

    2011-02-10

    Despite the growing importance of the Accountability for Reasonableness (A4R) framework in priority setting worldwide, there is still an inadequate understanding of the processes and mechanisms underlying its influence on legitimacy and fairness, as conceived and reflected in service management processes and outcomes. As a result, the ability to draw scientifically sound lessons for the application of the framework to services and interventions is limited. This paper evaluates the experiences of implementing the A4R approach in Mbarali District, Tanzania, in order to find out how the innovation was shaped, enabled, and constrained by the interaction between contexts, mechanisms and outcomes. This study draws on the principles of realist evaluation -- a largely qualitative approach, chiefly concerned with testing and refining programme theories by exploring the complex interactions of contexts, mechanisms, and outcomes. Mixed methods were used in data collection, including individual interviews, non-participant observation, and document reviews. A thematic framework approach was adopted for the data analysis. The study found that while the A4R approach to priority setting was helpful in strengthening transparency, accountability, stakeholder engagement, and fairness, the efforts at integrating it into the current district health system were challenging. Participatory structures under the decentralisation framework, central government's call for partnership in district-level planning and priority setting, perceived needs of stakeholders, as well as active engagement between researchers and decision makers all facilitated the adoption and implementation of the innovation. In contrast, however, limited local autonomy, low level of public awareness, unreliable and untimely funding, inadequate accountability mechanisms, and limited local resources were the major contextual factors that hampered the full implementation. This study documents an important first step in the

  5. Implementing accountability for reasonableness framework at district level in Tanzania: a realist evaluation

    PubMed Central

    2011-01-01

    Background Despite the growing importance of the Accountability for Reasonableness (A4R) framework in priority setting worldwide, there is still an inadequate understanding of the processes and mechanisms underlying its influence on legitimacy and fairness, as conceived and reflected in service management processes and outcomes. As a result, the ability to draw scientifically sound lessons for the application of the framework to services and interventions is limited. This paper evaluates the experiences of implementing the A4R approach in Mbarali District, Tanzania, in order to find out how the innovation was shaped, enabled, and constrained by the interaction between contexts, mechanisms and outcomes. Methods This study draws on the principles of realist evaluation -- a largely qualitative approach, chiefly concerned with testing and refining programme theories by exploring the complex interactions of contexts, mechanisms, and outcomes. Mixed methods were used in data collection, including individual interviews, non-participant observation, and document reviews. A thematic framework approach was adopted for the data analysis. Results The study found that while the A4R approach to priority setting was helpful in strengthening transparency, accountability, stakeholder engagement, and fairness, the efforts at integrating it into the current district health system were challenging. Participatory structures under the decentralisation framework, central government's call for partnership in district-level planning and priority setting, perceived needs of stakeholders, as well as active engagement between researchers and decision makers all facilitated the adoption and implementation of the innovation. In contrast, however, limited local autonomy, low level of public awareness, unreliable and untimely funding, inadequate accountability mechanisms, and limited local resources were the major contextual factors that hampered the full implementation. Conclusion This study

  6. Framework for State-Level Renewable Energy Market Potential Studies

    EPA Pesticide Factsheets

    This document provides a framework and next steps for state officials who require estimates of renewable energy market potential. The report gives insight into how to conduct a market potential study.

  7. Statistics of dark matter halos in the excursion set peak framework

    SciTech Connect

    Lapi, A.; Danese, L. E-mail: danese@sissa.it

    2014-07-01

    We derive approximated, yet very accurate analytical expressions for the abundance and clustering properties of dark matter halos in the excursion set peak framework; the latter relies on the standard excursion set approach, but also includes the effects of a realistic filtering of the density field, a mass-dependent threshold for collapse, and the prescription from peak theory that halos tend to form around density maxima. We find that our approximations work excellently for diverse power spectra, collapse thresholds and density filters. Moreover, when adopting a cold dark matter power spectra, a tophat filtering and a mass-dependent collapse threshold (supplemented with conceivable scatter), our approximated halo mass function and halo bias represent very well the outcomes of cosmological N-body simulations.

  8. Minimum mutual information based level set clustering algorithm for fast MRI tissue segmentation.

    PubMed

    Dai, Shuanglu; Man, Hong; Zhan, Shu

    2015-01-01

    Accurate and accelerated MRI tissue recognition is a crucial preprocessing for real-time 3d tissue modeling and medical diagnosis. This paper proposed an information de-correlated clustering algorithm implemented by variational level set method for fast tissue segmentation. The key idea is to design a local correlation term between original image and piecewise constant into the variational framework. The minimized correlation will then lead to de-correlated piecewise regions. Firstly, by introducing a continuous bounded variational domain describing the image, a probabilistic image restoration model is assumed to modify the distortion. Secondly, regional mutual information is introduced to measure the correlation between piecewise regions and original images. As a de-correlated description of the image, piecewise constants are finally solved by numerical approximation and level set evolution. The converged piecewise constants automatically clusters image domain into discriminative regions. The segmentation results show that our algorithm performs well in terms of time consuming, accuracy, convergence and clustering capability.

  9. Bio-molecule Surfaces Construction via a Higher-Order Level-Set Method.

    PubMed

    Bajaj, Chandrajit L; Xu, Guo-Liang; Zhang, Qin

    2008-11-01

    We present a general framework for a higher-order spline level-set (HLS) method and apply this to bio-molecule surfaces construction. Starting from a first order energy functional, we obtain a general level set formulation of geometric partial differential equation, and provide an efficient approach to solve this partial differential equation using a C(2) spline basis. We also present a fast cubic spline interpolation algorithm based on convolution and the Z-transform, which exploits the local relationship of interpolatory cubic spline coefficients with respect to given function data values. One example of our HLS method is demonstrated, which is the construction of bio-molecule surfaces (an implicit solvation interface) with their individual atomic coordinates and solvated radii as prerequisite.

  10. Comprehensive evaluation of long-term hydrological data sets: Constraints of the Budyko framework

    NASA Astrophysics Data System (ADS)

    Greve, Peter; Orlowsky, Boris; Seneviratne, Sonia I.

    2013-04-01

    An accurate estimate of the climatological land water balance is essential for a wide range of socio-economical issues. Despite the simplicity of the underlying water balance equation, its individual variables are of complex nature. Global estimates, either derived from observations or from models, of precipitation (P ) and especially evapotranspiration (ET) are characterized by high uncertainties. This leads to inconsistent results in determining conditions related to the land water balance and its components. In this study, we consider the Budyko framework as a constraint to evaluate long-term hydrological data sets within the period from 1984 to 2005. The Budyko framework is a well established empirically based relationsship between ET-P and Ep-P , with Ep being the potential evaporation. We use estimates of ET associated with the LandFlux-EVAL initiative (Mueller et. al., 2012), either derived from observations, CMIP5 models or land-surface models (LSMs) driven with observation-based forcing or atmospheric reanalyses. Data sets of P comprise all commonly used global observation-based estimates. Ep is determined by methods of differing complexity with recent global temperature and radiation data sets. Based on this comprehensive synthesis of data sets and methods to determine Ep, more than 2000 possible combinations for ET-P in conjunction with Ep-P are created. All combinations are validated against the Budyko curve and against physical limits within the Budyko phase space. For this purpose we develop an error measure based on the root mean square error which combines both constraints. We find that uncertainties are mainly induced by the ET data sets. In particular, reanalysis and CMIP5 data sets are characterized by low realism. The realism of LSMs is further not primarily controlled by the forcing, as different LSMs driven with the same forcing show significantly different error measures. Our comprehensive approach is thus suitable to detect uncertainties

  11. A Level Set Method for vaporizing two-phase flows

    NASA Astrophysics Data System (ADS)

    Tanguy, Sébastien; Ménard, Thibaut; Berlemont, Alain

    2007-02-01

    Development and applications of numerical methods devoted to reactive interface simulations are presented. Emphasis is put on vaporization, where numerical difficulties arise in imposing accurate jump conditions for heat and mass transfers. We use both the Level Set Method and the Ghost Fluid Method to capture the interface motion accurately and to handle suitable jump conditions. A local vaporization mass flow rate per unit of surface area is defined and Stefan flow is involved in the process. Specific care has been devoted to the extension of discontinuous variables across the interface to populate ghost cells, in order to avoid parasitic currents and numerical diffusion across the interface. A projection method is set up to impose both the velocity field continuity and a divergence-free condition for the extended velocity field across the interface. The d2 law is verified in the numerical simulations of the vaporization of an isolated static drop. Results are then presented for a water droplet moving in air. Vapor mass fraction and temperature fields inside and outside the droplet are presented.

  12. Modeling cellular deformations using the level set formalism

    PubMed Central

    Yang, Liu; Effler, Janet C; Kutscher, Brett L; Sullivan, Sarah E; Robinson, Douglas N; Iglesias, Pablo A

    2008-01-01

    Background Many cellular processes involve substantial shape changes. Traditional simulations of these cell shape changes require that grids and boundaries be moved as the cell's shape evolves. Here we demonstrate that accurate cell shape changes can be recreated using level set methods (LSM), in which the cellular shape is defined implicitly, thereby eschewing the need for updating boundaries. Results We obtain a viscoelastic model of Dictyostelium cells using micropipette aspiration and show how this viscoelastic model can be incorporated into LSM simulations to recreate the observed protrusion of cells into the micropipette faithfully. We also demonstrate the use of our techniques by simulating the cell shape changes elicited by the chemotactic response to an external chemoattractant gradient. Conclusion Our results provide a simple but effective means of incorporating cellular deformations into mathematical simulations of cell signaling. Such methods will be useful for simulating important cellular events such as chemotaxis and cytokinesis. PMID:18652669

  13. Fast parallel algorithms: from images to level sets and labels

    NASA Astrophysics Data System (ADS)

    Nguyen, H. T.; Jung, Ken K.; Raghavan, Raghu

    1990-07-01

    Decomposition into level sets refers to assigning a code with respect to intensity or elevation while labeling refers to assigning a code with respect to disconnected regions. We present a sequence of parallel algorithms for these two processes. The process of labeling includes re-assign labels into a natural sequence and compare different labeling algorithm. We discuss the difference between edge-based and region-based labeling. The speed improvements in this labeling scheme come from the collective efficiency of different techniques. We have implemented these algorithms on an in-house built Geometric Single Instruction Multiple Data (GSIMD) parallel machine with global buses and a Multiple Instruction Multiple Data (MIMD) controller. This allows real time image interpretation on live data at a rate that is much higher than video rate. The performance figures will be shown.

  14. Identifying Aquifer Heterogeneities using the Level Set Method

    NASA Astrophysics Data System (ADS)

    Lu, Z.; Vesselinov, V. V.; Lei, H.

    2016-12-01

    Material interfaces between hydrostatigraphic units (HSU) with contrasting aquifer parameters (e.g., strata and facies with different hydraulic conductivity) have a great impact on flow and contaminant transport in subsurface. However, the identification of HSU shape in the subsurface is challenging and typically relies on tomographic approaches where a series of steady-state/transient head measurements at spatially distributed observation locations are analyzed using inverse models. In this study, we developed a mathematically rigorous approach for identifying material interfaces among any arbitrary number of HSUs using the level set method. The approach has been tested first with several synthetic cases, where the true spatial distribution of HSUs was assumed to be known and the head measurements were taken from the flow simulation with the true parameter fields. These synthetic inversion examples demonstrate that the level set method is capable of characterizing the spatial distribution of the heterogeneous. We then applied the methodology to a large-scale problem in which the spatial distribution of pumping wells and observation well screens is consistent with the actual aquifer contamination (chromium) site at the Los Alamos National Laboratory (LANL). In this way, we test the applicability of the methodology at an actual site. We also present preliminary results using the actual LANL site data. We also investigated the impact of the number of pumping/observation wells and the drawdown observation frequencies/intervals on the quality of the inversion results. We also examined the uncertainties associated with the estimated HSU shapes, and the accuracy of the results under different hydraulic-conductivity contrasts between the HSU's.

  15. Powerful Set-Based Gene-Environment Interaction Testing Framework for Complex Diseases.

    PubMed

    Jiao, Shuo; Peters, Ulrike; Berndt, Sonja; Bézieau, Stéphane; Brenner, Hermann; Campbell, Peter T; Chan, Andrew T; Chang-Claude, Jenny; Lemire, Mathieu; Newcomb, Polly A; Potter, John D; Slattery, Martha L; Woods, Michael O; Hsu, Li

    2015-12-01

    Identification of gene-environment interaction (G × E) is important in understanding the etiology of complex diseases. Based on our previously developed Set Based gene EnviRonment InterAction test (SBERIA), in this paper we propose a powerful framework for enhanced set-based G × E testing (eSBERIA). The major challenge of signal aggregation within a set is how to tell signals from noise. eSBERIA tackles this challenge by adaptively aggregating the interaction signals within a set weighted by the strength of the marginal and correlation screening signals. eSBERIA then combines the screening-informed aggregate test with a variance component test to account for the residual signals. Additionally, we develop a case-only extension for eSBERIA (coSBERIA) and an existing set-based method, which boosts the power not only by exploiting the G-E independence assumption but also by avoiding the need to specify main effects for a large number of variants in the set. Through extensive simulation, we show that coSBERIA and eSBERIA are considerably more powerful than existing methods within the case-only and the case-control method categories across a wide range of scenarios. We conduct a genome-wide G × E search by applying our methods to Illumina HumanExome Beadchip data of 10,446 colorectal cancer cases and 10,191 controls and identify two novel interactions between nonsteroidal anti-inflammatory drugs (NSAIDs) and MINK1 and PTCHD3. © 2015 WILEY PERIODICALS, INC.

  16. A framework for transitioning patients from pediatric to adult health settings for patients with neurogenic bladder.

    PubMed

    Lewis, Jennifer; Frimberger, Dominic; Haddad, Emily; Slobodov, Gennady

    2017-04-01

    Adolescents with neurogenic bladder are a vulnerable population that severely lacks consistent transitional care from pediatric to adult urology settings. Our practice determined that 100 patients with spina bifida and other neurogenic bladder conditions were not appropriately transferred to the adult setting once reaching adulthood. We initiated a transitional program to establish a dedicated and formal process for adolescent patients to transition to adult urology. The REACH clinic implements a formalized staging framework to facilitate migration of adolescents and young adults to the adult health setting. A social worker was incorporated to act as a patient advocate, behavioral health consultant, and resource specialist. To date 45 patients have been enrolled in the transition program. We have identified and categorized according to the appropriate stage. The REACH clinic has appropriately outlined the goals and mission of the program and resources utilized are financially practical and feasible by conducting a monthly combined clinic. The program has been instrumental in improving tracking and monitoring of these patients through their transition period. Through the efforts of the pediatric and adult urology teams, the REACH program is a dedicated framework that provides structure for transition of the adolescent patient. The addition of a social worker has resulted in enriched rapport and will likely result in improved compliance. This program allows for surveillance and evaluation of patient outcome indicators in this patient population. We believe that early introduction and frequent encounters with the adult urologic team is crucial to successful transitions. Neurourol. Urodynam. 36:973-978, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  17. INSTITUTIONALIZING SAFEGUARDS-BY-DESIGN: HIGH-LEVEL FRAMEWORK

    SciTech Connect

    Trond Bjornard PhD; Joseph Alexander; Robert Bean; Brian Castle; Scott DeMuth, Ph.D.; Phillip Durst; Michael Ehinger; Prof. Michael Golay, Ph.D.; Kevin Hase, Ph.D.; David J. Hebditch, DPhil; John Hockert, Ph.D.; Bruce Meppen; James Morgan; Jerry Phillips, Ph.D., PE

    2009-02-01

    The application of a Safeguards-by-Design (SBD) process for new nuclear facilities can reduce proliferation risks. A multi-laboratory team was sponsored in Fiscal Year (FY) 2008 to define a SBD process and determine how it could be incorporated into existing facility design and construction processes. The possibility to significantly influence major design features, such as process selection and plant layout, largely ends with the conceptual design step. Therefore SBD’s principal focus must be on the early inclusion of safeguards requirements and the early identification of beneficial design features. The result could help form the basis for a new international norm for integrating safeguards into facility design. This is an interim report describing progress and project status as of the end of FY08. In this effort, SBD is defined as a structured approach to ensure the timely, efficient, and cost-effective integration of international and national safeguards, physical security, and other nonproliferation objectives into the overall design process for a nuclear facility. A key objective is to ensure that security and nonproliferation issues are considered when weighing facility design alternatives. Central to the work completed in FY08 was a study in which a SBD process was developed in the context of the current DOE facility acquisition process. The DOE study enabled the development of a “SBD design loop” that is suitable for use in any facility design process. It is a graded, iterative process that incorporates safeguards concerns throughout the conceptual, preliminary and final design processes. Additionally, a set of proposed design principles for SBD was developed. A “Generic SBD Process” was then developed. Key features of the process include the initiation of safeguards design activities in the pre-conceptual planning phase, early incorporation of safeguards requirements into the project requirements, early appointment of an SBD team, and

  18. An Analysis Framework Addressing the Scale and Legibility of Large Scientific Data Sets

    SciTech Connect

    Childs, Hank R.

    2006-01-01

    Much of the previous work in the large data visualization area has solely focused on handling the scale of the data. This task is clearly a great challenge and necessary, but it is not sufficient. Applying standard visualization techniques to large scale data sets often creates complicated pictures where meaningful trends are lost. A second challenge, then, is to also provide algorithms that simplify what an analyst must understand, using either visual or quantitative means. This challenge can be summarized as improving the legibility or reducing the complexity of massive data sets. Fully meeting both of these challenges is the work of many, many PhD dissertations. In this dissertation, we describe some new techniques to address both the scale and legibility challenges, in hope of contributing to the larger solution. In addition to our assumption of simultaneously addressing both scale and legibility, we add an additional requirement that the solutions considered fit well within an interoperable framework for diverse algorithms, because a large suite of algorithms is often necessary to fully understand complex data sets. For scale, we present a general architecture for handling large data, as well as details of a contract-based system for integrating advanced optimizations into a data flow network design. We also describe techniques for volume rendering and performing comparisons at the extreme scale. For legibility, we present several techniques. Most noteworthy are equivalence class functions, a technique to drive visualizations using statistical methods, and line-scan based techniques for characterizing shape.

  19. Protein complex-based analysis framework for high-throughput data sets.

    PubMed

    Vinayagam, Arunachalam; Hu, Yanhui; Kulkarni, Meghana; Roesel, Charles; Sopko, Richelle; Mohr, Stephanie E; Perrimon, Norbert

    2013-02-26

    Analysis of high-throughput data increasingly relies on pathway annotation and functional information derived from Gene Ontology. This approach has limitations, in particular for the analysis of network dynamics over time or under different experimental conditions, in which modules within a network rather than complete pathways might respond and change. We report an analysis framework based on protein complexes, which are at the core of network reorganization. We generated a protein complex resource for human, Drosophila, and yeast from the literature and databases of protein-protein interaction networks, with each species having thousands of complexes. We developed COMPLEAT (http://www.flyrnai.org/compleat), a tool for data mining and visualization for complex-based analysis of high-throughput data sets, as well as analysis and integration of heterogeneous proteomics and gene expression data sets. With COMPLEAT, we identified dynamically regulated protein complexes among genome-wide RNA interference data sets that used the abundance of phosphorylated extracellular signal-regulated kinase in cells stimulated with either insulin or epidermal growth factor as the output. The analysis predicted that the Brahma complex participated in the insulin response.

  20. Parallel level-set methods on adaptive tree-based grids

    NASA Astrophysics Data System (ADS)

    Mirzadeh, Mohammad; Guittet, Arthur; Burstedde, Carsten; Gibou, Frederic

    2016-10-01

    We present scalable algorithms for the level-set method on dynamic, adaptive Quadtree and Octree Cartesian grids. The algorithms are fully parallelized and implemented using the MPI standard and the open-source p4est library. We solve the level set equation with a semi-Lagrangian method which, similar to its serial implementation, is free of any time-step restrictions. This is achieved by introducing a scalable global interpolation scheme on adaptive tree-based grids. Moreover, we present a simple parallel reinitialization scheme using the pseudo-time transient formulation. Both parallel algorithms scale on the Stampede supercomputer, where we are currently using up to 4096 CPU cores, the limit of our current account. Finally, a relevant application of the algorithms is presented in modeling a crystallization phenomenon by solving a Stefan problem, illustrating a level of detail that would be impossible to achieve without a parallel adaptive strategy. We believe that the algorithms presented in this article will be of interest and useful to researchers working with the level-set framework and modeling multi-scale physics in general.

  1. Towards a Dynamic Conceptual Framework for English-Medium Education in Multilingual University Settings

    ERIC Educational Resources Information Center

    Dafouz, Emma; Smit, Ute

    2016-01-01

    At a time of increasing internationalization in tertiary education, English-Medium Education in Multilingual University Settings (EMEMUS) has become a common practice. While there is already ample research describing this phenomenon at a local level (Smit and Dafouz 2012a), the theoretical side needs to be elaborated. This article thus aims to…

  2. Towards a Dynamic Conceptual Framework for English-Medium Education in Multilingual University Settings

    ERIC Educational Resources Information Center

    Dafouz, Emma; Smit, Ute

    2016-01-01

    At a time of increasing internationalization in tertiary education, English-Medium Education in Multilingual University Settings (EMEMUS) has become a common practice. While there is already ample research describing this phenomenon at a local level (Smit and Dafouz 2012a), the theoretical side needs to be elaborated. This article thus aims to…

  3. Device for timing and power level setting for microwave applications

    NASA Astrophysics Data System (ADS)

    Ursu, M.-P.; Buidoş, T.

    2016-08-01

    Nowadays, the microwaves are widely used for various technological processes. The microwaves are emitted by magnetrons, which have strict requirements concerning power supplies for anode and filament cathodes, intensity of magnetic field, cooling and electromagnetic shielding. The magnetrons do not tolerate any alteration of their required voltages, currents and magnetic fields, which means that their output microwave power is fixed, so the only way to alter the power level is to use time-division, by turning the magnetron on and off by repetitive time patterns. In order to attain accurate and reproducible results, as well as correct and safe operation of the microwave device, all these requirements must be fulfilled. Safe, correct and reproducible operation of the microwave appliance can be achieved by means of a specially built electronic device, which ensures accurate and reproducible exposure times, interlocking of the commands and automatic switch off when abnormal operating conditions occur. This driving device, designed and realized during the completion of Mr.Ursu's doctoral thesis, consists of a quartz time-base, several programmable frequency and duration dividers, LED displays, sensors and interlocking gates. The active and passive electronic components are placed on custom-made PCB's, designed and made by means of computer-aided applications and machines. The driving commands of the electronic device are delivered to the magnetron power supplies by means of optic zero-passing relays. The inputs of the electronic driving device can sense the status of the microwave appliance. The user is able to enter the total exposure time, the division factor that sets the output power level and, as a novelty, the clock frequency of the time divider.

  4. A consensus-based educational framework and competency set for the discipline of disaster medicine and public health preparedness.

    PubMed

    Subbarao, Italo; Lyznicki, James M; Hsu, Edbert B; Gebbie, Kristine M; Markenson, David; Barzansky, Barbara; Armstrong, John H; Cassimatis, Emmanuel G; Coule, Philip L; Dallas, Cham E; King, Richard V; Rubinson, Lewis; Sattin, Richard; Swienton, Raymond E; Lillibridge, Scott; Burkle, Frederick M; Schwartz, Richard B; James, James J

    2008-03-01

    Various organizations and universities have developed competencies for health professionals and other emergency responders. Little effort has been devoted to the integration of these competencies across health specialties and professions. The American Medical Association Center for Public Health Preparedness and Disaster Response convened an expert working group (EWG) to review extant competencies and achieve consensus on an educational framework and competency set from which educators could devise learning objectives and curricula tailored to fit the needs of all health professionals in a disaster. The EWG conducted a systematic review of peer-reviewed and non-peer reviewed published literature. In addition, after-action reports from Hurricane Katrina and relevant publications recommended by EWG members and other subject matter experts were reviewed for congruencies and gaps. Consensus was ensured through a 3-stage Delphi process. The EWG process developed a new educational framework for disaster medicine and public health preparedness based on consensus identification of 7 core learning domains, 19 core competencies, and 73 specific competencies targeted at 3 broad health personnel categories. The competencies can be applied to a wide range of health professionals who are expected to perform at different levels (informed worker/student, practitioner, leader) according to experience, professional role, level of education, or job function. Although these competencies strongly reflect lessons learned following the health system response to Hurricane Katrina, it must be understood that preparedness is a process, and that these competencies must be reviewed continually and refined over time.

  5. GeneSetDB: A comprehensive meta-database, statistical and visualisation framework for gene set analysis

    PubMed Central

    Araki, Hiromitsu; Knapp, Christoph; Tsai, Peter; Print, Cristin

    2012-01-01

    Most “omics” experiments require comprehensive interpretation of the biological meaning of gene lists. To address this requirement, a number of gene set analysis (GSA) tools have been developed. Although the biological value of GSA is strictly limited by the breadth of the gene sets used, very few methods exist for simultaneously analysing multiple publically available gene set databases. Therefore, we constructed GeneSetDB (http://genesetdb.auckland.ac.nz/haeremai.html), a comprehensive meta-database, which integrates 26 public databases containing diverse biological information with a particular focus on human disease and pharmacology. GeneSetDB enables users to search for gene sets containing a gene identifier or keyword, generate their own gene sets, or statistically test for enrichment of an uploaded gene list across all gene sets, and visualise gene set enrichment and overlap using a clustered heat map. PMID:23650583

  6. GeneSetDB: A comprehensive meta-database, statistical and visualisation framework for gene set analysis.

    PubMed

    Araki, Hiromitsu; Knapp, Christoph; Tsai, Peter; Print, Cristin

    2012-01-01

    Most "omics" experiments require comprehensive interpretation of the biological meaning of gene lists. To address this requirement, a number of gene set analysis (GSA) tools have been developed. Although the biological value of GSA is strictly limited by the breadth of the gene sets used, very few methods exist for simultaneously analysing multiple publically available gene set databases. Therefore, we constructed GeneSetDB (http://genesetdb.auckland.ac.nz/haeremai.html), a comprehensive meta-database, which integrates 26 public databases containing diverse biological information with a particular focus on human disease and pharmacology. GeneSetDB enables users to search for gene sets containing a gene identifier or keyword, generate their own gene sets, or statistically test for enrichment of an uploaded gene list across all gene sets, and visualise gene set enrichment and overlap using a clustered heat map.

  7. Medical image segmentation using level set and watershed transform

    NASA Astrophysics Data System (ADS)

    Zhu, Fuping; Tian, Jie

    2003-07-01

    One of the most popular level set algorithms is the so-called fast marching method. In this paper, a medical image segmentation algorithm is proposed based on the combination of fast marching method and watershed transformation. First, the original image is smoothed using nonlinear diffusion filter, then the smoothed image is over-segmented by the watershed algorithm. Last, the image is segmented automatically using the modified fast marching method. Due to introducing over-segmentation, the arrival time the seeded point to the boundary of region should be calculated. For other pixels inside the region of the seeded point, the arrival time is not calculated because of the region homogeneity. So the algorithm"s speed improves greatly. Moreover, the speed function is redefined based on the statistical similarity degree of the nearby regions. We also extend our algorithm to 3D circumstance and segment medical image series. Experiments show that the algorithm can fast and accurately obtain segmentation results of medical images.

  8. Haustral fold segmentation with curvature-guided level set evolution.

    PubMed

    Zhu, Hongbin; Barish, Matthew; Pickhardt, Perry; Liang, Zhengrong

    2013-02-01

    Human colon has complex structures mostly because of the haustral folds. The folds are thin flat protrusions on the colon wall, which complicate the shape analysis for computer-aided detection (CAD) of colonic polyps. Fold segmentation may help reduce the structural complexity, and the folds can serve as an anatomic reference for computed tomographic colonography (CTC). Therefore, in this study, based on a model of the haustral fold boundaries, we developed a level-set approach to automatically segment the fold surfaces. To evaluate the developed fold segmentation algorithm, we first established the ground truth of haustral fold boundaries by experts' drawing on 15 patient CTC datasets without severe under/over colon distention from two medical centers. The segmentation algorithm successfully detected 92.7% of the folds in the ground truth. In addition to the sensitivity measure, we further developed a merit of segmented-area ratio (SAR), i.e., the ratio between the area of the intersection and union of the expert-drawn folds and the area of the automatically segmented folds, to measure the segmentation accuracy. The segmentation algorithm reached an average value of SAR = 86.2%, showing a good match with the ground truth on the fold surfaces. We believe the automatically segmented fold surfaces have the potential to benefit many postprocedures in CTC, such as CAD, taenia coli extraction, supine-prone registration, etc.

  9. A National Level Engagement Strategy: A Framework for Action

    DTIC Science & Technology

    2012-05-15

    hour oor rosponso, inclvdlno tho tlmo for review ing lnauuctiona. searching exiatlng dato eourcu. gothoru\\g end malntltinlng tho doto noodod, end ...U....,.yilt: C..didon Figure 9 - Engagement Framework • back to the desi red strategic ends . The model makes the in itial assumption that if the...complotlng end reviewing the colloctlon of lntormol lon. Sond common11 rogordlnp thle burdon u tlmeto or onv other ospoct of thiS collection ol

  10. NPN fuzzy sets and NPN qualitative algebra: a computational framework for bipolar cognitive modeling and multiagent decision analysis.

    PubMed

    Zhang, W R

    1996-01-01

    An NPN (Negative-Positive-Neutral) fuzzy set theory and an NPN qualitative algebra (Q-algebra) are proposed which form a computational framework for bipolar cognitive modeling and multiagent decision analysis. First a 6-valued NPN logic is introduced which extends the usual 4-valued Q-algebra (S, approximately , plus sign in circle,multiply sign in circle) and S={+,-,0,?} by adding one more level of specification; and then a real-valued NPN fuzzy logic is introduced which extends the 6-valued model to the real space { for all(x,y)|(x,y)in[-1,0]x[0,1]} and adds infinite levels of specifications, As a generalization, a fuzzy set theory is presented that allows beta-level fuzzy number-based NPN variables (x,y) to be substituted into (S, approximately , plus sign in circle,multiply sign in circle) where multiply sign in circle stands for any NPN T-norm; plus sign in circle stands for disjunction (V) or union ( union or logical sum), and beta is the number of alpha-cuts.

  11. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the

  12. A framework for different levels of integration of computational models into web-based virtual patients.

    PubMed

    Kononowicz, Andrzej A; Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-23

    Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients' interactivity by enriching them with computational models of physiological and pathological processes. The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the

  13. Research on classified real-time flood forecasting framework based on K-means cluster and rough set.

    PubMed

    Xu, Wei; Peng, Yong

    2015-01-01

    This research presents a new classified real-time flood forecasting framework. In this framework, historical floods are classified by a K-means cluster according to the spatial and temporal distribution of precipitation, the time variance of precipitation intensity and other hydrological factors. Based on the classified results, a rough set is used to extract the identification rules for real-time flood forecasting. Then, the parameters of different categories within the conceptual hydrological model are calibrated using a genetic algorithm. In real-time forecasting, the corresponding category of parameters is selected for flood forecasting according to the obtained flood information. This research tests the new classified framework on Guanyinge Reservoir and compares the framework with the traditional flood forecasting method. It finds that the performance of the new classified framework is significantly better in terms of accuracy. Furthermore, the framework can be considered in a catchment with fewer historical floods.

  14. Loudness discomfort level for speech: comparison of two instructional sets for saturation sound pressure level selection.

    PubMed

    Beattie, R C; Svihovec, D A; Carmen, R E; Kunkel, H A

    1980-01-01

    This study was undertaken to compare the speech loudness discomfort levels (LDL's) with two instructional sets which have been proposed for saturation sound pressure level selection of hearing aids. The phraseology recommended by McCandless and by Berger was presented to normal-hearing and hearing-impaired listeners. The normal-hearing subjects obtained mean LDL's of 94.6 and 111.9 dB SPL for these respective instructions, which was statistically significant. The hearing-impaired listeners also showed LDL's with Berger's instructions (114.7 dB SPL) to be significantly higher than with McCandless' instructional set (109.3 dB SPL). Consequently, this investigation suggests that these two instructional sets may lead to substantially different saturation sound pressure levels. Further studies are needed to determine the most appropriate phraseology for LDL measurement, including the assessment of speech intelligibility at various saturation sound pressure levels. Another instructional set was constructed which (1) includes an explanation to patients of the purpose and importance of the test, (2) requests listeners to indicate the upper level they are "willing" to listen as opposed to the level they are "able" to listen, (3) instructs patients to search thoroughly around their LDL before making a final judgment, and (4) contains a statement that the LDL should be made with the understanding that the speech could be listened to for a period of time. Whatever instructions are used, clinicians are advised to interpret their LDL's very cautiously until validational studies are available.

  15. Some free boundary problems in potential flow regime usinga based level set method

    SciTech Connect

    Garzon, M.; Bobillo-Ares, N.; Sethian, J.A.

    2008-12-09

    Recent advances in the field of fluid mechanics with moving fronts are linked to the use of Level Set Methods, a versatile mathematical technique to follow free boundaries which undergo topological changes. A challenging class of problems in this context are those related to the solution of a partial differential equation posed on a moving domain, in which the boundary condition for the PDE solver has to be obtained from a partial differential equation defined on the front. This is the case of potential flow models with moving boundaries. Moreover the fluid front will possibly be carrying some material substance which will diffuse in the front and be advected by the front velocity, as for example the use of surfactants to lower surface tension. We present a Level Set based methodology to embed this partial differential equations defined on the front in a complete Eulerian framework, fully avoiding the tracking of fluid particles and its known limitations. To show the advantages of this approach in the field of Fluid Mechanics we present in this work one particular application: the numerical approximation of a potential flow model to simulate the evolution and breaking of a solitary wave propagating over a slopping bottom and compare the level set based algorithm with previous front tracking models.

  16. Image registration via level-set motion: applications to atlas-based segmentation.

    PubMed

    Vemuri, B C; Ye, J; Chen, Y; Leonard, C M

    2003-03-01

    Image registration is an often encountered problem in various fields including medical imaging, computer vision and image processing. Numerous algorithms for registering image data have been reported in these areas. In this paper, we present a novel curve evolution approach expressed in a level-set framework to achieve image intensity morphing and a simple non-linear PDE for the corresponding coordinate registration. The key features of the intensity morphing model are that (a) it is very fast and (b) existence and uniqueness of the solution for the evolution model are established in a Sobolev space as opposed to using viscosity methods. The salient features of the coordinate registration model are its simplicity and computational efficiency. The intensity morph is easily achieved via evolving level-sets of one image into the level-sets of the other. To explicitly estimate the coordinate transformation between the images, we derive a non-linear PDE-based motion model which can be solved very efficiently. We demonstrate the performance of our algorithm on a variety of images including synthetic and real data. As an application of the PDE-based motion model, atlas based segmentation of hippocampal shape from several MR brain scans is depicted. In each of these experiments, automated hippocampal shape recovery results are validated via manual "expert" segmentations.

  17. Profile Evolution Simulation in Etching Systems Using Level Set Methods

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Govindan, T. R.; Meyyappan, M.

    1998-01-01

    Semiconductor device profiles are determined by the characteristics of both etching and deposition processes. In particular, a highly anisotropic etch is required to achieve vertical sidewalls. However, etching is comprised of both anisotropic and isotropic components, due to ion and neutral fluxes, respectively. In Ar/Cl2 plasmas, for example, neutral chlorine reacts with the Si surfaces to form silicon chlorides. These compounds are then removed by the impinging ion fluxes. Hence the directionality of the ions (and thus the ion angular distribution function, or IAD), as well as the relative fluxes of neutrals and ions determines the amount of undercutting. One method of modeling device profile evolution is to simulate the moving solid-gas interface between the semiconductor and the plasma as a string of nodes. The velocity of each node is calculated and then the nodes are advanced accordingly. Although this technique appears to be relatively straightforward, extensive looping schemes are required at the profile corners. An alternate method is to use level set theory, which involves embedding the location of the interface in a field variable. The normal speed is calculated at each mesh point, and the field variable is updated. The profile comers are more accurately modeled as the need for looping algorithms is eliminated. The model we have developed is a 2-D Level Set Profile Evolution Simulation (LSPES). The LSPES calculates etch rates of a substrate in low pressure plasmas due to the incident ion and neutral fluxes. For a Si substrate in an Ar/C12 gas mixture, for example, the predictions of the LSPES are identical to those from a string evolution model for high neutral fluxes and two different ion angular distributions.(2) In the figure shown, the relative neutral to ion flux in the bulk plasma is 100 to 1. For a moderately isotropic ion angular distribution function as shown in the cases in the left hand column, both the LSPES (top row) and rude's string

  18. Profile Evolution Simulation in Etching Systems Using Level Set Methods

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Govindan, T. R.; Meyyappan, M.

    1998-01-01

    Semiconductor device profiles are determined by the characteristics of both etching and deposition processes. In particular, a highly anisotropic etch is required to achieve vertical sidewalls. However, etching is comprised of both anisotropic and isotropic components, due to ion and neutral fluxes, respectively. In Ar/Cl2 plasmas, for example, neutral chlorine reacts with the Si surfaces to form silicon chlorides. These compounds are then removed by the impinging ion fluxes. Hence the directionality of the ions (and thus the ion angular distribution function, or IAD), as well as the relative fluxes of neutrals and ions determines the amount of undercutting. One method of modeling device profile evolution is to simulate the moving solid-gas interface between the semiconductor and the plasma as a string of nodes. The velocity of each node is calculated and then the nodes are advanced accordingly. Although this technique appears to be relatively straightforward, extensive looping schemes are required at the profile corners. An alternate method is to use level set theory, which involves embedding the location of the interface in a field variable. The normal speed is calculated at each mesh point, and the field variable is updated. The profile comers are more accurately modeled as the need for looping algorithms is eliminated. The model we have developed is a 2-D Level Set Profile Evolution Simulation (LSPES). The LSPES calculates etch rates of a substrate in low pressure plasmas due to the incident ion and neutral fluxes. For a Si substrate in an Ar/C12 gas mixture, for example, the predictions of the LSPES are identical to those from a string evolution model for high neutral fluxes and two different ion angular distributions.(2) In the figure shown, the relative neutral to ion flux in the bulk plasma is 100 to 1. For a moderately isotropic ion angular distribution function as shown in the cases in the left hand column, both the LSPES (top row) and rude's string

  19. Combating Terrorism: A Conceptual Framework for Targeting at the Operational Level

    DTIC Science & Technology

    2007-11-02

    COMBATING TERRORISM: A CONCEPTUAL FRAMEWORK FOR TARGETING AT THE OPERATIONAL LEVEL A thesis presented to the Faculty of the US Army Command and...ART AND SCIENCE THESIS APPROVAL PAGE Name of Candidate: Lt Col Angus S. J. Fay Thesis Title: Combating Terrorism: A Conceptual Framework for... conceptual framework for targeting terrorism at the operational level is worthy of investigation. Thesis Question Is there utility within the JIPB

  20. A universal surface complexation framework for modeling proton binding onto bacterial surfaces in geologic settings

    USGS Publications Warehouse

    Borrok, D.; Turner, B.F.; Fein, J.B.

    2005-01-01

    Adsorption onto bacterial cell walls can significantly affect the speciation and mobility of aqueous metal cations in many geologic settings. However, a unified thermodynamic framework for describing bacterial adsorption reactions does not exist. This problem originates from the numerous approaches that have been chosen for modeling bacterial surface protonation reactions. In this study, we compile all currently available potentiometric titration datasets for individual bacterial species, bacterial consortia, and bacterial cell wall components. Using a consistent, four discrete site, non-electrostatic surface complexation model, we determine total functional group site densities for all suitable datasets, and present an averaged set of 'universal' thermodynamic proton binding and site density parameters for modeling bacterial adsorption reactions in geologic systems. Modeling results demonstrate that the total concentrations of proton-active functional group sites for the 36 bacterial species and consortia tested are remarkably similar, averaging 3.2 ?? 1.0 (1??) ?? 10-4 moles/wet gram. Examination of the uncertainties involved in the development of proton-binding modeling parameters suggests that ignoring factors such as bacterial species, ionic strength, temperature, and growth conditions introduces relatively small error compared to the unavoidable uncertainty associated with the determination of cell abundances in realistic geologic systems. Hence, we propose that reasonable estimates of the extent of bacterial cell wall deprotonation can be made using averaged thermodynamic modeling parameters from all of the experiments that are considered in this study, regardless of bacterial species used, ionic strength, temperature, or growth condition of the experiment. The average site densities for the four discrete sites are 1.1 ?? 0.7 ?? 10-4, 9.1 ?? 3.8 ?? 10-5, 5.3 ?? 2.1 ?? 10-5, and 6.6 ?? 3.0 ?? 10-5 moles/wet gram bacteria for the sites with pKa values of 3

  1. Shared Investment Projects and Forecasting Errors: Setting Framework Conditions for Coordination and Sequencing Data Quality Activities

    PubMed Central

    Leitner, Stephan; Brauneis, Alexander; Rausch, Alexandra

    2015-01-01

    In this paper, we investigate the impact of inaccurate forecasting on the coordination of distributed investment decisions. In particular, by setting up a computational multi-agent model of a stylized firm, we investigate the case of investment opportunities that are mutually carried out by organizational departments. The forecasts of concern pertain to the initial amount of money necessary to launch and operate an investment opportunity, to the expected intertemporal distribution of cash flows, and the departments’ efficiency in operating the investment opportunity at hand. We propose a budget allocation mechanism for coordinating such distributed decisions The paper provides guidance on how to set framework conditions, in terms of the number of investment opportunities considered in one round of funding and the number of departments operating one investment opportunity, so that the coordination mechanism is highly robust to forecasting errors. Furthermore, we show that—in some setups—a certain extent of misforecasting is desirable from the firm’s point of view as it supports the achievement of the corporate objective of value maximization. We then address the question of how to improve forecasting quality in the best possible way, and provide policy advice on how to sequence activities for improving forecasting quality so that the robustness of the coordination mechanism to errors increases in the best possible way. At the same time, we show that wrong decisions regarding the sequencing can lead to a decrease in robustness. Finally, we conduct a comprehensive sensitivity analysis and prove that—in particular for relatively good forecasters—most of our results are robust to changes in setting the parameters of our multi-agent simulation model. PMID:25803736

  2. Shared investment projects and forecasting errors: setting framework conditions for coordination and sequencing data quality activities.

    PubMed

    Leitner, Stephan; Brauneis, Alexander; Rausch, Alexandra

    2015-01-01

    In this paper, we investigate the impact of inaccurate forecasting on the coordination of distributed investment decisions. In particular, by setting up a computational multi-agent model of a stylized firm, we investigate the case of investment opportunities that are mutually carried out by organizational departments. The forecasts of concern pertain to the initial amount of money necessary to launch and operate an investment opportunity, to the expected intertemporal distribution of cash flows, and the departments' efficiency in operating the investment opportunity at hand. We propose a budget allocation mechanism for coordinating such distributed decisions The paper provides guidance on how to set framework conditions, in terms of the number of investment opportunities considered in one round of funding and the number of departments operating one investment opportunity, so that the coordination mechanism is highly robust to forecasting errors. Furthermore, we show that-in some setups-a certain extent of misforecasting is desirable from the firm's point of view as it supports the achievement of the corporate objective of value maximization. We then address the question of how to improve forecasting quality in the best possible way, and provide policy advice on how to sequence activities for improving forecasting quality so that the robustness of the coordination mechanism to errors increases in the best possible way. At the same time, we show that wrong decisions regarding the sequencing can lead to a decrease in robustness. Finally, we conduct a comprehensive sensitivity analysis and prove that-in particular for relatively good forecasters-most of our results are robust to changes in setting the parameters of our multi-agent simulation model.

  3. Simultaneous segmentation and reconstruction: A level set method approach for limited view computed tomography

    PubMed Central

    Yoon, Sungwon; Pineda, Angel R.; Fahrig, Rebecca

    2010-01-01

    Purpose: An iterative tomographic reconstruction algorithm that simultaneously segments and reconstructs the reconstruction domain is proposed and applied to tomographic reconstructions from a sparse number of projection images. Methods: The proposed algorithm uses a two-phase level set method segmentation in conjunction with an iterative tomographic reconstruction to achieve simultaneous segmentation and reconstruction. The simultaneous segmentation and reconstruction is achieved by alternating between level set function evolutions and per-region intensity value updates. To deal with the limited number of projections, a priori information about the reconstruction is enforced via penalized likelihood function. Specifically, smooth function within each region (piecewise smooth function) and bounded function intensity values for each region are assumed. Such a priori information is formulated into a quadratic objective function with linear bound constraints. The level set function evolutions are achieved by artificially time evolving the level set function in the negative gradient direction; the intensity value updates are achieved by using the gradient projection conjugate gradient algorithm. Results: The proposed simultaneous segmentation and reconstruction results were compared to “conventional” iterative reconstruction (with no segmentation), iterative reconstruction followed by segmentation, and filtered backprojection. Improvements of 6%–13% in the normalized root mean square error were observed when the proposed algorithm was applied to simulated projections of a numerical phantom and to real fan-beam projections of the Catphan phantom, both of which did not satisfy the a priori assumptions. Conclusions: The proposed simultaneous segmentation and reconstruction resulted in improved reconstruction image quality. The algorithm correctly segments the reconstruction space into regions, preserves sharp edges between different regions, and smoothes the noise

  4. Simultaneous segmentation and reconstruction: A level set method approach for limited view computed tomography

    SciTech Connect

    Yoon, Sungwon; Pineda, Angel R.; Fahrig, Rebecca

    2010-05-15

    Purpose: An iterative tomographic reconstruction algorithm that simultaneously segments and reconstructs the reconstruction domain is proposed and applied to tomographic reconstructions from a sparse number of projection images. Methods: The proposed algorithm uses a two-phase level set method segmentation in conjunction with an iterative tomographic reconstruction to achieve simultaneous segmentation and reconstruction. The simultaneous segmentation and reconstruction is achieved by alternating between level set function evolutions and per-region intensity value updates. To deal with the limited number of projections, a priori information about the reconstruction is enforced via penalized likelihood function. Specifically, smooth function within each region (piecewise smooth function) and bounded function intensity values for each region are assumed. Such a priori information is formulated into a quadratic objective function with linear bound constraints. The level set function evolutions are achieved by artificially time evolving the level set function in the negative gradient direction; the intensity value updates are achieved by using the gradient projection conjugate gradient algorithm. Results: The proposed simultaneous segmentation and reconstruction results were compared to ''conventional'' iterative reconstruction (with no segmentation), iterative reconstruction followed by segmentation, and filtered backprojection. Improvements of 6%-13% in the normalized root mean square error were observed when the proposed algorithm was applied to simulated projections of a numerical phantom and to real fan-beam projections of the Catphan phantom, both of which did not satisfy the a priori assumptions. Conclusions: The proposed simultaneous segmentation and reconstruction resulted in improved reconstruction image quality. The algorithm correctly segments the reconstruction space into regions, preserves sharp edges between different regions, and smoothes the noise

  5. Parallel Computation of the Topology of Level Sets

    SciTech Connect

    Pascucci, V; Cole-McLaughlin, K

    2004-12-16

    This paper introduces two efficient algorithms that compute the Contour Tree of a 3D scalar field F and its augmented version with the Betti numbers of each isosurface. The Contour Tree is a fundamental data structure in scientific visualization that is used to preprocess the domain mesh to allow optimal computation of isosurfaces with minimal overhead storage. The Contour Tree can also be used to build user interfaces reporting the complete topological characterization of a scalar field, as shown in Figure 1. Data exploration time is reduced since the user understands the evolution of level set components with changing isovalue. The Augmented Contour Tree provides even more accurate information segmenting the range space of the scalar field in portion of invariant topology. The exploration time for a single isosurface is also improved since its genus is known in advance. Our first new algorithm augments any given Contour Tree with the Betti numbers of all possible corresponding isocontours in linear time with the size of the tree. Moreover we show how to extend the scheme introduced in [3] with the Betti number computation without increasing its complexity. Thus, we improve on the time complexity from our previous approach [10] from O(m log m) to O(n log n + m), where m is the number of cells and n is the number of vertices in the domain of F. Our second contribution is a new divide-and-conquer algorithm that computes the Augmented Contour Tree with improved efficiency. The approach computes the output Contour Tree by merging two intermediate Contour Trees and is independent of the interpolant. In this way we confine any knowledge regarding a specific interpolant to an independent function that computes the tree for a single cell. We have implemented this function for the trilinear interpolant and plan to replace it with higher order interpolants when needed. The time complexity is O(n + t log n), where t is the number of critical points of F. For the first time

  6. Topology Optimization using the Level Set and eXtended Finite Element Methods: Theory and Applications

    NASA Astrophysics Data System (ADS)

    Villanueva Perez, Carlos Hernan

    Computational design optimization provides designers with automated techniques to develop novel and non-intuitive optimal designs. Topology optimization is a design optimization technique that allows for the evolution of a broad variety of geometries in the optimization process. Traditional density-based topology optimization methods often lack a sufficient resolution of the geometry and physical response, which prevents direct use of the optimized design in manufacturing and the accurate modeling of the physical response of boundary conditions. The goal of this thesis is to introduce a unified topology optimization framework that uses the Level Set Method (LSM) to describe the design geometry and the eXtended Finite Element Method (XFEM) to solve the governing equations and measure the performance of the design. The methodology is presented as an alternative to density-based optimization approaches, and is able to accommodate a broad range of engineering design problems. The framework presents state-of-the-art methods for immersed boundary techniques to stabilize the systems of equations and enforce the boundary conditions, and is studied with applications in 2D and 3D linear elastic structures, incompressible flow, and energy and species transport problems to test the robustness and the characteristics of the method. A comparison of the framework against density-based topology optimization approaches is studied with regards to convergence, performance, and the capability to manufacture the designs. Furthermore, the ability to control the shape of the design to operate within manufacturing constraints is developed and studied. The analysis capability of the framework is validated quantitatively through comparison against previous benchmark studies, and qualitatively through its application to topology optimization problems. The design optimization problems converge to intuitive designs and resembled well the results from previous 2D or density-based studies.

  7. Rehabilitation goal setting with community dwelling adults with acquired brain injury: a theoretical framework derived from clinicians' reflections on practice.

    PubMed

    Prescott, Sarah; Fleming, Jennifer; Doig, Emmah

    2017-06-11

    The aim of this study was to explore clinicians' experiences of implementing goal setting with community dwelling clients with acquired brain injury, to develop a goal setting practice framework. Grounded theory methodology was employed. Clinicians, representing six disciplines across seven services, were recruited and interviewed until theoretical saturation was achieved. A total of 22 clinicians were interviewed. A theoretical framework was developed to explain how clinicians support clients to actively engage in goal setting in routine practice. The framework incorporates three phases: a needs identification phase, a goal operationalisation phase, and an intervention phase. Contextual factors, including personal and environmental influences, also affect how clinicians and clients engage in this process. Clinicians use additional strategies to support clients with impaired self-awareness. These include structured communication and metacognitive strategies to operationalise goals. For clients with emotional distress, clinicians provide additional time and intervention directed at new identity development. The goal setting practice framework may guide clinician's understanding of how to engage in client-centred goal setting in brain injury rehabilitation. There is a predilection towards a client-centred goal setting approach in the community setting, however, contextual factors can inhibit implementation of this approach. Implications for Rehabilitation The theoretical framework describes processes used to develop achievable client-centred goals with people with brain injury. Building rapport is a core strategy to engage clients with brain injury in goal setting. Clients with self-awareness impairment benefit from additional metacognitive strategies to participate in goal setting. Clients with emotional distress may need additional time for new identity development.

  8. An Evaluation of the High Level Architecture (HLA) as a Framework for NASA Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reid, Michael R.; Powers, Edward I. (Technical Monitor)

    2000-01-01

    The High Level Architecture (HLA) is a current US Department of Defense and an industry (IEEE-1516) standard architecture for modeling and simulations. It provides a framework and set of functional rules and common interfaces for integrating separate and disparate simulators into a larger simulation. The goal of the HLA is to reduce software costs by facilitating the reuse of simulation components and by providing a runtime infrastructure to manage the simulations. In order to evaluate the applicability of the HLA as a technology for NASA space mission simulations, a Simulations Group at Goddard Space Flight Center (GSFC) conducted a study of the HLA and developed a simple prototype HLA-compliant space mission simulator. This paper summarizes the prototyping effort and discusses the potential usefulness of the HLA in the design and planning of future NASA space missions with a focus on risk mitigation and cost reduction.

  9. Measuring Afterschool Program Quality Using Setting-Level Observational Approaches

    ERIC Educational Resources Information Center

    Oh, Yoonkyung; Osgood, D. Wayne; Smith, Emilie P.

    2015-01-01

    The importance of afterschool hours for youth development is widely acknowledged, and afterschool settings have recently received increasing attention as an important venue for youth interventions, bringing a growing need for reliable and valid measures of afterschool quality. This study examined the extent to which the two observational tools,…

  10. Measuring Afterschool Program Quality Using Setting-Level Observational Approaches

    ERIC Educational Resources Information Center

    Oh, Yoonkyung; Osgood, D. Wayne; Smith, Emilie P.

    2015-01-01

    The importance of afterschool hours for youth development is widely acknowledged, and afterschool settings have recently received increasing attention as an important venue for youth interventions, bringing a growing need for reliable and valid measures of afterschool quality. This study examined the extent to which the two observational tools,…

  11. Structural engineering masters level education framework of knowledge for the needs of initial professional practice

    NASA Astrophysics Data System (ADS)

    Balogh, Zsuzsa Enriko

    For at least the last decade, engineering, civil engineering, along with structural engineering as a profession within civil engineering, have and continue to face an emerging need for "Raising the Bar" of preparedness of young engineers seeking to become practicing professional engineers. The present consensus of the civil engineering profession is that the increasing need for broad and in-depth knowledge should require the young structural engineers to have at least a Masters-Level education. This study focuses on the Masters-Level preparedness in the structural engineering area within the civil engineering field. It follows much of the methodology used in the American Society of Civil Engineers (ASCE) Body of Knowledge determination for civil engineering and extends this type of study to better define the portion of the young engineers preparation beyond the undergraduate program for one specialty area of civil engineering. The objective of this research was to create a Framework of Knowledge for the young engineer which identifies and recognizes the needs of the profession, along with the profession's expectations of how those needs can be achieved in the graduate-level academic setting, in the practice environment, and through lifelong learning opportunities with an emphasis on the initial five years experience past completion of a Masters program in structural engineering. This study applied a modified Delphi method to obtain the critical information from members of the structural engineering profession. The results provide a Framework of Knowledge which will be useful to several groups seeking to better ensure the preparedness of the future young structural engineers at the Masters-Level.

  12. A framework for performance and data quality assessment of Radio Frequency IDentification (RFID) systems in health care settings.

    PubMed

    van der Togt, Remko; Bakker, Piet J M; Jaspers, Monique W M

    2011-04-01

    RFID offers great opportunities to health care. Nevertheless, prior experiences also show that RFID systems have not been designed and tested in response to the particular needs of health care settings and might introduce new risks. The aim of this study is to present a framework that can be used to assess the performance of RFID systems particularly in health care settings. We developed a framework describing a systematic approach that can be used for assessing the feasibility of using an RFID technology in a particular healthcare setting; more specific for testing the impact of environmental factors on the quality of RFID generated data and vice versa. This framework is based on our own experiences with an RFID pilot implementation in an academic hospital in The Netherlands and a literature review concerning RFID test methods and current insights of RFID implementations in healthcare. The implementation of an RFID system within the blood transfusion chain inside a hospital setting was used as a show case to explain the different phases of the framework. The framework consists of nine phases, including an implementation development plan, RFID and medical equipment interference tests, data accuracy- and data completeness tests to be run in laboratory, simulated field and real field settings. The potential risks that RFID technologies may bring to the healthcare setting should be thoroughly evaluated before they are introduced into a vital environment. The RFID performance assessment framework that we present can act as a reference model to start an RFID development, engineering, implementation and testing plan and more specific, to assess the potential risks of interference and to test the quality of the RFID generated data potentially influenced by physical objects in specific health care environments. Copyright © 2010 Elsevier Inc. All rights reserved.

  13. Setting the stage for the EPOS ERIC: Integration of the legal, governance and financial framework

    NASA Astrophysics Data System (ADS)

    Atakan, Kuvvet; Bazin, Pierre-Louis; Bozzoli, Sabrina; Freda, Carmela; Giardini, Domenico; Hoffmann, Thomas; Kohler, Elisabeth; Kontkanen, Pirjo; Lauterjung, Jörn; Pedersen, Helle; Saleh, Kauzar; Sangianantoni, Agata

    2017-04-01

    EPOS - the European Plate Observing System - is the ESFRI infrastructure serving the need of the solid Earth science community at large. The EPOS mission is to create a single sustainable, and distributed infrastructure that integrates the diverse European Research Infrastructures for solid Earth science under a common framework. Thematic Core Services (TCS) and Integrated Core Services (Central Hub, ICS-C and Distributed, ICS-D) are key elements, together with NRIs (National Research Infrastructures), in the EPOS architecture. Following the preparatory phase, EPOS has initiated formal steps to adopt an ERIC legal framework (European Research Infrastructure Consortium). The statutory seat of EPOS will be in Rome, Italy, while the ICS-C will be jointly operated by France, UK and Denmark. The TCS planned so far cover: seismology, near-fault observatories, GNSS data and products, volcano observations, satellite data, geomagnetic observations, anthropogenic hazards, geological information modelling, multiscale laboratories and geo-energy test beds for low carbon energy. In the ERIC process, EPOS and all its services must achieve sustainability from a legal, governance, financial, and technical point of view, as well as full harmonization with national infrastructure roadmaps. As EPOS is a distributed infrastructure, the TCSs have to be linked to the future EPOS ERIC from legal and governance perspectives. For this purpose the TCSs have started to organize themselves as consortia and negotiate agreements to define the roles of the different actors in the consortium as well as their commitment to contribute to the EPOS activities. The link to the EPOS ERIC shall be made by service agreements of dedicated Service Providers. A common EPOS data policy has also been developed, based on the general principles of Open Access and paying careful attention to licensing issues, quality control, and intellectual property rights, which shall apply to the data, data products

  14. Telemedicine: what framework, what levels of proof, implementation rules.

    PubMed

    Zannad, Faiez; Maugendre, Philippe; Audry, Antoine; Avril, Carole; Blaise, Lucile; Blin, Olivier; Burnel, Philippe; Falise-Mirat, Béatrice; Girault, Danièle; Giri, Isabelle; Goehrs, Jean-Marie; Lassale, Catherine; Le Meur, Roland; Leurent, Pierre; Ratignier-Carbonneil, Christelle; Rossignol, Patrick; Satonnet, Evelyne; Simon, Pierre; Treluyer, Laurent

    2014-01-01

    The concept of telemedicine was formalised in France in the 2009 "Hospital, patients, health territories" (loi hôpital, patients, santé, territoire) law and the 2010 decree through which it was applied. Many experiments have been carried out and the regulatory institutions (Ministry, Regional Health Agency [Agence régionale de santé, ARS], French National Health Authority [Haute autorité de santé, HAS], etc.) have issued various guidance statements and recommendations on its organisation and on the expectations of its evaluation. With this background, the round table wanted to produce recommendations on different areas of medical telemonitoring (the role of telemonitoring, the regulatory system, the principles for assessment, methods of use and conditions for sustained and seamless deployment). Whilst many studies carried out on new medical telemonitoring approaches have led to the postulate that it offers benefit, both clinically and in terms of patient quality of life, more information is needed to demonstrate its impact on the organisation of healthcare and the associated medico-economic benefit (criteria, methods, resources). Similarly, contractual frameworks for deployment of telemonitoring do exist, although they are complicated and involve many different stakeholders (Director General fo the Care Offering [Direction générale de l'offre de soins, DGOS], ARS, HAS, Agency for Shared Health Information Systems [Agence des systèmes d'information partagés de santé, ASIP], French National Data Protection Commission [Commission nationale informatique et libertés, CNIL], French National Medical Council [Conseil national de l'Ordre des médecins, CNOM], etc.) that would benefit from a shared approach and seamless exchange between the partners involved. The current challenge is also to define the conditions required to validate a stable economic model in order to promote organisational change. One topical issue is placing the emphasis on its evaluation and

  15. Exploring Veteran Success through State-Level Administrative Data Sets

    ERIC Educational Resources Information Center

    Massa, Tod; Gogia, Laura

    2017-01-01

    This chapter describes the benefits and challenges of state-level longitudinal data collection on student veterans and offers recommendations for optimizing collection and reporting for the advocacy of student veteran success.

  16. Concurrent Validity of the Independent Reading Level Assessment Framework and a State Assessment

    ERIC Educational Resources Information Center

    Ralston, Nicole C.; Waggoner, Jacqueline M.; Tarasawa, Beth; Jackson, Amy

    2016-01-01

    This study investigates the use of screening assessments within the increasingly popular Response to Intervention (RTI) framework, specifically seeking to collect concurrent validity evidence on one potential new screening tool, the Independent Reading Level Assessment (IRLA) framework. Furthermore, this study builds on existing literature by…

  17. A Conceptual Framework for Educational Design at Modular Level to Promote Transfer of Learning

    ERIC Educational Resources Information Center

    Botma, Yvonne; Van Rensburg, G. H.; Coetzee, I. M.; Heyns, T.

    2015-01-01

    Students bridge the theory-practice gap when they apply in practice what they have learned in class. A conceptual framework was developed that can serve as foundation to design for learning transfer at modular level. The framework is based on an adopted and adapted systemic model of transfer of learning, existing learning theories, constructive…

  18. A Conceptual Framework for Educational Design at Modular Level to Promote Transfer of Learning

    ERIC Educational Resources Information Center

    Botma, Yvonne; Van Rensburg, G. H.; Coetzee, I. M.; Heyns, T.

    2015-01-01

    Students bridge the theory-practice gap when they apply in practice what they have learned in class. A conceptual framework was developed that can serve as foundation to design for learning transfer at modular level. The framework is based on an adopted and adapted systemic model of transfer of learning, existing learning theories, constructive…

  19. Threshold estimation based on a p-value framework in dose-response and regression settings.

    PubMed

    Mallik, A; Sen, B; Banerjee, M; Michailidis, G

    2011-12-01

    We use p-values to identify the threshold level at which a regression function leaves its baseline value, a problem motivated by applications in toxicological and pharmacological dose-response studies and environmental statistics. We study the problem in two sampling settings: one where multiple responses can be obtained at a number of different covariate levels, and the other the standard regression setting involving limited number of response values at each covariate. Our procedure involves testing the hypothesis that the regression function is at its baseline at each covariate value and then computing the potentially approximate p-value of the test. An estimate of the threshold is obtained by fitting a piecewise constant function with a single jump discontinuity, known as a stump, to these observed p-values, as they behave in markedly different ways on the two sides of the threshold. The estimate is shown to be consistent and its finite sample properties are studied through simulations. Our approach is computationally simple and extends to the estimation of the baseline value of the regression function, heteroscedastic errors and to time series. It is illustrated on some real data applications.

  20. Marker ReDistancing/Level Set Method for High-Fidelity Implicit Interface Tracking

    SciTech Connect

    Robert Nourgaliev; Samet Kadioglu; Vincent Mousseau; Dana Knoll

    2010-02-01

    A hybrid of the Front-Tracking (FT) and the Level-Set (LS) methods is introduced, combining advantages and removing drawbacks of both methods. The kinematics of the interface is treated in a Lagrangian (FT) manner, by tracking markers placed at the interface. The markers are not connected – instead, the interface topology is resolved in an Eulerian (LS) framework, by wrapping a signed distance function around Lagrangian markers each time the markers move. For accuracy and efficiency, we have developed a high-order “anchoring” algorithm and an implicit PDE-based re-distancing. We have demonstrated that the method is 3rd-order accurate in space, near the markers, and therefore 1st-order convergent in curvature; in contrast to traditional PDE-based re-initialization algorithms, which tend to slightly relocate the zero Level Set and can be shown to be non-convergent in curvature. The implicit pseudo-time discretization of the re-distancing equation is implemented within the Jacobian-Free Newton Krylov (JFNK) framework combined with ILU(k) preconditioning. We have demonstrated that the steady-state solutions in pseudo-time can be achieved very efficiently, with iterations (CFL ), in contrast to the explicit re-distancing which requires 100s of iterations with CFL . The most cost-effective algorithm is found to be a hybrid of explicit and implicit discretizations, in which we apply first 10-15 iterations with explicit discretization (to bring the initial guess to the ball of convergence for the Newton’s method) and then finishing with 2-3 implicit steps, bringing the re-distancing equation to a complete steady-state. The eigenscopy of the JFNK-ILU(k) demonstrates the efficiency of the ILU(k) preconditioner, which effectively cluster eigenvalues of the otherwise extremely ill-conditioned Jacobian matrices, thereby enabling the Krylov (GMRES) method to converge with iterations, with only a few levels of ILU fill-ins. Importantly, due to the Level Set localization

  1. High level trigger online calibration framework in ALICE

    NASA Astrophysics Data System (ADS)

    Bablok, S. R.; Djuvsland, Ø.; Kanaki, K.; Nystrand, J.; Richter, M.; Röhrich, D.; Skjerdal, K.; Ullaland, K.; Øvrebekk, G.; Larsen, D.; Alme, J.; Alt, T.; Lindenstruth, V.; Steinbeck, T. M.; Thäder, J.; Kebschull, U.; Böttger, S.; Kalcher, S.; Lara, C.; Panse, R.; Appelshäuser, H.; Ploskon, M.; Helstrup, H.; Hetland, K. F.; Haaland, Ø.; Roed, K.; Thingnæs, T.; Aamodt, K.; Hille, P. T.; Lovhoiden, G.; Skaali, B.; Tveter, T.; Das, I.; Chattopadhyay, S.; Becker, B.; Cicalo, C.; Marras, D.; Siddhanta, S.; Cleymans, J.; Szostak, A.; Fearick, R.; Vaux, G. d.; Vilakazi, Z.

    2008-07-01

    The ALICE High Level Trigger (HLT) is designed to perform event analysis of heavy ion and proton-proton collisions as well as calibration calculations online. A large PC farm, currently under installation, enables analysis algorithms to process these computationally intensive tasks. The HLT receives event data from all major detectors in ALICE. Interfaces to the various other systems provide the analysis software with required additional information. Processed results are sent back to the corresponding systems. To allow online performance monitoring of the detectors an interface for visualizing these results has been developed.

  2. Joint Infrared Target Recognition and Segmentation Using a Shape Manifold-Aware Level Set

    PubMed Central

    Yu, Liangjiang; Fan, Guoliang; Gong, Jiulu; Havlicek, Joseph P.

    2015-01-01

    We propose new techniques for joint recognition, segmentation and pose estimation of infrared (IR) targets. The problem is formulated in a probabilistic level set framework where a shape constrained generative model is used to provide a multi-class and multi-view shape prior and where the shape model involves a couplet of view and identity manifolds (CVIM). A level set energy function is then iteratively optimized under the shape constraints provided by the CVIM. Since both the view and identity variables are expressed explicitly in the objective function, this approach naturally accomplishes recognition, segmentation and pose estimation as joint products of the optimization process. For realistic target chips, we solve the resulting multi-modal optimization problem by adopting a particle swarm optimization (PSO) algorithm and then improve the computational efficiency by implementing a gradient-boosted PSO (GB-PSO). Evaluation was performed using the Military Sensing Information Analysis Center (SENSIAC) ATR database, and experimental results show that both of the PSO algorithms reduce the cost of shape matching during CVIM-based shape inference. Particularly, GB-PSO outperforms other recent ATR algorithms, which require intensive shape matching, either explicitly (with pre-segmentation) or implicitly (without pre-segmentation). PMID:25938202

  3. Moving Particle Level-Set (MPLS) method for incompressible multiphase flow computation

    NASA Astrophysics Data System (ADS)

    Ng, K. C.; Hwang, Y. H.; Sheu, T. W. H.; Yu, C. H.

    2015-11-01

    An implementation of a multiphase model in a recently developed Moving Particle Pressure Mesh (MPPM) particle-based solver is reported in the current work. By enforcing the divergence-free condition on the background mesh (pressure mesh), the moving particles are merely treated as observation points without intrinsic mass property, which has surmounted several computational deficiencies in the existing Moving Particle Semi-implicit (MPS) method. In the current work, in order to enhance the smoothness of the fluid interface and simulate interfacial flow with large density ratio without rigorous tuning of calibration parameters as required in most of the existing particle methods, a density interpolation scheme is put forward in the current work by using the conservative level-set method to ensure mass conservation. Several multiphase flow cases are simulated and compared with the existing numerical/theoretical solutions. It is encouraging to observe that the present solutions are more accurate than the numerical solutions based on the existing MPS methods. The proposal of the current Moving Particle Level-Set (MPLS) method thus provides a simple yet effective approach in computing incompressible multiphase flow within the numerical framework of particle method.

  4. Joint infrared target recognition and segmentation using a shape manifold-aware level set.

    PubMed

    Yu, Liangjiang; Fan, Guoliang; Gong, Jiulu; Havlicek, Joseph P

    2015-04-29

    We propose new techniques for joint recognition, segmentation and pose estimation of infrared (IR) targets. The problem is formulated in a probabilistic level set framework where a shape constrained generative model is used to provide a multi-class and multi-view shape prior and where the shape model involves a couplet of view and identity manifolds (CVIM). A level set energy function is then iteratively optimized under the shape constraints provided by the CVIM. Since both the view and identity variables are expressed explicitly in the objective function, this approach naturally accomplishes recognition, segmentation and pose estimation as joint products of the optimization process. For realistic target chips, we solve the resulting multi-modal optimization problem by adopting a particle swarm optimization (PSO) algorithm and then improve the computational efficiency by implementing a gradient-boosted PSO (GB-PSO). Evaluation was performed using the Military Sensing Information Analysis Center (SENSIAC) ATR database, and experimental results show that both of the PSO algorithms reduce the cost of shape matching during CVIM-based shape inference. Particularly, GB-PSO outperforms other recent ATR algorithms, which require intensive shape matching, either explicitly (with pre-segmentation) or implicitly (without pre-segmentation).

  5. A novel framework for assessing metadata quality in epidemiological and public health research settings.

    PubMed

    McMahon, Christiana; Denaxas, Spiros

    2016-01-01

    Metadata are critical in epidemiological and public health research. However, a lack of biomedical metadata quality frameworks and limited awareness of the implications of poor quality metadata renders data analyses problematic. In this study, we created and evaluated a novel framework to assess metadata quality of epidemiological and public health research datasets. We performed a literature review and surveyed stakeholders to enhance our understanding of biomedical metadata quality assessment. The review identified 11 studies and nine quality dimensions; none of which were specifically aimed at biomedical metadata. 96 individuals completed the survey; of those who submitted data, most only assessed metadata quality sometimes, and eight did not at all. Our framework has four sections: a) general information; b) tools and technologies; c) usability; and d) management and curation. We evaluated the framework using three test cases and sought expert feedback. The framework can assess biomedical metadata quality systematically and robustly.

  6. A novel framework for assessing metadata quality in epidemiological and public health research settings

    PubMed Central

    McMahon, Christiana; Denaxas, Spiros

    2016-01-01

    Metadata are critical in epidemiological and public health research. However, a lack of biomedical metadata quality frameworks and limited awareness of the implications of poor quality metadata renders data analyses problematic. In this study, we created and evaluated a novel framework to assess metadata quality of epidemiological and public health research datasets. We performed a literature review and surveyed stakeholders to enhance our understanding of biomedical metadata quality assessment. The review identified 11 studies and nine quality dimensions; none of which were specifically aimed at biomedical metadata. 96 individuals completed the survey; of those who submitted data, most only assessed metadata quality sometimes, and eight did not at all. Our framework has four sections: a) general information; b) tools and technologies; c) usability; and d) management and curation. We evaluated the framework using three test cases and sought expert feedback. The framework can assess biomedical metadata quality systematically and robustly. PMID:27570670

  7. Comparing volume of fluid and level set methods for evaporating liquid-gas flows

    NASA Astrophysics Data System (ADS)

    Palmore, John; Desjardins, Olivier

    2016-11-01

    This presentation demonstrates three numerical strategies for simulating liquid-gas flows undergoing evaporation. The practical aim of this work is to choose a framework capable of simulating the combustion of liquid fuels in an internal combustion engine. Each framework is analyzed with respect to its accuracy and computational cost. All simulations are performed using a conservative, finite volume code for simulating reacting, multiphase flows under the low-Mach assumption. The strategies used in this study correspond to different methods for tracking the liquid-gas interface and handling the transport of the discontinuous momentum and vapor mass fractions fields. The first two strategies are based on conservative, geometric volume of fluid schemes using directionally split and un-split advection, respectively. The third strategy is the accurate conservative level set method. For all strategies, special attention is given to ensuring the consistency between the fluxes of mass, momentum, and vapor fractions. The study performs three-dimensional simulations of an isolated droplet of a single component fuel evaporating into air. Evaporation rates and vapor mass fractions are compared to analytical results.

  8. Bridging Informatics and Implementation Science: Evaluating a Framework to Assess Electronic Health Record Implementations in Community Settings

    PubMed Central

    Richardson, Joshua E.; Abramson, Erika L.; Pfoh, Elizabeth R.; Kaushal, Rainu

    2012-01-01

    Effective electronic health record (EHR) implementations in community settings are critical to promoting safe and reliable EHR use as well as mitigating provider dissatisfaction that often results. The implementation challenge is compounded given the scale and scope of EHR installations that are occurring and will continue to occur over the next five years. However, when compared to EHR evaluations relatively few biomedical informatics researchers have published on evaluating EHR implementations. Fewer still have evaluated EHR implementations in community settings. We report on the methods we used to achieve a novel application of an implementation science framework in informatics to qualitatively evaluate community-based EHR implementations. We briefly provide an overview of the implementation science framework, our methods for adapting it to informatics, the effects the framework had on our qualitative methods of inquiry and analysis, and discuss its potential value for informatics research. PMID:23304351

  9. Investigating the Experience of Outdoor and Adventurous Project Work in an Educational Setting Using a Self-Determination Framework

    ERIC Educational Resources Information Center

    Sproule, John; Martindale, Russell; Wang, John; Allison, Peter; Nash, Christine; Gray, Shirley

    2013-01-01

    The purpose of this study was to carry out a preliminary investigation to explore the use of outdoor and adventurous project work (PW) within an educational setting. Specifically, differences between the PW and normal academic school experiences were examined using a self-determination theory framework integrated with a goal orientation and…

  10. Investigating the Experience of Outdoor and Adventurous Project Work in an Educational Setting Using a Self-Determination Framework

    ERIC Educational Resources Information Center

    Sproule, John; Martindale, Russell; Wang, John; Allison, Peter; Nash, Christine; Gray, Shirley

    2013-01-01

    The purpose of this study was to carry out a preliminary investigation to explore the use of outdoor and adventurous project work (PW) within an educational setting. Specifically, differences between the PW and normal academic school experiences were examined using a self-determination theory framework integrated with a goal orientation and…

  11. 76 FR 9004 - Public Comment on Setting Achievement Levels in Writing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-16

    ... Public Comment on Setting Achievement Levels in Writing AGENCY: U.S. Department of Education, National Assessment Governing Board. ACTION: Notice, Public Comment on Setting Achievement Levels in Writing. SUMMARY... recommendations to improve the design proposed for setting achievement levels for NAEP in writing. This notice...

  12. 21 CFR 530.23 - Procedure for setting and announcing safe levels.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 6 2014-04-01 2014-04-01 false Procedure for setting and announcing safe levels... for setting and announcing safe levels. (a) FDA may issue an order establishing a safe level for a... in the Federal Register a notice of the order. The notice will include: (1) A statement setting...

  13. Toppled television sets and head injuries in the pediatric population: a framework for prevention.

    PubMed

    Cusimano, Michael D; Parker, Nadine

    2016-01-01

    Injuries to children caused by falling televisions have become more frequent during the last decade. These injuries can be severe and even fatal and are likely to become even more common in the future as TVs increase in size and become more affordable. To formulate guidelines for the prevention of these injuries, the authors systematically reviewed the literature on injuries related to toppling televisions. The authors searched MEDLINE, PubMed, Embase, Scopus, CINAHL (Cumulative Index to Nursing and Allied Health Literature), Cochrane Library, and Google Scholar according to the Cochrane guidelines for all studies involving children 0-18 years of age who were injured by toppled TVs. Factors contributing to injury were categorized using Haddon's Matrix, and the public health approach was used as a framework for developing strategies to prevent these injuries. The vast majority (84%) of the injuries occurred in homes and more than three-fourths were unwitnessed by adult caregivers. The TVs were most commonly large and elevated off the ground. Dressers and other furniture not designed to support TVs were commonly involved in the TV-toppling incident. The case fatality rate varies widely, but almost all deaths reported (96%) were due to brain injuries. Toddlers between the ages of 1 and 3 years most frequently suffer injuries to the head and neck, and they are most likely to suffer severe injuries. Many of these injuries require brain imaging and neurosurgical intervention. Prevention of these injuries will require changes in TV design and legislation as well as increases in public education and awareness. Television-toppling injuries can be easily prevented; however, the rates of injury do not reflect a sufficient level of awareness, nor do they reflect an acceptable effort from an injury prevention perspective.

  14. High-level waste tank farm set point document

    SciTech Connect

    Anthony, J.A. III

    1995-01-15

    Setpoints for nuclear safety-related instrumentation are required for actions determined by the design authorization basis. Minimum requirements need to be established for assuring that setpoints are established and held within specified limits. This document establishes the controlling methodology for changing setpoints of all classifications. The instrumentation under consideration involve the transfer, storage, and volume reduction of radioactive liquid waste in the F- and H-Area High-Level Radioactive Waste Tank Farms. The setpoint document will encompass the PROCESS AREA listed in the Safety Analysis Report (SAR) (DPSTSA-200-10 Sup 18) which includes the diversion box HDB-8 facility. In addition to the PROCESS AREAS listed in the SAR, Building 299-H and the Effluent Transfer Facility (ETF) are also included in the scope.

  15. Level set methods for modelling field evaporation in atom probe.

    PubMed

    Haley, Daniel; Moody, Michael P; Smith, George D W

    2013-12-01

    Atom probe is a nanoscale technique for creating three-dimensional spatially and chemically resolved point datasets, primarily of metallic or semiconductor materials. While atom probe can achieve local high-level resolution, the spatial coherence of the technique is highly dependent upon the evaporative physics in the material and can often result in large geometric distortions in experimental results. The distortions originate from uncertainties in the projection function between the field evaporating specimen and the ion detector. Here we explore the possibility of continuum numerical approximations to the evaporative behavior during an atom probe experiment, and the subsequent propagation of ions to the detector, with particular emphasis placed on the solution of axisymmetric systems, such as isolated particles and multilayer systems. Ultimately, this method may prove critical in rapid modeling of tip shape evolution in atom probe tomography, which itself is a key factor in the rapid generation of spatially accurate reconstructions in atom probe datasets.

  16. Evaluation of the causal framework used for setting national ambient air quality standards.

    PubMed

    Goodman, Julie E; Prueitt, Robyn L; Sax, Sonja N; Bailey, Lisa A; Rhomberg, Lorenz R

    2013-11-01

    Abstract A scientifically sound assessment of the potential hazards associated with a substance requires a systematic, objective and transparent evaluation of the weight of evidence (WoE) for causality of health effects. We critically evaluated the current WoE framework for causal determination used in the United States Environmental Protection Agency's (EPA's) assessments of the scientific data on air pollutants for the National Ambient Air Quality Standards (NAAQS) review process, including its methods for literature searches; study selection, evaluation and integration; and causal judgments. The causal framework used in recent NAAQS evaluations has many valuable features, but it could be more explicit in some cases, and some features are missing that should be included in every WoE evaluation. Because of this, it has not always been applied consistently in evaluations of causality, leading to conclusions that are not always supported by the overall WoE, as we demonstrate using EPA's ozone Integrated Science Assessment as a case study. We propose additions to the NAAQS causal framework based on best practices gleaned from a previously conducted survey of available WoE frameworks. A revision of the NAAQS causal framework so that it more closely aligns with these best practices and the full and consistent application of the framework will improve future assessments of the potential health effects of criteria air pollutants by making the assessments more thorough, transparent, and scientifically sound.

  17. Priority setting in HIV/AIDS control in West Java Indonesia: an evaluation based on the accountability for reasonableness framework.

    PubMed

    Tromp, Noor; Prawiranegara, Rozar; Subhan Riparev, Harris; Siregar, Adiatma; Sunjaya, Deni; Baltussen, Rob

    2015-04-01

    Indonesia has insufficient resources to adequately respond to the HIV/AIDS epidemic, and thus faces a great challenge in prioritizing interventions. In many countries, such priority setting processes are typically ad hoc and not transparent leading to unfair decisions. Here, we evaluated the priority setting process in HIV/AIDS control in West Java province against the four conditions of the accountability for reasonableness (A4R) framework: relevance, publicity, appeals and revision, and enforcement. We reviewed government documents and conducted semi-structured qualitative interviews based on the A4R framework with 22 participants of the 5-year HIV/AIDS strategy development for 2008-13 (West Java province) and 2007-11 (Bandung). We found that criteria for priority setting were used implicitly and that the strategies included a wide range of programmes. Many stakeholders were involved in the process but their contribution could be improved and particularly the public and people living with HIV/AIDS could be better engaged. The use of appeal and publicity mechanisms could be more transparent and formally stated. Public regulations are not yet installed to ensure fair priority setting. To increase fairness in HIV/AIDS priority setting, West Java should make improvements on all four conditions of the A4R framework. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014; all rights reserved.

  18. Measuring implementation behaviour of menu guidelines in the childcare setting: confirmatory factor analysis of a theoretical domains framework questionnaire (TDFQ).

    PubMed

    Seward, Kirsty; Wolfenden, Luke; Wiggers, John; Finch, Meghan; Wyse, Rebecca; Oldmeadow, Christopher; Presseau, Justin; Clinton-McHarg, Tara; Yoong, Sze Lin

    2017-04-04

    While there are number of frameworks which focus on supporting the implementation of evidence based approaches, few psychometrically valid measures exist to assess constructs within these frameworks. This study aimed to develop and psychometrically assess a scale measuring each domain of the Theoretical Domains Framework for use in assessing the implementation of dietary guidelines within a non-health care setting (childcare services). A 75 item 14-domain Theoretical Domains Framework Questionnaire (TDFQ) was developed and administered via telephone interview to 202 centre based childcare service cooks who had a role in planning the service menu. Confirmatory factor analysis (CFA) was undertaken to assess the reliability, discriminant validity and goodness of fit of the 14-domain theoretical domain framework measure. For the CFA, five iterative processes of adjustment were undertaken where 14 items were removed, resulting in a final measure consisting of 14 domains and 61 items. For the final measure: the Chi-Square goodness of fit statistic was 3447.19; the Standardized Root Mean Square Residual (SRMR) was 0.070; the Root Mean Square Error of Approximation (RMSEA) was 0.072; and the Comparative Fit Index (CFI) had a value of 0.78. While only one of the three indices support goodness of fit of the measurement model tested, a 14-domain model with 61 items showed good discriminant validity and internally consistent items. Future research should aim to assess the psychometric properties of the developed TDFQ in other community-based settings.

  19. Setting the Normalcy Level of HI Properties in Isolated Galaxies

    NASA Astrophysics Data System (ADS)

    Espada, D.; Verdes-Montenegro, L.; Athanassoula, E.; Bosma, A.; Huchtmeier, W. K.; Leon, S.; Lisenfeld, U.; Sabater, J.; Sulentic, J.; Verley, S.; Yun, M.

    2010-10-01

    Studying the atomic gas (HI) properties of the most isolated galaxies is essential to quantify the effect that the environment exerts on this sensitive component of the interstellar medium. We observed and compiled HI data for a well defined sample of ˜ 800 galaxies in the Catalog of Isolated Galaxies, as part of the AMIGA project (Analysis of the ISM in Isolated GAlaxies, http://amiga.iaa.es), which enlarges considerably previous samples used to quantify the HI deficiency in galaxies located in denser environments. By studying the shape of 182 HI profiles, we revisited the usually accepted result that, independently of the environment, more than half of the galaxies present a perturbed HI disk. In isolated galaxies this would certainly be a striking result if these are supposed to be the most relaxed systems, and has implications in the relaxation time scales of HI disks and the nature of the most frequent perturbing mechanisms in galaxies. Our sample likely exhibits the lowest HI asymmetry level in the local Universe. We found that field samples present an excess of ˜ 20% more asymmetric HI profiles than those in CIG. Still a small percentage of galaxies in our sample present large asymmetries. Follow-up high resolution VLA maps give insight into the origin of such asymmetries.

  20. Bushmeat genetics: setting up a reference framework for the DNA typing of African forest bushmeat.

    PubMed

    Gaubert, Philippe; Njiokou, Flobert; Olayemi, Ayodeji; Pagani, Paolo; Dufour, Sylvain; Danquah, Emmanuel; Nutsuakor, Mac Elikem K; Ngua, Gabriel; Missoup, Alain-Didier; Tedesco, Pablo A; Dernat, Rémy; Antunes, Agostinho

    2015-05-01

    The bushmeat trade in tropical Africa represents illegal, unsustainable off-takes of millions of tons of wild game - mostly mammals - per year. We sequenced four mitochondrial gene fragments (cyt b, COI, 12S, 16S) in >300 bushmeat items representing nine mammalian orders and 59 morphological species from five western and central African countries (Guinea, Ghana, Nigeria, Cameroon and Equatorial Guinea). Our objectives were to assess the efficiency of cross-species PCR amplification and to evaluate the usefulness of our multilocus approach for reliable bushmeat species identification. We provide a straightforward amplification protocol using a single 'universal' primer pair per gene that generally yielded >90% PCR success rates across orders and was robust to different types of meat preprocessing and DNA extraction protocols. For taxonomic identification, we set up a decision pipeline combining similarity- and tree-based approaches with an assessment of taxonomic expertise and coverage of the GENBANK database. Our multilocus approach permitted us to: (i) adjust for existing taxonomic gaps in GENBANK databases, (ii) assign to the species level 67% of the morphological species hypotheses and (iii) successfully identify samples with uncertain taxonomic attribution (preprocessed carcasses and cryptic lineages). High levels of genetic polymorphism across genes and taxa, together with the excellent resolution observed among species-level clusters (neighbour-joining trees and Klee diagrams) advocate the usefulness of our markers for bushmeat DNA typing. We formalize our DNA typing decision pipeline through an expert-curated query database - DNA BUSHMEAT - that shall permit the automated identification of African forest bushmeat items.

  1. An analytical framework for delirium research in palliative care settings: integrated epidemiologic, clinician-researcher, and knowledge user perspectives.

    PubMed

    Lawlor, Peter G; Davis, Daniel H J; Ansari, Mohammed; Hosie, Annmarie; Kanji, Salmaan; Momoli, Franco; Bush, Shirley H; Watanabe, Sharon; Currow, David C; Gagnon, Bruno; Agar, Meera; Bruera, Eduardo; Meagher, David J; de Rooij, Sophia E J A; Adamis, Dimitrios; Caraceni, Augusto; Marchington, Katie; Stewart, David J

    2014-08-01

    Delirium often presents difficult management challenges in the context of goals of care in palliative care settings. The aim was to formulate an analytical framework for further research on delirium in palliative care settings, prioritize the associated research questions, discuss the inherent methodological challenges associated with relevant studies, and outline the next steps in a program of delirium research. We combined multidisciplinary input from delirium researchers and knowledge users at an international delirium study planning meeting, relevant literature searches, focused input of epidemiologic expertise, and a meeting participant and coauthor survey to formulate a conceptual research framework and prioritize research questions. Our proposed framework incorporates three main groups of research questions: the first was predominantly epidemiologic, such as delirium occurrence rates, risk factor evaluation, screening, and diagnosis; the second covers pragmatic management questions; and the third relates to the development of predictive models for delirium outcomes. Based on aggregated survey responses to each research question or domain, the combined modal ratings of "very" or "extremely" important confirmed their priority. Using an analytical framework to represent the full clinical care pathway of delirium in palliative care settings, we identified multiple knowledge gaps in relation to the occurrence rates, assessment, management, and outcome prediction of delirium in this population. The knowledge synthesis generated from adequately powered, multicenter studies to answer the framework's research questions will inform decision making and policy development regarding delirium detection and management and thus help to achieve better outcomes for patients in palliative care settings. Copyright © 2014 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  2. Architecture-Driven Level Set Optimization: From Clustering to Subpixel Image Segmentation.

    PubMed

    Balla-Arabe, Souleymane; Gao, Xinbo; Ginhac, Dominique; Brost, Vincent; Yang, Fan

    2016-12-01

    Thanks to their effectiveness, active contour models (ACMs) are of great interest for computer vision scientists. The level set methods (LSMs) refer to the class of geometric active contours. Comparing with the other ACMs, in addition to subpixel accuracy, it has the intrinsic ability to automatically handle topological changes. Nevertheless, the LSMs are computationally expensive. A solution for their time consumption problem can be hardware acceleration using some massively parallel devices such as graphics processing units (GPUs). But the question is: which accuracy can we reach while still maintaining an adequate algorithm to massively parallel architecture? In this paper, we attempt to push back the compromise between, speed and accuracy, efficiency and effectiveness, to a higher level, comparing with state-of-the-art methods. To this end, we designed a novel architecture-aware hybrid central processing unit (CPU)-GPU LSM for image segmentation. The initialization step, using the well-known k -means algorithm, is fast although executed on a CPU, while the evolution equation of the active contour is inherently local and therefore suitable for GPU-based acceleration. The incorporation of local statistics in the level set evolution allowed our model to detect new boundaries which are not extracted by the used clustering algorithm. Comparing with some cutting-edge LSMs, the introduced model is faster, more accurate, less subject to giving local minima, and therefore suitable for automatic systems. Furthermore, it allows two-phase clustering algorithms to benefit from the numerous LSM advantages such as the ability to achieve robust and subpixel accurate segmentation results with smooth and closed contours. Intensive experiments demonstrate, objectively and subjectively, the good performance of the introduced framework both in terms of speed and accuracy.

  3. Benchmarking density functional theory predictions of framework structures and properties in a chemically diverse test set of metal-organic frameworks

    SciTech Connect

    Nazarian, Dalar; Ganesh, P.; Sholl, David S.

    2015-01-01

    We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionals for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.

  4. Screening Systems and Decision Making at the Preschool Level: Application of a Comprehensive Validity Framework

    ERIC Educational Resources Information Center

    Kettler, Ryan J.; Feeney-Kettler, Kelly A.

    2011-01-01

    Universal screening is designed to be an efficient method for identifying preschool students with mental health problems, but prior to use, screening systems must be evaluated to determine their appropriateness within a specific setting. In this article, an evidence-based validity framework is applied to four screening systems for identifying…

  5. A global health delivery framework approach to epilepsy care in resource-limited settings.

    PubMed

    Cochran, Maggie F; Berkowitz, Aaron L

    2015-11-15

    The Global Health Delivery (GHD) framework (Farmer, Kim, and Porter, Lancet 2013;382:1060-69) allows for the analysis of health care delivery systems along four axes: a care delivery value chain that incorporates prevention, diagnosis, and treatment of a medical condition; shared delivery infrastructure that integrates care within existing healthcare delivery systems; alignment of care delivery with local context; and generation of economic growth and social development through the health care delivery system. Here, we apply the GHD framework to epilepsy care in rural regions of low- and middle-income countries (LMIC) where there are few or no neurologists. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Conceptual Framework and Levels of Abstraction for a Complex Large-Scale System

    SciTech Connect

    Simpson, Mary J.

    2005-03-23

    A conceptual framework and levels of abstraction are created to apply across all potential threats. Bioterrorism is used as a complex example to describe the general framework. Bioterrorism is unlimited with respect to the use of a specific agent, mode of dissemination, and potential target. Because the threat is open-ended, there is a strong need for a common, systemic understanding of attack scenarios related to bioterrorism. In recognition of this large-scale complex problem, systems are being created to define, design and use the proper level of abstraction and conceptual framework in bioterrorism. The wide variety of biological agents and delivery mechanisms provide an opportunity for dynamic scale changes by the linking or interlinking of existing threat components. Concurrent impacts must be separated and evaluated in terms of a given environment and/or ‘abstraction framework.’

  7. A conceptual framework for organizational readiness to implement nutrition and physical activity programs in early childhood education settings.

    PubMed

    Sharma, Shreela V; Upadhyaya, Mudita; Schober, Daniel J; Byrd-Williams, Courtney

    2014-10-30

    Across multiple sectors, organizational readiness predicts the success of program implementation. However, the factors influencing readiness of early childhood education (ECE) organizations for implementation of new nutrition and physical activity programs is poorly understood. This study presents a new conceptual framework to measure organizational readiness to implement nutrition and physical activity programs in ECE centers serving children aged 0 to 5 years. The framework was validated for consensus on relevance and generalizability by conducting focus groups; the participants were managers (16 directors and 2 assistant directors) of ECE centers. The framework theorizes that it is necessary to have "collective readiness," which takes into account such factors as resources, organizational operations, work culture, and the collective attitudes, motivation, beliefs, and intentions of ECE staff. Results of the focus groups demonstrated consensus on the relevance of proposed constructs across ECE settings. Including readiness measures during program planning and evaluation could inform implementation of ECE programs targeting nutrition and physical activity behaviors.

  8. Cooperative fuzzy games approach to setting target levels of ECs in quality function deployment.

    PubMed

    Yang, Zhihui; Chen, Yizeng; Yin, Yunqiang

    2014-01-01

    Quality function deployment (QFD) can provide a means of translating customer requirements (CRs) into engineering characteristics (ECs) for each stage of product development and production. The main objective of QFD-based product planning is to determine the target levels of ECs for a new product or service. QFD is a breakthrough tool which can effectively reduce the gap between CRs and a new product/service. Even though there are conflicts among some ECs, the objective of developing new product is to maximize the overall customer satisfaction. Therefore, there may be room for cooperation among ECs. A cooperative game framework combined with fuzzy set theory is developed to determine the target levels of the ECs in QFD. The key to develop the model is the formulation of the bargaining function. In the proposed methodology, the players are viewed as the membership functions of ECs to formulate the bargaining function. The solution for the proposed model is Pareto-optimal. An illustrated example is cited to demonstrate the application and performance of the proposed approach.

  9. Cooperative Fuzzy Games Approach to Setting Target Levels of ECs in Quality Function Deployment

    PubMed Central

    Yang, Zhihui; Chen, Yizeng; Yin, Yunqiang

    2014-01-01

    Quality function deployment (QFD) can provide a means of translating customer requirements (CRs) into engineering characteristics (ECs) for each stage of product development and production. The main objective of QFD-based product planning is to determine the target levels of ECs for a new product or service. QFD is a breakthrough tool which can effectively reduce the gap between CRs and a new product/service. Even though there are conflicts among some ECs, the objective of developing new product is to maximize the overall customer satisfaction. Therefore, there may be room for cooperation among ECs. A cooperative game framework combined with fuzzy set theory is developed to determine the target levels of the ECs in QFD. The key to develop the model is the formulation of the bargaining function. In the proposed methodology, the players are viewed as the membership functions of ECs to formulate the bargaining function. The solution for the proposed model is Pareto-optimal. An illustrated example is cited to demonstrate the application and performance of the proposed approach. PMID:25097884

  10. An explanatory framework of teachers' perceptions of a positive mealtime environment in a preschool setting.

    PubMed

    Mita, Satoko C; Gray, Samuel A; Goodell, L Suzanne

    2015-07-01

    Attending a preschool center may help preschoolers with growth and development that encourage a healthy lifestyle, including sound eating behaviors. Providing a positive mealtime environment (PME) may be one of the keys to fostering a child's healthy eating habits in the classroom. However, a specific definition of a PME, the components of a PME, or directions on how to create one have not been established. The purpose of this study, therefore, was to explore Head Start teachers' perceptions related to a PME and create a conceptual framework representing these perceptions. To achieve this purpose, researchers conducted 65 in-depth phone interviews with Head Start teachers around the US. Applying principles of grounded theory, researchers developed a conceptual framework depicting teachers' perceptions of PME, consisting of five key components: (1) the people (i.e., teachers, kitchen staff, parent volunteers, and children), (2) positive emotional tone (e.g., relaxed and happy), (3) rules, expectations, and routines (e.g., family-style mealtime), (4) operations of a PME (i.e., eating, socialization, and learning), and (5) both short- and long-term outcomes of a PME. With this PME framework, researchers may be able to enhance the effectiveness of nutrition interventions related to a PME, focusing on the factors in the conceptual framework as well as barriers associated with achieving these factors.

  11. Validation of the Visitor and Resident Framework in an E-Book Setting

    ERIC Educational Resources Information Center

    Engelsmann, Hazel C.; Greifeneder, Elke; Lauridsen, Nikoline D.; Nielsen, Anja G.

    2014-01-01

    Introduction: By applying the visitor and resident framework on e-book usage, the article explores whether the concepts of a resident and a visitor can help to explain e-book use, and can help to gain a better insight into users' motivations for e-book use. Method: A questionnaire and semi-structured interviews were conducted with users of the…

  12. Empirical Learner Language and the Levels of the "Common European Framework of Reference"

    ERIC Educational Resources Information Center

    Wisniewski, Katrin

    2017-01-01

    The "Common European Framework of Reference" (CEFR) is the most widespread reference tool for linking language tests, curricula, and national educational standards to levels of foreign language proficiency in Europe. In spite of this, little is known about how the CEFR levels (A1-C2) relate to empirical learner language(s). This article…

  13. Alternative Frameworks of the Secondary School Students on the Concept of Condensation at Submicroscopic Level

    ERIC Educational Resources Information Center

    Abdullah, Nurdiana; Surif, Johari; Ismail, Syuhaida

    2016-01-01

    The study was carried out to identify the alternative frameworks on the concept of condensation at submicroscopic level among secondary school students (N = 324). Data was collected by using the qualitative method through the Understanding Test on the Concept of Matter at Submicroscopic Level which consisted of 10 open-ended questions. The…

  14. Learning about Energy: The Influence of Alternative Frameworks, Cognitive Levels, and Closed-Mindedness.

    ERIC Educational Resources Information Center

    Trumper, Ricardo; Gorsky, Paul

    1993-01-01

    This study found no significant relations between junior high school students' (n=50) prior alternative frameworks on energy and their cognitive levels of operation. Significant differences in learning outcomes were achieved by students (n=29) who had higher cognitive level scores. The extent to which students succeeded in learning the energy…

  15. Novel multimodality segmentation using level sets and Jensen-Rényi divergence

    SciTech Connect

    Markel, Daniel; Zaidi, Habib; El Naqa, Issam

    2013-12-15

    Purpose: Positron emission tomography (PET) is playing an increasing role in radiotherapy treatment planning. However, despite progress, robust algorithms for PET and multimodal image segmentation are still lacking, especially if the algorithm were extended to image-guided and adaptive radiotherapy (IGART). This work presents a novel multimodality segmentation algorithm using the Jensen-Rényi divergence (JRD) to evolve the geometric level set contour. The algorithm offers improved noise tolerance which is particularly applicable to segmentation of regions found in PET and cone-beam computed tomography. Methods: A steepest gradient ascent optimization method is used in conjunction with the JRD and a level set active contour to iteratively evolve a contour to partition an image based on statistical divergence of the intensity histograms. The algorithm is evaluated using PET scans of pharyngolaryngeal squamous cell carcinoma with the corresponding histological reference. The multimodality extension of the algorithm is evaluated using 22 PET/CT scans of patients with lung carcinoma and a physical phantom scanned under varying image quality conditions. Results: The average concordance index (CI) of the JRD segmentation of the PET images was 0.56 with an average classification error of 65%. The segmentation of the lung carcinoma images had a maximum diameter relative error of 63%, 19.5%, and 14.8% when using CT, PET, and combined PET/CT images, respectively. The estimated maximal diameters of the gross tumor volume (GTV) showed a high correlation with the macroscopically determined maximal diameters, with aR{sup 2} value of 0.85 and 0.88 using the PET and PET/CT images, respectively. Results from the physical phantom show that the JRD is more robust to image noise compared to mutual information and region growing. Conclusions: The JRD has shown improved noise tolerance compared to mutual information for the purpose of PET image segmentation. Presented is a flexible

  16. An Analytical Framework for Delirium Research in Palliative Care Settings: Integrated Epidemiologic, Clinician-Researcher, and Knowledge User Perspectives

    PubMed Central

    Ansari, Mohammed; Hosie, Annmarie; Kanji, Salmaan; Momoli, Franco; Bush, Shirley H.; Watanabe, Sharon; Currow, David C.; Gagnon, Bruno; Agar, Meera; Bruera, Eduardo; Meagher, David J.; de Rooij, Sophia E.J.A.; Adamis, Dimitrios; Caraceni, Augusto; Marchington, Katie; Stewart, David J.

    2014-01-01

    Context Delirium often presents difficult management challenges in the context of goals of care in palliative care settings. Objectives The aim was to formulate an analytical framework for further research on delirium in palliative care settings, prioritize the associated research questions, discuss the inherent methodological challenges associated with relevant studies, and outline the next steps in a program of delirium research. Methods We combined multidisciplinary input from delirium researchers and knowledge users at an international delirium study planning meeting, relevant literature searches, focused input of epidemiologic expertise, and a meeting participant and coauthor survey to formulate a conceptual research framework and prioritize research questions. Results Our proposed framework incorporates three main groups of research questions: the first was predominantly epidemiologic, such as delirium occurrence rates, risk factor evaluation, screening, and diagnosis; the second covers pragmatic management questions; and the third relates to the development of predictive models for delirium outcomes. Based on aggregated survey responses to each research question or domain, the combined modal ratings of “very” or “extremely” important confirmed their priority. Conclusion Using an analytical framework to represent the full clinical care pathway of delirium in palliative care settings, we identified multiple knowledge gaps in relation to the occurrence rates, assessment, management, and outcome prediction of delirium in this population. The knowledge synthesis generated from adequately powered, multicenter studies to answer the framework’s research questions will inform decision making and policy development regarding delirium detection and management and thus help to achieve better outcomes for patients in palliative care settings. PMID:24726762

  17. A 4D Framework for Ocean Basin Paleodepths and Eustatic Sea Level Change

    NASA Astrophysics Data System (ADS)

    Muller, R.; Sdrolias, M.; Gaina, C.

    2006-12-01

    A digital framework for paleobathymetry of the ocean basins requires the complete reconstruction of ocean floor through time, including the main ocean basins, back-arc basins, and now subducted ocean crust. We reconstruct paleo-oceans by creating "synthetic plates", the locations and geometry of which is established on the basis of preserved ocean crust (magnetic lineations and fracture zones), geological data, and the rules of plate tectonics. We reconstruct the spreading histories of the Pacific, Phoenix, Izanagi, Farallon and Kula plates, the plates involved in the Indian, Atlantic, Caribbean, Arctic, Tethys and Arctic oceanic domains and all plates involved in preserved backarc basins. Based mainly on the GML-standards compliant GPlates software and the Generic Mapping Tools, we have created a set of global oceanic paleo-isochrons and paleoceanic age and depth grids. We show that the late-Cretaceous sea level highstand and the subsequent long-term drop in sea level was primarily caused by the changing age-area distribution of Pacific ocean floor through time. The emplacement of oceanic plateaus has resulted in a 40 m sealevel rise between 125 and 110 Ma, and a further 60 m rise after 110 Ma, whereas the oceanic age and latitude dependence of marine sediments has resulted in a 40m sealevel rise since about 120Ma, offsetting the gradual post-80Ma drop in sealevel due to the ageing and deepening mainly of the Pacific ocean basin, with the net effect being an about 200m drop after 80 Ma. Between 140 Ma and the present, oceanic crustal production dropped by over 40% in the Pacific, but stayed roughly constant in the remaining ocean basins. Our results suggest that the overall magnitude of 1st order sealevel change implied by Haq's sea level curve is correct.

  18. Intervention complexity--a conceptual framework to inform priority-setting in health.

    PubMed

    Gericke, Christian A; Kurowski, Christoph; Ranson, M Kent; Mills, Anne

    2005-04-01

    Health interventions vary substantially in the degree of effort required to implement them. To some extent this is apparent in their financial cost, but the nature and availability of non-financial resources is often of similar importance. In particular, human resource requirements are frequently a major constraint. We propose a conceptual framework for the analysis of interventions according to their degree of technical complexity; this complements the notion of institutional capacity in considering the feasibility of implementing an intervention. Interventions are categorized into four dimensions: characteristics of the basic intervention; characteristics of delivery; requirements on government capacity; and usage characteristics. The analysis of intervention complexity should lead to a better understanding of supply- and demand-side constraints to scaling up, indicate priorities for further research and development, and can point to potential areas for improvement of specific aspects of each intervention to close the gap between the complexity of an intervention and the capacity to implement it. The framework is illustrated using the examples of scaling up condom social marketing programmes, and the DOTS strategy for tuberculosis control in highly resource-constrained countries. The framework could be used as a tool for policy-makers, planners and programme managers when considering the expansion of existing projects or the introduction of new interventions. Intervention complexity thus complements the considerations of burden of disease, cost-effectiveness, affordability and political feasibility in health policy decision-making. Reducing the technical complexity of interventions will be crucial to meeting the health-related Millennium Development Goals.

  19. Multi-domain, Higher Order Level Set Scheme for 3D Image Segmentation on the GPU.

    PubMed

    Sharma, Ojaswa; Zhang, Qin; Anton, François; Bajaj, Chandrajit

    2010-06-13

    Level set method based segmentation provides an efficient tool for topological and geometrical shape handling. Conventional level set surfaces are only C(0) continuous since the level set evolution involves linear interpolation to compute derivatives. Bajaj et al. present a higher order method to evaluate level set surfaces that are C(2) continuous, but are slow due to high computational burden. In this paper, we provide a higher order GPU based solver for fast and efficient segmentation of large volumetric images. We also extend the higher order method to multi-domain segmentation. Our streaming solver is efficient in memory usage.

  20. Benchmarking density functional theory predictions of framework structures and properties in a chemically diverse test set of metal-organic frameworks

    DOE PAGES

    Nazarian, Dalar; Ganesh, P.; Sholl, David S.

    2015-01-01

    We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionalsmore » for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.« less

  1. A novel level set model with automated initialization and controlling parameters for medical image segmentation.

    PubMed

    Liu, Qingyi; Jiang, Mingyan; Bai, Peirui; Yang, Guang

    2016-03-01

    In this paper, a level set model without the need of generating initial contour and setting controlling parameters manually is proposed for medical image segmentation. The contribution of this paper is mainly manifested in three points. First, we propose a novel adaptive mean shift clustering method based on global image information to guide the evolution of level set. By simple threshold processing, the results of mean shift clustering can automatically and speedily generate an initial contour of level set evolution. Second, we devise several new functions to estimate the controlling parameters of the level set evolution based on the clustering results and image characteristics. Third, the reaction diffusion method is adopted to supersede the distance regularization term of RSF-level set model, which can improve the accuracy and speed of segmentation effectively with less manual intervention. Experimental results demonstrate the performance and efficiency of the proposed model for medical image segmentation.

  2. A framework for outcome-level evaluation of in-service training of health care workers

    PubMed Central

    2013-01-01

    Background In-service training is a key strategic approach to addressing the severe shortage of health care workers in many countries. However, there is a lack of evidence linking these health care worker trainings to improved health outcomes. In response, the United States President’s Emergency Plan for AIDS Relief’s Human Resources for Health Technical Working Group initiated a project to develop an outcome-focused training evaluation framework. This paper presents the methods and results of that project. Methods A general inductive methodology was used for the conceptualization and development of the framework. Fifteen key informant interviews were conducted to explore contextual factors, perceived needs, barriers and facilitators affecting the evaluation of training outcomes. In addition, a thematic analysis of 70 published articles reporting health care worker training outcomes identified key themes and categories. These were integrated, synthesized and compared to several existing training evaluation models. This formed an overall typology which was used to draft a new framework. Finally, the framework was refined and validated through an iterative process of feedback, pilot testing and revision. Results The inductive process resulted in identification of themes and categories, as well as relationships among several levels and types of outcomes. The resulting framework includes nine distinct types of outcomes that can be evaluated, which are organized within three nested levels: individual, organizational and health system/population. The outcome types are: (1) individual knowledge, attitudes and skills; (2) individual performance; (3) individual patient health; (4) organizational systems; (5) organizational performance; (6) organizational-level patient health; (7) health systems; (8) population-level performance; and (9) population-level health. The framework also addresses contextual factors which may influence the outcomes of training, as well as the

  3. A framework for outcome-level evaluation of in-service training of health care workers.

    PubMed

    O'Malley, Gabrielle; Perdue, Thomas; Petracca, Frances

    2013-10-01

    In-service training is a key strategic approach to addressing the severe shortage of health care workers in many countries. However, there is a lack of evidence linking these health care worker trainings to improved health outcomes. In response, the United States President's Emergency Plan for AIDS Relief's Human Resources for Health Technical Working Group initiated a project to develop an outcome-focused training evaluation framework. This paper presents the methods and results of that project. A general inductive methodology was used for the conceptualization and development of the framework. Fifteen key informant interviews were conducted to explore contextual factors, perceived needs, barriers and facilitators affecting the evaluation of training outcomes. In addition, a thematic analysis of 70 published articles reporting health care worker training outcomes identified key themes and categories. These were integrated, synthesized and compared to several existing training evaluation models. This formed an overall typology which was used to draft a new framework. Finally, the framework was refined and validated through an iterative process of feedback, pilot testing and revision. The inductive process resulted in identification of themes and categories, as well as relationships among several levels and types of outcomes. The resulting framework includes nine distinct types of outcomes that can be evaluated, which are organized within three nested levels: individual, organizational and health system/population. The outcome types are: (1) individual knowledge, attitudes and skills; (2) individual performance; (3) individual patient health; (4) organizational systems; (5) organizational performance; (6) organizational-level patient health; (7) health systems; (8) population-level performance; and (9) population-level health. The framework also addresses contextual factors which may influence the outcomes of training, as well as the ability of evaluators to

  4. Intervention complexity--a conceptual framework to inform priority-setting in health.

    PubMed Central

    Gericke, Christian A.; Kurowski, Christoph; Ranson, M. Kent; Mills, Anne

    2005-01-01

    Health interventions vary substantially in the degree of effort required to implement them. To some extent this is apparent in their financial cost, but the nature and availability of non-financial resources is often of similar importance. In particular, human resource requirements are frequently a major constraint. We propose a conceptual framework for the analysis of interventions according to their degree of technical complexity; this complements the notion of institutional capacity in considering the feasibility of implementing an intervention. Interventions are categorized into four dimensions: characteristics of the basic intervention; characteristics of delivery; requirements on government capacity; and usage characteristics. The analysis of intervention complexity should lead to a better understanding of supply- and demand-side constraints to scaling up, indicate priorities for further research and development, and can point to potential areas for improvement of specific aspects of each intervention to close the gap between the complexity of an intervention and the capacity to implement it. The framework is illustrated using the examples of scaling up condom social marketing programmes, and the DOTS strategy for tuberculosis control in highly resource-constrained countries. The framework could be used as a tool for policy-makers, planners and programme managers when considering the expansion of existing projects or the introduction of new interventions. Intervention complexity thus complements the considerations of burden of disease, cost-effectiveness, affordability and political feasibility in health policy decision-making. Reducing the technical complexity of interventions will be crucial to meeting the health-related Millennium Development Goals. PMID:15868020

  5. A framework for evaluating safety-net and other community-level factors on access for low-income populations.

    PubMed

    Davidson, Pamela L; Andersen, Ronald M; Wyn, Roberta; Brown, E Richard

    2004-01-01

    The framework presented in this article extends the Andersen behavioral model of health services utilization research to examine the effects of contextual determinants of access. A conceptual framework is suggested for selecting and constructing contextual (or community-level) variables representing the social, economic, structural, and public policy environment that influence low-income people's use of medical care. Contextual variables capture the characteristics of the population that disproportionately relies on the health care safety net, the public policy support for low-income and safety-net populations, and the structure of the health care market and safety-net services within that market. Until recently, the literature in this area has been largely qualitative and descriptive and few multivariate studies comprehensively investigated the contextual determinants of access. The comprehensive and systematic approach suggested by the framework will enable researchers to strengthen the external validity of results by accounting for the influence of a consistent set of contextual factors across locations and populations. A subsequent article in this issue of Inquiry applies the framework to examine access to ambulatory care for low-income adults, both insured and uninsured.

  6. Improving adolescent health policy: incorporating a framework for assessing state-level policies.

    PubMed

    Brindis, Claire D; Moore, Kristin

    2014-01-01

    Many US policies that affect health are made at the state, not the federal, level. Identifying state-level policies and data to analyze how different policies affect outcomes may help policy makers ascertain the usefulness of their public policies and funding decisions in improving the health of adolescent populations. A framework for describing and assessing the role of federal and state policies on adolescent health and well-being is proposed; an example of how the framework might be applied to the issue of teen childbearing is included. Such a framework can also help inform analyses of whether and how state and federal policies contribute to the variation across states in meeting adolescent health needs. A database on state policies, contextual variables, and health outcomes data can further enable researchers and policy makers to examine how these factors are associated with behaviors they aim to impact.

  7. An improved particle correction procedure for the particle level set method

    NASA Astrophysics Data System (ADS)

    Wang, Zhaoyuan; Yang, Jianming; Stern, Frederick

    2009-09-01

    The particle level set method [D. Enright, R. Fedkiw, J. Ferziger, I. Mitchell, A hybrid particle level set method for improved interface capturing, J. Comput. Phys. 183 (2002) 83-116.] can substantially improve the mass conservation property of the level set method by using Lagrangian marker particles to correct the level set function in the under-resolved regions. In this study, the limitations of the particle level set method due to the errors introduced in the particle correction process are analyzed, and an improved particle correction procedure is developed based on a new interface reconstruction scheme. Moreover, the zero level set is "anchored" as the level set functions are reinitialized; hence the additional particle correction after the level set reinitialization is avoided. With this new scheme, a well-defined zero level set can be obtained and the disturbances to the interface are significantly reduced. Consequently, the particle reseeding operation will barely result in the loss of interface characteristics and can be applied as frequently as necessary. To demonstrate the accuracy and robustness of the proposed method, two extreme particle reseeding strategies, one without reseeding and the other with reseeding every time step, are applied in several benchmark advection tests and the results are compared with each other. Three interfacial flow cases, a 2D surface tension driven oscillating droplet, a 2D gas bubble rising in a quiescent liquid, and a 3D drop impact onto a liquid pool are simulated to illustrate the advantages of the current method over the level set and the original particle level set methods with regard to the smoothness of geometric properties and mass conservation in real physical applications.

  8. A framework of comfort for practice: An integrative review identifying the multiple influences on patients' experience of comfort in healthcare settings.

    PubMed

    Wensley, Cynthia; Botti, Mari; McKillop, Ann; Merry, Alan F

    2017-04-01

    Comfort is central to patient experience but the concept of comfort is poorly defined. This review aims to develop a framework representing patients' complex perspective of comfort to inform practice and guide initiatives to improve the quality of healthcare. CINAHL, MEDLINE Complete, PsycINFO and Google Scholar (November 2016); reference lists of included publications. Qualitative and theoretical studies advancing knowledge about the concept of comfort in healthcare settings. Studies rated for methodological quality and relevance to patients' perspectives. Data on design, methods, features of the concept of comfort, influences on patients' comfort. Data were systematically coded and categorized using Framework method. Sixty-two studies (14 theoretical and 48 qualitative) were included. Qualitative studies explored patient and staff perspectives in varying healthcare settings including hospice, emergency departments, paediatric, medical and surgical wards and residential care for the elderly. From patients' perspective, comfort is multidimensional, characterized by relief from physical discomfort and feeling positive and strengthened in one's ability to cope with the challenges of illness, injury and disability. Different factors are important to different individuals. We identified 10 areas of influence within four interrelated levels: patients' use of self-comforting strategies; family presence; staff actions and behaviours; and environmental factors. Our data provide new insights into the nature of comfort as a highly personal and contextual experience influenced in different individuals by different factors that we have classified into a framework to guide practice and quality improvement initiatives.

  9. A framework for testing and promoting expanded dissemination of promising preventive interventions that are being implemented in community settings.

    PubMed

    Mason, W Alex; Fleming, Charles B; Thompson, Ronald W; Haggerty, Kevin P; Snyder, James J

    2014-10-01

    Many evidence-based preventive interventions have been developed in recent years, but few are widely used. With the current focus on efficacy trials, widespread dissemination and implementation of evidence-based interventions are often afterthoughts. One potential strategy for reversing this trend is to find a promising program with a strong delivery vehicle in place and improve and test the program's efficacy through rigorous evaluation. If the program is supported by evidence, the dissemination vehicle is already in place and potentially can be expanded. This strategy has been used infrequently and has met with limited success to date, in part, because the field lacks a framework for guiding such research. To address this gap, we outline a framework for moving promising preventive interventions that are currently being implemented in community settings through a process of rigorous testing and, if needed, program modification in order to promote expanded dissemination. The framework is guided by RE-AIM (Reach, Efficacy/Effectiveness, Adoption, Implementation, and Maintenance) (Glasgow et al., Am J Publ Health 89:1322-1327, 1999), which focuses attention on external as well as internal validity in program tests, and is illustrated with examples. Challenges, such as responding to negative and null results, and opportunities inherent in the framework are discussed.

  10. Integrating Compact Constraint and Distance Regularization with Level Set for Hepatocellular Carcinoma (HCC) Segmentation on Computed Tomography (CT) Images

    NASA Astrophysics Data System (ADS)

    Gui, Luying; He, Jian; Qiu, Yudong; Yang, Xiaoping

    2017-01-01

    This paper presents a variational level set approach to segment lesions with compact shapes on medical images. In this study, we investigate to address the problem of segmentation for hepatocellular carcinoma which are usually of various shapes, variable intensities, and weak boundaries. An efficient constraint which is called the isoperimetric constraint to describe the compactness of shapes is applied in this method. In addition, in order to ensure the precise segmentation and stable movement of the level set, a distance regularization is also implemented in the proposed variational framework. Our method is applied to segment various hepatocellular carcinoma regions on Computed Tomography images with promising results. Comparison results also prove that the proposed method is more accurate than other two approaches.

  11. Level-set immersed boundary method for simulating 3D turbulent free surface flows in arbitrarily complex open channels

    NASA Astrophysics Data System (ADS)

    Kang, Seokkoo; Sotiropoulos, Fotis

    2010-11-01

    A numerical method is developed for simulating three-dimensional free surface flows in open channels of arbitrarily complex bathymetry. The complex geometry is handled using the curvilinear immersed boundary (CURVIB) method of Ge and Sotiropoulos (J. of Computational Physics, 2007) and free surface deformation is modeled by employing a two-phase flow level-set approach. A new method is developed for solving the level-set equations and the reinitialization equation in the context of the CURVIB framework. The method is validated for various free-surface model problems and its capabilities are demonstrated by applying to simulate turbulent free-surface flow in an open channel with embedded complex hydraulic structures.

  12. A level set approach for left ventricle detection in CT images using shape segmentation and optical flow

    NASA Astrophysics Data System (ADS)

    Brieva, Jorge; Moya-Albor, Ernesto; Escalante-Ramírez, Boris

    2015-01-01

    The left ventricle (LV) segmentation plays an important role in a subsequent process for the functional analysis of the LV. Typical segmentation of the endocardium wall in the ventricle excludes papillary muscles which leads to an incorrect measure of the ejected volume in the LV. In this paper we present a new variational strategy using a 2D level set framework that includes a local term for enhancing the low contrast structures and a 2D shape model. The shape model in the level set method is propagated to all image sequences corresponding to the cardiac cycles through the optical flow approach using the Hermite transform. To evaluate our strategy we use the Dice index and the Hausdorff distance to compare the segmentation results with the manual segmentation carried out by the physician.

  13. Joint Target Tracking, Recognition and Segmentation for Infrared Imagery Using a Shape Manifold-Based Level Set

    PubMed Central

    Gong, Jiulu; Fan, Guoliang; Yu, Liangjiang; Havlicek, Joseph P.; Chen, Derong; Fan, Ningjun

    2014-01-01

    We propose a new integrated target tracking, recognition and segmentation algorithm, called ATR-Seg, for infrared imagery. ATR-Seg is formulated in a probabilistic shape-aware level set framework that incorporates a joint view-identity manifold (JVIM) for target shape modeling. As a shape generative model, JVIM features a unified manifold structure in the latent space that is embedded with one view-independent identity manifold and infinite identity-dependent view manifolds. In the ATR-Seg algorithm, the ATR problem formulated as a sequential level-set optimization process over the latent space of JVIM, so that tracking and recognition can be jointly optimized via implicit shape matching where target segmentation is achieved as a by-product without any pre-processing or feature extraction. Experimental results on the recently released SENSIAC ATR database demonstrate the advantages and effectiveness of ATR-Seg over two recent ATR algorithms that involve explicit shape matching. PMID:24919014

  14. Joint target tracking, recognition and segmentation for infrared imagery using a shape manifold-based level set.

    PubMed

    Gong, Jiulu; Fan, Guoliang; Yu, Liangjiang; Havlicek, Joseph P; Chen, Derong; Fan, Ningjun

    2014-06-10

    We propose a new integrated target tracking, recognition and segmentation algorithm, called ATR-Seg, for infrared imagery. ATR-Seg is formulated in a probabilistic shape-aware level set framework that incorporates a joint view-identity manifold (JVIM) for target shape modeling. As a shape generative model, JVIM features a unified manifold structure in the latent space that is embedded with one view-independent identity manifold and infinite identity-dependent view manifolds. In the ATR-Seg algorithm, the ATR problem formulated as a sequential level-set optimization process over the latent space of JVIM, so that tracking and recognition can be jointly optimized via implicit shape matching where target segmentation is achieved as a by-product without any pre-processing or feature extraction. Experimental results on the recently released SENSIAC ATR database demonstrate the advantages and effectiveness of ATR-Seg over two recent ATR algorithms that involve explicit shape matching.

  15. Education leadership in the clinical health care setting: a framework for nursing education development.

    PubMed

    Mockett, Lynda; Horsfall, Janine; O'Callaghan, Wendy

    2006-12-01

    This paper describes how a new framework for clinical nursing education was introduced at Counties Manukau District Health Board (CMDHB), New Zealand. The project was initiated in response to the significant legislative and post registration nursing education changes within New Zealand. The journey of change has been a significant undertaking, and has required clear management, strong leadership, perseverance and understanding of the organisation's culture. The approach taken to managing the change had four stages, and reflects various change management models. The first stage, the identification process, identified the impetus for change. Creating the vision is the second stage and identified what the change would look like within the organisation. To ensure success and to guide the process of change a realistic and sustainable vision was developed. Implementing the vision was the third stage, and discusses the communication and pilot phase of implementing the nursing education framework. Stage four, embedding the vision, explores the process and experiences of changing an education culture and embedding the vision into an organisation. The paper concludes by discussing the importance of implementing robust, consistent, strategic and collaborative processes--that reflect and evaluate best educational nursing practice.

  16. The Agenda Setting Function of the Mass Media at Three Levels of "Information Holding"

    ERIC Educational Resources Information Center

    Benton, Marc; Frazier, P. Jean

    1976-01-01

    Extends the theoretical concept of agenda setting to include awareness of general issues, awareness of proposed solutions, and specific knowledge about the proposals. Examines whether or not agenda setting is operative at these levels and compares findings with previous agenda setting studies. (MH)

  17. Educational Preparation and Experiences in the Clinical Setting: Entry-Level Clinical Athletic Trainers' Perspectives

    ERIC Educational Resources Information Center

    Schilling, Jim

    2011-01-01

    Context: The clinical job setting: (Outpatient/Ambulatory/Rehabilitation Clinic) should no longer be referred to as a nontraditional setting as it employs the greatest percentage of certified members. Understanding the experiences, knowledge, and skills necessary to be successful in the clinical setting as entry-level certified athletic trainers…

  18. Implementing the New State Framework for History-Social Studies of (Tenth Grade Level).

    ERIC Educational Resources Information Center

    Leavey, Don

    1990-01-01

    Describes experience of implementing new California History Social Science Framework at the tenth grade level at Edison High School, Huntington Beach, California. Discusses the anxieties felt by teachers as they omitted areas of world history to teach selected topics in greater depth. Presents the world history course structure that was developed…

  19. Implementing the New State Framework for History-Social Studies of (Tenth Grade Level).

    ERIC Educational Resources Information Center

    Leavey, Don

    1990-01-01

    Describes experience of implementing new California History Social Science Framework at the tenth grade level at Edison High School, Huntington Beach, California. Discusses the anxieties felt by teachers as they omitted areas of world history to teach selected topics in greater depth. Presents the world history course structure that was developed…

  20. A set of STS assays targeting the chromosome 22 physical framework markers

    SciTech Connect

    MacCollin, M.; Romano, D.; Trofatter, J.; Menon, A.; Gusella, J. ); Budarf, M.; Emanuel, B. Children's Hospital, Philadelphia, PA ); Denny, C. ); Rouleau, G. ); Fontaine, B. )

    1993-03-01

    The widespread use of the sequence-tagged site (STS) as a quick, efficient, and reproducible assay for comparing physical and genetic map information promises to facilitate greatly long-range goals of the mapping of the human genome. The authors have designed 21 STS assays for loci on human chromosome 22. These assays primarily tag the physical framework markers of the long arm of 22, but additional assays have been designed from known genes and loci in the neurofibromatosis 2 (NF2) region. The availability of these assays will make these loci available to the research community without physical transfer of materials and will serve as start points for further efforts to physically map chromosome 22 with yeast artificial chromosome clones. 19 refs., 1 fig., 1 tab.

  1. Motivation and engagement in music and sport: testing a multidimensional framework in diverse performance settings.

    PubMed

    Martin, Andrew J

    2008-02-01

    The present study assessed the application of a multidimensional model of motivation and engagement (the Motivation and Engagement Wheel) and its accompanying instrumentation (the Motivation and Engagement Scale) to the music and sport domains. Participants were 463 young classical musicians (N=224) and sportspeople (N=239). In both music and sport samples, the data confirmed the good fit of the four hypothesized higher-order dimensions and their 11 first-order dimensions: adaptive cognitions (self-efficacy, valuing, mastery orientation), adaptive behaviors (planning, task management, persistence), impeding/maladaptive cognitions (uncertain control, anxiety, failure avoidance), and maladaptive behaviors (self-handicapping, disengagement). Multigroup tests of factor invariance showed that in terms of underlying motivational constructs and the composition of and relationships among these constructs, key subsamples are not substantially different. Moreover-and of particular relevance to issues around the generalizability of the framework-the factor structure for music and sport samples was predominantly invariant.

  2. Evidence-Based Standard Setting: Establishing a Validity Framework for Cut Scores

    ERIC Educational Resources Information Center

    McClarty, Katie Larsen; Way, Walter D.; Porter, Andrew C.; Beimers, Jennifer N.; Miles, Julie A.

    2013-01-01

    Performance standards are a powerful way to communicate K-12 student achievement (e.g., proficiency) and are the cornerstone of standards-based reform. As education reform shifts the focus to college and career readiness, approaches for setting performance standards need to be revised. We argue that the focus on assessing student readiness can…

  3. Design of the control set in the framework of variational data assimilation

    NASA Astrophysics Data System (ADS)

    Gejadze, I. Yu.; Malaterre, P.-O.

    2016-11-01

    Solving data assimilation problems under uncertainty in basic model parameters and in source terms may require a careful design of the control set. The task is to avoid such combinations of the control variables which may either lead to ill-posedness of the control problem formulation or compromise the robustness of the solution procedure. We suggest a method for quantifying the performance of a control set which is formed as a subset of the full set of uncertainty-bearing model inputs. Based on this quantity one can decide if the chosen 'safe' control set is sufficient in terms of the prediction accuracy. Technically, the method presents a certain generalization of the 'variational' uncertainty quantification method for observed systems. It is implemented as a matrix-free method, thus allowing high-dimensional applications. Moreover, if the Automatic Differentiation is utilized for computing the tangent linear and adjoint mappings, then it could be applied to any multi-input 'black-box' system. As application example we consider the full Saint-Venant hydraulic network model SIC2, which describes the flow dynamics in river and canal networks. The developed methodology seem useful in the context of the future SWOT satellite mission, which will provide observations of river systems the properties of which are known with quite a limited precision.

  4. Intellectual Curiosity in Action: A Framework to Assess First-Year Seminars in Liberal Arts Settings

    ERIC Educational Resources Information Center

    Kolb, Kenneth H.; Longest, Kyle C.; Barnett, Jenna C.

    2014-01-01

    Fostering students' intellectual curiosity is a common goal of first-year seminar programs--especially in liberal arts settings. The authors propose an alternative method to assess this ambiguous, value-laden concept. Relying on data gathered from pre- and posttest in-depth interviews of 34 students enrolled in first-year seminars, they construct…

  5. Intellectual Curiosity in Action: A Framework to Assess First-Year Seminars in Liberal Arts Settings

    ERIC Educational Resources Information Center

    Kolb, Kenneth H.; Longest, Kyle C.; Barnett, Jenna C.

    2014-01-01

    Fostering students' intellectual curiosity is a common goal of first-year seminar programs--especially in liberal arts settings. The authors propose an alternative method to assess this ambiguous, value-laden concept. Relying on data gathered from pre- and posttest in-depth interviews of 34 students enrolled in first-year seminars, they construct…

  6. An Examination of the Replicability of Angoff Standard Setting Results within a Generalizability Theory Framework

    ERIC Educational Resources Information Center

    Clauser, Jerome C.; Margolis, Melissa J.; Clauser, Brian E.

    2014-01-01

    Evidence of stable standard setting results over panels or occasions is an important part of the validity argument for an established cut score. Unfortunately, due to the high cost of convening multiple panels of content experts, standards often are based on the recommendation from a single panel of judges. This approach implicitly assumes that…

  7. Treating Voice Disorders in the School-Based Setting: Working within the Framework of IDEA

    ERIC Educational Resources Information Center

    Ruddy, Bari Hoffman; Sapienza, Christine M.

    2004-01-01

    The role of the speech-language pathologist (SLP) has developed considerably over the last 10 years given the medical and technological advances in life-sustaining procedures. Over time, children born with congenital, surgical, or "medically fragile" conditions have become mainstreamed into regular school-based settings, thus extending…

  8. Integrating spatial fuzzy clustering with level set methods for automated medical image segmentation.

    PubMed

    Li, Bing Nan; Chui, Chee Kong; Chang, Stephen; Ong, S H

    2011-01-01

    The performance of the level set segmentation is subject to appropriate initialization and optimal configuration of controlling parameters, which require substantial manual intervention. A new fuzzy level set algorithm is proposed in this paper to facilitate medical image segmentation. It is able to directly evolve from the initial segmentation by spatial fuzzy clustering. The controlling parameters of level set evolution are also estimated from the results of fuzzy clustering. Moreover the fuzzy level set algorithm is enhanced with locally regularized evolution. Such improvements facilitate level set manipulation and lead to more robust segmentation. Performance evaluation of the proposed algorithm was carried on medical images from different modalities. The results confirm its effectiveness for medical image segmentation.

  9. Steepest-entropy-ascent nonequilibrium quantum thermodynamic framework to model chemical reaction rates at an atomistic level.

    PubMed

    Beretta, G P; Al-Abbasi, Omar; von Spakovsky, M R

    2017-04-01

    The steepest entropy ascent (SEA) dynamical principle provides a general framework for modeling the dynamics of nonequilibrium (NE) phenomena at any level of description, including the atomistic one. It has recently been shown to provide a precise implementation and meaning to the maximum entropy production principle and to encompass many well-established theories of nonequilibrium thermodynamics into a single unifying geometrical framework. Its original formulation in the framework of quantum thermodynamics (QT) assumes the simplest and most natural Fisher-Rao metric to geometrize from a dynamical standpoint the manifold of density operators, which represent the thermodynamic NE states of the system. This simplest SEAQT formulation is used here to develop a general mathematical framework for modeling the NE time evolution of the quantum state of a chemically reactive mixture at an atomistic level. The method is illustrated for a simple two-reaction kinetic scheme of the overall reaction F+H_{2}⇔HF+F in an isolated tank of fixed volume. However, the general formalism is developed for a reactive system subject to multiple reaction mechanisms. To explicitly implement the SEAQT nonlinear law of evolution for the density operator, both the energy and the particle number eigenvalue problems are set up and solved analytically under the dilute gas approximation. The system-level energy and particle number eigenvalues and eigenstates are used in the SEAQT equation of motion to determine the time evolution of the density operator, thus effectively describing the overall kinetics of the reacting system as it relaxes toward stable chemical equilibrium. The predicted time evolution in the near-equilibrium limit is compared to the reaction rates given by a standard detailed kinetic model so as to extract the single time constant needed by the present SEA model.

  10. Steepest-entropy-ascent nonequilibrium quantum thermodynamic framework to model chemical reaction rates at an atomistic level

    NASA Astrophysics Data System (ADS)

    Beretta, G. P.; Al-Abbasi, Omar; von Spakovsky, M. R.

    2017-04-01

    The steepest entropy ascent (SEA) dynamical principle provides a general framework for modeling the dynamics of nonequilibrium (NE) phenomena at any level of description, including the atomistic one. It has recently been shown to provide a precise implementation and meaning to the maximum entropy production principle and to encompass many well-established theories of nonequilibrium thermodynamics into a single unifying geometrical framework. Its original formulation in the framework of quantum thermodynamics (QT) assumes the simplest and most natural Fisher-Rao metric to geometrize from a dynamical standpoint the manifold of density operators, which represent the thermodynamic NE states of the system. This simplest SEAQT formulation is used here to develop a general mathematical framework for modeling the NE time evolution of the quantum state of a chemically reactive mixture at an atomistic level. The method is illustrated for a simple two-reaction kinetic scheme of the overall reaction F +H2⇔HF +F in an isolated tank of fixed volume. However, the general formalism is developed for a reactive system subject to multiple reaction mechanisms. To explicitly implement the SEAQT nonlinear law of evolution for the density operator, both the energy and the particle number eigenvalue problems are set up and solved analytically under the dilute gas approximation. The system-level energy and particle number eigenvalues and eigenstates are used in the SEAQT equation of motion to determine the time evolution of the density operator, thus effectively describing the overall kinetics of the reacting system as it relaxes toward stable chemical equilibrium. The predicted time evolution in the near-equilibrium limit is compared to the reaction rates given by a standard detailed kinetic model so as to extract the single time constant needed by the present SEA model.

  11. Holocene sea level variations on the basis of integration of independent data sets

    SciTech Connect

    Sahagian, D.; Berkman, P. . Dept. of Geological Sciences and Byrd Polar Research Center)

    1992-01-01

    Variations in sea level through earth history have occurred at a wide variety of time scales. Sea level researchers have attacked the problem of measuring these sea level changes through a variety of approaches, each relevant only to the time scale in question, and usually only relevant to the specific locality from which a specific type of data are derived. There is a plethora of different data types that can and have been used (locally) for the measurement of Holocene sea level variations. The problem of merging different data sets for the purpose of constructing a global eustatic sea level curve for the Holocene has not previously been adequately addressed. The authors direct the efforts to that end. Numerous studies have been published regarding Holocene sea level changes. These have involved exposed fossil reef elevations, elevation of tidal deltas, elevation of depth of intertidal peat deposits, caves, tree rings, ice cores, moraines, eolian dune ridges, marine-cut terrace elevations, marine carbonate species, tide gauges, and lake level variations. Each of these data sets is based on particular set of assumptions, and is valid for a specific set of environments. In order to obtain the most accurate possible sea level curve for the Holocene, these data sets must be merged so that local and other influences can be filtered out of each data set. Since each data set involves very different measurements, each is scaled in order to define the sensitivity of the proxy measurement parameter to sea level, including error bounds. This effectively determines the temporal and spatial resolution of each data set. The level of independence of data sets is also quantified, in order to rule out the possibility of a common non-eustatic factor affecting more than one variety of data. The Holocene sea level curve is considered to be independent of other factors affecting the proxy data, and is taken to represent the relation between global ocean water and basin volumes.

  12. Disseminating hypnosis to health care settings: Applying the RE-AIM framework

    PubMed Central

    Yeh, Vivian M.; Schnur, Julie B.; Montgomery, Guy H.

    2014-01-01

    Hypnosis is a brief intervention ready for wider dissemination in medical contexts. Overall, hypnosis remains underused despite evidence supporting its beneficial clinical impact. This review will evaluate the evidence supporting hypnosis for dissemination using guidelines formulated by Glasgow and colleagues (1999). Five dissemination dimensions will be considered: Reach, Efficacy, Adoption, Implementation, and Maintenance (RE-AIM). Reach In medical settings, hypnosis is capable of helping a diverse range of individuals with a wide variety of problems. Efficacy There is evidence supporting the use of hypnosis for chronic pain, acute pain and emotional distress arising from medical procedures and conditions, cancer treatment-related side-effects and irritable bowel syndrome. Adoption Although hypnosis is currently not a part of mainstream clinical practices, evidence suggests that patients and healthcare providers are open to trying hypnosis, and may become more so when educated about what hypnosis can do. Implementation Hypnosis is a brief intervention capable of being administered effectively by healthcare providers. Maintenance Given the low resource needs of hypnosis, opportunities for reimbursement, and the ability of the intervention to potentially help medical settings reduce costs, the intervention has the qualities necessary to be integrated into routine care in a self-sustaining way in medical settings. In sum, hypnosis is a promising candidate for further dissemination. PMID:25267941

  13. The Harmonizing Outcome Measures for Eczema (HOME) roadmap: a methodological framework to develop core sets of outcome measurements in dermatology.

    PubMed

    Schmitt, Jochen; Apfelbacher, Christian; Spuls, Phyllis I; Thomas, Kim S; Simpson, Eric L; Furue, Masutaka; Chalmers, Joanne; Williams, Hywel C

    2015-01-01

    Core outcome sets (COSs) are consensus-derived minimum sets of outcomes to be assessed in a specific situation. COSs are being increasingly developed to limit outcome-reporting bias, allow comparisons across trials, and strengthen clinical decision making. Despite the increasing interest in outcomes research, methods to develop COSs have not yet been standardized. The aim of this paper is to present the Harmonizing Outcomes Measures for Eczema (HOME) roadmap for the development and implementation of COSs, which was developed on the basis of our experience in the standardization of outcome measurements for atopic eczema. Following the establishment of a panel representing all relevant stakeholders and a research team experienced in outcomes research, the scope and setting of the core set should be defined. The next steps are the definition of a core set of outcome domains such as symptoms or quality of life, followed by the identification or development and validation of appropriate outcome measurement instruments to measure these core domains. Finally, the consented COS needs to be disseminated, implemented, and reviewed. We believe that the HOME roadmap is a useful methodological framework to develop COSs in dermatology, with the ultimate goal of better decision making and promoting patient-centered health care.

  14. Level set based vertebra segmentation for the evaluation of Ankylosing Spondylitis

    NASA Astrophysics Data System (ADS)

    Tan, Sovira; Yao, Jianhua; Ward, Michael M.; Yao, Lawrence; Summers, Ronald M.

    2006-03-01

    Ankylosing Spondylitis is a disease of the vertebra where abnormal bone structures (syndesmophytes) grow at intervertebral disk spaces. Because this growth is so slow as to be undetectable on plain radiographs taken over years, it is necessary to resort to computerized techniques to complement qualitative human judgment with precise quantitative measures on 3-D CT images. Very fine segmentation of the vertebral body is required to capture the small structures caused by the pathology. We propose a segmentation algorithm based on a cascade of three level set stages and requiring no training or prior knowledge. First, the noise inside the vertebral body that often blocks the proper evolution of level set surfaces is attenuated by a sigmoid function whose parameters are determined automatically. The 1st level set (geodesic active contour) is designed to roughly segment the interior of the vertebra despite often highly inhomogeneous and even discontinuous boundaries. The result is used as an initial contour for the 2nd level set (Laplacian level set) that closely captures the inner boundary of the cortical bone. The last level set (reversed Laplacian level set) segments the outer boundary of the cortical bone and also corrects small flaws of the previous stage. We carried out extensive tests on 30 vertebrae (5 from each of 6 patients). Two medical experts scored the results at intervertebral disk spaces focusing on end plates and syndesmophytes. Only two minor segmentation errors at vertebral end plates were reported and two syndesmophytes were considered slightly under-segmented.

  15. A Conceptual Framework for Organizational Readiness to Implement Nutrition and Physical Activity Programs in Early Childhood Education Settings

    PubMed Central

    Upadhyaya, Mudita; Schober, Daniel J.; Byrd-Williams, Courtney

    2014-01-01

    Across multiple sectors, organizational readiness predicts the success of program implementation. However, the factors influencing readiness of early childhood education (ECE) organizations for implementation of new nutrition and physical activity programs is poorly understood. This study presents a new conceptual framework to measure organizational readiness to implement nutrition and physical activity programs in ECE centers serving children aged 0 to 5 years. The framework was validated for consensus on relevance and generalizability by conducting focus groups; the participants were managers (16 directors and 2 assistant directors) of ECE centers. The framework theorizes that it is necessary to have “collective readiness,” which takes into account such factors as resources, organizational operations, work culture, and the collective attitudes, motivation, beliefs, and intentions of ECE staff. Results of the focus groups demonstrated consensus on the relevance of proposed constructs across ECE settings. Including readiness measures during program planning and evaluation could inform implementation of ECE programs targeting nutrition and physical activity behaviors. PMID:25357258

  16. Ice cover, landscape setting, and geological framework of Lake Vostok, East Antarctica

    USGS Publications Warehouse

    Studinger, M.; Bell, R.E.; Karner, G.D.; Tikku, A.A.; Holt, J.W.; Morse, D.L.; David, L.; Richter, T.G.; Kempf, S.D.; Peters, M.E.; Blankenship, D.D.; Sweeney, R.E.; Rystrom, V.L.

    2003-01-01

    Lake Vostok, located beneath more than 4 km of ice in the middle of East Antarctica, is a unique subglacial habitat and may contain microorganisms with distinct adaptations to such an extreme environment. Melting and freezing at the base of the ice sheet, which slowly flows across the lake, controls the flux of water, biota and sediment particles through the lake. The influx of thermal energy, however, is limited to contributions from below. Thus the geological origin of Lake Vostok is a critical boundary condition for the subglacial ecosystem. We present the first comprehensive maps of ice surface, ice thickness and subglacial topography around Lake Vostok. The ice flow across the lake and the landscape setting are closely linked to the geological origin of Lake Vostok. Our data show that Lake Vostok is located along a major geological boundary. Magnetic and gravity data are distinct east and west of the lake, as is the roughness of the subglacial topography. The physiographic setting of the lake has important consequences for the ice flow and thus the melting and freezing pattern and the lake's circulation. Lake Vostok is a tectonically controlled subglacial lake. The tectonic processes provided the space for a unique habitat and recent minor tectonic activity could have the potential to introduce small, but significant amounts of thermal energy into the lake. ?? 2002 Elsevier Science B.V. All rights reserved.

  17. A rough set based rational clustering framework for determining correlated genes.

    PubMed

    Jeyaswamidoss, Jeba Emilyn; Thangaraj, Kesavan; Ramar, Kadarkarai; Chitra, Muthusamy

    2016-06-01

    Cluster analysis plays a foremost role in identifying groups of genes that show similar behavior under a set of experimental conditions. Several clustering algorithms have been proposed for identifying gene behaviors and to understand their significance. The principal aim of this work is to develop an intelligent rough clustering technique, which will efficiently remove the irrelevant dimensions in a high-dimensional space and obtain appropriate meaningful clusters. This paper proposes a novel biclustering technique that is based on rough set theory. The proposed algorithm uses correlation coefficient as a similarity measure to simultaneously cluster both the rows and columns of a gene expression data matrix and mean squared residue to generate the initial biclusters. Furthermore, the biclusters are refined to form the lower and upper boundaries by determining the membership of the genes in the clusters using mean squared residue. The algorithm is illustrated with yeast gene expression data and the experiment proves the effectiveness of the method. The main advantage is that it overcomes the problem of selection of initial clusters and also the restriction of one object belonging to only one cluster by allowing overlapping of biclusters.

  18. Experience with low-cost telemedicine in three different settings. Recommendations based on a proposed framework for network performance evaluation.

    PubMed

    Wootton, Richard; Vladzymyrskyy, Anton; Zolfo, Maria; Bonnardot, Laurent

    2011-01-01

    Telemedicine has been used for many years to support doctors in the developing world. Several networks provide services in different settings and in different ways. However, to draw conclusions about which telemedicine networks are successful requires a method of evaluating them. No general consensus or validated framework exists for this purpose. To define a basic method of performance measurement that can be used to improve and compare teleconsultation networks; to employ the proposed framework in an evaluation of three existing networks; to make recommendations about the future implementation and follow-up of such networks. Analysis based on the experience of three telemedicine networks (in operation for 7-10 years) that provide services to doctors in low-resource settings and which employ the same basic design. Although there are many possible indicators and metrics that might be relevant, five measures for each of the three user groups appear to be sufficient for the proposed framework. In addition, from the societal perspective, information about clinical- and cost-effectiveness is also required. The proposed performance measurement framework was applied to three mature telemedicine networks. Despite their differences in terms of activity, size and objectives, their performance in certain respects is very similar. For example, the time to first reply from an expert is about 24 hours for each network. Although all three networks had systems in place to collect data from the user perspective, none of them collected information about the coordinator's time required or about ease of system usage. They had only limited information about quality and cost. Measuring the performance of a telemedicine network is essential in understanding whether the network is working as intended and what effect it is having. Based on long-term field experience, the suggested framework is a practical tool that will permit organisations to assess the performance of their own networks

  19. Experience with low-cost telemedicine in three different settings. Recommendations based on a proposed framework for network performance evaluation

    PubMed Central

    Wootton, Richard; Vladzymyrskyy, Anton; Zolfo, Maria; Bonnardot, Laurent

    2011-01-01

    Background Telemedicine has been used for many years to support doctors in the developing world. Several networks provide services in different settings and in different ways. However, to draw conclusions about which telemedicine networks are successful requires a method of evaluating them. No general consensus or validated framework exists for this purpose. Objective To define a basic method of performance measurement that can be used to improve and compare teleconsultation networks; to employ the proposed framework in an evaluation of three existing networks; to make recommendations about the future implementation and follow-up of such networks. Methods Analysis based on the experience of three telemedicine networks (in operation for 7–10 years) that provide services to doctors in low-resource settings and which employ the same basic design. Findings Although there are many possible indicators and metrics that might be relevant, five measures for each of the three user groups appear to be sufficient for the proposed framework. In addition, from the societal perspective, information about clinical- and cost-effectiveness is also required. The proposed performance measurement framework was applied to three mature telemedicine networks. Despite their differences in terms of activity, size and objectives, their performance in certain respects is very similar. For example, the time to first reply from an expert is about 24 hours for each network. Although all three networks had systems in place to collect data from the user perspective, none of them collected information about the coordinator's time required or about ease of system usage. They had only limited information about quality and cost. Conclusion Measuring the performance of a telemedicine network is essential in understanding whether the network is working as intended and what effect it is having. Based on long-term field experience, the suggested framework is a practical tool that will permit

  20. Level set discrete element method for three-dimensional computations with triaxial case study

    NASA Astrophysics Data System (ADS)

    Kawamoto, Reid; Andò, Edward; Viggiani, Gioacchino; Andrade, José E.

    2016-06-01

    In this paper, we outline the level set discrete element method (LS-DEM) which is a discrete element method variant able to simulate systems of particles with arbitrary shape using level set functions as a geometric basis. This unique formulation allows seamless interfacing with level set-based characterization methods as well as computational ease in contact calculations. We then apply LS-DEM to simulate two virtual triaxial specimens generated from XRCT images of experiments and demonstrate LS-DEM's ability to quantitatively capture and predict stress-strain and volume-strain behavior observed in the experiments.

  1. Evolving entities: towards a unified framework for understanding diversity at the species and higher levels

    PubMed Central

    Barraclough, Timothy G.

    2010-01-01

    Current approaches to studying the evolution of biodiversity differ in their treatment of species and higher level diversity patterns. Species are regarded as the fundamental evolutionarily significant units of biodiversity, both in theory and in practice, and extensive theory explains how they originate and evolve. However, most species are still delimited using qualitative methods that only relate indirectly to the underlying theory. In contrast, higher level patterns of diversity have been subjected to rigorous quantitative study (using phylogenetics), but theory that adequately explains the observed patterns has been lacking. Most evolutionary analyses of higher level diversity patterns have considered non-equilibrium explanations based on rates of diversification (i.e. exponentially growing clades), rather than equilibrium explanations normally used at the species level and below (i.e. constant population sizes). This paper argues that species level and higher level patterns of diversity can be considered within a common framework, based on equilibrium explanations. It shows how forces normally considered in the context of speciation, namely divergent selection and geographical isolation, can generate evolutionarily significant units of diversity above the level of reproductively isolated species. Prospects for the framework to answer some unresolved questions about higher level diversity patterns are discussed. PMID:20439282

  2. Epidemic Reconstruction in a Phylogenetics Framework: Transmission Trees as Partitions of the Node Set

    PubMed Central

    Hall, Matthew; Woolhouse, Mark; Rambaut, Andrew

    2015-01-01

    The use of genetic data to reconstruct the transmission tree of infectious disease epidemics and outbreaks has been the subject of an increasing number of studies, but previous approaches have usually either made assumptions that are not fully compatible with phylogenetic inference, or, where they have based inference on a phylogeny, have employed a procedure that requires this tree to be fixed. At the same time, the coalescent-based models of the pathogen population that are employed in the methods usually used for time-resolved phylogeny reconstruction are a considerable simplification of epidemic process, as they assume that pathogen lineages mix freely. Here, we contribute a new method that is simultaneously a phylogeny reconstruction method for isolates taken from an epidemic, and a procedure for transmission tree reconstruction. We observe that, if one or more samples is taken from each host in an epidemic or outbreak and these are used to build a phylogeny, a transmission tree is equivalent to a partition of the set of nodes of this phylogeny, such that each partition element is a set of nodes that is connected in the full tree and contains all the tips corresponding to samples taken from one and only one host. We then implement a Monte Carlo Markov Chain (MCMC) procedure for simultaneous sampling from the spaces of both trees, utilising a newly-designed set of phylogenetic tree proposals that also respect node partitions. We calculate the posterior probability of these partitioned trees based on a model that acknowledges the population structure of an epidemic by employing an individual-based disease transmission model and a coalescent process taking place within each host. We demonstrate our method, first using simulated data, and then with sequences taken from the H7N7 avian influenza outbreak that occurred in the Netherlands in 2003. We show that it is superior to established coalescent methods for reconstructing the topology and node heights of the

  3. Epidemic Reconstruction in a Phylogenetics Framework: Transmission Trees as Partitions of the Node Set.

    PubMed

    Hall, Matthew; Woolhouse, Mark; Rambaut, Andrew

    2015-12-01

    The use of genetic data to reconstruct the transmission tree of infectious disease epidemics and outbreaks has been the subject of an increasing number of studies, but previous approaches have usually either made assumptions that are not fully compatible with phylogenetic inference, or, where they have based inference on a phylogeny, have employed a procedure that requires this tree to be fixed. At the same time, the coalescent-based models of the pathogen population that are employed in the methods usually used for time-resolved phylogeny reconstruction are a considerable simplification of epidemic process, as they assume that pathogen lineages mix freely. Here, we contribute a new method that is simultaneously a phylogeny reconstruction method for isolates taken from an epidemic, and a procedure for transmission tree reconstruction. We observe that, if one or more samples is taken from each host in an epidemic or outbreak and these are used to build a phylogeny, a transmission tree is equivalent to a partition of the set of nodes of this phylogeny, such that each partition element is a set of nodes that is connected in the full tree and contains all the tips corresponding to samples taken from one and only one host. We then implement a Monte Carlo Markov Chain (MCMC) procedure for simultaneous sampling from the spaces of both trees, utilising a newly-designed set of phylogenetic tree proposals that also respect node partitions. We calculate the posterior probability of these partitioned trees based on a model that acknowledges the population structure of an epidemic by employing an individual-based disease transmission model and a coalescent process taking place within each host. We demonstrate our method, first using simulated data, and then with sequences taken from the H7N7 avian influenza outbreak that occurred in the Netherlands in 2003. We show that it is superior to established coalescent methods for reconstructing the topology and node heights of the

  4. Ontological Problem-Solving Framework for Assigning Sensor Systems and Algorithms to High-Level Missions

    PubMed Central

    Qualls, Joseph; Russomanno, David J.

    2011-01-01

    The lack of knowledge models to represent sensor systems, algorithms, and missions makes opportunistically discovering a synthesis of systems and algorithms that can satisfy high-level mission specifications impractical. A novel ontological problem-solving framework has been designed that leverages knowledge models describing sensors, algorithms, and high-level missions to facilitate automated inference of assigning systems to subtasks that may satisfy a given mission specification. To demonstrate the efficacy of the ontological problem-solving architecture, a family of persistence surveillance sensor systems and algorithms has been instantiated in a prototype environment to demonstrate the assignment of systems to subtasks of high-level missions. PMID:22164081

  5. Options for future effective water management in Lombok: A multi-level nested framework

    NASA Astrophysics Data System (ADS)

    Sjah, Taslim; Baldwin, Claudia

    2014-11-01

    Previous research on water use in Lombok identified reduced water available in springs and limits on seasonal water availability. It foreshadowed increasing competition for water resources in critical areas of Lombok. This study examines preliminary information on local social-institutional arrangements for water allocation in the context of Ostrom's rules for self-governing institutions. We identify robust customary mechanisms for decision-making about water sharing and rules at a local level and suggest areas of further investigation for strengthening multi-level networked and nested frameworks, in collaboration with higher levels of government.

  6. Setting a Minimum Standard of Care in Clinical Trials: Human Rights and Bioethics as Complementary Frameworks.

    PubMed

    Marouf, Fatma E; Esplin, Bryn S

    2015-06-11

    For the past few decades, there has been intense debate in bioethics about the standard of care that should be provided in clinical trials conducted in developing countries. Some interpret the Declaration of Helsinki to mean that control groups should receive the best intervention available worldwide, while others interpret this and other international guidelines to mean the best local standard of care. Questions of justice are particularly relevant where limited resources mean that the local standard of care is no care at all. Introducing human rights law into this complex and longstanding debate adds a new and important perspective. Through non-derogable rights, including the core obligations of the right to health, human rights law can help set a minimum standard of care.

  7. Development of an evidence-based framework of factors contributing to patient safety incidents in hospital settings: a systematic review

    PubMed Central

    McEachan, Rosemary R C; Giles, Sally J; Sirriyeh, Reema; Watt, Ian S; Wright, John

    2012-01-01

    Objective The aim of this systematic review was to develop a ‘contributory factors framework’ from a synthesis of empirical work which summarises factors contributing to patient safety incidents in hospital settings. Design A mixed-methods systematic review of the literature was conducted. Data sources Electronic databases (Medline, PsycInfo, ISI Web of knowledge, CINAHL and EMBASE), article reference lists, patient safety websites, registered study databases and author contacts. Eligibility criteria Studies were included that reported data from primary research in secondary care aiming to identify the contributory factors to error or threats to patient safety. Results 1502 potential articles were identified. 95 papers (representing 83 studies) which met the inclusion criteria were included, and 1676 contributory factors extracted. Initial coding of contributory factors by two independent reviewers resulted in 20 domains (eg, team factors, supervision and leadership). Each contributory factor was then coded by two reviewers to one of these 20 domains. The majority of studies identified active failures (errors and violations) as factors contributing to patient safety incidents. Individual factors, communication, and equipment and supplies were the other most frequently reported factors within the existing evidence base. Conclusions This review has culminated in an empirically based framework of the factors contributing to patient safety incidents. This framework has the potential to be applied across hospital settings to improve the identification and prevention of factors that cause harm to patients. PMID:22421911

  8. Effective communication of public health guidance to emergency department clinicians in the setting of emerging incidents: a qualitative study and framework.

    PubMed

    Khan, Yasmin; Sanford, Sarah; Sider, Doug; Moore, Kieran; Garber, Gary; de Villa, Eileen; Schwartz, Brian

    2017-04-28

    agencies and emergency department clinicians at the local level. Our framework which is grounded in qualitative evidence focuses on strategies to promote effective communication in the emerging public health incident setting and may be useful in informing practice.

  9. A framework for sea level rise vulnerability assessment for southwest U.S. military installations

    USGS Publications Warehouse

    Chadwick, B.; Flick, Reinhard; Helly, J.; Nishikawa, T.; Pei, Fang Wang; O'Reilly, W.; Guza, R.; Bromirski, Peter; Young, A.; Crampton, W.; Wild, B.; Canner, I.

    2011-01-01

    We describe an analysis framework to determine military installation vulnerabilities under increases in local mean sea level as projected over the next century. The effort is in response to an increasing recognition of potential climate change ramifications for national security and recommendations that DoD conduct assessments of the impact on U.S. military installations of climate change. Results of the effort described here focus on development of a conceptual framework for sea level rise vulnerability assessment at coastal military installations in the southwest U.S. We introduce the vulnerability assessment in the context of a risk assessment paradigm that incorporates sources in the form of future sea level conditions, pathways of impact including inundation, flooding, erosion and intrusion, and a range of military installation specific receptors such as critical infrastructure and training areas. A unique aspect of the methodology is the capability to develop wave climate projections from GCM outputs and transform these to future wave conditions at specific coastal sites. Future sea level scenarios are considered in the context of installation sensitivity curves which reveal response thresholds specific to each installation, pathway and receptor. In the end, our goal is to provide a military-relevant framework for assessment of accelerated SLR vulnerability, and develop the best scientifically-based scenarios of waves, tides and storms and their implications for DoD installations in the southwestern U.S. ?? 2011 MTS.

  10. Basin-scale runoff prediction: An Ensemble Kalman Filter framework based on global hydrometeorological data sets

    NASA Astrophysics Data System (ADS)

    Lorenz, Christof; Tourian, Mohammad J.; Devaraju, Balaji; Sneeuw, Nico; Kunstmann, Harald

    2015-10-01

    In order to cope with the steady decline of the number of in situ gauges worldwide, there is a growing need for alternative methods to estimate runoff. We present an Ensemble Kalman Filter based approach that allows us to conclude on runoff for poorly or irregularly gauged basins. The approach focuses on the application of publicly available global hydrometeorological data sets for precipitation (GPCC, GPCP, CRU, UDEL), evapotranspiration (MODIS, FLUXNET, GLEAM, ERA interim, GLDAS), and water storage changes (GRACE, WGHM, GLDAS, MERRA LAND). Furthermore, runoff data from the GRDC and satellite altimetry derived estimates are used. We follow a least squares prediction that exploits the joint temporal and spatial auto- and cross-covariance structures of precipitation, evapotranspiration, water storage changes and runoff. We further consider time-dependent uncertainty estimates derived from all data sets. Our in-depth analysis comprises of 29 large river basins of different climate regions, with which runoff is predicted for a subset of 16 basins. Six configurations are analyzed: the Ensemble Kalman Filter (Smoother) and the hard (soft) Constrained Ensemble Kalman Filter (Smoother). Comparing the predictions to observed monthly runoff shows correlations larger than 0.5, percentage biases lower than ± 20%, and NSE-values larger than 0.5. A modified NSE-metric, stressing the difference to the mean annual cycle, shows an improvement of runoff predictions for 14 of the 16 basins. The proposed method is able to provide runoff estimates for nearly 100 poorly gauged basins covering an area of more than 11,500,000 km2 with a freshwater discharge, in volume, of more than 125,000 m3/s.

  11. Basin-scale runoff prediction: An Ensemble Kalman Filter framework based on global hydrometeorological data sets

    NASA Astrophysics Data System (ADS)

    Kunstmann, Harald; Lorenz, Christof; Tourian, Mohammad; Devaraju, Balaji; Sneeuw, Nico

    2016-04-01

    In order to cope with the steady decline of the number of in situ gauges worldwide, there is a growing need for alternative methods to estimate runoff. We present an Ensemble Kalman Filter based approach that allows us to conclude on runoff for poorly or irregularly gauged basins. The approach focuses on the application of publicly available global hydrometeorological data sets for precipitation (GPCC, GPCP, CRU, UDEL), evapotranspiration (MODIS, FLUXNET, GLEAM, ERA interim, GLDAS), and water storage changes (GRACE, WGHM, GLDAS, MERRA LAND). Furthermore, runoff data from the GRDC and satellite altimetry derived estimates are used. We follow a least squares prediction that exploits the joint temporal and spatial auto- and cross-covariance structures of precipitation, evapotranspiration, water storage changes and runoff. We further consider time-dependent uncertainty estimates derived from all data sets. Our in-depth analysis comprises of 29 large river basins of different climate regions, with which runoff is predicted for a subset of 16 basins. Six configurations are analyzed: the Ensemble Kalman Filter (Smoother) and the hard (soft) Constrained Ensemble Kalman Filter (Smoother). Comparing the predictions to observed monthly runoff shows correlations larger than 0.5, percentage biases lower than ± 20%, and NSE-values larger than 0.5. A modified NSE-metric, stressing the difference to the mean annual cycle, shows an improvement of runoff predictions for 14 of the 16 basins. The proposed method is able to provide runoff estimates for nearly 100 poorly gauged basins covering an area of more than 11,500,000 km2 with a freshwater discharge, in volume, of more than 125,000 m3/s.

  12. Mandating influenza vaccinations for health care workers: analysing opportunities for policy change using Kingdon's agenda setting framework.

    PubMed

    Jackson-Lee, Angela; Barr, Neil G; Randall, Glen E

    2016-09-29

    The consequences of annual influenza outbreaks are often underestimated by the general public. Influenza poses a serious public health threat around the world, particularly for the most vulnerable populations. Fortunately, vaccination can mitigate the negative effects of this common infectious disease. Although inoculating frontline health care workers (HCWs) helps minimize disease transmission, some HCWs continue to resist participating in voluntary immunization programs. A potential solution to this problem is government-mandated vaccination for HCWs; however, in practice, there are substantial barriers to the adoption of such policies. The purpose of this paper is to identify the likelihood of adopting a policy for mandatory immunization of HCWs in Ontario based on a historical review of barriers to the agenda setting process. Documents from secondary data sources were analysed using Kingdon's agenda setting framework of three converging streams leading to windows of opportunity for possible policy adoption. The problems, politics, and policies streams of Kingdon's framework have converged and diverged repeatedly over an extended period (policy windows have opened and closed several times). In each instance, a technically feasible solution was available. However, despite the evidence supporting the value of HCW immunization, alignment of the three agenda setting streams occurred for very short periods of time, during which, opposition lobby groups reacted, making the proposed solution less politically acceptable. Prior to the adoption of any new policies, issues must reach a government's decision agenda. Based on Kingdon's agenda setting framework, this only occurs when there is alignment of the problems, politics, and policies streams. Understanding this process makes it easier to predict the likelihood of a policy being adopted, and ultimately implemented. Such learning may be applied to policy issues in other jurisdictions. In the case of mandatory influenza

  13. [Intellectual development disorders in Latin America: a framework for setting policy priorities for research and care].

    PubMed

    Lazcano-Ponce, Eduardo; Katz, Gregorio; Allen-Leigh, Betania; Magaña Valladares, Laura; Rangel-Eudave, Guillermina; Minoletti, Alberto; Wahlberg, Ernesto; Vásquez, Armando; Salvador-Carulla, Luis

    2013-09-01

    Intellectual development disorders (IDDs) are a set of development disorders characterized by significantly limited cognitive functioning, learning disorders, and disorders related to adaptive skills and behavior. Previously grouped under the term "intellectual disability," this problem has not been widely studied or quantified in Latin America. Those affected are absent from public policy and do not benefit from government social development and poverty reduction strategies. This article offers a critical look at IDDs and describes a new taxonomy; it also proposes recognizing IDDs as a public health issue and promoting the professionalization of care, and suggests an agenda for research and regional action. In Latin America there is no consensus on the diagnostic criteria for IDDs. A small number of rehabilitation programs cover a significant proportion of the people who suffer from IDDs, evidence-based services are not offered, and health care guidelines have not been evaluated. Manuals on psychiatric diagnosis focus heavily on identifying serious IDDs and contribute to underreporting and erroneous classification. The study of these disorders has not been a legal, social science, or public health priority, resulting in a dearth of scientific evidence on them. Specific competencies and professionalization of care for these persons are needed, and interventions must be carried out with a view to prevention, rehabilitation, community integration, and inclusion in the work force.

  14. 3D level set methods for evolving fronts on tetrahedral meshes with adaptive mesh refinement

    DOE PAGES

    Morgan, Nathaniel Ray; Waltz, Jacob I.

    2017-03-02

    The level set method is commonly used to model dynamically evolving fronts and interfaces. In this work, we present new methods for evolving fronts with a specified velocity field or in the surface normal direction on 3D unstructured tetrahedral meshes with adaptive mesh refinement (AMR). The level set field is located at the nodes of the tetrahedral cells and is evolved using new upwind discretizations of Hamilton–Jacobi equations combined with a Runge–Kutta method for temporal integration. The level set field is periodically reinitialized to a signed distance function using an iterative approach with a new upwind gradient. We discuss themore » details of these level set and reinitialization methods. Results from a range of numerical test problems are presented.« less

  15. 3D level set methods for evolving fronts on tetrahedral meshes with adaptive mesh refinement

    NASA Astrophysics Data System (ADS)

    Morgan, Nathaniel R.; Waltz, Jacob I.

    2017-05-01

    The level set method is commonly used to model dynamically evolving fronts and interfaces. In this work, we present new methods for evolving fronts with a specified velocity field or in the surface normal direction on 3D unstructured tetrahedral meshes with adaptive mesh refinement (AMR). The level set field is located at the nodes of the tetrahedral cells and is evolved using new upwind discretizations of Hamilton-Jacobi equations combined with a Runge-Kutta method for temporal integration. The level set field is periodically reinitialized to a signed distance function using an iterative approach with a new upwind gradient. The details of these level set and reinitialization methods are discussed. Results from a range of numerical test problems are presented.

  16. Separation and molecular-level segregation of complex alkane mixtures in metal-organic frameworks.

    PubMed

    Dubbeldam, David; Galvin, Casey J; Walton, Krista S; Ellis, Donald E; Snurr, Randall Q

    2008-08-20

    In this computational work we explore metal-organic frameworks (MOFs) for separating alkanes according to the degree of branching. We show that the structure MOF-1 shows an adsorption hierarchy for a 13-component light naphtha mixture precisely as desired for increasing the research octane number of gasoline. In addition we report an unusual molecular-level segregation of molecules based on their degree of branching.

  17. A CONCEPTUAL FRAMEWORK FOR MANAGING RADIATION DOSE TO PATIENTS IN DIAGNOSTIC RADIOLOGY USING REFERENCE DOSE LEVELS.

    PubMed

    Almén, Anja; Båth, Magnus

    2016-06-01

    The overall aim of the present work was to develop a conceptual framework for managing radiation dose in diagnostic radiology with the intention to support optimisation. An optimisation process was first derived. The framework for managing radiation dose, based on the derived optimisation process, was then outlined. The outset of the optimisation process is four stages: providing equipment, establishing methodology, performing examinations and ensuring quality. The optimisation process comprises a series of activities and actions at these stages. The current system of diagnostic reference levels is an activity in the last stage, ensuring quality. The system becomes a reactive activity only to a certain extent engaging the core activity in the radiology department, performing examinations. Three reference dose levels-possible, expected and established-were assigned to the three stages in the optimisation process, excluding ensuring quality. A reasonably achievable dose range is also derived, indicating an acceptable deviation from the established dose level. A reasonable radiation dose for a single patient is within this range. The suggested framework for managing radiation dose should be regarded as one part of the optimisation process. The optimisation process constitutes a variety of complementary activities, where managing radiation dose is only one part. This emphasises the need to take a holistic approach integrating the optimisation process in different clinical activities.

  18. Locally constrained active contour: a region-based level set for ovarian cancer metastasis segmentation

    NASA Astrophysics Data System (ADS)

    Liu, Jianfei; Yao, Jianhua; Wang, Shijun; Linguraru, Marius George; Summers, Ronald M.

    2014-03-01

    Accurate segmentation of ovarian cancer metastases is clinically useful to evaluate tumor growth and determine follow-up treatment. We present a region-based level set algorithm with localization constraints to segment ovarian cancer metastases. Our approach is established on a representative region-based level set, Chan-Vese model, in which an active contour is driven by region competition. To reduce over-segmentation, we constrain the level set propagation within a narrow image band by embedding a dynamic localization function. The metastasis intensity prior is also estimated from image regions within the level set initialization. The localization function and intensity prior force the level set to stop at the desired metastasis boundaries. Our approach was validated on 19 ovarian cancer metastases with radiologist-labeled ground-truth on contrast-enhanced CT scans from 15 patients. The comparison between our algorithm and geodesic active contour indicated that the volume overlap was 75+/-10% vs. 56+/-6%, the Dice coefficient was 83+/-8% vs. 63+/-8%, and the average surface distance was 2.2+/-0.6mm vs. 4.4+/-0.9mm. Experimental results demonstrated that our algorithm outperformed traditional level set algorithms.

  19. An adaptive level set approach for incompressible two-phase flows

    SciTech Connect

    Sussman, M.; Almgren, A.S.; Bell, J.B.

    1997-04-01

    In Sussman, Smereka and Osher, a numerical method using the level set approach was formulated for solving incompressible two-phase flow with surface tension. In the level set approach, the interface is represented as the zero level set of a smooth function; this has the effect of replacing the advection of density, which has steep gradients at the interface, with the advection of the level set function, which is smooth. In addition, the interface can merge or break up with no special treatment. The authors maintain the level set function as the signed distance from the interface in order to robustly compute flows with high density ratios and stiff surface tension effects. In this work, they couple the level set scheme to an adaptive projection method for the incompressible Navier-Stokes equations, in order to achieve higher resolution of the interface with a minimum of additional expense. They present two-dimensional axisymmetric and fully three-dimensional results of air bubble and water drop computations.

  20. A discontinuous Galerkin conservative level set scheme for interface capturing in multiphase flows

    SciTech Connect

    Owkes, Mark Desjardins, Olivier

    2013-09-15

    The accurate conservative level set (ACLS) method of Desjardins et al. [O. Desjardins, V. Moureau, H. Pitsch, An accurate conservative level set/ghost fluid method for simulating turbulent atomization, J. Comput. Phys. 227 (18) (2008) 8395–8416] is extended by using a discontinuous Galerkin (DG) discretization. DG allows for the scheme to have an arbitrarily high order of accuracy with the smallest possible computational stencil resulting in an accurate method with good parallel scaling. This work includes a DG implementation of the level set transport equation, which moves the level set with the flow field velocity, and a DG implementation of the reinitialization equation, which is used to maintain the shape of the level set profile to promote good mass conservation. A near second order converging interface curvature is obtained by following a height function methodology (common amongst volume of fluid schemes) in the context of the conservative level set. Various numerical experiments are conducted to test the properties of the method and show excellent results, even on coarse meshes. The tests include Zalesak’s disk, two-dimensional deformation of a circle, time evolution of a standing wave, and a study of the Kelvin–Helmholtz instability. Finally, this novel methodology is employed to simulate the break-up of a turbulent liquid jet.

  1. A discontinuous Galerkin conservative level set scheme for interface capturing in multiphase flows

    NASA Astrophysics Data System (ADS)

    Owkes, Mark; Desjardins, Olivier

    2013-09-01

    The accurate conservative level set (ACLS) method of Desjardins et al. [O. Desjardins, V. Moureau, H. Pitsch, An accurate conservative level set/ghost fluid method for simulating turbulent atomization, J. Comput. Phys. 227 (18) (2008) 8395-8416] is extended by using a discontinuous Galerkin (DG) discretization. DG allows for the scheme to have an arbitrarily high order of accuracy with the smallest possible computational stencil resulting in an accurate method with good parallel scaling. This work includes a DG implementation of the level set transport equation, which moves the level set with the flow field velocity, and a DG implementation of the reinitialization equation, which is used to maintain the shape of the level set profile to promote good mass conservation. A near second order converging interface curvature is obtained by following a height function methodology (common amongst volume of fluid schemes) in the context of the conservative level set. Various numerical experiments are conducted to test the properties of the method and show excellent results, even on coarse meshes. The tests include Zalesak’s disk, two-dimensional deformation of a circle, time evolution of a standing wave, and a study of the Kelvin-Helmholtz instability. Finally, this novel methodology is employed to simulate the break-up of a turbulent liquid jet.

  2. A three-tier framework for monitoring antiretroviral therapy in high HIV burden settings

    PubMed Central

    Osler, Meg; Hilderbrand, Katherine; Hennessey, Claudine; Arendse, Juanita; Goemaere, Eric; Ford, Nathan; Boulle, Andrew

    2014-01-01

    The provision of antiretroviral therapy (ART) in low and middle-income countries is a chronic disease intervention of unprecedented magnitude and is the dominant health systems challenge for high-burden countries, many of which rank among the poorest in the world. Substantial external investment, together with the requirement for service evolution to adapt to changing needs, including the constant shift to earlier ART initiation, makes outcome monitoring and reporting particularly important. However, there is growing concern at the inability of many high-burden countries to report on the outcomes of patients who have been in care for various durations, or even the number of patients in care at a particular point in time. In many instances, countries can only report on the number of patients ever started on ART. Despite paper register systems coming under increasing strain, the evolution from paper directly to complex electronic medical record solutions is not viable in many contexts. Implementing a bridging solution, such as a simple offline electronic version of the paper register, can be a pragmatic alternative. This paper describes and recommends a three-tiered monitoring approach in low- and middle-income countries based on the experience implementing such a system in the Western Cape province of South Africa. A three-tier approach allows Ministries of Health to strategically implement one of the tiers in each facility offering ART services. Each tier produces the same nationally required monthly enrolment and quarterly cohort reports so that outputs from the three tiers can be aggregated into a single database at any level of the health system. The choice of tier is based on context and resources at the time of implementation. As resources and infrastructure improve, more facilities will transition to the next highest and more technologically sophisticated tier. Implementing a three-tier monitoring system at country level for pre-antiretroviral wellness, ART

  3. A three-tier framework for monitoring antiretroviral therapy in high HIV burden settings.

    PubMed

    Osler, Meg; Hilderbrand, Katherine; Hennessey, Claudine; Arendse, Juanita; Goemaere, Eric; Ford, Nathan; Boulle, Andrew

    2014-01-01

    The provision of antiretroviral therapy (ART) in low and middle-income countries is a chronic disease intervention of unprecedented magnitude and is the dominant health systems challenge for high-burden countries, many of which rank among the poorest in the world. Substantial external investment, together with the requirement for service evolution to adapt to changing needs, including the constant shift to earlier ART initiation, makes outcome monitoring and reporting particularly important. However, there is growing concern at the inability of many high-burden countries to report on the outcomes of patients who have been in care for various durations, or even the number of patients in care at a particular point in time. In many instances, countries can only report on the number of patients ever started on ART. Despite paper register systems coming under increasing strain, the evolution from paper directly to complex electronic medical record solutions is not viable in many contexts. Implementing a bridging solution, such as a simple offline electronic version of the paper register, can be a pragmatic alternative. This paper describes and recommends a three-tiered monitoring approach in low- and middle-income countries based on the experience implementing such a system in the Western Cape province of South Africa. A three-tier approach allows Ministries of Health to strategically implement one of the tiers in each facility offering ART services. Each tier produces the same nationally required monthly enrolment and quarterly cohort reports so that outputs from the three tiers can be aggregated into a single database at any level of the health system. The choice of tier is based on context and resources at the time of implementation. As resources and infrastructure improve, more facilities will transition to the next highest and more technologically sophisticated tier. Implementing a three-tier monitoring system at country level for pre-antiretroviral wellness, ART

  4. Joint inversion of geophysical data using petrophysical clustering and facies deformation wth the level set technique

    NASA Astrophysics Data System (ADS)

    Revil, A.

    2015-12-01

    Geological expertise and petrophysical relationships can be brought together to provide prior information while inverting multiple geophysical datasets. The merging of such information can result in more realistic solution in the distribution of the model parameters, reducing ipse facto the non-uniqueness of the inverse problem. We consider two level of heterogeneities: facies, described by facies boundaries and heteroegenities inside each facies determined by a correlogram. In this presentation, we pose the geophysical inverse problem in terms of Gaussian random fields with mean functions controlled by petrophysical relationships and covariance functions controlled by a prior geological cross-section, including the definition of spatial boundaries for the geological facies. The petrophysical relationship problem is formulated as a regression problem upon each facies. The inversion of the geophysical data is performed in a Bayesian framework. We demonstrate the usefulness of this strategy using a first synthetic case for which we perform a joint inversion of gravity and galvanometric resistivity data with the stations located at the ground surface. The joint inversion is used to recover the density and resistivity distributions of the subsurface. In a second step, we consider the possibility that the facies boundaries are deformable and their shapes are inverted as well. We use the level set approach to perform such deformation preserving prior topological properties of the facies throughout the inversion. With the help of prior facies petrophysical relationships and topological characteristic of each facies, we make posterior inference about multiple geophysical tomograms based on their corresponding geophysical data misfits. The method is applied to a second synthetic case showing that we can recover the heterogeneities inside the facies, the mean values for the petrophysical properties, and, to some extent, the facies boundaries using the 2D joint inversion of

  5. Multiphase permittivity imaging using absolute value electrical capacitance tomography data and a level set algorithm.

    PubMed

    Al Hosani, E; Soleimani, M

    2016-06-28

    Multiphase flow imaging is a very challenging and critical topic in industrial process tomography. In this article, simulation and experimental results of reconstructing the permittivity profile of multiphase material from data collected in electrical capacitance tomography (ECT) are presented. A multiphase narrowband level set algorithm is developed to reconstruct the interfaces between three- or four-phase permittivity values. The level set algorithm is capable of imaging multiphase permittivity by using one set of ECT measurement data, so-called absolute value ECT reconstruction, and this is tested with high-contrast and low-contrast multiphase data. Simulation and experimental results showed the superiority of this algorithm over classical pixel-based image reconstruction methods. The multiphase level set algorithm and absolute ECT reconstruction are presented for the first time, to the best of our knowledge, in this paper and critically evaluated. This article is part of the themed issue 'Supersensing through industrial process tomography'. © 2016 The Author(s).

  6. A unified framework for developing effective hygiene procedures for hands, environmental surfaces and laundry in healthcare, domestic, food handling and other settings.

    PubMed

    Bloomfield, Sally F; Carling, Philip C; Exner, Martin

    2017-01-01

    Hygiene procedures for hands, surfaces and fabrics are central to preventing spread of infection in settings including healthcare, food production, catering, agriculture, public settings, and home and everyday life. They are used in situations including hand hygiene, clinical procedures, decontamination of environmental surfaces, respiratory hygiene, food handling, laundry hygiene, toilet hygiene and so on. Although the principles are common to all, approaches currently used in different settings are inconsistent. A concern is the use of inconsistent terminology which is misleading, especially to people we need to communicate with such as the public or cleaning professionals. This paper reviews the data on current approaches, alongside new insights to developing hygiene procedures. Using this data, we propose a more scientifically-grounded framework for developing procedures that maximize protection against infection, based on consistent principles and terminology, and applicable across all settings. A key feature is use of test models which assess the state of surfaces after treatment rather than product performance alone. This allows procedures that rely on removal of microbes to be compared with those employing chemical or thermal inactivation. This makes it possible to ensure that a consistent "safety target level" is achieved regardless of the type of procedure used, and allows us deliver maximum health benefit whilst ensuring prudent usage of antimicrobial agents, detergents, water and energy.

  7. Target Detection in SAR Images Based on a Level Set Approach

    SciTech Connect

    Marques, Regis C.P.; Medeiros, Fatima N.S.; Ushizima, Daniela M.

    2008-09-01

    This paper introduces a new framework for point target detection in synthetic aperture radar (SAR) images. We focus on the task of locating reflective small regions using alevel set based algorithm. Unlike most of the approaches in image segmentation, we address an algorithm which incorporates speckle statistics instead of empirical parameters and also discards speckle filtering. The curve evolves according to speckle statistics, initially propagating with a maximum upward velocity in homogeneous areas. Our approach is validated by a series of tests on synthetic and real SAR images and compared with three other segmentation algorithms, demonstrating that it configures a novel and efficient method for target detection purpose.

  8. National Service Frameworks and UK general practitioners: street-level bureaucrats at work?

    PubMed

    Checkland, Kath

    2004-11-01

    This paper argues that the past decade has seen significant changes in the nature of medical work in general practice in the UK. Increasing pressure to use normative clinical guidelines and the move towards explicit quantitative measures of performance together have the potential to alter the way in which health care is delivered to patients. Whilst it is possible to view these developments from the well-established sociological perspectives of deprofessionalisation and proletarianisation, this paper takes a view of general practice as work, and uses the ideas of Lipsky to analyse practice-level responses to some of these changes. In addition to evidence-based clinical guidelines, National Service Frameworks, introduced by the UK government in 1997, also specify detailed models of service provision that health care providers are expected to follow. As part of a larger study examining the impact of National Service Frameworks in general practice, the response of three practices to the first four NSFs were explored. The failure of NSFs to make a significant impact is compared to the practices' positive responses to purely clinical guidelines such as those developed by the British Hypertension Society. Lipsky's concept of public service workers as 'street-level bureaucrats' is discussed and used as a framework within which to view these findings.

  9. The Effects on Motor Performance of Setting an Overt Level of Aspiration by Mentally Retarded Students.

    ERIC Educational Resources Information Center

    Kozar, Bill

    This study investigates the effects of setting an overt level of aspiration on the standing long jump performance of mildly and moderately retarded institutionalized children. Thirty-three mildly retarded and seven moderately retarded students were randomly assigned to either an overt level of aspiration (OLA) group or a control group. Each…

  10. Real Time Global Tests of the ALICE High Level Trigger Data Transport Framework

    NASA Astrophysics Data System (ADS)

    Becker, B.; Chattopadhyay, S.; Cicalo, C.; Cleymans, J.; de Vaux, G.; Fearick, R. W.; Lindenstruth, V.; Richter, M.; Rohrich, D.; Staley, F.; Steinbeck, T. M.; Szostak, A.; Tilsner, H.; Weis, R.; Vilakazi, Z. Z.

    2008-04-01

    The High Level Trigger (HLT) system of the ALICE experiment is an online event filter and trigger system designed for input bandwidths of up to 25 GB/s at event rates of up to 1 kHz. The system is designed as a scalable PC cluster, implementing several hundred nodes. The transport of data in the system is handled by an object-oriented data flow framework operating on the basis of the publisher-subscriber principle, being designed fully pipelined with lowest processing overhead and communication latency in the cluster. In this paper, we report the latest measurements where this framework has been operated on five different sites over a global north-south link extending more than 10,000 km, processing a ldquoreal-timerdquo data flow.

  11. A Measurement Framework for Team Level Assessment of Innovation Capability in Early Requirements Engineering

    NASA Astrophysics Data System (ADS)

    Regnell, Björn; Höst, Martin; Nilsson, Fredrik; Bengtsson, Henrik

    When developing software-intensive products for a market-place it is important for a development organisation to create innovative features for coming releases in order to achieve advantage over competitors. This paper focuses on assessment of innovation capability at team level in relation to the requirements engineering that is taking place before the actual product development projects are decided, when new business models, technology opportunities and intellectual property rights are created and investigated through e.g. prototyping and concept development. The result is a measurement framework focusing on four areas: innovation elicitation, selection, impact and ways-of-working. For each area, candidate measurements were derived from interviews to be used as inspiration in the development of a tailored measurement program. The framework is based on interviews with participants of a software team with specific innovation responsibilities and validated through cross-case analysis and feedback from practitioners.

  12. Individualized Education and Competency Development of Croatian Community Pharmacists Using the General Level Framework

    PubMed Central

    Staničić, Živka; Hadžiabdić, Maja Ortner; Mucalo, Iva; Bates, Ian; Duggan, Catherine; Carter, Sarah; Bruno, Andreia; Košiček, Miljenko

    2012-01-01

    Objectives. To measure Croatian community pharmacists’ progress in competency development using the General Level Framework (GLF) as an educational tool in a longitudinal study. Methods. Patient care competencies of 100 community pharmacists were evaluated twice, in 2009 and in 2010 in a prospective cohort study. During this 12-month period, tailored educational programs based on the GLF were organized and conducted, new services and standard operating procedures were implemented, and documentation of contributions to patient care in the pharmacist’s portfolio became mandatory. Results. Pharmacists’ development of all GLF patient care competencies was significant with the greatest improvements seen in the following competencies: patient consultation, monitoring drug therapy, medicine information and patient education, and evaluation of outcomes. Conclusions. This study, which retested the effectiveness of an evidence-based competency framework, confirmed that GLF is a valid educational tool for pharmacist development. PMID:22438595

  13. Breast mass segmentation in digital mammography based on pulse coupled neural network and level set method

    NASA Astrophysics Data System (ADS)

    Xie, Weiying; Ma, Yide; Li, Yunsong

    2015-05-01

    A novel approach to mammographic image segmentation, termed as PCNN-based level set algorithm, is presented in this paper. Just as its name implies, a method based on pulse coupled neural network (PCNN) in conjunction with the variational level set method for medical image segmentation. To date, little work has been done on detecting the initial zero level set contours based on PCNN algorithm for latterly level set evolution. When all the pixels of the input image are fired by PCNN, the small pixel value will be a much more refined segmentation. In mammographic image, the breast tumor presents big pixel value. Additionally, the mammographic image with predominantly dark region, so that we firstly obtain the negative of mammographic image with predominantly dark region except the breast tumor before all the pixels of an input image are fired by PCNN. Therefore, in here, PCNN algorithm is employed to achieve mammary-specific, initial mass contour detection. After that, the initial contours are all extracted. We define the extracted contours as the initial zero level set contours for automatic mass segmentation by variational level set in mammographic image analysis. What's more, a new proposed algorithm improves external energy of variational level set method in terms of mammographic images in low contrast. In accordance with the gray scale of mass region in mammographic image is higher than the region surrounded, so the Laplace operator is used to modify external energy, which could make the bright spot becoming much brighter than the surrounded pixels in the image. A preliminary evaluation of the proposed method performs on a known public database namely MIAS, rather than synthetic images. The experimental results demonstrate that our proposed approach can potentially obtain better masses detection results in terms of sensitivity and specificity. Ultimately, this algorithm could lead to increase both sensitivity and specificity of the physicians' interpretation of

  14. A hybrid method for pancreas extraction from CT image based on level set methods.

    PubMed

    Jiang, Huiyan; Tan, Hanqing; Fujita, Hiroshi

    2013-01-01

    This paper proposes a novel semiautomatic method to extract the pancreas from abdominal CT images. Traditional level set and region growing methods that request locating initial contour near the final boundary of object have problem of leakage to nearby tissues of pancreas region. The proposed method consists of a customized fast-marching level set method which generates an optimal initial pancreas region to solve the problem that the level set method is sensitive to the initial contour location and a modified distance regularized level set method which extracts accurate pancreas. The novelty in our method is the proper selection and combination of level set methods, furthermore an energy-decrement algorithm and an energy-tune algorithm are proposed to reduce the negative impact of bonding force caused by connected tissue whose intensity is similar with pancreas. As a result, our method overcomes the shortages of oversegmentation at weak boundary and can accurately extract pancreas from CT images. The proposed method is compared to other five state-of-the-art medical image segmentation methods based on a CT image dataset which contains abdominal images from 10 patients. The evaluated results demonstrate that our method outperforms other methods by achieving higher accuracy and making less false segmentation in pancreas extraction.

  15. Setting the Direction Framework

    ERIC Educational Resources Information Center

    Alberta Education, 2009

    2009-01-01

    Alberta has a long and proud history of meeting the educational needs of students with disabilities and diverse needs. The province serves many thousand students with behavioural, communicational and intellectual needs; as well as students with mental health challenges, learning or physical disabilities and students who are gifted and talented.…

  16. Bounding probabilistic sea-level projections within the framework of the possibility theory

    NASA Astrophysics Data System (ADS)

    Le Cozannet, Gonéri; Manceau, Jean-Charles; Rohmer, Jeremy

    2017-01-01

    Despite progresses in climate change science, projections of future sea-level rise remain highly uncertain, especially due to large unknowns in the melting processes affecting the ice-sheets in Greenland and Antarctica. Based on climate-models outcomes and the expertise of scientists concerned with these issues, the IPCC provided constraints to the quantiles of sea-level projections. Moreover, additional physical limits to future sea-level rise have been established, although approximately. However, many probability functions can comply with this imprecise knowledge. In this contribution, we provide a framework based on extra-probabilistic theories (namely the possibility theory) to model the uncertainties in sea-level rise projections by 2100 under the RCP 8.5 scenario. The results provide a concise representation of uncertainties in future sea-level rise and of their intrinsically imprecise nature, including a maximum bound of the total uncertainty. Today, coastal impact studies are increasingly moving away from deterministic sea-level projections, which underestimate the expectancy of damages and adaptation needs compared to probabilistic laws. However, we show that the probability functions used so-far have only explored a rather conservative subset of sea-level projections compliant with the IPCC. As a consequence, coastal impact studies relying on these probabilistic sea-level projections are expected to underestimate the possibility of large damages and adaptation needs.

  17. Issues related to setting exemption levels for oil and gas NORM

    SciTech Connect

    Blunt, D. L.; Gooden, D. S.; Smith, K. P.

    1999-11-12

    In the absence of any federal regulations that specifically address the handling and disposal of wastes containing naturally occurring radioactive material (NORM), individual states have taken responsibility for developing their own regulatory programs for NORM. A key issue in developing NORM rules is defining exemption levels--specific levels or concentrations that determine which waste materials are subject to controlled management. In general, states have drawn upon existing standards and guidelines for similar waste types in establishing exemption levels for NORM. Simply adopting these standards may not be appropriate for oil and gas NORM for several reasons. The Interstate Oil and Gas Compact Commission's NORM Subcommittee has summarized the issues involved in setting exemption levels in a report titled ``Naturally Occurring Radioactive Materials (NORM): Issues from the Oil and Gas Point of View''. The committee has also recommended a set of exemption levels for controlled practices and for remediation activities on the basis of the issues discussed.

  18. Hepatic vessel segmentation using variational level set combined with non-local robust statistics.

    PubMed

    Lu, Siyu; Huang, Hui; Liang, Ping; Chen, Gang; Xiao, Liang

    2017-02-01

    Hepatic vessel segmentation is a challenging step in therapy guided by magnetic resonance imaging (MRI). This paper presents an improved variational level set method, which uses non-local robust statistics to suppress the influence of noise in MR images. The non-local robust statistics, which represent vascular features, are learned adaptively from seeds provided by users. K-means clustering in neighborhoods of seeds is utilized to exclude inappropriate seeds, which are obviously corrupted by noise. The neighborhoods of appropriate seeds are placed in an array to calculate the non-local robust statistics, and the variational level set formulation can be constructed. Bias correction is utilized in the level set formulation to reduce the influence of intensity inhomogeneity of MRI. Experiments were conducted over real MR images, and showed that the proposed method performed better on small hepatic vessel segmentation compared with other segmentation methods.

  19. A Variational Level Set Approach to Segmentation and Bias Correction of Images with Intensity Inhomogeneity

    PubMed Central

    Huang, Rui; Ding, Zhaohua; Gatenby, Chris; Metaxas, Dimitris; Gore, John

    2009-01-01

    This paper presents a variational level set approach to joint segmentation and bias correction of images with intensity inhomogeneity. Our method is based on an observation that intensities in a relatively small local region are separable, despite of the inseparability of the intensities in the whole image caused by the intensity inhomogeneity. We first define a weighted K-means clustering objective function for image intensities in a neighborhood around each point, with the cluster centers having a multiplicative factor that estimates the bias within the neighborhood. The objective function is then integrated over the entire domain and incorporated into a variational level set formulation. The energy minimization is performed via a level set evolution process. Our method is able to estimate bias of quite general profiles. Moreover, it is robust to initialization, and therefore allows automatic applications. The proposed method has been used for images of various modalities with promising results. PMID:18982712

  20. [Narrow band multi-region level set method for remote sensing image].

    PubMed

    Fang, Jiang-Xiong; Tu, En-Mei; Yang, Jie; Jia, Zhen-Hong; Nikola, Kasabov

    2011-11-01

    Massive redundant contours happen when the classical Chan-Vese (C-V) model is used to segment remote sensing images, which have interlaced edges. What's more, this model can't segment homogeneous objects with multiple regions. In order to overcome this limitation of C-V model, narrow band multiple level set method is proposed. The use of N-1 curves is required for the segmentation of N regions and each curve represents one region. First, the level set model to establish an independent multi-region region can eliminate the redundant contours and avoids the problems of vacuum and overlap. Then, narrow band approach to level set method can reduce the computational cost. Experimental results of remote image verify that our model is efficient and accurate.

  1. Aerostructural Level Set Topology Optimization for a Common Research Model Wing

    NASA Technical Reports Server (NTRS)

    Dunning, Peter D.; Stanford, Bret K.; Kim, H. Alicia

    2014-01-01

    The purpose of this work is to use level set topology optimization to improve the design of a representative wing box structure for the NASA common research model. The objective is to minimize the total compliance of the structure under aerodynamic and body force loading, where the aerodynamic loading is coupled to the structural deformation. A taxi bump case was also considered, where only body force loads were applied. The trim condition that aerodynamic lift must balance the total weight of the aircraft is enforced by allowing the root angle of attack to change. The level set optimization method is implemented on an unstructured three-dimensional grid, so that the method can optimize a wing box with arbitrary geometry. Fast matching and upwind schemes are developed for an unstructured grid, which make the level set method robust and efficient. The adjoint method is used to obtain the coupled shape sensitivities required to perform aerostructural optimization of the wing box structure.

  2. Dynamically reconfigurable framework for pixel-level visible light communication projector

    NASA Astrophysics Data System (ADS)

    Zhou, Leijie; Fukushima, Shogo; Naemura, Takeshi

    2014-03-01

    We have developed the Pixel-level Visible Light Communication (PVLC) projector based on the DLP (Digital Light Processing) system. The projector can embed invisible data pixel by pixel into a visible image to realize augmented reality applications. However, it cannot update either invisible or visible contents in real time. In order to solve the problem, we improve the projector so that a PC can dynamically control the system and enable us to achieve a high-frame-rate feature by resolution conversion. This paper proposes the system framework and the design method for the dynamically reconfigurable PVLC projector.

  3. Setting-level influences on implementation of the responsive classroom approach.

    PubMed

    Wanless, Shannon B; Patton, Christine L; Rimm-Kaufman, Sara E; Deutsch, Nancy L

    2013-02-01

    We used mixed methods to examine the association between setting-level factors and observed implementation of a social and emotional learning intervention (Responsive Classroom® approach; RC). In study 1 (N = 33 3rd grade teachers after the first year of RC implementation), we identified relevant setting-level factors and uncovered the mechanisms through which they related to implementation. In study 2 (N = 50 4th grade teachers after the second year of RC implementation), we validated our most salient Study 1 finding across multiple informants. Findings suggested that teachers perceived setting-level factors, particularly principal buy-in to the intervention and individualized coaching, as influential to their degree of implementation. Further, we found that intervention coaches' perspectives of principal buy-in were more related to implementation than principals' or teachers' perspectives. Findings extend the application of setting theory to the field of implementation science and suggest that interventionists may want to consider particular accounts of school setting factors before determining the likelihood of schools achieving high levels of implementation.

  4. Geometrically constrained isogeometric parameterized level-set based topology optimization via trimmed elements

    NASA Astrophysics Data System (ADS)

    Wang, Yingjun; Benson, David J.

    2016-12-01

    In this paper, an approach based on the fast point-in-polygon (PIP) algorithm and trimmed elements is proposed for isogeometric topology optimization (TO) with arbitrary geometric constraints. The isogeometric parameterized level-set-based TO method, which directly uses the non-uniform rational basis splines (NURBS) for both level set function (LSF) parameterization and objective function calculation, provides higher accuracy and efficiency than previous methods. The integration of trimmed elements is completed by the efficient quadrature rule that can design the quadrature points and weights for arbitrary geometric shape. Numerical examples demonstrate the efficiency and flexibility of the method.

  5. Numerical Schemes for the Hamilton-Jacobi and Level Set Equations on Triangulated Domains

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.; Sethian, James A.

    2006-01-01

    Borrowing from techniques developed for conservation law equations, we have developed both monotone and higher order accurate numerical schemes which discretize the Hamilton-Jacobi and level set equations on triangulated domains. The use of unstructured meshes containing triangles (2D) and tetrahedra (3D) easily accommodates mesh adaptation to resolve disparate level set feature scales with a minimal number of solution unknowns. The minisymposium talk will discuss these algorithmic developments and present sample calculations using our adaptive triangulation algorithm applied to various moving interface problems such as etching, deposition, and curvature flow.

  6. A level-set method for thermal motion of bubbles and droplets

    NASA Astrophysics Data System (ADS)

    Balcázar, Néstor; Oliva, Assensi; Rigola, Joaquim

    2016-09-01

    A conservative level-set model for direct simulation of two-phase flows with thermocapillary effects at dynamically deformable interface is presented. The Navier-Stokes equations coupled with the energy conservation equation are solved by means of a finite-volume/level-set method. Some numerical examples including thermocapillary motion of single and multiple fluid particles are computed by means of the present method. The results are compared with analytical solutions and numerical results from the literature as validations of the proposed model.

  7. Segmentierung des Femurs aus MRT-Daten mit Shape-Based Level-Sets

    NASA Astrophysics Data System (ADS)

    Dekomien, Claudia; Busch, Martin; Teske, Wolfram; Winter, Susanne

    Inhalt dieser Arbeit ist die Segmentierung des Femurs aus MRT-Datensätzen mit einem Shape-based Level-Set-Ansatz. Der Algorithmus besteht aus zwei Phasen, der Modellerstellung und der Segmentierungsphase. In der Segmentierungsphase wurde ein kantenbasiertes und ein auf Intensitäten basierendes Optimierungskriterium mit einander kombiniert. Für eine lokale Verbesserung des Ergebnisses wurde zusätzlich ein Laplacian Level-Set-Verfahren angewendet. Der Femur konnte mit diesem Ansatz in drei verschiedenen MRT-Sequenzen und einem Fusionsdatensatz gut segmentiert werden.

  8. Level set segmentation of brain magnetic resonance images based on local Gaussian distribution fitting energy.

    PubMed

    Wang, Li; Chen, Yunjie; Pan, Xiaohua; Hong, Xunning; Xia, Deshen

    2010-05-15

    This paper presents a variational level set approach in a multi-phase formulation to segmentation of brain magnetic resonance (MR) images with intensity inhomogeneity. In our model, the local image intensities are characterized by Gaussian distributions with different means and variances. We define a local Gaussian distribution fitting energy with level set functions and local means and variances as variables. The means and variances of local intensities are considered as spatially varying functions. Therefore, our method is able to deal with intensity inhomogeneity without inhomogeneity correction. Our method has been applied to 3T and 7T MR images with promising results.

  9. Individual-and Setting-Level Correlates of Secondary Traumatic Stress in Rape Crisis Center Staff.

    PubMed

    Dworkin, Emily R; Sorell, Nicole R; Allen, Nicole E

    2016-02-01

    Secondary traumatic stress (STS) is an issue of significant concern among providers who work with survivors of sexual assault. Although STS has been studied in relation to individual-level characteristics of a variety of types of trauma responders, less research has focused specifically on rape crisis centers as environments that might convey risk or protection from STS, and no research to knowledge has modeled setting-level variation in correlates of STS. The current study uses a sample of 164 staff members representing 40 rape crisis centers across a single Midwestern state to investigate the staff member-and agency-level correlates of STS. Results suggest that correlates exist at both levels of analysis. Younger age and greater severity of sexual assault history were statistically significant individual-level predictors of increased STS. Greater frequency of supervision was more strongly related to secondary stress for non-advocates than for advocates. At the setting level, lower levels of supervision and higher client loads agency-wide accounted for unique variance in staff members' STS. These findings suggest that characteristics of both providers and their settings are important to consider when understanding their STS. © The Author(s) 2014.

  10. Accurate Adaptive Level Set Method and Sharpening Technique for Three Dimensional Deforming Interfaces

    NASA Technical Reports Server (NTRS)

    Kim, Hyoungin; Liou, Meng-Sing

    2011-01-01

    In this paper, we demonstrate improved accuracy of the level set method for resolving deforming interfaces by proposing two key elements: (1) accurate level set solutions on adapted Cartesian grids by judiciously choosing interpolation polynomials in regions of different grid levels and (2) enhanced reinitialization by an interface sharpening procedure. The level set equation is solved using a fifth order WENO scheme or a second order central differencing scheme depending on availability of uniform stencils at each grid point. Grid adaptation criteria are determined so that the Hamiltonian functions at nodes adjacent to interfaces are always calculated by the fifth order WENO scheme. This selective usage between the fifth order WENO and second order central differencing schemes is confirmed to give more accurate results compared to those in literature for standard test problems. In order to further improve accuracy especially near thin filaments, we suggest an artificial sharpening method, which is in a similar form with the conventional re-initialization method but utilizes sign of curvature instead of sign of the level set function. Consequently, volume loss due to numerical dissipation on thin filaments is remarkably reduced for the test problems

  11. Geological repository for nuclear high level waste in France from feasibility to design within a legal framework

    SciTech Connect

    Voizard, Patrice; Mayer, Stefan; Ouzounian, Gerald

    2007-07-01

    Over the past 15 years, the French program on deep geologic disposal of high level and long-lived radioactive waste has benefited from a clear legal framework as the result of the December 30, 1991 French Waste Act. To fulfil its obligations stipulated in this law, ANDRA has submitted the 'Dossier 2005 Argile' (clay) and 'Dossier 2005 Granite' to the French Government. The first of those reports presents a concept for the underground disposal of nuclear waste at a specific clay site and focuses on a feasibility study. Knowledge of the host rock characteristics is based on the investigations carried out at the Meuse/Haute Marne Underground Research Laboratory. The repository concept addresses various issues, the most important of which relates to the large amount of waste, the clay host rock and the reversibility requirement. This phase has ended upon review and evaluation of the 'Dossier 2005' made by different organisations including the National Review Board, the National Safety Authority and the NEA International Review Team. By passing the 'new', June 28, 2006 Planning Act on the sustainable management of radioactive materials and waste, the French parliament has further defined a clear legal framework for future work. This June 28 Planning Act thus sets a schedule and defines the objectives for the next phase of repository design in requesting the submission of a construction authorization application by 2015. The law calls for the repository program to be in a position to commission disposal installations by 2025. (authors)

  12. Evidence-informed capacity building for setting health priorities in low- and middle-income countries: A framework and recommendations for further research.

    PubMed

    Li, Ryan; Ruiz, Francis; Culyer, Anthony J; Chalkidou, Kalipso; Hofman, Karen J

    2017-01-01

    Priority-setting in health is risky and challenging, particularly in resource-constrained settings. It is not simply a narrow technical exercise, and involves the mobilisation of a wide range of capacities among stakeholders - not only the technical capacity to "do" research in economic evaluations. Using the Individuals, Nodes, Networks and Environment (INNE) framework, we identify those stakeholders, whose capacity needs will vary along the evidence-to-policy continuum. Policymakers and healthcare managers require the capacity to commission and use relevant evidence (including evidence of clinical and cost-effectiveness, and of social values); academics need to understand and respond to decision-makers' needs to produce relevant research. The health system at all levels will need institutional capacity building to incentivise routine generation and use of evidence. Knowledge brokers, including priority-setting agencies (such as England's National Institute for Health and Care Excellence, and Health Interventions and Technology Assessment Program, Thailand) and the media can play an important role in facilitating engagement and knowledge transfer between the various actors. Especially at the outset but at every step, it is critical that patients and the public understand that trade-offs are inherent in priority-setting, and careful efforts should be made to engage them, and to hear their views throughout the process. There is thus no single approach to capacity building; rather a spectrum of activities that recognises the roles and skills of all stakeholders. A range of methods, including formal and informal training, networking and engagement, and support through collaboration on projects, should be flexibly employed (and tailored to specific needs of each country) to support institutionalisation of evidence-informed priority-setting. Finally, capacity building should be a two-way process; those who build capacity should also attend to their own capacity

  13. Evidence-informed capacity building for setting health priorities in low- and middle-income countries: A framework and recommendations for further research

    PubMed Central

    Li, Ryan; Ruiz, Francis; Culyer, Anthony J; Chalkidou, Kalipso; Hofman, Karen J

    2017-01-01

    Priority-setting in health is risky and challenging, particularly in resource-constrained settings. It is not simply a narrow technical exercise, and involves the mobilisation of a wide range of capacities among stakeholders – not only the technical capacity to “do” research in economic evaluations. Using the Individuals, Nodes, Networks and Environment (INNE) framework, we identify those stakeholders, whose capacity needs will vary along the evidence-to-policy continuum. Policymakers and healthcare managers require the capacity to commission and use relevant evidence (including evidence of clinical and cost-effectiveness, and of social values); academics need to understand and respond to decision-makers’ needs to produce relevant research. The health system at all levels will need institutional capacity building to incentivise routine generation and use of evidence. Knowledge brokers, including priority-setting agencies (such as England’s National Institute for Health and Care Excellence, and Health Interventions and Technology Assessment Program, Thailand) and the media can play an important role in facilitating engagement and knowledge transfer between the various actors. Especially at the outset but at every step, it is critical that patients and the public understand that trade-offs are inherent in priority-setting, and careful efforts should be made to engage them, and to hear their views throughout the process. There is thus no single approach to capacity building; rather a spectrum of activities that recognises the roles and skills of all stakeholders. A range of methods, including formal and informal training, networking and engagement, and support through collaboration on projects, should be flexibly employed (and tailored to specific needs of each country) to support institutionalisation of evidence-informed priority-setting. Finally, capacity building should be a two-way process; those who build capacity should also attend to their own

  14. A Framework for Lab Work Management in Mass Courses. Application to Low Level Input/Output without Hardware

    ERIC Educational Resources Information Center

    Rodriguez, Santiago; Zamorano, Juan; Rosales, Francisco; Dopico, Antonio Garcia; Pedraza, Jose Luis

    2007-01-01

    This paper describes a complete lab work management framework designed and developed in the authors' department to help teachers to manage the small projects that students are expected to complete as lab assignments during their graduate-level computer engineering studies. The paper focuses on an application example of the framework to a specific…

  15. Demons versus level-set motion registration for coronary 18F-sodium fluoride PET

    NASA Astrophysics Data System (ADS)

    Rubeaux, Mathieu; Joshi, Nikhil; Dweck, Marc R.; Fletcher, Alison; Motwani, Manish; Thomson, Louise E.; Germano, Guido; Dey, Damini; Berman, Daniel S.; Newby, David E.; Slomka, Piotr J.

    2016-03-01

    Ruptured coronary atherosclerotic plaques commonly cause acute myocardial infarction. It has been recently shown that active microcalcification in the coronary arteries, one of the features that characterizes vulnerable plaques at risk of rupture, can be imaged using cardiac gated 18F-sodium fluoride (18F-NaF) PET. We have shown in previous work that a motion correction technique applied to cardiac-gated 18F-NaF PET images can enhance image quality and improve uptake estimates. In this study, we further investigated the applicability of different algorithms for registration of the coronary artery PET images. In particular, we aimed to compare demons vs. level-set nonlinear registration techniques applied for the correction of cardiac motion in coronary 18F-NaF PET. To this end, fifteen patients underwent 18F-NaF PET and prospective coronary CT angiography (CCTA). PET data were reconstructed in 10 ECG gated bins; subsequently these gated bins were registered using demons and level-set methods guided by the extracted coronary arteries from CCTA, to eliminate the effect of cardiac motion on PET images. Noise levels, target-to-background ratios (TBR) and global motion were compared to assess image quality. Compared to the reference standard of using only diastolic PET image (25% of the counts from PET acquisition), cardiac motion registration using either level-set or demons techniques almost halved image noise due to the use of counts from the full PET acquisition and increased TBR difference between 18F-NaF positive and negative lesions. The demons method produces smoother deformation fields, exhibiting no singularities (which reflects how physically plausible the registration deformation is), as compared to the level-set method, which presents between 4 and 8% of singularities, depending on the coronary artery considered. In conclusion, the demons method produces smoother motion fields as compared to the level-set method, with a motion that is physiologically

  16. Demons versus Level-Set motion registration for coronary 18F-sodium fluoride PET

    PubMed Central

    Rubeaux, Mathieu; Joshi, Nikhil; Dweck, Marc R.; Fletcher, Alison; Motwani, Manish; Thomson, Louise E.; Germano, Guido; Dey, Damini; Berman, Daniel S.; Newby, David E.; Slomka, Piotr J.

    2016-01-01

    Ruptured coronary atherosclerotic plaques commonly cause acute myocardial infarction. It has been recently shown that active microcalcification in the coronary arteries, one of the features that characterizes vulnerable plaques at risk of rupture, can be imaged using cardiac gated 18F-sodium fluoride (18F-NaF) PET. We have shown in previous work that a motion correction technique applied to cardiac-gated 18F-NaF PET images can enhance image quality and improve uptake estimates. In this study, we further investigated the applicability of different algorithms for registration of the coronary artery PET images. In particular, we aimed to compare demons vs. level-set nonlinear registration techniques applied for the correction of cardiac motion in coronary 18F-NaF PET. To this end, fifteen patients underwent 18F-NaF PET and prospective coronary CT angiography (CCTA). PET data were reconstructed in 10 ECG gated bins; subsequently these gated bins were registered using demons and level-set methods guided by the extracted coronary arteries from CCTA, to eliminate the effect of cardiac motion on PET images. Noise levels, target-to-background ratios (TBR) and global motion were compared to assess image quality. Compared to the reference standard of using only diastolic PET image (25% of the counts from PET acquisition), cardiac motion registration using either level-set or demons techniques almost halved image noise due to the use of counts from the full PET acquisition and increased TBR difference between 18F-NaF positive and negative lesions. The demons method produces smoother deformation fields, exhibiting no singularities (which reflects how physically plausible the registration deformation is), as compared to the level-set method, which presents between 4 and 8% of singularities, depending on the coronary artery considered. In conclusion, the demons method produces smoother motion fields as compared to the level-set method, with a motion that is physiologically

  17. Demons versus Level-Set motion registration for coronary (18)F-sodium fluoride PET.

    PubMed

    Rubeaux, Mathieu; Joshi, Nikhil; Dweck, Marc R; Fletcher, Alison; Motwani, Manish; Thomson, Louise E; Germano, Guido; Dey, Damini; Berman, Daniel S; Newby, David E; Slomka, Piotr J

    2016-02-27

    Ruptured coronary atherosclerotic plaques commonly cause acute myocardial infarction. It has been recently shown that active microcalcification in the coronary arteries, one of the features that characterizes vulnerable plaques at risk of rupture, can be imaged using cardiac gated (18)F-sodium fluoride ((18)F-NaF) PET. We have shown in previous work that a motion correction technique applied to cardiac-gated (18)F-NaF PET images can enhance image quality and improve uptake estimates. In this study, we further investigated the applicability of different algorithms for registration of the coronary artery PET images. In particular, we aimed to compare demons vs. level-set nonlinear registration techniques applied for the correction of cardiac motion in coronary (18)F-NaF PET. To this end, fifteen patients underwent (18)F-NaF PET and prospective coronary CT angiography (CCTA). PET data were reconstructed in 10 ECG gated bins; subsequently these gated bins were registered using demons and level-set methods guided by the extracted coronary arteries from CCTA, to eliminate the effect of cardiac motion on PET images. Noise levels, target-to-background ratios (TBR) and global motion were compared to assess image quality. Compared to the reference standard of using only diastolic PET image (25% of the counts from PET acquisition), cardiac motion registration using either level-set or demons techniques almost halved image noise due to the use of counts from the full PET acquisition and increased TBR difference between (18)F-NaF positive and negative lesions. The demons method produces smoother deformation fields, exhibiting no singularities (which reflects how physically plausible the registration deformation is), as compared to the level-set method, which presents between 4 and 8% of singularities, depending on the coronary artery considered. In conclusion, the demons method produces smoother motion fields as compared to the level-set method, with a motion that is

  18. Probabilistic framework for assessing the ice sheet contribution to sea level change.

    PubMed

    Little, Christopher M; Urban, Nathan M; Oppenheimer, Michael

    2013-02-26

    Previous sea level rise (SLR) assessments have excluded the potential for dynamic ice loss over much of Greenland and Antarctica, and recently proposed "upper bounds" on Antarctica's 21st-century SLR contribution are derived principally from regions where present-day mass loss is concentrated (basin 15, or B15, drained largely by Pine Island, Thwaites, and Smith glaciers). Here, we present a probabilistic framework for assessing the ice sheet contribution to sea level change that explicitly accounts for mass balance uncertainty over an entire ice sheet. Applying this framework to Antarctica, we find that ongoing mass imbalances in non-B15 basins give an SLR contribution by 2100 that: (i) is comparable to projected changes in B15 discharge and Antarctica's surface mass balance, and (ii) varies widely depending on the subset of basins and observational dataset used in projections. Increases in discharge uncertainty, or decreases in the exceedance probability used to define an upper bound, increase the fractional contribution of non-B15 basins; even weak spatial correlations in future discharge growth rates markedly enhance this sensitivity. Although these projections rely on poorly constrained statistical parameters, they may be updated with observations and/or models at many spatial scales, facilitating a more comprehensive account of uncertainty that, if implemented, will improve future assessments.

  19. Systems Science and Obesity Policy: A Novel Framework for Analyzing and Rethinking Population-Level Planning

    PubMed Central

    Matteson, Carrie L.; Finegood, Diane T.

    2014-01-01

    Objectives. We demonstrate the use of a systems-based framework to assess solutions to complex health problems such as obesity. Methods. We coded 12 documents published between 2004 and 2013 aimed at influencing obesity planning for complex systems design (9 reports from US and Canadian governmental or health authorities, 1 Cochrane review, and 2 Institute of Medicine reports). We sorted data using the intervention-level framework (ILF), a novel solutions-oriented approach to complex problems. An in-depth comparison of 3 documents provides further insight into complexity and systems design in obesity policy. Results. The majority of strategies focused mainly on changing the determinants of energy imbalance (food intake and physical activity). ILF analysis brings to the surface actions aimed at higher levels of system function and points to a need for more innovative policy design. Conclusions. Although many policymakers acknowledge obesity as a complex problem, many strategies stem from the paradigm of individual choice and are limited in scope. The ILF provides a template to encourage natural systems thinking and more strategic policy design grounded in complexity science. PMID:24832406

  20. A Framework for Spatial Assessment of Local Level Vulnerability and Adaptive Capacity to Extreme Heat

    NASA Astrophysics Data System (ADS)

    Wilhelmi, O.; Hayden, M.; Harlan, S.; Ruddell, D.; Komatsu, K.; England, B.; Uejio, C.

    2008-12-01

    Changing climate is predicted to increase the intensity and impacts of heat waves prompting the need to develop preparedness and adaptation strategies that reduce societal vulnerability. Central to understanding societal vulnerability, is adaptive capacity, the potential of a system or population to modify its features/behaviors so as to better cope with existing and anticipated stresses and fluctuations. Adaptive capacity influences adaptation, the actual adjustments made to cope with the impacts from current and future hazardous heat events. Understanding societal risks, vulnerabilities and adaptive capacity to extreme heat events and climate change requires an interdisciplinary approach that includes information about weather and climate, the natural and built environment, social processes and characteristics, interactions with the stakeholders, and an assessment of community vulnerability. This project presents a framework for an interdisciplinary approach and a case study that explore linkages between quantitative and qualitative data for a more comprehensive understanding of local level vulnerability and adaptive capacity to extreme heat events in Phoenix, Arizona. In this talk, we will present a methodological framework for conducting collaborative research on societal vulnerability and adaptive capacity on a local level that includes integration of household surveys into a quantitative spatial assessment of societal vulnerability. We highlight a collaborative partnership among researchers, community leaders and public health officials. Linkages between assessment of local adaptive capacity and development of regional climate change adaptation strategies will be discussed.

  1. A Bayesian framework for cell-level protein network analysis for multivariate proteomics image data

    NASA Astrophysics Data System (ADS)

    Kovacheva, Violet N.; Sirinukunwattana, Korsuk; Rajpoot, Nasir M.

    2014-03-01

    The recent development of multivariate imaging techniques, such as the Toponome Imaging System (TIS), has facilitated the analysis of multiple co-localisation of proteins. This could hold the key to understanding complex phenomena such as protein-protein interaction in cancer. In this paper, we propose a Bayesian framework for cell level network analysis allowing the identification of several protein pairs having significantly higher co-expression levels in cancerous tissue samples when compared to normal colon tissue. It involves segmenting the DAPI-labeled image into cells and determining the cell phenotypes according to their protein-protein dependence profile. The cells are phenotyped using Gaussian Bayesian hierarchical clustering (GBHC) after feature selection is performed. The phenotypes are then analysed using Difference in Sums of Weighted cO-dependence Profiles (DiSWOP), which detects differences in the co-expression patterns of protein pairs. We demonstrate that the pairs highlighted by the proposed framework have high concordance with recent results using a different phenotyping method. This demonstrates that the results are independent of the clustering method used. In addition, the highlighted protein pairs are further analysed via protein interaction pathway databases and by considering the localization of high protein-protein dependence within individual samples. This suggests that the proposed approach could identify potentially functional protein complexes active in cancer progression and cell differentiation.

  2. Weld defect detection on digital radiographic image using level set method

    NASA Astrophysics Data System (ADS)

    Halim, Suhaila Abd; Petrus, Bertha Trissan; Ibrahim, Arsmah; Manurung, Yupiter HP; Jayes, Mohd Idris

    2013-09-01

    Segmentation is the most critical task and widely used to obtain useful information in image processing. In this study, Level set based on Chan Vese method is explored and applied to define weld defect on digital radiographic image and its accuracy is evaluated to measure its performance. A set of images with region of interest (ROI) that contain defect are used as input image. The ROI image is pre-processed to improve their quality for better detection. Then, the image is segmented using level set method that is implemented using MATLAB R2009a. The accuracy of the method is evaluated using Receiver Operating Characteristic (ROC). Experimental results show that the method generated an area underneath the ROC of 0.7 in the set of images and the operational point reached corresponds to 0.6 of sensitivity and 0.8 of specificity. The application of segmentation technique such as Chan-Vese level set able to assist radiographer in detecting the defect on digital radiographic image accurately.

  3. Analysis of Forensic Autopsy in 120 Cases of Medical Disputes Among Different Levels of Institutional Settings.

    PubMed

    Yu, Lin-Sheng; Ye, Guang-Hua; Fan, Yan-Yan; Li, Xing-Biao; Feng, Xiang-Ping; Han, Jun-Ge; Lin, Ke-Zhi; Deng, Miao-Wu; Li, Feng

    2015-09-01

    Despite advances in medical science, the causes of death can sometimes only be determined by pathologists after a complete autopsy. Few studies have investigated the importance of forensic autopsy in medically disputed cases among different levels of institutional settings. Our study aimed to analyze forensic autopsy in 120 cases of medical disputes among five levels of institutional settings between 2001 and 2012 in Wenzhou, China. The results showed an overall concordance rate of 55%. Of the 39% of clinically missed diagnosis, cardiovascular pathology comprises 55.32%, while respiratory pathology accounts for the remaining 44. 68%. Factors that increase the likelihood of missed diagnoses were private clinics, community settings, and county hospitals. These results support that autopsy remains an important tool in establishing causes of death in medically disputed case, which may directly determine or exclude the fault of medical care and therefore in helping in resolving these cases. © 2015 American Academy of Forensic Sciences.

  4. A Multilevel Conceptual Framework to Understand the Role of Food Insecurity on Antiretroviral Therapy Adherence in Low-Resource Settings: From Theory to Practice.

    PubMed

    Masa, Rainier; Chowa, Gina

    2017-04-03

    The objective of this study was to describe a multilevel conceptual framework to understand the role of food insecurity on antiretroviral therapy adherence. The authors illustrated an example of how they used the multilevel framework to develop an intervention for poor people living with HIV in a rural and low-resource community. The framework incorporates intrapersonal, interpersonal, and structural-level theories of understanding and changing health behaviors. The framework recognizes the role of personal, social, and environmental factors on cognition and behavior, with particular attention to ways in which treatment adherence is enabled or prevented by structural conditions, such as food insecurity.

  5. Pull-push level sets: a new term to encode prior knowledge for the segmentation of teeth images

    NASA Astrophysics Data System (ADS)

    de Luis Garcia, Rodrigo; San Jose Estepar, Raul; Alberola-Lopez, Carlos

    2005-04-01

    This paper presents a novel level set method for contour detection in multiple-object scenarios applied to the segmentation of teeth images. Teeth segmentation from 2D images of dental plaster cast models is a difficult problem because it is necessary to independently segment several objects that have very badly defined borders between them. Current methods for contour detection which only employ image information cannot successfully segment such structures. Being therefore necessary to use prior knowledge about the problem domain, current approaches in the literature are limited to the extraction of shape information of individual objects, whereas the key factor in such a problem are the relative positions of the different objects composing the anatomical structure. Therefore, we propose a novel method for introducing such information into a level set framework. This results in a new energy term which can be explained as a regional term that takes into account the relative positions of the different objects, and consequently creates an attraction or repulsion force that favors a determined configuration. The proposed method is compared with balloon and GVF snakes, as well as with the Geodesic Active Regions model, showing accurate results.

  6. An investigation of children's levels of inquiry in an informal science setting

    NASA Astrophysics Data System (ADS)

    Clark-Thomas, Beth Anne

    Elementary school students' understanding of both science content and processes are enhanced by the higher level thinking associated with inquiry-based science investigations. Informal science setting personnel, elementary school teachers, and curriculum specialists charged with designing inquiry-based investigations would be well served by an understanding of the varying influence of certain present factors upon the students' willingness and ability to delve into such higher level inquiries. This study examined young children's use of inquiry-based materials and factors which may influence the level of inquiry they engaged in during informal science activities. An informal science setting was selected as the context for the examination of student inquiry behaviors because of the rich inquiry-based environment present at the site and the benefits previously noted in the research regarding the impact of informal science settings upon the construction of knowledge in science. The study revealed several patterns of behavior among children when they are engaged in inquiry-based activities at informal science exhibits. These repeated behaviors varied in the children's apparent purposeful use of the materials at the exhibits. These levels of inquiry behavior were taxonomically defined as high/medium/low within this study utilizing a researcher-developed tool. Furthermore, in this study adult interventions, questions, or prompting were found to impact the level of inquiry engaged in by the children. This study revealed that higher levels of inquiry were preceded by task directed and physical feature prompts. Moreover, the levels of inquiry behaviors were haltered, even lowered, when preceded by a prompt that focused on a science content or concept question. Results of this study have implications for the enhancement of inquiry-based science activities in elementary schools as well as in informal science settings. These findings have significance for all science educators

  7. Segmentation of ventricles in Alzheimer mr images using anisotropic diffusion filtering and level set method.

    PubMed

    Anandh, K R; Sujatha, C M; Ramakrishnan, S

    2014-01-01

    Ventricle enlargement is a useful structural biomarker for the diagnosis of Alzheimer’s Disease (AD). This devastating neurodegenerative disorder results in progression of dementia. Although AD results in the passive increment of ventricle volume, there exists a large overlap in the volume measurements of AD and normal subjects. Hence, shape based analysis of ventricle dilation is appropriate to detect the subtle morphological changes among these two groups. In this work, segmentation of ventricle in Alzheimer MR images is employed using level set method and anisotropic based diffusion filtering. Images considered for this study are preprocessed using filters. Anisotropic based diffusion filtering is employed to extract the edge map. This filtering performs region specific smoothing process using the diffusion coefficient as a function of image gradient. Filtered images are subjected to level set method which employs an improved diffusion rate equation for the level set evolution. Geometric features are extracted from the segmented ventricles. Results show that the diffusion filter could extract edge map with sharp region boundaries. The modified level set method is able to extract the morphological changes in ventricles. The observed morphological changes are distinct for normal and AD subjects (p < 0.0001). It is also observed that the sizes of ventricle in the AD subjects are noticeably enlarged when compared to normal subjects. Features obtained from the segmented ventricles are also clearly distinct and demonstrate the differences in the AD subjects. As ventricle volume and its morphometry are significant biomarkers, this study seems to be clinically relevant.

  8. Total variation based edge enhancement for level set segmentation and asymmetry analysis in breast thermograms.

    PubMed

    Prabha, S; Anandh, K R; Sujatha, C M; Ramakrishnan, S

    2014-01-01

    In this work, an attempt has been made to perform asymmetry analysis in breast thermograms using non-linear total variation diffusion filter and reaction diffusion based level set method. Breast images used in this study are obtained from online database of the project PROENG. Initially the images are subjected to total variation (TV) diffusion filter to generate the edge map. Reaction diffusion based level set method is employed to segment the breast tissues using TV edge map as stopping boundary function. Asymmetry analysis is performed on the segmented breast tissues using wavelet based structural texture features. The results show that nonlinear total variation based reaction diffusion level set method could efficiently segment the breast tissues. This method yields high correlation between the segmented output and the ground truth than the conventional level set. Structural texture features extracted from the wavelet coefficients are found to be significant in demarcating normal and abnormal tissues. Hence, it appears that the asymmetry analysis on segmented breast tissues extracted using total variation edge map can be used efficiently to identify the pathological conditions of breast thermograms.

  9. 21 CFR 530.23 - Procedure for setting and announcing safe levels.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., and the availability, if any, of a specific analytical method or methods for drug residue detection... 21 Food and Drugs 6 2010-04-01 2010-04-01 false Procedure for setting and announcing safe levels. 530.23 Section 530.23 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN...

  10. Physical Activity Levels in Coeducational and Single-Gender High School Physical Education Settings

    ERIC Educational Resources Information Center

    Hannon, James; Ratliffe, Thomas

    2005-01-01

    The purpose of this study was to investigate the effects of coeducational (coed) and single-gender game-play settings on the activity levels of Caucasian and African American high school physical education students. Students participated in flag football, ultimate Frisbee, and soccer units. Classes were as follows: there were two coed classes, two…

  11. Adaptive segmentation of magnetic resonance images with intensity inhomogeneity using level set method.

    PubMed

    Liu, Lixiong; Zhang, Qi; Wu, Min; Li, Wu; Shang, Fei

    2013-05-01

    It is a big challenge to segment magnetic resonance (MR) images with intensity inhomogeneity. The widely used segmentation algorithms are region based, which mostly rely on the intensity homogeneity, and could bring inaccurate results. In this paper, we propose a novel region-based active contour model in a variational level set formulation. Based on the fact that intensities in a relatively small local region are separable, a local intensity clustering criterion function is defined. Then, the local function is integrated around the neighborhood center to formulate a global intensity criterion function, which defines the energy term to drive the evolution of the active contour locally. Simultaneously, an intensity fitting term that drives the motion of the active contour globally is added to the energy. In order to segment the image fast and accurately, we utilize a coefficient to make the segmentation adaptive. Finally, the energy is incorporated into a level set formulation with a level set regularization term, and the energy minimization is conducted by a level set evolution process. Experiments on synthetic and real MR images show the effectiveness of our method. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Knowledge Levels of Pre-Service Mathematics Teachers on the Concept of Set

    ERIC Educational Resources Information Center

    Sirmaci, Nur; Tas, Fatih

    2013-01-01

    This study aims to investigate knowledge levels of pre-service mathematics teachers about set. The study was carried out on a total of 196 students studying at the Department of Mathematics Teaching at Kazim Karabekir Faculty of Education in Ataturk University. Concept testing consisting of 7 open-ended questions aiming to analyze the knowledge of…

  13. A Cartesian Adaptive Level Set Method for Two-Phase Flows

    DTIC Science & Technology

    2003-12-01

    Cartesian adaptive level set method 237 REFERENCES CHANDRASEKHAR, S. 1961 Hydrodynamic and Hydromagnetic Stability . Oxford Univer- sity Press. CHEN, G...Volume Method for Unsteady Incompressible Flow on Hybrid Unstructured Grids. J. Comput. Phys. 162, 411-428. LAMB, H. 1932 Hydrodynamics . Cambridge

  14. A fast level set method for synthetic aperture radar ocean image segmentation.

    PubMed

    Huang, Xiaoxia; Huang, Bo; Li, Hongga

    2009-01-01

    Segmentation of high noise imagery like Synthetic Aperture Radar (SAR) images is still one of the most challenging tasks in image processing. While level set, a novel approach based on the analysis of the motion of an interface, can be used to address this challenge, the cell-based iterations may make the process of image segmentation remarkably slow, especially for large-size images. For this reason fast level set algorithms such as narrow band and fast marching have been attempted. Built upon these, this paper presents an improved fast level set method for SAR ocean image segmentation. This competent method is dependent on both the intensity driven speed and curvature flow that result in a stable and smooth boundary. Notably, it is optimized to track moving interfaces for keeping up with the point-wise boundary propagation using a single list and a method of fast up-wind scheme iteration. The list facilitates efficient insertion and deletion of pixels on the propagation front. Meanwhile, the local up-wind scheme is used to update the motion of the curvature front instead of solving partial differential equations. Experiments have been carried out on extraction of surface slick features from ERS-2 SAR images to substantiate the efficacy of the proposed fast level set method.

  15. Energy-optimal path planning by stochastic dynamically orthogonal level-set optimization

    NASA Astrophysics Data System (ADS)

    Subramani, Deepak N.; Lermusiaux, Pierre F. J.

    2016-04-01

    A stochastic optimization methodology is formulated for computing energy-optimal paths from among time-optimal paths of autonomous vehicles navigating in a dynamic flow field. Based on partial differential equations, the methodology rigorously leverages the level-set equation that governs time-optimal reachability fronts for a given relative vehicle-speed function. To set up the energy optimization, the relative vehicle-speed and headings are considered to be stochastic and new stochastic Dynamically Orthogonal (DO) level-set equations are derived. Their solution provides the distribution of time-optimal reachability fronts and corresponding distribution of time-optimal paths. An optimization is then performed on the vehicle's energy-time joint distribution to select the energy-optimal paths for each arrival time, among all stochastic time-optimal paths for that arrival time. Numerical schemes to solve the reduced stochastic DO level-set equations are obtained, and accuracy and efficiency considerations are discussed. These reduced equations are first shown to be efficient at solving the governing stochastic level-sets, in part by comparisons with direct Monte Carlo simulations. To validate the methodology and illustrate its accuracy, comparisons with semi-analytical energy-optimal path solutions are then completed. In particular, we consider the energy-optimal crossing of a canonical steady front and set up its semi-analytical solution using a energy-time nested nonlinear double-optimization scheme. We then showcase the inner workings and nuances of the energy-optimal path planning, considering different mission scenarios. Finally, we study and discuss results of energy-optimal missions in a wind-driven barotropic quasi-geostrophic double-gyre ocean circulation.

  16. Automatic Measurement of Thalamic Diameter in 2-D Fetal Ultrasound Brain Images Using Shape Prior Constrained Regularized Level Sets.

    PubMed

    Sridar, Pradeeba; Kumar, Ashnil; Li, Changyang; Woo, Joyce; Quinton, Ann; Benzie, Ron; Peek, Michael J; Feng, Dagan; Kumar, R Krishna; Nanan, Ralph; Kim, Jinman

    2017-07-01

    We derived an automated algorithm for accurately measuring the thalamic diameter from 2-D fetal ultrasound (US) brain images. The algorithm overcomes the inherent limitations of the US image modality: nonuniform density; missing boundaries; and strong speckle noise. We introduced a "guitar" structure that represents the negative space surrounding the thalamic regions. The guitar acts as a landmark for deriving the widest points of the thalamus even when its boundaries are not identifiable. We augmented a generalized level-set framework with a shape prior and constraints derived from statistical shape models of the guitars; this framework was used to segment US images and measure the thalamic diameter. Our segmentation method achieved a higher mean Dice similarity coefficient, Hausdorff distance, specificity, and reduced contour leakage when compared to other well-established methods. The automatic thalamic diameter measurement had an interobserver variability of -0.56 ± 2.29 mm compared to manual measurement by an expert sonographer. Our method was capable of automatically estimating the thalamic diameter, with the measurement accuracy on par with clinical assessment. Our method can be used as part of computer-assisted screening tools that automatically measure the biometrics of the fetal thalamus; these biometrics are linked to neurodevelopmental outcomes.

  17. Loosely coupled level sets for retinal layers and drusen segmentation in subjects with dry age-related macular degeneration

    NASA Astrophysics Data System (ADS)

    Novosel, Jelena; Wang, Ziyuan; de Jong, Henk; Vermeer, Koenraad A.; van Vliet, Lucas J.

    2016-03-01

    Optical coherence tomography (OCT) is used to produce high-resolution three-dimensional images of the retina, which permit the investigation of retinal irregularities. In dry age-related macular degeneration (AMD), a chronic eye disease that causes central vision loss, disruptions such as drusen and changes in retinal layer thicknesses occur which could be used as biomarkers for disease monitoring and diagnosis. Due to the topology disrupting pathology, existing segmentation methods often fail. Here, we present a solution for the segmentation of retinal layers in dry AMD subjects by extending our previously presented loosely coupled level sets framework which operates on attenuation coefficients. In eyes affected by AMD, Bruch's membrane becomes visible only below the drusen and our segmentation framework is adapted to delineate such a partially discernible interface. Furthermore, the initialization stage, which tentatively segments five interfaces, is modified to accommodate the appearance of drusen. This stage is based on Dijkstra's algorithm and combines prior knowledge on the shape of the interface, gradient and attenuation coefficient in the newly proposed cost function. This prior knowledge is incorporated by varying the weights for horizontal, diagonal and vertical edges. Finally, quantitative evaluation of the accuracy shows a good agreement between manual and automated segmentation.

  18. Scope of physician procedures independently billed by mid-level providers in the office setting.

    PubMed

    Coldiron, Brett; Ratnarathorn, Mondhipa

    2014-11-01

    Mid-level providers (nurse practitioners and physician assistants) were originally envisioned to provide primary care services in underserved areas. This study details the current scope of independent procedural billing to Medicare of difficult, invasive, and surgical procedures by medical mid-level providers. To understand the scope of independent billing to Medicare for procedures performed by mid-level providers in an outpatient office setting for a calendar year. Analyses of the 2012 Medicare Physician/Supplier Procedure Summary Master File, which reflects fee-for-service claims that were paid by Medicare, for Current Procedural Terminology procedures independently billed by mid-level providers. Outpatient office setting among health care providers. The scope of independent billing to Medicare for procedures performed by mid-level providers. In 2012, nurse practitioners and physician assistants billed independently for more than 4 million procedures at our cutoff of 5000 paid claims per procedure. Most (54.8%) of these procedures were performed in the specialty area of dermatology. The findings of this study are relevant to safety and quality of care. Recently, the shortage of primary care clinicians has prompted discussion of widening the scope of practice for mid-level providers. It would be prudent to temper widening the scope of practice of mid-level providers by recognizing that mid-level providers are not solely limited to primary care, and may involve procedures for which they may not have formal training.

  19. A predictive coding framework for rapid neural dynamics during sentence-level language comprehension.

    PubMed

    Lewis, Ashley G; Bastiaansen, Marcel

    2015-07-01

    There is a growing literature investigating the relationship between oscillatory neural dynamics measured using electroencephalography (EEG) and/or magnetoencephalography (MEG), and sentence-level language comprehension. Recent proposals have suggested a strong link between predictive coding accounts of the hierarchical flow of information in the brain, and oscillatory neural dynamics in the beta and gamma frequency ranges. We propose that findings relating beta and gamma oscillations to sentence-level language comprehension might be unified under such a predictive coding account. Our suggestion is that oscillatory activity in the beta frequency range may reflect both the active maintenance of the current network configuration responsible for representing the sentence-level meaning under construction, and the top-down propagation of predictions to hierarchically lower processing levels based on that representation. In addition, we suggest that oscillatory activity in the low and middle gamma range reflect the matching of top-down predictions with bottom-up linguistic input, while evoked high gamma might reflect the propagation of bottom-up prediction errors to higher levels of the processing hierarchy. We also discuss some of the implications of this predictive coding framework, and we outline ideas for how these might be tested experimentally. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Online monitoring of oil film using electrical capacitance tomography and level set method.

    PubMed

    Xue, Q; Sun, B Y; Cui, Z Q; Ma, M; Wang, H X

    2015-08-01

    In the application of oil-air lubrication system, electrical capacitance tomography (ECT) provides a promising way for monitoring oil film in the pipelines by reconstructing cross sectional oil distributions in real time. While in the case of small diameter pipe and thin oil film, the thickness of the oil film is hard to be observed visually since the interface of oil and air is not obvious in the reconstructed images. And the existence of artifacts in the reconstructions has seriously influenced the effectiveness of image segmentation techniques such as level set method. Besides, level set method is also unavailable for online monitoring due to its low computation speed. To address these problems, a modified level set method is developed: a distance regularized level set evolution formulation is extended to image two-phase flow online using an ECT system, a narrowband image filter is defined to eliminate the influence of artifacts, and considering the continuity of the oil distribution variation, the detected oil-air interface of a former image can be used as the initial contour for the detection of the subsequent frame; thus, the propagation from the initial contour to the boundary can be greatly accelerated, making it possible for real time tracking. To testify the feasibility of the proposed method, an oil-air lubrication facility with 4 mm inner diameter pipe is measured in normal operation using an 8-electrode ECT system. Both simulation and experiment results indicate that the modified level set method is capable of visualizing the oil-air interface accurately online.

  1. A localized re-initialization equation for the conservative level set method

    NASA Astrophysics Data System (ADS)

    McCaslin, Jeremy O.; Desjardins, Olivier

    2014-04-01

    The conservative level set methodology for interface transport is modified to allow for localized level set re-initialization. This approach is suitable to applications in which there is a significant amount of spatial variability in level set transport. The steady-state solution of the modified re-initialization equation matches that of the original conservative level set provided an additional Eikonal equation is solved, which can be done efficiently through a fast marching method (FMM). Implemented within the context of the accurate conservative level set method (ACLS) (Desjardins et al., 2008, [6]), the FMM solution of this Eikonal equation comes at no additional cost. A metric for the appropriate amount of local re-initialization is proposed based on estimates of local flow deformation and numerical diffusion. The method is compared to standard global re-initialization for two test cases, yielding the expected results that minor differences are observed for Zalesak's disk, and improvements in both mass conservation and interface topology are seen for a drop deforming in a vortex. Finally, the method is applied to simulation of a viscously damped standing wave and a three-dimensional drop impacting on a shallow pool. Negligible differences are observed for the standing wave, as expected. For the last case, results suggest that spatially varying re-initialization provides a reduction in spurious interfacial corrugations, improvements in the prediction of radial growth of the splashing lamella, and a reduction in conservation errors, as well as a reduction in overall computational cost that comes from improved conditioning of the pressure Poisson equation due to the removal of spurious corrugations.

  2. A GPU Accelerated Discontinuous Galerkin Conservative Level Set Method for Simulating Atomization

    NASA Astrophysics Data System (ADS)

    Jibben, Zechariah J.

    This dissertation describes a process for interface capturing via an arbitrary-order, nearly quadrature free, discontinuous Galerkin (DG) scheme for the conservative level set method (Olsson et al., 2005, 2008). The DG numerical method is utilized to solve both advection and reinitialization, and executed on a refined level set grid (Herrmann, 2008) for effective use of processing power. Computation is executed in parallel utilizing both CPU and GPU architectures to make the method feasible at high order. Finally, a sparse data structure is implemented to take full advantage of parallelism on the GPU, where performance relies on well-managed memory operations. With solution variables projected into a kth order polynomial basis, a k + 1 order convergence rate is found for both advection and reinitialization tests using the method of manufactured solutions. Other standard test cases, such as Zalesak's disk and deformation of columns and spheres in periodic vortices are also performed, showing several orders of magnitude improvement over traditional WENO level set methods. These tests also show the impact of reinitialization, which often increases shape and volume errors as a result of level set scalar trapping by normal vectors calculated from the local level set field. Accelerating advection via GPU hardware is found to provide a 30x speedup factor comparing a 2.0GHz Intel Xeon E5-2620 CPU in serial vs. a Nvidia Tesla K20 GPU, with speedup factors increasing with polynomial degree until shared memory is filled. A similar algorithm is implemented for reinitialization, which relies on heavier use of shared and global memory and as a result fills them more quickly and produces smaller speedups of 18x.

  3. Online monitoring of oil film using electrical capacitance tomography and level set method

    SciTech Connect

    Xue, Q. Ma, M.; Sun, B. Y.; Cui, Z. Q.; Wang, H. X.

    2015-08-15

    In the application of oil-air lubrication system, electrical capacitance tomography (ECT) provides a promising way for monitoring oil film in the pipelines by reconstructing cross sectional oil distributions in real time. While in the case of small diameter pipe and thin oil film, the thickness of the oil film is hard to be observed visually since the interface of oil and air is not obvious in the reconstructed images. And the existence of artifacts in the reconstructions has seriously influenced the effectiveness of image segmentation techniques such as level set method. Besides, level set method is also unavailable for online monitoring due to its low computation speed. To address these problems, a modified level set method is developed: a distance regularized level set evolution formulation is extended to image two-phase flow online using an ECT system, a narrowband image filter is defined to eliminate the influence of artifacts, and considering the continuity of the oil distribution variation, the detected oil-air interface of a former image can be used as the initial contour for the detection of the subsequent frame; thus, the propagation from the initial contour to the boundary can be greatly accelerated, making it possible for real time tracking. To testify the feasibility of the proposed method, an oil-air lubrication facility with 4 mm inner diameter pipe is measured in normal operation using an 8-electrode ECT system. Both simulation and experiment results indicate that the modified level set method is capable of visualizing the oil-air interface accurately online.

  4. A patient-centered pharmacy services model of HIV patient care in community pharmacy settings: a theoretical and empirical framework.

    PubMed

    Kibicho, Jennifer; Owczarzak, Jill

    2012-01-01

    Reflecting trends in health care delivery, pharmacy practice has shifted from a drug-specific to a patient-centered model of care, aimed at improving the quality of patient care and reducing health care costs. In this article, we outline a theoretical model of patient-centered pharmacy services (PCPS), based on in-depth, qualitative interviews with a purposive sample of 28 pharmacists providing care to HIV-infected patients in specialty, semispecialty, and nonspecialty pharmacy settings. Data analysis was an interactive process informed by pharmacists' interviews and a review of the general literature on patient centered care, including Medication Therapy Management (MTM) services. Our main finding was that the current models of pharmacy services, including MTM, do not capture the range of pharmacy services in excess of mandated drug dispensing services. In this article, we propose a theoretical PCPS model that reflects the actual services pharmacists provide. The model includes five elements: (1) addressing patients as whole, contextualized persons; (2) customizing interventions to unique patient circumstances; (3) empowering patients to take responsibility for their own health care; (4) collaborating with clinical and nonclinical providers to address patient needs; and (5) developing sustained relationships with patients. The overarching goal of PCPS is to empower patients' to take responsibility for their own health care and self-manage their HIV-infection. Our findings provide the foundation for future studies regarding how widespread these practices are in diverse community settings, the validity of the proposed PCPS model, the potential for standardizing pharmacist practices, and the feasibility of a PCPS framework to reimburse pharmacists services.

  5. Integrated SFM Techniques Using Data Set from Google Earth 3d Model and from Street Level

    NASA Astrophysics Data System (ADS)

    Inzerillo, L.

    2017-08-01

    Structure from motion (SfM) represents a widespread photogrammetric method that uses the photogrammetric rules to carry out a 3D model from a photo data set collection. Some complex ancient buildings, such as Cathedrals, or Theatres, or Castles, etc. need to implement the data set (realized from street level) with the UAV one in order to have the 3D roof reconstruction. Nevertheless, the use of UAV is strong limited from the government rules. In these last years, Google Earth (GE) has been enriched with the 3D models of the earth sites. For this reason, it seemed convenient to start to test the potentiality offered by GE in order to extract from it a data set that replace the UAV function, to close the aerial building data set, using screen images of high resolution 3D models. Users can take unlimited "aerial photos" of a scene while flying around in GE at any viewing angle and altitude. The challenge is to verify the metric reliability of the SfM model carried out with an integrated data set (the one from street level and the one from GE) aimed at replace the UAV use in urban contest. This model is called integrated GE SfM model (i-GESfM). In this paper will be present a case study: the Cathedral of Palermo.

  6. Level-Set Minimization of Potential Controlled Hadwiger Valuations for Molecular Solvation

    PubMed Central

    Cheng, Li-Tien; Li, Bo; Wang, Zhongming

    2012-01-01

    A level-set method is developed for the numerical minimization of a class of Had-wiger valuations with a potential on a set of three-dimensional bodies. Such valuations are linear combinations of the volume, surface area, and surface integral of mean curvature. The potential increases rapidly as the body shrinks beyond a critical size. The combination of the Hadwiger valuation and the potential is the mean-field free-energy functional of the solvation of non-polar molecules in the recently developed variational implicit-solvent model. This functional of surfaces is minimized by the level-set evolution in the steepest decent of the free energy. The normal velocity of this surface evolution consists of both the mean and Gaussian curvatures, and a lower-order, “forcing” term arising from the potential. The forward Euler method is used to discretize the time derivative with a dynamic time stepping that satisfies a CFL condition. The normal velocity is decomposed into two parts. The first part consists of both the mean and Gaussian curvature terms. It is of parabolic type with parameter correction, and is discretized by central differencing. The second part has all the lower-order terms. It is of hyperbolic type, and is discretized by an upwinding scheme. New techniques of local level-set method and numerical integration are developed. Numerical tests demonstrate a second-order convergence of the method. Examples of application to the modeling of molecular solvation are presented. PMID:22323839

  7. Epifluorescence-based Quantitative Microvasculature Remodeling Using Geodesic Level-Sets and Shape-based Evolution

    PubMed Central

    Bunyak, F.; Palaniappan, K.; Glinskii, O.; Glinskii, V.; Glinsky, V.; Huxley, V.

    2009-01-01

    Accurate vessel segmentation is the first step in analysis of microvascular networks for reliable feature extraction and quantitative characterization. Segmentation of epifluorescent imagery of microvasculature presents a unique set of challenges and opportunities compared to traditional angiogram-based vessel imagery. This paper presents a novel system that combines methods from mathematical morphology, differential geometry, and active contours to reliably detect and segment microvasculature under varying background fluorescence conditions. The system consists of three main modules: vessel enhancement, shape-based initialization, and level-set based segmentation. Vessel enhancement deals with image noise and uneven background fluorescence using anisotropic diffusion and mathematical morphology techniques. Shape-based initialization uses features from the second-order derivatives of the enhanced vessel image and produces a coarse ridge (vessel) mask. Geodesic level-set based active contours refine the coarse ridge map and fix possible discontinuities or leakage of the level set contours that may arise from complex topology or high background fluorescence. The proposed system is tested on epifluorescence-based high resolution images of porcine dura mater microvasculature. Preliminary experiments show promising results. PMID:19163371

  8. An innovative hydrogeologic setting for disposal of low-level radioactive wastes

    NASA Astrophysics Data System (ADS)

    Legrand, Harry E.

    1989-05-01

    A natural unique hydrogeological setting favorable for safe and economical disposal of low-level radioactive wastes occurs in the flat hinterland of southeastern North Carolina. The uniqueness results partly from the absence of vertical and horizontal groundwater gradients, representing a nonflow, or null, zone. The null setting is localized to key horizons 30 to 75 feet below land surface and to areas where glauconitic sandy clays of the Peedee Formation lie under less than 25 feet of surficial sandy clays; the Peedee contains nearly stagnant brackish groundwater slightly below the proposed disposal zone. Issues to overcome include: (1) demonstrating better combined safety and economical features over conventional and prescribed settings, (2) dewatering the low-permeability disposal zone for the 20-year operational period, and (3) changing rules to allow disposal slightly below the zone in which the normal water table occurs. Favorable site characteristics of the key setting are: (1) no major aquifer to contaminate, (2) no surface streams or lakes to contaminate, (3) optimal ion exchange and sorptive capacity (clay and glauconite pellets), (4) no appreciable or distinctive vertical and horizontal gradients, (5) no elongated contaminated plume to develop, (6) no surface erosion, (7) a capable setting for injection of potential contaminated water into deep brackish water wells, if needed and allowed, (8) minimum problems of the “overfilled bathtub effect,” (9) no apparent long-term harmful environmental impact (normal water table would be restored after the 20-year period), (10) relatively inexpensive disposal (engineered barriers not needed and desired), (11) simple and relatively inexpensive monitoring, (12) large tracts of land likely available, and (13) sparse population. In spite of legal and political obstacles to shallow land burial, the null setting described is a capable hydrogeological host to contain low-level radioactive wastes. The setting may have

  9. Minimum data set to measure rehabilitation needs and health outcome after major trauma: application of an international framework.

    PubMed

    Hoffman, Karen P; Playford, Diane E; Grill, Eva; Soberg, Helene L; Brohi, Karim

    2016-06-01

    Measurement of long term health outcome after trauma remains non-standardized and ambiguous which limits national and international comparison of burden of injuries. The World Health Organization (WHO) has recommended the application of the International Classification of Function, Disability and Health (ICF) to measure rehabilitation and health outcome worldwide. No previous poly-trauma studies have applied the ICF comprehensively to evaluate outcome after injury. To apply the ICF categorization in patients with traumatic injuries to identify a minimum data set of important rehabilitation and health outcomes to enable national and international comparison of outcome data. A mixed methods design of patient interviews and an on-line survey. An ethnically diverse urban major trauma center in London. Adult patients with major traumatic injuries (poly-trauma) and international health care professionals (HCPs) working in acute and post-acute major trauma settings. Mixed methods investigated patients and health care professionals (HCPs) perspectives of important rehabilitation and health outcomes. Qualitative patient data and quantitative HCP data were linked to ICF categories. Combined data were refined to identify a minimum data set of important rehabilitation and health outcome categories. Transcribed patient interview data (N.=32) were linked to 234 (64%) second level ICF categories. Two hundred and fourteen HCPs identified 121 from a possible 140 second level ICF categories (86%) as relevant and important. Patients and HCPs strongly agreed on ICF body structures and body functions categories which include temperament, energy and drive, memory, emotions, pain and repair function of the skin. Conversely, patients prioritised domestic tasks, recreation and work compared to HCP priorities of self-care and mobility. Twenty six environmental factors were identified. Patient and HCP data were refined to recommend a 109 possible ICF categories for a minimum data set. The

  10. A Study on the Security Levels of Spread-Spectrum Embedding Schemes in the WOA Framework.

    PubMed

    Wang, Yuan-Gen; Zhu, Guopu; Kwong, Sam; Shi, Yun-Qing

    2017-08-23

    Security analysis is a very important issue for digital watermarking. Several years ago, according to Kerckhoffs' principle, the famous four security levels, namely insecurity, key security, subspace security, and stego-security, were defined for spread-spectrum (SS) embedding schemes in the framework of watermarked-only attack. However, up to now there has been little application of the definition of these security levels to the theoretical analysis of the security of SS embedding schemes, due to the difficulty of the theoretical analysis. In this paper, based on the security definition, we present a theoretical analysis to evaluate the security levels of five typical SS embedding schemes, which are the classical SS, the improved SS (ISS), the circular extension of ISS, the nonrobust and robust natural watermarking, respectively. The theoretical analysis of these typical SS schemes are successfully performed by taking advantage of the convolution of probability distributions to derive the probabilistic models of watermarked signals. Moreover, simulations are conducted to illustrate and validate our theoretical analysis. We believe that the theoretical and practical analysis presented in this paper can bridge the gap between the definition of the four security levels and its application to the theoretical analysis of SS embedding schemes.

  11. Implementation of E.U. Water Framework Directive: source assessment of metallic substances at catchment levels.

    PubMed

    Chon, Ho-Sik; Ohandja, Dieudonne-Guy; Voulvoulis, Nikolaos

    2010-01-01

    The E.U. Water Framework Directive (WFD) aims to prevent deterioration of water quality and to phase out or reduce the concentrations of priority substances at catchment levels. It requires changes in water management from a local scale to a river basin scale, and establishes Environmental Quality Standards (EQS) as a guideline for the chemical status of receiving waters. According to the Directive, the standard and the scope of the investigation for water management are more stringent and expanded than in the past, and this change also needs to be applied to restoring the level of metals in water bodies. The aim of this study was to identify anthropogenic emission sources of metallic substances at catchment levels. Potential sources providing substantial amounts of such substances in receiving waters included stormwater, industrial effluents, treated effluents, agricultural drainage, sediments, mining drainage and landfill leachates. Metallic substances have more emission sources than other dangerous substances at catchment levels. Therefore, source assessment for these substances is required to be considered more significantly to restore their chemical status in the context of the WFD. To improve source assessment quality, research on the role of societal and environmental parameters and contribution of each source to the chemical distribution in receiving waters need to be carried out.

  12. Street Level Hydrology: An Urban Application of the WRF-Hydro Framework in Denver, Colorado

    NASA Astrophysics Data System (ADS)

    Read, L.; Hogue, T. S.; Salas, F. R.; Gochis, D.

    2015-12-01

    Urban flood modeling at the watershed scale carries unique challenges in routing complexity, data resolution, social and political issues, and land surface - infrastructure interactions. The ability to accurately trace and predict the flow of water through the urban landscape enables better emergency response management, floodplain mapping, and data for future urban infrastructure planning and development. These services are of growing importance as urban population is expected to continue increasing by 1.84% per year for the next 25 years, increasing the vulnerability of urban regions to damages and loss of life from floods. Although a range of watershed-scale models have been applied in specific urban areas to examine these issues, there is a trend towards national scale hydrologic modeling enabled by supercomputing resources to understand larger system-wide hydrologic impacts and feedbacks. As such it is important to address how urban landscapes can be represented in large scale modeling processes. The current project investigates how coupling terrain and infrastructure routing can improve flow prediction and flooding events over the urban landscape. We utilize the WRF-Hydro modeling framework and a high-resolution terrain routing grid with the goal of compiling standard data needs necessary for fine scale urban modeling and dynamic flood forecasting in the urban setting. The city of Denver is selected as a case study, as it has experienced several large flooding events in the last five years and has an urban annual population growth rate of 1.5%, one of the highest in the U.S. Our work highlights the hydro-informatic challenges associated with linking channel networks and drainage infrastructure in an urban area using the WRF-Hydro modeling framework and high resolution urban models for short-term flood prediction.

  13. Framework for Leadership and Training of Biosafety Level 4 Laboratory Workers

    PubMed Central

    Anderson, Kevin; Bloom, Marshall E.; Estep, James E.; Feldmann, Heinz; Geisbert, Joan B.; Geisbert, Thomas W.; Hensley, Lisa; Holbrook, Michael; Jahrling, Peter B.; Ksiazek, Thomas G.; Korch, George; Patterson, Jean; Skvorak, John P.; Weingartl, Hana

    2008-01-01

    Construction of several new Biosafety Level 4 (BSL-4) laboratories and expansion of existing operations have created an increased international demand for well-trained staff and facility leaders. Directors of most North American BSL-4 laboratories met and agreed upon a framework for leadership and training of biocontainment research and operations staff. They agreed on essential preparation and training that includes theoretical consideration of biocontainment principles, practical hands-on training, and mentored on-the-job experiences relevant to positional responsibilities as essential preparation before a person’s independent access to a BSL-4 facility. They also agreed that the BSL-4 laboratory director is the key person most responsible for ensuring that staff members are appropriately prepared for BSL-4 operations. Although standardized certification of training does not formally exist, the directors agreed that facility-specific, time-limited documentation to recognize specific skills and experiences of trained persons is needed. PMID:18976549

  14. Framework for leadership and training of Biosafety Level 4 laboratory workers.

    PubMed

    Le Duc, James W; Anderson, Kevin; Bloom, Marshall E; Estep, James E; Feldmann, Heinz; Geisbert, Joan B; Geisbert, Thomas W; Hensley, Lisa; Holbrook, Michael; Jahrling, Peter B; Ksiazek, Thomas G; Korch, George; Patterson, Jean; Skvorak, John P; Weingartl, Hana

    2008-11-01

    Construction of several new Biosafety Level 4 (BSL-4) laboratories and expansion of existing operations have created an increased international demand for well-trained staff and facility leaders. Directors of most North American BSL-4 laboratories met and agreed upon a framework for leadership and training of biocontainment research and operations staff. They agreed on essential preparation and training that includes theoretical consideration of biocontainment principles, practical hands-on training, and mentored on-the-job experiences relevant to positional responsibilities as essential preparation before a person's independent access to a BSL-4 facility. They also agreed that the BSL-4 laboratory director is the key person most responsible for ensuring that staff members are appropriately prepared for BSL-4 operations. Although standardized certification of training does not formally exist, the directors agreed that facility-specific, time-limited documentation to recognize specific skills and experiences of trained persons is needed.

  15. Providing conceptual framework support for distributed Web-based simulation within the high-level architecture

    NASA Astrophysics Data System (ADS)

    Page, Ernest H.; Griffin, Sean P.; Rother, S. L.

    1998-08-01

    Web-based simulation, a subject of increasing interest to both simulation researchers and practitioners, has the potential to significantly influence the application and availability of simulation as a problem-solving technique. Web technologies also portend cost-effective distributed modeling and simulation. These applications will require solutions to the systems interoperability problem similar to the DoD High Level Architecture (HLA). The suitability of the HLA to serve 'mainstream' simulation is examined.Approaches for incorporating discrete event simulation conceptual frameworks within the HLA are described and ongoing research in this area noted. Issues raised include a discussion of the appropriate roles for a simulation-support language and a simulation-support architecture.

  16. Assessing the Macro-Level Correlates of Malware Infections Using a Routine Activities Framework.

    PubMed

    Holt, Thomas J; Burruss, George W; Bossler, Adam M

    2016-12-01

    The ability to gain unauthorized access to computer systems to engage in espionage and data theft poses a massive threat to individuals worldwide. There has been minimal focus, however, on the role of malicious software, or malware, which can automate this process. This study examined the macro-correlates of malware infection at the national level by using an open repository of known malware infections and utilizing a routine activities framework. Negative inflated binomial models for counts indicated that nations with greater technological infrastructure, more political freedoms, and with less organized crime financial impact were more likely to report malware infections. The number of Computer Emergency Response Teams (CERTs) in a nation was not significantly related with reported malware infection. The implications of the study for the understanding of malware infection, routine activity theory, and target-hardening strategies are discussed.

  17. Not Your Basic Base Levels: Simulations of Erosion and Deposition With Fluctuating Water Levels in Coastal and Enclosed Basin Settings

    NASA Astrophysics Data System (ADS)

    Howard, A. D.; Matsubara, Y.; Lloyd, H.

    2006-12-01

    The DELIM landform evolution model has been adapted to investigate erosional and depositional landforms in two setting with fluctuating base levels. The first is erosion and wave planation of terraced landscapes in Coastal Plain sediments along the estuarine Potomac River. The last 3.5 million years of erosion is simulated with base level fluctuations based upon the long-term oceanic delta 18O record, eustatic sea level changes during the last 120 ka, estimates of the history of tectonic uplift in the region, and maximum depths of incision of the Potomac River during sea-level lowstands. Inhibition of runoff erosion by vegetation has been a crucial factor allowing persistence of uplands in the soft coastal plain bedrock. The role of vegetation is simulated as a contributing area- dependent critical shear stress. Development of wave-cut terraces is simulated by episodic planation of the landscape during base-level highstands. Although low base level excursions are infrequent and of short duration, the total amount of erosion is largely controlled by the depth and frequency of lowstands. The model has also been adapted to account for flow routing and accompanying erosion and sedimentation in landscapes with multiple enclosed depressions. The hydrological portion of the model has been calibrated and tested in the Great Basin and Mojave regions of the southwestern U.S. In such a setting, runoff, largely from mountains, may flow through several lacustrine basins, each with evaporative losses. An iterative approach determines the size and depth of lakes, including overflow (or not) that balances runoff and evaporation. The model utilizes information on temperatures, rainfall, runoff, and evaporation within the region to parameterize evaporation and runoff as functions of latitude, mean annual temperature, precipitation, and elevation. The model is successful in predicting the location of modern perennial lakes in the region as well as that of lakes during the last

  18. Comparison of bladder segmentation using deep-learning convolutional neural network with and without level sets

    NASA Astrophysics Data System (ADS)

    Cha, Kenny H.; Hadjiiski, Lubomir M.; Samala, Ravi K.; Chan, Heang-Ping; Cohan, Richard H.; Caoili, Elaine M.

    2016-03-01

    We are developing a CAD system for detection of bladder cancer in CTU. In this study we investigated the application of deep-learning convolutional neural network (DL-CNN) to the segmentation of the bladder, which is a challenging problem because of the strong boundary between the non-contrast and contrast-filled regions in the bladder. We trained a DL-CNN to estimate the likelihood of a pixel being inside the bladder using neighborhood information. The segmented bladder was obtained from thresholding and hole-filling of the likelihood map. We compared the segmentation performance of the DL-CNN alone and with additional cascaded 3D and 2D level sets to refine the segmentation using 3D hand-segmented contours as reference standard. The segmentation accuracy was evaluated by five performance measures: average volume intersection %, average % volume error, average absolute % error, average minimum distance, and average Jaccard index for a data set of 81 training and 92 test cases. For the training set, DLCNN with level sets achieved performance measures of 87.2+/-6.1%, 6.0+/-9.1%, 8.7+/-6.1%, 3.0+/-1.2 mm, and 81.9+/-7.6%, respectively, while the DL-CNN alone obtained the values of 73.6+/-8.5%, 23.0+/-8.5%, 23.0+/-8.5%, 5.1+/-1.5 mm, and 71.5+/-9.2%, respectively. For the test set, the DL-CNN with level sets achieved performance measures of 81.9+/-12.1%, 10.2+/-16.2%, 14.0+/-13.0%, 3.6+/-2.0 mm, and 76.2+/-11.8%, respectively, while DL-CNN alone obtained 68.7+/-12.0%, 27.2+/-13.7%, 27.4+/-13.6%, 5.7+/-2.2 mm, and 66.2+/-11.8%, respectively. DL-CNN alone is effective in segmenting bladders but may not follow the details of the bladder wall. The combination of DL-CNN with level sets provides highly accurate bladder segmentation.

  19. Can we decide which outcomes should be measured in every clinical trial? A scoping review of the existing conceptual frameworks and processes to develop core outcome sets.

    PubMed

    Idzerda, Leanne; Rader, Tamara; Tugwell, Peter; Boers, Maarten

    2014-05-01

    The usefulness of randomized control trials to advance clinical care depends upon the outcomes reported, but disagreement on the choice of outcome measures has resulted in inconsistency and the potential for reporting bias. One solution to this problem is the development of a core outcome set: a minimum set of outcome measures deemed critical for clinical decision making. Within rheumatology the Outcome Measures in Rheumatology (OMERACT) initiative has pioneered the development of core outcome sets since 1992. As the number of diseases addressed by OMERACT has increased and its experience in formulating core sets has grown, clarification and update of the conceptual framework and formulation of a more explicit process of area/domain core set development has become necessary. As part of the update process of the OMERACT Filter criteria to version 2, a literature review was undertaken to compare and contrast the OMERACT conceptual framework with others within and outside rheumatology. A scoping search was undertaken to examine the extent, range, and nature of conceptual frameworks for core set outcome selection in health. We searched the following resources: Cochrane Library Methods Group Register; Medline; Embase; PsycInfo; Environmental Studies and Policy Collection; and ABI/INFORM Global. We also conducted a targeted Google search. Five conceptual frameworks were identified: the WHO tripartite definition of health; the 5 Ds (discomfort, disability, drug toxicity, dollar cost, and death); the International Classification of Functioning (ICF); PROMIS (Patient-Reported Outcomes Measurement System); and the Outcomes Hierarchy. Of these, only the 5 Ds and ICF frameworks have been systematically applied in core set development. Outside the area of rheumatology, several core sets were identified; these had been developed through a limited range of consensus-based methods with varying degrees of methodological rigor. None applied a framework to ensure content validity of

  20. Therapeutic and diagnostic set for irradiation the cell lines in low level laser therapy

    NASA Astrophysics Data System (ADS)

    Gryko, Lukasz; Zajac, Andrzej; Gilewski, Marian; Szymanska, Justyna; Goralczyk, Krzysztof

    2014-05-01

    In the paper is presented optoelectronic diagnostic set for standardization the biostimulation procedures performed on cell lines. The basic functional components of the therapeutic set are two digitally controlled illuminators. They are composed of the sets of semiconductor emitters - medium power laser diodes and high power LEDs emitting radiation in wide spectral range from 600 nm to 1000 nm. Emitters are coupled with applicator by fibre optic and optical systems that provides uniform irradiation of vessel with cell culture samples. Integrated spectrometer and optical power meter allow to control the energy and spectral parameters of electromagnetic radiation during the Low Level Light Therapy procedure. Dedicated power supplies and digital controlling system allow independent power of each emitter . It was developed active temperature stabilization system to thermal adjust spectral line of emitted radiation to more efficient association with absorption spectra of biological acceptors. Using the set to controlled irradiation and allowing to measure absorption spectrum of biological medium it is possible to carry out objective assessment the impact of the exposure parameters on the state cells subjected to Low Level Light Therapy. That procedure allows comparing the biological response of cell lines after irradiation with radiation of variable spectral and energetic parameters. Researches were carried out on vascular endothelial cell lines. Cells proliferations after irradiation of LEDs: 645 nm, 680 nm, 740 nm, 780 nm, 830 nm, 870 nm, 890 nm, 970 nm and lasers 650 nm and 830 nm were examined.

  1. Phase field and level set methods for modeling solute precipitation and/or dissolution

    SciTech Connect

    Zhijie Xu; Hai Huang; Paul Meakin

    2012-01-01

    The dynamics of solid-liquid interfaces controlled by solute precipitation and/or dissolution due to the chemical reaction at the interface were computed in two dimensions using a phase field models. Sharp-interface asymptotic analysis demonstrated that the phase field solutions should converge to the proper sharp-interface precipitation/dissolution limit. For the purpose of comparison, the numerical solution of the sharp-interface model for solute precipitation/dissolution was directly solved using a level set method. In general, the phase field results are found in good agreement with the level set results for all reaction rates and geometry configurations investigated. Present study supports the applications of both methods to more complicated and realistic reactive systems, including the nuclear waste release and mineral precipitation and dissolution

  2. Phase field and level set methods for modeling solute precipitation and/or dissolution

    SciTech Connect

    Xu, Zhijie; Huang, Hai; Li, Xiaoyi; Meakin, Paul

    2012-01-02

    The dynamics of solid-liquid interfaces controlled by solute precipitation and/or dissolution due to the chemical reaction at the interface were computed in two dimensions using a phase field models. Sharp-interface asymptotic analysis demonstrated that the phase field solutions should converge to the proper sharp-interface precipitation/dissolution limit. For the purpose of comparison, the numerical solution of the sharp-interface model for solute precipitation/dissolution was directly solved using a level set method. In general, the phase field results are found in good agreement with the level set results for all reaction rates and geometry configurations. Present study supports the applications of both methods to more complicated and realistic reactive systems.

  3. A novel breast ultrasound image segmentation algorithm based on neutrosophic similarity score and level set.

    PubMed

    Guo, Yanhui; Şengür, Abdulkadir; Tian, Jia-Wei

    2016-01-01

    Breast ultrasound (BUS) image segmentation is a challenging task due to the speckle noise, poor quality of the ultrasound images and size and location of the breast lesions. In this paper, we propose a new BUS image segmentation algorithm based on neutrosophic similarity score (NSS) and level set algorithm. At first, the input BUS image is transferred to the NS domain via three membership subsets T, I and F, and then, a similarity score NSS is defined and employed to measure the belonging degree to the true tumor region. Finally, the level set method is used to segment the tumor from the background tissue region in the NSS image. Experiments have been conducted on a variety of clinical BUS images. Several measurements are used to evaluate and compare the proposed method's performance. The experimental results demonstrate that the proposed method is able to segment the BUS images effectively and accurately. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. A Cartesian Adaptive Level Set Method for Two-Phase Flows

    NASA Technical Reports Server (NTRS)

    Ham, F.; Young, Y.-N.

    2003-01-01

    In the present contribution we develop a level set method based on local anisotropic Cartesian adaptation as described in Ham et al. (2002). Such an approach should allow for the smallest possible Cartesian grid capable of resolving a given flow. The remainder of the paper is organized as follows. In section 2 the level set formulation for free surface calculations is presented and its strengths and weaknesses relative to the other free surface methods reviewed. In section 3 the collocated numerical method is described. In section 4 the method is validated by solving the 2D and 3D drop oscilation problem. In section 5 we present some results from more complex cases including the 3D drop breakup in an impulsively accelerated free stream, and the 3D immiscible Rayleigh-Taylor instability. Conclusions are given in section 6.

  5. Automatic segmentation of Leishmania parasite in microscopic images using a modified CV level set method

    NASA Astrophysics Data System (ADS)

    Farahi, Maria; Rabbani, Hossein; Talebi, Ardeshir; Sarrafzadeh, Omid; Ensafi, Shahab

    2015-12-01

    Visceral Leishmaniasis is a parasitic disease that affects liver, spleen and bone marrow. According to World Health Organization report, definitive diagnosis is possible just by direct observation of the Leishman body in the microscopic image taken from bone marrow samples. We utilize morphological and CV level set method to segment Leishman bodies in digital color microscopic images captured from bone marrow samples. Linear contrast stretching method is used for image enhancement and morphological method is applied to determine the parasite regions and wipe up unwanted objects. Modified global and local CV level set methods are proposed for segmentation and a shape based stopping factor is used to hasten the algorithm. Manual segmentation is considered as ground truth to evaluate the proposed method. This method is tested on 28 samples and achieved 10.90% mean of segmentation error for global model and 9.76% for local model.

  6. Active contour segmentation using level set function with enhanced image from prior intensity.

    PubMed

    Kim, Sunhee; Kim, Youngjun; Lee, Deukhee; Park, Sehyung

    2015-01-01

    This paper presents a new active contour segmentation model using a level set function that can correctly capture both the strong and the weak boundaries of a target enclosed by bright and dark regions at the same time. We introduce an enhanced image obtained from prior information about the intensity of the target. The enhanced image emphasizes the regions where pixels have intensities close to the prior intensity. This enables a desirable segmentation of an image having a partially low contrast with the target surrounded by regions that are brighter or darker than the target. We define an edge indicator function on an original image, and local and regularization forces on an enhanced image. An edge indicator function and two forces are incorporated in order to identify the strong and weak boundaries, respectively. We established an evolution equation of contours in the level set formulation and experimented with several medical images to show the performance of the proposed method.

  7. Parallel computation of level set method for 500 Hz visual servo control

    NASA Astrophysics Data System (ADS)

    Fei, Xianfeng; Igarashi, Yasunobu; Hashimoto, Koichi

    2008-11-01

    We propose a 2D microorganism tracking system using a parallel level set method and a column parallel vision system (CPV). This system keeps a single microorganism in the middle of the visual field under a microscope by visual servoing an automated stage. We propose a new energy function for the level set method. This function constrains an amount of light intensity inside the detected object contour to control the number of the detected objects. This algorithm is implemented in CPV system and computational time for each frame is 2 [ms], approximately. A tracking experiment for about 25 s is demonstrated. Also we demonstrate a single paramecium can be kept tracking even if other paramecia appear in the visual field and contact with the tracked paramecium.

  8. Extending fields in a level set method by solving a biharmonic equation

    NASA Astrophysics Data System (ADS)

    Moroney, Timothy J.; Lusmore, Dylan R.; McCue, Scott W.; McElwain, D. L. Sean

    2017-08-01

    We present an approach for computing extensions of velocities or other fields in level set methods by solving a biharmonic equation. The approach differs from other commonly used approaches to velocity extension because it deals with the interface fully implicitly through the level set function. No explicit properties of the interface, such as its location or the velocity on the interface, are required in computing the extension. These features lead to a particularly simple implementation using either a sparse direct solver or a matrix-free conjugate gradient solver. Furthermore, we propose a fast Poisson preconditioner that can be used to accelerate the convergence of the latter. We demonstrate the biharmonic extension on a number of test problems that serve to illustrate its effectiveness at producing smooth and accurate extensions near interfaces. A further feature of the method is the natural way in which it deals with symmetry and periodicity, ensuring through its construction that the extension field also respects these symmetries.

  9. Unsupervised segmentation of the prostate using MR images based on level set with a shape prior.

    PubMed

    Liu, Xin; Langer, D L; Haider, M A; Van der Kwast, T H; Evans, A J; Wernick, M N; Yetik, I S

    2009-01-01

    Prostate cancer is the second leading cause of cancer death in American men. Current prostate MRI can benefit from automated tumor localization to help guide biopsy, radiotherapy and surgical planning. An important step of automated prostate cancer localization is the segmentation of the prostate. In this paper, we propose a fully automatic method for the segmentation of the prostate. We firstly apply a deformable ellipse model to find an ellipse that best fits the prostate shape. Then, this ellipse is used to initiate the level set and constrain the level set evolution with a shape penalty term. Finally, certain post processing methods are applied to refine the prostate boundaries. We apply the proposed method to real diffusion-weighted (DWI) MRI images data to test the performance. Our results show that accurate segmentation can be obtained with the proposed method compared to human readers.

  10. A level set-based shape optimization method for periodic sound barriers composed of elastic scatterers

    NASA Astrophysics Data System (ADS)

    Hashimoto, Hiroshi; Kim, Min-Geun; Abe, Kazuhisa; Cho, Seonho

    2013-10-01

    This paper presents a level set-based topology optimization method for noise barriers formed from an assembly of scatterers. The scattering obstacles are modeled by elastic bodies arranged periodically along the wall. Due to the periodicity, the problem can be reduced to that in a unit cell. The interaction between the elastic scatterers and the acoustic field is described in the context of the level set analysis. The semi-infinite acoustic wave regions located on the both sides of the barrier are represented by impedance matrices. The objective function is defined by the energy transmission passing the barrier. The design sensitivity is evaluated analytically by the aid of adjoint equations. The dependency of the optimal profile on the stiffness of scatterers and on the target frequency band is examined. The feasibility of the developed optimization method is proved through numerical examples.

  11. Atlas-based segmentation of 3D cerebral structures with competitive level sets and fuzzy control.

    PubMed

    Ciofolo, Cybèle; Barillot, Christian

    2009-06-01

    We propose a novel approach for the simultaneous segmentation of multiple structures with competitive level sets driven by fuzzy control. To this end, several contours evolve simultaneously toward previously defined anatomical targets. A fuzzy decision system combines the a priori knowledge provided by an anatomical atlas with the intensity distribution of the image and the relative position of the contours. This combination automatically determines the directional term of the evolution equation of each level set. This leads to a local expansion or contraction of the contours, in order to match the boundaries of their respective targets. Two applications are presented: the segmentation of the brain hemispheres and the cerebellum, and the segmentation of deep internal structures. Experimental results on real magnetic resonance (MR) images are presented, quantitatively assessed and discussed.

  12. Sensitivity of the ECHAM6 Single Column Model to Vertical Resolution and Implementation of the Level Set Method

    NASA Astrophysics Data System (ADS)

    Cheedela, Suvarchal Kumar; Stevens, Bjorn; Schmidt, Heiko; Mellado, Juan Pedro

    2010-05-01

    Simulating cloud-topped boundary layers has been a haunting task for general circulation models (GCM's) and has challenged our ability to study the role of low clouds in climate in detail. One of the most important, yet difficult to simulate, feature of cloud-topped boundary layers is the presence of large vertical gradients. Resolving such gradients is important because of their relevance to key processes that govern the evolution of the boundary layer. For instance, a typical stratocumulus cloud has a thickness on the order of hundreds of meters and a process such as cloud-top entrainment, that is key to the dynamics of stratocumulus-topped boundary layers, occurs across a very thin interface (on the order of tens of centimeters) with sharp gradients in temperature and humidity. Such sharp gradients are not well represented by current GCM's and evidently linked to the vertical resolution used by these models. In an ongoing study we investigate the sensitivity of the ECHAM6 GCM to vertical resolution in a single column model (SCM) framework. The SCM is a 1D vertical column of a GCM that includes all parameterizations of diabatic processes and is run over a region of interest using prescribed large-scale advective fluxes estimated from observations. As such, the SCM conveniently decouples the large-scale flow from (diabatic) parameterizations and has a great advantage of its high computational speed. Because simply increasing vertical resolution to improve the representation of vertical gradients remains difficult, even in climate models envisaged for the future (principally due to the associated high computational cost), we propose a novel alternative strategy of using level set method. Level Set Method here aims to capture effects of very high resolution such as cloud top interface in a lower vertical resolution models (current climate models). In this method the 2D interface is represented as a level set of a smoother field, computed as the distance from the

  13. Hydrological drivers of record-setting water level rise on Earth's largest lake system

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Bruxer, J.; Durnford, D.; Smith, J. P.; Clites, A. H.; Seglenieks, F.; Qian, S. S.; Hunter, T. S.; Fortin, V.

    2016-05-01

    Between January 2013 and December 2014, water levels on Lake Superior and Lake Michigan-Huron, the two largest lakes on Earth by surface area, rose at the highest rate ever recorded for a 2 year period beginning in January and ending in December of the following year. This historic event coincided with below-average air temperatures and extensive winter ice cover across the Great Lakes. It also brought an end to a 15 year period of persistently below-average water levels on Lakes Superior and Michigan-Huron that included several months of record-low water levels. To differentiate hydrological drivers behind the recent water level rise, we developed a Bayesian Markov chain Monte Carlo (MCMC) routine for inferring historical estimates of the major components of each lake's water budget. Our results indicate that, in 2013, the water level rise on Lake Superior was driven by increased spring runoff and over-lake precipitation. In 2014, reduced over-lake evaporation played a more significant role in Lake Superior's water level rise. The water level rise on Lake Michigan-Huron in 2013 was also due to above-average spring runoff and persistent over-lake precipitation, while in 2014, it was due to a rare combination of below-average evaporation, above-average runoff and precipitation, and very high inflow rates from Lake Superior through the St. Marys River. We expect, in future research, to apply our new framework across the other Laurentian Great Lakes, and to Earth's other large freshwater basins as well.

  14. Stochastic level-set variational implicit-solvent approach to solute-solvent interfacial fluctuations.

    PubMed

    Zhou, Shenggao; Sun, Hui; Cheng, Li-Tien; Dzubiella, Joachim; Li, Bo; McCammon, J Andrew

    2016-08-07

    Recent years have seen the initial success of a variational implicit-solvent model (VISM), implemented with a robust level-set method, in capturing efficiently different hydration states and providing quantitatively good estimation of solvation free energies of biomolecules. The level-set minimization of the VISM solvation free-energy functional of all possible solute-solvent interfaces or dielectric boundaries predicts an equilibrium biomolecular conformation that is often close to an initial guess. In this work, we develop a theory in the form of Langevin geometrical flow to incorporate solute-solvent interfacial fluctuations into the VISM. Such fluctuations are crucial to biomolecular conformational changes and binding process. We also develop a stochastic level-set method to numerically implement such a theory. We describe the interfacial fluctuation through the "normal velocity" that is the solute-solvent interfacial force, derive the corresponding stochastic level-set equation in the sense of Stratonovich so that the surface representation is independent of the choice of implicit function, and develop numerical techniques for solving such an equation and processing the numerical data. We apply our computational method to study the dewetting transition in the system of two hydrophobic plates and a hydrophobic cavity of a synthetic host molecule cucurbit[7]uril. Numerical simulations demonstrate that our approach can describe an underlying system jumping out of a local minimum of the free-energy functional and can capture dewetting transitions of hydrophobic systems. In the case of two hydrophobic plates, we find that the wavelength of interfacial fluctuations has a strong influence to the dewetting transition. In addition, we find that the estimated energy barrier of the dewetting transition scales quadratically with the inter-plate distance, agreeing well with existing studies of molecular dynamics simulations. Our work is a first step toward the inclusion of

  15. Segmentation of the liver from abdominal MR images: a level-set approach

    NASA Astrophysics Data System (ADS)

    Abdalbari, Anwar; Huang, Xishi; Ren, Jing

    2015-03-01

    The usage of prior knowledge in segmentation of abdominal MR images enables more accurate and comprehensive interpretation about the organ to segment. Prior knowledge about abdominal organ like liver vessels can be employed to get an accurate segmentation of the liver that leads to accurate diagnosis or treatment plan. In this paper, a new method for segmenting the liver from abdominal MR images using liver vessels as prior knowledge is proposed. This paper employs the technique of level set method to segment the liver from MR abdominal images. The speed image used in the level set method is responsible for propagating and stopping region growing at boundaries. As a result of the poor contrast of the MR images between the liver and the surrounding organs i.e. stomach, kidneys, and heart causes leak of the segmented liver to those organs that lead to inaccurate or incorrect segmentation. For that reason, a second speed image is developed, as an extra term to the level set, to control the front propagation at weak edges with the help of the original speed image. The basic idea of the proposed approach is to use the second speed image as a boundary surface which is approximately orthogonal to the area of the leak. The aim of the new speed image is to slow down the level set propagation and prevent the leak in the regions close to liver boundary. The new speed image is a surface created by filling holes to reconstruct the liver surface. These holes are formed as a result of the exit and the entry of the liver vessels, and are considered the main cause of the segmentation leak. The result of the proposed method shows superior outcome than other methods in the literature.

  16. Functional level-set derivative for a polymer self consistent field theory Hamiltonian

    NASA Astrophysics Data System (ADS)

    Ouaknin, Gaddiel; Laachi, Nabil; Bochkov, Daniil; Delaney, Kris; Fredrickson, Glenn H.; Gibou, Frederic

    2017-09-01

    We derive functional level-set derivatives for the Hamiltonian arising in self-consistent field theory, which are required to solve free boundary problems in the self-assembly of polymeric systems such as block copolymer melts. In particular, we consider Dirichlet, Neumann and Robin boundary conditions. We provide numerical examples that illustrate how these shape derivatives can be used to find equilibrium and metastable structures of block copolymer melts with a free surface in both two and three spatial dimensions.

  17. Stochastic level-set variational implicit-solvent approach to solute-solvent interfacial fluctuations

    NASA Astrophysics Data System (ADS)

    Zhou, Shenggao; Sun, Hui; Cheng, Li-Tien; Dzubiella, Joachim; Li, Bo; McCammon, J. Andrew

    2016-08-01

    Recent years have seen the initial success of a variational implicit-solvent model (VISM), implemented with a robust level-set method, in capturing efficiently different hydration states and providing quantitatively good estimation of solvation free energies of biomolecules. The level-set minimization of the VISM solvation free-energy functional of all possible solute-solvent interfaces or dielectric boundaries predicts an equilibrium biomolecular conformation that is often close to an initial guess. In this work, we develop a theory in the form of Langevin geometrical flow to incorporate solute-solvent interfacial fluctuations into the VISM. Such fluctuations are crucial to biomolecular conformational changes and binding process. We also develop a stochastic level-set method to numerically implement such a theory. We describe the interfacial fluctuation through the "normal velocity" that is the solute-solvent interfacial force, derive the corresponding stochastic level-set equation in the sense of Stratonovich so that the surface representation is independent of the choice of implicit function, and develop numerical techniques for solving such an equation and processing the numerical data. We apply our computational method to study the dewetting transition in the system of two hydrophobic plates and a hydrophobic cavity of a synthetic host molecule cucurbit[7]uril. Numerical simulations demonstrate that our approach can describe an underlying system jumping out of a local minimum of the free-energy functional and can capture dewetting transitions of hydrophobic systems. In the case of two hydrophobic plates, we find that the wavelength of interfacial fluctuations has a strong influence to the dewetting transition. In addition, we find that the estimated energy barrier of the dewetting transition scales quadratically with the inter-plate distance, agreeing well with existing studies of molecular dynamics simulations. Our work is a first step toward the inclusion of

  18. Building a Community Framework for Adaptation to Sea Level Rise and Inundation

    NASA Astrophysics Data System (ADS)

    Culver, M. E.; Schubel, J.; Davidson, M. A.; Haines, J.

    2010-12-01

    Sea level rise and inundation pose a substantial risk to many coastal communities, and the risk is projected to increase because of continued development, changes in the frequency and intensity of inundation events, and acceleration in the rate of sea-level rise. Calls for action at all levels acknowledge that a viable response must engage federal, state and local expertise, perspectives, and resources in a coordinated and collaborative effort. Representatives from a variety of these agencies and organizations have developed a shared framework to help coastal communities structure and facilitate community-wide adaptation processes and to help agencies determine where investments should be made to enable states and local governments to assess impacts and initiate adaptation strategies over the next decade. For sea level rise planning and implementation, the requirements for high-quality data and information are vast and the availability is limited. Participants stressed the importance of data interoperability to ensure that users are able to apply data from a variety of sources and to improve availability and confidence in the data. Participants were able to prioritize the following six categories of data needed to support future sea level rise planning and implementation: - Data to understand land forms and where and how water will flow - Monitoring data and environmental drivers - Consistent sea level rise scenarios and projections across agencies to support local planning - Data to characterize vulnerabilities and impacts of sea level rise - Community characteristics - Legal frameworks and administrative structure. To develop a meaningful and effective sea level rise adaptation plan, state and local planners must understand how the availability, scale, and uncertainty of these types of data will impact new guidelines or adaptation measures. The tools necessary to carry-out the adaptation planning process need to be understood in terms of data requirements

  19. A Real-Time Algorithm for the Approximation of Level-Set-Based Curve Evolution

    PubMed Central

    Shi, Yonggang; Karl, William Clem

    2010-01-01

    In this paper, we present a complete and practical algorithm for the approximation of level-set-based curve evolution suitable for real-time implementation. In particular, we propose a two-cycle algorithm to approximate level-set-based curve evolution without the need of solving partial differential equations (PDEs). Our algorithm is applicable to a broad class of evolution speeds that can be viewed as composed of a data-dependent term and a curve smoothness regularization term. We achieve curve evolution corresponding to such evolution speeds by separating the evolution process into two different cycles: one cycle for the data-dependent term and a second cycle for the smoothness regularization. The smoothing term is derived from a Gaussian filtering process. In both cycles, the evolution is realized through a simple element switching mechanism between two linked lists, that implicitly represents the curve using an integer valued level-set function. By careful construction, all the key evolution steps require only integer operations. A consequence is that we obtain significant computation speedups compared to exact PDE-based approaches while obtaining excellent agreement with these methods for problems of practical engineering interest. In particular, the resulting algorithm is fast enough for use in real-time video processing applications, which we demonstrate through several image segmentation and video tracking experiments. PMID:18390371

  20. Vascular tree segmentation in medical images using Hessian-based multiscale filtering and level set method.

    PubMed

    Jin, Jiaoying; Yang, Linjun; Zhang, Xuming; Ding, Mingyue

    2013-01-01

    Vascular segmentation plays an important role in medical image analysis. A novel technique for the automatic extraction of vascular trees from 2D medical images is presented, which combines Hessian-based multiscale filtering and a modified level set method. In the proposed algorithm, the morphological top-hat transformation is firstly adopted to attenuate background. Then Hessian-based multiscale filtering is used to enhance vascular structures by combining Hessian matrix with Gaussian convolution to tune the filtering response to the specific scales. Because Gaussian convolution tends to blur vessel boundaries, which makes scale selection inaccurate, an improved level set method is finally proposed to extract vascular structures by introducing an external constrained term related to the standard deviation of Gaussian function into the traditional level set. Our approach was tested on synthetic images with vascular-like structures and 2D slices extracted from real 3D abdomen magnetic resonance angiography (MRA) images along the coronal plane. The segmentation rates for synthetic images are above 95%. The results for MRA images demonstrate that the proposed method can extract most of the vascular structures successfully and accurately in visualization. Therefore, the proposed method is effective for the vascular tree extraction in medical images.

  1. Numerical Simulation of Dynamic Contact Angles and Contact Lines in Multiphase Flows using Level Set Method

    NASA Astrophysics Data System (ADS)

    Pendota, Premchand

    Many physical phenomena and industrial applications involve multiphase fluid flows and hence it is of high importance to be able to simulate various aspects of these flows accurately. The Dynamic Contact Angles (DCA) and the contact lines at the wall boundaries are a couple of such important aspects. In the past few decades, many mathematical models were developed for predicting the contact angles of the inter-face with the wall boundary under various flow conditions. These models are used to incorporate the physics of DCA and contact line motion in numerical simulations using various interface capturing/tracking techniques. In the current thesis, a simple approach to incorporate the static and dynamic contact angle boundary conditions using the level set method is developed and implemented in multiphase CFD codes, LIT (Level set Interface Tracking) (Herrmann (2008)) and NGA (flow solver) (Desjardins et al (2008)). Various DCA models and associated boundary conditions are reviewed. In addition, numerical aspects such as the occurrence of a stress singularity at the contact lines and grid convergence of macroscopic interface shape are dealt with in the context of the level set approach.

  2. A level set method for cupping artifact correction in cone-beam CT

    SciTech Connect

    Xie, Shipeng; Li, Haibo; Ge, Qi; Li, Chunming

    2015-08-15

    Purpose: To reduce cupping artifacts and improve the contrast-to-noise ratio in cone-beam computed tomography (CBCT). Methods: A level set method is proposed to reduce cupping artifacts in the reconstructed image of CBCT. The authors derive a local intensity clustering property of the CBCT image and define a local clustering criterion function of the image intensities in a neighborhood of each point. This criterion function defines an energy in terms of the level set functions, which represent a segmentation result and the cupping artifacts. The cupping artifacts are estimated as a result of minimizing this energy. Results: The cupping artifacts in CBCT are reduced by an average of 90%. The results indicate that the level set-based algorithm is practical and effective for reducing the cupping artifacts and preserving the quality of the reconstructed image. Conclusions: The proposed method focuses on the reconstructed image without requiring any additional physical equipment, is easily implemented, and provides cupping correction through a single-scan acquisition. The experimental results demonstrate that the proposed method successfully reduces the cupping artifacts.

  3. On the geometry of two-dimensional slices of irregular level sets in turbulent flows

    SciTech Connect

    Catrakis, H.J.; Cook, A.W.; Dimotakis, P.E.; Patton, J.M.

    1998-03-20

    Isoscalar surfaces in turbulent flows are found to be more complex than (self-similar) fractals, in both the far field of liquid-phase turbulent jets and in a realization of Rayleigh-Taylor-instability flow. In particular, they exhibit a scale-dependent coverage dimension, D{sub 2}((lambda)), for 2-D slices of scalar level sets, that increases with scale, from unity, at small scales, to 2, at large scales. For the jet flow and Reynolds numbers investigated, the isoscalar-surface geometry is both scalar-threshold- and Re-dependent; the level-set (coverage) length decreases with increasing Re, indicating enhanced mixing with increasing Reynolds number; and the size distribution of closed regions is well described by lognormal statistics at small scales. A similar D{sub 2}((lambda)) behavior is found for level-set data of 3-D density-interface behavior in recent direct numerical-simulation studies of Rayleigh-Taylor-instability flow. A comparison of (spatial) spectral and isoscalar coverage statistics will be disc

  4. Automatic Lumen Segmentation in Intravascular Optical Coherence Tomography Images Using Level Set

    PubMed Central

    Cheng, Kang; Qin, Xianjing; Yin, Qinye; Li, Jianan; Zhao, Wei

    2017-01-01

    Automatic lumen segmentation from intravascular optical coherence tomography (IVOCT) images is an important and fundamental work for diagnosis and treatment of coronary artery disease. However, it is a very challenging task due to irregular lumen caused by unstable plaque and bifurcation vessel, guide wire shadow, and blood artifacts. To address these problems, this paper presents a novel automatic level set based segmentation algorithm which is very competent for irregular lumen challenge. Before applying the level set model, a narrow image smooth filter is proposed to reduce the effect of artifacts and prevent the leakage of level set meanwhile. Moreover, a divide-and-conquer strategy is proposed to deal with the guide wire shadow. With our proposed method, the influence of irregular lumen, guide wire shadow, and blood artifacts can be appreciably reduced. Finally, the experimental results showed that the proposed method is robust and accurate by evaluating 880 images from 5 different patients and the average DSC value was 98.1% ± 1.1%. PMID:28270857

  5. Stabilized Conservative Level Set Method with Adaptive Wavelet-based Mesh Refinement

    NASA Astrophysics Data System (ADS)

    Shervani-Tabar, Navid; Vasilyev, Oleg V.

    2016-11-01

    This paper addresses one of the main challenges of the conservative level set method, namely the ill-conditioned behavior of the normal vector away from the interface. An alternative formulation for reconstruction of the interface is proposed. Unlike the commonly used methods which rely on the unit normal vector, Stabilized Conservative Level Set (SCLS) uses a modified renormalization vector with diminishing magnitude away from the interface. With the new formulation, in the vicinity of the interface the reinitialization procedure utilizes compressive flux and diffusive terms only in the normal direction to the interface, thus, preserving the conservative level set properties, while away from the interfaces the directional diffusion mechanism automatically switches to homogeneous diffusion. The proposed formulation is robust and general. It is especially well suited for use with adaptive mesh refinement (AMR) approaches due to need for a finer resolution in the vicinity of the interface in comparison with the rest of the domain. All of the results were obtained using the Adaptive Wavelet Collocation Method, a general AMR-type method, which utilizes wavelet decomposition to adapt on steep gradients in the solution while retaining a predetermined order of accuracy.

  6. A Performance Comparison Between a Level Set Method and an Unsplit Volume of Fluid Method

    NASA Astrophysics Data System (ADS)

    Desjardins, Olivier; Chiodi, Robert; Owkes, Mark

    2016-11-01

    The simulation of high density ratio liquid-gas flows presents many numerical difficulties due to the necessity to track the interface and the discontinuities in physical properties associated with the interface. Two main categories of methods used to track the interface are level set methods and volume of fluid (VOF) methods. In particular, conservative level set methods track and transport the interface using a scalar field, with the interface profile represented by a hyperbolic tangent function of a finite thickness. Volume of fluid methods, on the other hand, store the percentage of each fluid in the computational cells. Both methods offer distinct advantages, however, the strengths and weaknesses of each method relative to each other have yet to be thoroughly investigated. This work compares the accuracy and computational efficiency for an accurate conservative level set method and an unsplit VOF method using canonical test cases, such as Zalesak's disk, the deformation of a circle, and the deformation of a sphere. The mass conservation and ability to correctly predict instability for a more complex case of an air-blast atomization of a planar liquid layer will also be presented.

  7. A Quadrature-Free Conservative Level Set RKDG for Simulating Atomization

    NASA Astrophysics Data System (ADS)

    Jibben, Zechariah; Herrmann, Marcus

    2012-11-01

    We present an arbitrary high-order, quadrature-free, Runge-Kutta discontinuous Galerkin (RKDG) method for the solution of the conservative level set equation (Olsson et al., 2007), used for capturing phase interfaces in atomizing multiphase flows. Special care is taken to maintain high-order accuracy in the reinitialization equation, using appropriate slope limiters when necessary and a shared basis across cell interfaces for the diffusive flux. For efficiency, we implement the method in the context of the dual narrow band overset mesh approach of the Refined Level Set Grid method (Herrmann, 2008). The accuracy, consistency, and convergence of the resulting method is demonstrated using the method of manufactured solutions (MMS) and several standard test cases, including Zalesak's disk and columns and spheres in prescribed deformation fields. Using MMS, we demonstrate k + 1 order spatial convergence for k-th order orthonormal Legendre polynomial basis functions. We furthermore show several orders of magnitude improvement in shape and volume errors over traditional WENO based distance function level set methods, and k - 1 order spatial convergence of interfacial curvature using direct neighbor cells only. Supported by Stanford's 2012 CTR Summer Program and NSF grant CBET-1054272.

  8. Comparison between advected-field and level-set methods in the study of vesicle dynamics

    NASA Astrophysics Data System (ADS)

    Maitre, E.; Misbah, C.; Peyla, P.; Raoult, A.

    2012-07-01

    Phospholipidic membranes and vesicles constitute a basic element in real biological functions. Vesicles are viewed as a model system to mimic basic viscoelastic behaviors of some cells, like red blood cells. Phase field and level-set models are powerful tools to tackle dynamics of membranes and their coupling to the flow. These two methods are somewhat similar, but to date no bridge between them has been made. This is a first focus of this paper, where we show how the phase-field methods developed in Biben and Misbah (2003) [7], Beaucourt (2004) [9], Biben (2005) [33] for immersed vesicles could be considered as a level-set method for a particular strain-stress relationship. The main conclusion is that the two methods share several common features and we shall provide the correspondence between the two methods. Furthermore, a constitutive viscoelastic law is derived for the composite fluid: the ambient fluid and the membranes. We present two different approaches to deal with the membrane local incompressibility, and point out differences. Some numerical results following from the level-set approach are presented.

  9. Vascular Tree Segmentation in Medical Images Using Hessian-Based Multiscale Filtering and Level Set Method

    PubMed Central

    Jin, Jiaoying; Yang, Linjun; Zhang, Xuming

    2013-01-01

    Vascular segmentation plays an important role in medical image analysis. A novel technique for the automatic extraction of vascular trees from 2D medical images is presented, which combines Hessian-based multiscale filtering and a modified level set method. In the proposed algorithm, the morphological top-hat transformation is firstly adopted to attenuate background. Then Hessian-based multiscale filtering is used to enhance vascular structures by combining Hessian matrix with Gaussian convolution to tune the filtering response to the specific scales. Because Gaussian convolution tends to blur vessel boundaries, which makes scale selection inaccurate, an improved level set method is finally proposed to extract vascular structures by introducing an external constrained term related to the standard deviation of Gaussian function into the traditional level set. Our approach was tested on synthetic images with vascular-like structures and 2D slices extracted from real 3D abdomen magnetic resonance angiography (MRA) images along the coronal plane. The segmentation rates for synthetic images are above 95%. The results for MRA images demonstrate that the proposed method can extract most of the vascular structures successfully and accurately in visualization. Therefore, the proposed method is effective for the vascular tree extraction in medical images. PMID:24348738

  10. Level Set Based Hippocampus Segmentation in MR Images with Improved Initialization Using Region Growing

    PubMed Central

    Zhou, Zhaozhong; Ding, Xiaokang; Deng, Xiaolei; Zou, Ling; Li, Bailin

    2017-01-01

    The hippocampus has been known as one of the most important structures referred to as Alzheimer's disease and other neurological disorders. However, segmentation of the hippocampus from MR images is still a challenging task due to its small size, complex shape, low contrast, and discontinuous boundaries. For the accurate and efficient detection of the hippocampus, a new image segmentation method based on adaptive region growing and level set algorithm is proposed. Firstly, adaptive region growing and morphological operations are performed in the target regions and its output is used for the initial contour of level set evolution method. Then, an improved edge-based level set method utilizing global Gaussian distributions with different means and variances is developed to implement the accurate segmentation. Finally, gradient descent method is adopted to get the minimization of the energy equation. As proved by experiment results, the proposed method can ideally extract the contours of the hippocampus that are very close to manual segmentation drawn by specialists. PMID:28191031

  11. Conceptual framework for assessing the response of delta channel networks to Holocene sea level rise

    NASA Astrophysics Data System (ADS)

    Jerolmack, Douglas J.

    2009-08-01

    Recent research has identified two fundamental unit processes that build delta distributary channels. The first is mouth-bar deposition at the shoreline and subsequent channel bifurcation, which is driven by progradation of the shoreline; the second is avulsion to a new channel, a result of aggradation of the delta topset. The former creates relatively small, branching networks such as Wax Lake Delta; the latter generates relatively few, long distributaries such as the Mississippi and Atchafalaya channels on the Mississippi Delta. The relative rate of progradation to aggradation, and hence the creation of accommodation space, emerges as a controlling parameter on channel network form. Field and experimental research has identified sea level as the dominant control on Holocene delta growth worldwide, and has empirically linked channel network changes to changes in the rate of sea level rise. Here I outline a simple modeling framework for distributary network evolution, and use this to explore large-scale changes in Holocene channel pattern that have been observed in deltas such as the Rhine-Meuse and Mississippi. Rapid early- to mid-Holocene sea level rise forced many deltas into an aggradational mode, where I hypothesize that avulsion and the generation of large-scale branches should dominate. Slowing of sea level rise in the last ˜6000 yr allowed partitioning of sediment into progradation, facilitating the growth of smaller-scale distributary trees at the shorelines of some deltas, and a reduction in the number of large-scale branches. Significant antecedent topography modulates delta response; the filling of large incised valleys, for example, caused many deltas to bypass the aggradational phase. Human effects on deltas can be cast in terms of geologic controls affecting accommodation: constriction of channels forces rapid local progradation and mouth-bar bifurcation, while accelerated sea level rise increases aggradation and induces more frequent channel

  12. Stacking sequence and shape optimization of laminated composite plates via a level-set method

    NASA Astrophysics Data System (ADS)

    Allaire, G.; Delgado, G.

    2016-12-01

    We consider the optimal design of composite laminates by allowing a variable stacking sequence and in-plane shape of each ply. In order to optimize both variables we rely on a decomposition technique which aggregates the constraints into one unique constraint margin function. Thanks to this approach, an exactly equivalent bi-level optimization problem is established. This problem is made up of an inner level represented by the combinatorial optimization of the stacking sequence and an outer level represented by the topology and geometry optimization of each ply. We propose for the stacking sequence optimization an outer approximation method which iteratively solves a set of mixed integer linear problems associated to the evaluation of the constraint margin function. For the topology optimization of each ply, we lean on the level set method for the description of the interfaces and the Hadamard method for boundary variations by means of the computation of the shape gradient. Numerical experiments are performed on an aeronautic test case where the weight is minimized subject to different mechanical constraints, namely compliance, reserve factor and buckling load.

  13. Automatic Rooftop Extraction in Stereo Imagery Using Distance and Building Shape Regularized Level Set Evolution

    NASA Astrophysics Data System (ADS)

    Tian, J.; Krauß, T.; d'Angelo, P.

    2017-05-01

    Automatic rooftop extraction is one of the most challenging problems in remote sensing image analysis. Classical 2D image processing techniques are expensive due to the high amount of features required to locate buildings. This problem can be avoided when 3D information is available. In this paper, we show how to fuse the spectral and height information of stereo imagery to achieve an efficient and robust rooftop extraction. In the first step, the digital terrain model (DTM) and in turn the normalized digital surface model (nDSM) is generated by using a newly step-edge approach. In the second step, the initial building locations and rooftop boundaries are derived by removing the low-level pixels and high-level pixels with higher probability to be trees and shadows. This boundary is then served as the initial level set function, which is further refined to fit the best possible boundaries through distance regularized level-set curve evolution. During the fitting procedure, the edge-based active contour model is adopted and implemented by using the edges indicators extracted from panchromatic image. The performance of the proposed approach is tested by using the WorldView-2 satellite data captured over Munich.

  14. Assessing levels of adaptation during implementation of evidence-based interventions: introducing the Rogers-Rütten framework.

    PubMed

    Bowen, Shelly-Ann K; Saunders, Ruth P; Richter, Donna L; Hussey, Jim; Elder, Keith; Lindley, Lisa

    2010-12-01

    Most HIV-prevention funding agencies require the use of evidence-based behavioral interventions, tested and proven to be effective through outcome evaluation. Adaptation of programs during implementation is common and may be influenced by many factors, including agency mission, time constraints, and funding streams. There are few theoretical frameworks to understand how these organizational and program-related factors influence the level of adaptation. This study used constructs from both Rogers's diffusion theory and Rütten's framework for policy analysis to create a conceptual framework that identifies determinants hypothesized to affect the level of adaptation. Preliminary measures of these constructs were also developed. This framework and its measures assess organizational and program-related factors associated with adaptation and could serve as a model to assess implementation and adaptation in fields outside of HIV prevention.

  15. Modelling calving front dynamics using a level-set method: application to Jakobshavn Isbræ, West Greenland

    NASA Astrophysics Data System (ADS)

    Bondzio, Johannes H.; Seroussi, Hélène; Morlighem, Mathieu; Kleiner, Thomas; Rückamp, Martin; Humbert, Angelika; Larour, Eric Y.

    2016-03-01

    Calving is a major mechanism of ice discharge of the Antarctic and Greenland ice sheets, and a change in calving front position affects the entire stress regime of marine terminating glaciers. The representation of calving front dynamics in a 2-D or 3-D ice sheet model remains non-trivial. Here, we present the theoretical and technical framework for a level-set method, an implicit boundary tracking scheme, which we implement into the Ice Sheet System Model (ISSM). This scheme allows us to study the dynamic response of a drainage basin to user-defined calving rates. We apply the method to Jakobshavn Isbræ, a major marine terminating outlet glacier of the West Greenland Ice Sheet. The model robustly reproduces the high sensitivity of the glacier to calving, and we find that enhanced calving triggers significant acceleration of the ice stream. Upstream acceleration is sustained through a combination of mechanisms. However, both lateral stress and ice influx stabilize the ice stream. This study provides new insights into the ongoing changes occurring at Jakobshavn Isbræ and emphasizes that the incorporation of moving boundaries and dynamic lateral effects, not captured in flow-line models, is key for realistic model projections of sea level rise on centennial timescales.

  16. A new framework for estimating return levels using regional frequency analysis

    NASA Astrophysics Data System (ADS)

    Winter, Hugo; Bernardara, Pietro; Clegg, Georgina

    2017-04-01

    We propose a new framework for incorporating more spatial and temporal information into the estimation of extreme return levels. Currently, most studies use extreme value models applied to data from a single site; an approach which is inefficient statistically and leads to return level estimates that are less physically realistic. We aim to highlight the benefits that could be obtained by using methodology based upon regional frequency analysis as opposed to classic single site extreme value analysis. This motivates a shift in thinking, which permits the evaluation of local and regional effects and makes use of the wide variety of data that are now available on high temporal and spatial resolutions. The recent winter storms over the UK during the winters of 2013-14 and 2015-16, which have caused wide-ranging disruption and damaged important infrastructure, provide the main motivation for the current work. One of the most impactful natural hazards is flooding, which is often initiated by extreme precipitation. In this presentation, we focus on extreme rainfall, but shall discuss other meteorological variables alongside potentially damaging hazard combinations. To understand the risks posed by extreme precipitation, we need reliable statistical models which can be used to estimate quantities such as the T-year return level, i.e. the level which is expected to be exceeded once every T-years. Extreme value theory provides the main collection of statistical models that can be used to estimate the risks posed by extreme precipitation events. Broadly, at a single site, a statistical model is fitted to exceedances of a high threshold and the model is used to extrapolate to levels beyond the range of the observed data. However, when we have data at many sites over a spatial domain, fitting a separate model for each separate site makes little sense and it would be better if we could incorporate all this information to improve the reliability of return level estimates. Here

  17. Improved inhalation technology for setting safe exposure levels for workplace chemicals

    NASA Technical Reports Server (NTRS)

    Stuart, Bruce O.

    1993-01-01

    Threshold Limit Values recommended as allowable air concentrations of a chemical in the workplace are often based upon a no-observable-effect-level (NOEL) determined by experimental inhalation studies using rodents. A 'safe level' for human exposure must then be estimated by the use of generalized safety factors in attempts to extrapolate from experimental rodents to man. The recent development of chemical-specific physiologically-based toxicokinetics makes use of measured physiological, biochemical, and metabolic parameters to construct a validated model that is able to 'scale-up' rodent response data to predict the behavior of the chemical in man. This procedure is made possible by recent advances in personal computer software and the emergence of appropriate biological data, and provides an analytical tool for much more reliable risk evaluation and airborne chemical exposure level setting for humans.

  18. Conceptualizing and assessing heterosexism in high schools: a setting-level approach.

    PubMed

    Chesir-Teran, Daniel

    2003-06-01

    Heterosexism is defined as a setting-level process that systematically privileges heterosexuality relative to homosexuality, based on the assumption that heterosexuality, as well as heterosexual power and privilege are the norm and the ideal. The many ways heterosexism is manifest in the physical-architectural, program-policy, suprapersonal, and social features of high schools are described followed by a proposal for a comprehensive assessment strategy. Strategies used in previous research are reviewed in terms of what is assessed, how it is assessed, and how it is analyzed. The author advocates for more comprehensive assessments and for school-level analyses to enable comparisons between schools, facilitate research on the effects of heterosexism, and provide a basis for evaluating interventions. Additional issues include reliability and validity, links between heterosexism and other forms of oppression, heterosexism in other contexts or at other levels, and implications for theory and practice in community psychology.

  19. Statistical criteria to set alarm levels for continuous measurements of ground contamination.

    PubMed

    Brandl, A; Jimenez, A D Herrera

    2008-08-01

    In the course of the decommissioning of the ASTRA research reactor at the site of the Austrian Research Centers at Seibersdorf, the operator and licensee, Nuclear Engineering Seibersdorf, conducted an extensive site survey and characterization to demonstrate compliance with regulatory site release criteria. This survey included radiological characterization of approximately 400,000 m(2) of open land on the Austrian Research Centers premises. Part of this survey was conducted using a mobile large-area gas proportional counter, continuously recording measurements while it was moved at a speed of 0.5 ms(-1). In order to set reasonable investigation levels, two alarm levels based on statistical considerations were developed. This paper describes the derivation of these alarm levels and the operational experience gained by detector deployment in the field.

  20. Differential optimal dopamine levels for set-shifting and working memory in Parkinson's disease.

    PubMed

    Fallon, Sean James; Smulders, Katrijn; Esselink, Rianne A; van de Warrenburg, Bart P; Bloem, Bastiaan R; Cools, Roshan

    2015-10-01

    Parkinson's disease (PD) is an important model for the role of dopamine in supporting human cognition. However, despite the uniformity of midbrain dopamine depletion only some patients experience cognitive impairment. The neurocognitive mechanisms of this heterogeneity remain unclear. A genetic polymorphism in the catechol O-methyltransferase (COMT) enzyme, predominantly thought to exert its cognitive effect through acting on prefrontal cortex (PFC) dopamine transmission, provides us with an experimental window onto dopamine's role in cognitive performance in PD. In a large cohort of PD patients (n=372), we examined the association between COMT genotype and two tasks known to implicate prefrontal dopamine (spatial working memory and attentional set-shifting) and on a task less sensitive to prefrontal dopamine (paired associates learning). Consistent with the known neuroanatomical locus of its effects, differences between the COMT genotype groups were observed on dopamine-dependant tasks, but not the paired associates learning task. However, COMT genotype had differential effects on the two prefrontal dopamine tasks. Putative prefrontal dopamine levels influenced spatial working memory in an 'Inverted-U'-shaped fashion, whereas a linear, dose-dependant pattern was observed for attentional set-shifting. Cumulatively, these results revise our understanding of when COMT genotype modulates cognitive functioning in PD patients by showing that the behavioural consequences of genetic variation vary according to task demands, presumably because set-shifting and working memory have different optimal dopamine levels.

  1. Study of Burn Scar Extraction Automatically Based on Level Set Method using Remote Sensing Data

    PubMed Central

    Liu, Yang; Dai, Qin; Liu, JianBo; Liu, ShiBin; Yang, Jin

    2014-01-01

    Burn scar extraction using remote sensing data is an efficient way to precisely evaluate burn area and measure vegetation recovery. Traditional burn scar extraction methodologies have no well effect on burn scar image with blurred and irregular edges. To address these issues, this paper proposes an automatic method to extract burn scar based on Level Set Method (LSM). This method utilizes the advantages of the different features in remote sensing images, as well as considers the practical needs of extracting the burn scar rapidly and automatically. This approach integrates Change Vector Analysis (CVA), Normalized Difference Vegetation Index (NDVI) and the Normalized Burn Ratio (NBR) to obtain difference image and modifies conventional Level Set Method Chan-Vese (C-V) model with a new initial curve which results from a binary image applying K-means method on fitting errors of two near-infrared band images. Landsat 5 TM and Landsat 8 OLI data sets are used to validate the proposed method. Comparison with conventional C-V model, OSTU algorithm, Fuzzy C-mean (FCM) algorithm are made to show that the proposed approach can extract the outline curve of fire burn scar effectively and exactly. The method has higher extraction accuracy and less algorithm complexity than that of the conventional C-V model. PMID:24503563

  2. Study of burn scar extraction automatically based on level set method using remote sensing data.

    PubMed

    Liu, Yang; Dai, Qin; Liu, Jianbo; Liu, ShiBin; Yang, Jin

    2014-01-01

    Burn scar extraction using remote sensing data is an efficient way to precisely evaluate burn area and measure vegetation recovery. Traditional burn scar extraction methodologies have no well effect on burn scar image with blurred and irregular edges. To address these issues, this paper proposes an automatic method to extract burn scar based on Level Set Method (LSM). This method utilizes the advantages of the different features in remote sensing images, as well as considers the practical needs of extracting the burn scar rapidly and automatically. This approach integrates Change Vector Analysis (CVA), Normalized Difference Vegetation Index (NDVI) and the Normalized Burn Ratio (NBR) to obtain difference image and modifies conventional Level Set Method Chan-Vese (C-V) model with a new initial curve which results from a binary image applying K-means method on fitting errors of two near-infrared band images. Landsat 5 TM and Landsat 8 OLI data sets are used to validate the proposed method. Comparison with conventional C-V model, OSTU algorithm, Fuzzy C-mean (FCM) algorithm are made to show that the proposed approach can extract the outline curve of fire burn scar effectively and exactly. The method has higher extraction accuracy and less algorithm complexity than that of the conventional C-V model.

  3. Polyp enhancing level set evolution of colon wall: method and pilot study.

    PubMed

    Konukoglu, Ender; Acar, Burak; Paik, David S; Beaulieu, Christopher F; Rosenberg, Jarrett; Napel, Sandy

    2007-12-01

    Computer aided detection (CAD) in computed tomography colonography (CTC) aims at detecting colonic polyps that are the precursors of colon cancer. In this work, we propose a colon wall evolution algorithm polyp enhancing level sets (PELS) based on the level-set formulation that regularizes and enhances polyps as a preprocessing step to CTC CAD algorithms. The underlying idea is to evolve the polyps towards spherical protrusions on the colon wall while keeping other structures, such as haustral folds, relatively unchanged and, thereby, potentially improve the performance of CTC CAD algorithms, especially for smaller polyps. To evaluate our methods, we conducted a pilot study using an arbitrarily chosen CTC CAD method, the surface normal overlap (SNO) CAD algorithm, on a nine patient CTC data set with 47 polyps of sizes ranging from 2.0 to 17.0 mm in diameter. PELS increased the maximum sensitivity by 8.1% (from 21/37 to 24/37) for small polyps of sizes ranging from 5.0 to 9.0 mm in diameter. This is accompanied by a statistically significant separation between small polyps and false positives. PELS did not change the CTC CAD performance significantly for larger polyps.

  4. Physical Therapy for Young Children Diagnosed with Autism Spectrum Disorders–Clinical Frameworks Model in an Israeli Setting

    PubMed Central

    Atun-Einy, Osnat; Lotan, Meir; Harel, Yael; Shavit, Efrat; Burstein, Shimshon; Kempner, Gali

    2013-01-01

    Recent research findings suggest that many children with Autism Spectrum Disorders (ASD) demonstrate delayed and atypical motor achievements. It has now become clear that a more holistic, integrative and multi-disciplinary intervention is required to effectively address the motor-related impairments of this population. It is also crucial to ensure that this group of clients has access to early physical therapy (PT) interventions. Despite accumulating research on physical interventions, little is known about intervention model for implementation at a national level. This report introduces a model that uniquely illustrates implementation of PT services for a large number of children with ASD. The model has been operating for the past 2 years in one country (Israel), and includes an optional implementation model of PT practice settings for young children diagnosed with ASD. The Israeli setting offers a unique opportunity for implementing PT services for a multitude of children with ASD on a regular basis as an accepted/needed service. The initial outcomes of the present implementation suggest that an intensive PT intervention program might enhance therapeutic outcomes for this population, and contribute to our knowledge on the potential of PT for individuals with ASD. PMID:24400265

  5. Student Performance-University Preference Model: A Framework for Helping Students Choose the Right A-Level Subjects

    ERIC Educational Resources Information Center

    Wilkins, Stephen; Meeran, Sheik

    2011-01-01

    Every year, many students in the UK fail to achieve a place at their preferred university because they take the wrong A-level subjects. This study aims to suggest a framework for helping students choose the right subjects. Data on student achievement in A-level examinations were obtained from a UK sixth form college over a four-year period.…

  6. Modelling Molecular Mechanisms: A Framework of Scientific Reasoning to Construct Molecular-Level Explanations for Cellular Behaviour

    ERIC Educational Resources Information Center

    van Mil, Marc H. W.; Boerwinkel, Dirk Jan; Waarlo, Arend Jan

    2013-01-01

    Although molecular-level details are part of the upper-secondary biology curriculum in most countries, many studies report that students fail to connect molecular knowledge to phenomena at the level of cells, organs and organisms. Recent studies suggest that students lack a framework to reason about complex systems to make this connection. In this…

  7. Modelling Molecular Mechanisms: A Framework of Scientific Reasoning to Construct Molecular-Level Explanations for Cellular Behaviour

    ERIC Educational Resources Information Center

    van Mil, Marc H. W.; Boerwinkel, Dirk Jan; Waarlo, Arend Jan

    2013-01-01

    Although molecular-level details are part of the upper-secondary biology curriculum in most countries, many studies report that students fail to connect molecular knowledge to phenomena at the level of cells, organs and organisms. Recent studies suggest that students lack a framework to reason about complex systems to make this connection. In this…

  8. Structural topology design of container ship based on knowledge-based engineering and level set method

    NASA Astrophysics Data System (ADS)

    Cui, Jin-ju; Wang, De-yu; Shi, Qi-qi

    2015-06-01

    Knowledge-Based Engineering (KBE) is introduced into the ship structural design in this paper. From the implementation of KBE, the design solutions for both Rules Design Method (RDM) and Interpolation Design Method (IDM) are generated. The corresponding Finite Element (FE) models are generated. Topological design of the longitudinal structures is studied where the Gaussian Process (GP) is employed to build the surrogate model for FE analysis. Multi-objective optimization methods inspired by Pareto Front are used to reduce the design tank weight and outer surface area simultaneously. Additionally, an enhanced Level Set Method (LSM) which employs implicit algorithm is applied to the topological design of typical bracket plate which is used extensively in ship structures. Two different sets of boundary conditions are considered. The proposed methods show satisfactory efficiency and accuracy.

  9. A three-dimensional coupled Nitsche and level set method for electrohydrodynamic potential flows in moving domains

    NASA Astrophysics Data System (ADS)

    Johansson, A.; Garzon, M.; Sethian, J. A.

    2016-03-01

    In this paper we present a new algorithm for computing three-dimensional electrohydrodynamic flow in moving domains which can undergo topological changes. We consider a non-viscous, irrotational, perfect conducting fluid and introduce a way to model the electrically charged flow with an embedded potential approach. To numerically solve the resulting system, we combine a level set method to track both the free boundary and the surface velocity potential with a Nitsche finite element method for solving the Laplace equations. This results in an algorithmic framework that does not require body-conforming meshes, works in three dimensions, and seamlessly tracks topological change. Assembling this coupled system requires care: while convergence and stability properties of Nitsche's methods have been well studied for static problems, they have rarely been considered for moving domains or for obtaining the gradients of the solution on the embedded boundary. We therefore investigate the performance of the symmetric and non-symmetric Nitsche formulations, as well as two different stabilization techniques. The global algorithm and in particular the coupling between the Nitsche solver and the level set method are also analyzed in detail. Finally we present numerical results for several time-dependent problems, each one designed to achieve a specific objective: (a) The oscillation of a perturbed sphere, which is used for convergence studies and the examination of the Nitsche methods; (b) The break-up of a two lobe droplet with axial symmetry, which tests the capability of the algorithm to go past flow singularities such as topological changes and preservation of an axi-symmetric flow, and compares results to previous axi-symmetric calculations; (c) The electrohydrodynamical deformation of a thin film and subsequent jet ejection, which will account for the presence of electrical forces in a non-axi-symmetric geometry.

  10. Large-Eddy Simulation of Premixed and Partially Premixed Turbulent Combustion Using a Level Set Method

    NASA Astrophysics Data System (ADS)

    Duchamp de Lageneste, Laurent; Pitsch, Heinz

    2001-11-01

    Level-set methods (G-equation) have been recently used in the context of RANS to model turbulent premixed (Hermann 2000) or partially premixed (Chen 1999) combustion. By directly taking into account unsteady effects, LES can be expected to improve predictions over RANS. Since the reaction zone thickness of premixed flames in technical devices is usually much smaller than the LES grid spacing, chemical reactions completely occur on the sub-grid scales and hence have to be modeled entirely. In the level-set methodology, the flame front is represented by an arbitrary iso-surface G0 of a scalar field G whose evolution is described by the so-called G-equation. This equation is only valid at G=G_0, and hence decoupled from other G levels. Heat release is then modeled using a flamelet approach in which temperature is determined as a function of G and the mixture-fraction Z. In the present study, the proposed approach has been formulated for LES and validated using data from a turbulent Bunsen burner experiment (Chen, Peters 1996). Simulation of an experimental Lean Premixed Prevapourised (LPP) dump combustor (Besson, Bruel 1999, 2000) under different premixed or partially premixed conditions will also be presented.

  11. Computerized segmentation of liver in hepatic CT and MRI by means of level-set geodesic active contouring.

    PubMed

    Suzuki, Kenji; Huynh, Hieu Trung; Liu, Yipeng; Calabrese, Dominic; Zhou, Karen; Oto, Aytekin; Hori, Masatoshi

    2013-01-01

    Computerized liver volumetry has been studied, because the current "gold-standard" manual volumetry is subjective and very time-consuming. Liver volumetry is done in either CT or MRI. A number of researchers have developed computerized liver segmentation in CT, but there are fewer studies on ones for MRI. Our purpose in this study was to develop a general framework for liver segmentation in both CT and MRI. Our scheme consisted of 1) an anisotropic diffusion filter to reduce noise while preserving liver structures, 2) a scale-specific gradient magnitude filter to enhance liver boundaries, 3) a fast-marching algorithm to roughly determine liver boundaries, and 4) a geodesic-active-contour model coupled with a level-set algorithm to refine the initial boundaries. Our CT database contained hepatic CT scans of 18 liver donors obtained under a liver transplant protocol. Our MRI database contains 23 patients with 1.5T MRI scanners. To establish "gold-standard" liver volumes, radiologists manually traced the contour of the liver on each CT or MR slice. We compared our computer volumetry with "gold-standard" manual volumetry. Computer volumetry in CT and MRI reached excellent agreement with manual volumetry (intra-class correlation coefficient = 0.94 and 0.98, respectively). Average user time for computer volumetry in CT and MRI was 0.57 ± 0.06 and 1.0 ± 0.13 min. per case, respectively, whereas those for manual volumetry were 39.4 ± 5.5 and 24.0 ± 4.4 min. per case, respectively, with statistically significant difference (p < .05). Our computerized liver segmentation framework provides an efficient and accurate way of measuring liver volumes in both CT and MRI.

  12. Nurses' comfort level with spiritual assessment: a study among nurses working in diverse healthcare settings.

    PubMed

    Cone, Pamela H; Giske, Tove

    2017-10-01

    To gain knowledge about nurses' comfort level in assessing spiritual matters and to learn what questions nurses use in practice related to spiritual assessment. Spirituality is important in holistic nursing care; however, nurses report feeling uncomfortable and ill-prepared to address this domain with patients. Education is reported to impact nurses' ability to engage in spiritual care. This cross-sectional exploratory survey reports on a mixed-method study examining how comfortable nurses are with spiritual assessment. In 2014, a 21-item survey with 10 demographic variables and three open-ended questions were distributed to Norwegian nurses working in diverse care settings with 172 nurse responses (72 % response rate). SPSS was used to analyse quantitative data; thematic analysis examined the open-ended questions. Norwegian nurses reported a high level of comfort with most questions even though spirituality is seen as private. Nurses with some preparation or experience in spiritual care were most comfortable assessing spirituality. Statistically significant correlations were found between the nurses' comfort level with spiritual assessment and their preparedness and sense of the importance of spiritual assessment. How well-prepared nurses felt was related to years of experience, degree of spirituality and religiosity, and importance of spiritual assessment. Many nurses are poorly prepared for spiritual assessment and care among patients in diverse care settings; educational preparation increases their comfort level with facilitating such care. Nurses who feel well prepared with spirituality feel more comfortable with the spiritual domain. By fostering a culture where patients' spirituality is discussed and reflected upon in everyday practice and in continued education, nurses' sense of preparedness, and thus their level of comfort, can increase. Clinical supervision and interprofessional collaboration with hospital chaplains and/or other spiritual leaders can

  13. Method: automatic segmentation of mitochondria utilizing patch classification, contour pair classification, and automatically seeded level sets.

    PubMed

    Giuly, Richard J; Martone, Maryann E; Ellisman, Mark H

    2012-02-09

    While progress has been made to develop automatic segmentation techniques for mitochondria, there remains a need for more accurate and robust techniques to delineate mitochondria in serial blockface scanning electron microscopic data. Previously developed texture based methods are limited for solving this problem because texture alone is often not sufficient to identify mitochondria. This paper presents a new three-step method, the Cytoseg process, for automated segmentation of mitochondria contained in 3D electron microscopic volumes generated through serial block face scanning electron microscopic imaging. The method consists of three steps. The first is a random forest patch classification step operating directly on 2D image patches. The second step consists of contour-pair classification. At the final step, we introduce a method to automatically seed a level set operation with output from previous steps. We report accuracy of the Cytoseg process on three types of tissue and compare it to a previous method based on Radon-Like Features. At step 1, we show that the patch classifier identifies mitochondria texture but creates many false positive pixels. At step 2, our contour processing step produces contours and then filters them with a second classification step, helping to improve overall accuracy. We show that our final level set operation, which is automatically seeded with output from previous steps, helps to smooth the results. Overall, our results show that use of contour pair classification and level set operations improve segmentation accuracy beyond patch classification alone. We show that the Cytoseg process performs well compared to another modern technique based on Radon-Like Features. We demonstrated that texture based methods for mitochondria segmentation can be enhanced with multiple steps that form an image processing pipeline. While we used a random-forest based patch classifier to recognize texture, it would be possible to replace this with

  14. A level set based algorithm to reconstruct the urinary bladder from multiple views.

    PubMed

    Ma, Zhen; Jorge, Renato Natal; Mascarenhas, T; Tavares, João Manuel R S

    2013-12-01

    The urinary bladder can be visualized from different views by imaging facilities such as computerized tomography and magnetic resonance imaging. Multi-view imaging can present more details of this pelvic organ and contribute to a more reliable reconstruction. Based on the information from multi-view planes, a level set based algorithm is proposed to reconstruct the 3D shape of the bladder using the cross-sectional boundaries. The algorithm provides a flexible solution to handle the discrepancies from different view planes and can obtain an accurate bladder surface with more geometric details. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.

  15. Differential graded Lie algebras and singularities of level sets of momentum mappings

    NASA Astrophysics Data System (ADS)

    Goldman, William M.; Millson, John J.

    1990-08-01

    The germ of an analytic variety X at a point x∈ X is said to be quadratic if it is bi-analytically isomorphic to the germ of a cone defined by a system of homogeneous quadratic equations at the origin. Arms, Marsden and Moncrief show in [2] that under certain conditions the analytic germ of a level set of a momentum mapping is quadratic. We discuss related ideas in a more algebraic context by associating to an affine Hamiltonian action a differential graded Lie algebra, which in the presence of an invariant positive complex structure, is formal in the sence of [5].

  16. "EU-on-Demand": Developing National Qualifications Frameworks in a Multi-Level Context

    ERIC Educational Resources Information Center

    Elken, Mari

    2016-01-01

    The development of comprehensive national qualifications frameworks (NQFs) across Europe has been sparked by the introduction of the European Qualifications Framework (EQF) in 2008. Taking an institutional perspective, this article examines the development of NQFs in three countries, in light of developments that have taken place at the European…

  17. "EU-on-Demand": Developing National Qualifications Frameworks in a Multi-Level Context

    ERIC Educational Resources Information Center

    Elken, Mari

    2016-01-01

    The development of comprehensive national qualifications frameworks (NQFs) across Europe has been sparked by the introduction of the European Qualifications Framework (EQF) in 2008. Taking an institutional perspective, this article examines the development of NQFs in three countries, in light of developments that have taken place at the European…

  18. Language Arts Curriculum Framework: Sample Grade Level Benchmarks, Grades 5-8.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock.

    Based on the 1998 Arkansas English Language Arts Frameworks, this framework lists benchmarks for grades five through eight in writing; reading; and listening, speaking, and viewing. The writing section's stated standards are to help students employ a wide range of strategies as they write; use different writing process elements appropriately to…

  19. Analytical Overview of the European and Russian Qualifications Frameworks with a Focus on Doctoral Degree Level

    ERIC Educational Resources Information Center

    Chigisheva, Oksana; Bondarenko, Anna; Soltovets, Elena

    2017-01-01

    The paper provides analytical insights into highly acute issues concerning preparation and adoption of Qualifications Frameworks being an adequate response to the growing interactions at the global labor market and flourishing of knowledge economy. Special attention is paid to the analyses of transnational Meta Qualifications Frameworks (A…

  20. Crack Level Estimation Approach for Planetary Gear Sets Based on Simulation Signal and GRA

    NASA Astrophysics Data System (ADS)

    Cheng, Zhe; Hu, Niaoqing; Zuo, Mingjian; Fan, Bin

    2012-05-01

    The planetary gearbox is a critical mechanism in helicopter transmission systems. Tooth failures in planetary gear sets will cause great risk to helicopter operations. A crack level estimation methodology has been devised in this paper by integrating a physical model for simulation signal generation and a grey relational analysis (GRA) algorithm for damage level estimation. The proposed method was calibrated firstly with fault seeded test data and then validated with the data of other tests from a helicopter transmission test rig. The estimation results of test data coincide with the actual test records, showing the effectiveness and accuracy of the method in providing a novel way to hybrid model based methods and signal analysis methods for more accurate health monitoring and condition prediction.

  1. Vessel Segmentation and Blood Flow Simulation Using Level-Sets and Embedded Boundary Methods

    SciTech Connect

    Deschamps, T; Schwartz, P; Trebotich, D; Colella, P; Saloner, D; Malladi, R

    2004-12-09

    In this article we address the problem of blood flow simulation in realistic vascular objects. The anatomical surfaces are extracted by means of Level-Sets methods that accurately model the complex and varying surfaces of pathological objects such as aneurysms and stenoses. The surfaces obtained are defined at the sub-pixel level where they intersect the Cartesian grid of the image domain. It is therefore straightforward to construct embedded boundary representations of these objects on the same grid, for which recent work has enabled discretization of the Navier-Stokes equations for incompressible fluids. While most classical techniques require construction of a structured mesh that approximates the surface in order to extrapolate a 3D finite-element gridding of the whole volume, our method directly simulates the blood-flow inside the extracted surface without losing any complicated details and without building additional grids.

  2. Information Seen as Part of the Development of Living Intelligence: the Five-Leveled Cybersemiotic Framework for FIS

    NASA Astrophysics Data System (ADS)

    Brier, Soren

    2003-06-01

    It is argued that a true transdisciplinary information science going from physical information to phenomenological understanding needs a metaphysical framework. Three different kinds of causality are implied: efficient, formal and final. And at least five different levels of existence are needed: 1. The quantum vacuum fields with entangled causation. 2. The physical level with is energy and force-based efficient causation. 3. The informational-chemical level with its formal causation based on pattern fitting. 4. The biological-semiotic level with its non-conscious final causation and 5. The social-linguistic level of self-consciousness with its conscious goal-oriented final causation. To integrate these consistently in an evolutionary theory as emergent levels, neither mechanical determinism nor complexity theory are sufficient because they cannot be a foundation for a theory of lived meaning. C. S. Peirce's triadic semiotic philosophy combined with a cybernetic and systemic view, like N. Luhmann's, could create the framework I call Cybersemiotics.

  3. Level set-based core segmentation of mammographic masses facilitating three stage (core, periphery, spiculation) analysis.

    PubMed

    Ball, John E; Bruce, Lori Mann

    2007-01-01

    We present mammographic mass core segmentation, based on the Chan-Vese level set method. The proposed method is analyzed via resulting feature efficacies. Additionally, the core segmentation method is used to investigate the idea of a three stage segmentation approach, i.e. segment the mass core, periphery, and spiculations (if any exist) and use features from these three segmentations to classify the mass as either benign or malignant. The proposed core segmentation method and a proposed end-to-end computer aided detection (CAD) system using a three stage segmentation are implemented and experimentally tested with a set of 60 mammographic images from the Digital Database of Screening Mammography. Receiver operating characteristic (ROC) curve AZ values for morphological and texture features extracted from the core segmentation are shown to be on par, or better, than those extracted from a periphery segmentation. The efficacy of the core segmentation features when combined with the periphery and spiculation segmentation features are shown to be feature set dependent. The proposed end-to-end system uses stepwise linear discriminant analysis for feature selection and a maximum likelihood classifier. Using all three stages (core + periphery + spiculations) results in an overall accuracy (OA) of 90% with 2 false negatives (FN). Since many CAD systems only perform a periphery analysis, adding core features could be a benefit to potentially increase OA and reduce FN cases.

  4. Theories of behaviour change synthesised into a set of theoretical groupings: introducing a thematic series on the theoretical domains framework.

    PubMed

    Francis, Jill J; O'Connor, Denise; Curran, Janet

    2012-04-24

    Behaviour change is key to increasing the uptake of evidence into healthcare practice. Designing behaviour-change interventions first requires problem analysis, ideally informed by theory. Yet the large number of partly overlapping theories of behaviour makes it difficult to select the most appropriate theory. The need for an overarching theoretical framework of behaviour change was addressed in research in which 128 explanatory constructs from 33 theories of behaviour were identified and grouped. The resulting Theoretical Domains Framework (TDF) appears to be a helpful basis for investigating implementation problems. Research groups in several countries have conducted TDF-based studies. It seems timely to bring together the experience of these teams in a thematic series to demonstrate further applications and to report key developments. This overview article describes the TDF, provides a brief critique of the framework, and introduces this thematic series.In a brief review to assess the extent of TDF-based research, we identified 133 papers that cite the framework. Of these, 17 used the TDF as the basis for empirical studies to explore health professionals' behaviour. The identified papers provide evidence of the impact of the TDF on implementation research. Two major strengths of the framework are its theoretical coverage and its capacity to elicit beliefs that could signify key mediators of behaviour change. The TDF provides a useful conceptual basis for assessing implementation problems, designing interventions to enhance healthcare practice, and understanding behaviour-change processes. We discuss limitations and research challenges and introduce papers in this series.

  5. Theories of behaviour change synthesised into a set of theoretical groupings: introducing a thematic series on the theoretical domains framework

    PubMed Central

    2012-01-01

    Behaviour change is key to increasing the uptake of evidence into healthcare practice. Designing behaviour-change interventions first requires problem analysis, ideally informed by theory. Yet the large number of partly overlapping theories of behaviour makes it difficult to select the most appropriate theory. The need for an overarching theoretical framework of behaviour change was addressed in research in which 128 explanatory constructs from 33 theories of behaviour were identified and grouped. The resulting Theoretical Domains Framework (TDF) appears to be a helpful basis for investigating implementation problems. Research groups in several countries have conducted TDF-based studies. It seems timely to bring together the experience of these teams in a thematic series to demonstrate further applications and to report key developments. This overview article describes the TDF, provides a brief critique of the framework, and introduces this thematic series. In a brief review to assess the extent of TDF-based research, we identified 133 papers that cite the framework. Of these, 17 used the TDF as the basis for empirical studies to explore health professionals’ behaviour. The identified papers provide evidence of the impact of the TDF on implementation research. Two major strengths of the framework are its theoretical coverage and its capacity to elicit beliefs that could signify key mediators of behaviour change. The TDF provides a useful conceptual basis for assessing implementation problems, designing interventions to enhance healthcare practice, and understanding behaviour-change processes. We discuss limitations and research challenges and introduce papers in this series. PMID:22531601

  6. Variational B-spline level-set: a linear filtering approach for fast deformable model evolution.

    PubMed

    Bernard, Olivier; Friboulet, Denis; Thévenaz, Philippe; Unser, Michael

    2009-06-01

    In the field of image segmentation, most level-set-based active-contour approaches take advantage of a discrete representation of the associated implicit function. We present in this paper a different formulation where the implicit function is modeled as a continuous parametric function expressed on a B-spline basis. Starting from the active-contour energy functional, we show that this formulation allows us to compute the solution as a restriction of the variational problem on the space spanned by the B-splines. As a consequence, the minimization of the functional is directly obtained in terms of the B-spline coefficients. We also show that each step of this minimization may be expressed through a convolution operation. Because the B-spline functions are separable, this convolution may in turn be performed as a sequence of simple 1-D convolutions, which yields an efficient algorithm. As a further consequence, each step of the level-set evolution may be interpreted as a filtering operation with a B-spline kernel. Such filtering induces an intrinsic smoothing in the algorithm, which can be controlled explicitly via the degree and the scale of the chosen B-spline kernel. We illustrate the behavior of this approach on simulated as well as experimental images from various fields.

  7. A novel approach to segmentation and measurement of medical image using level set methods.

    PubMed

    Chen, Yao-Tien

    2017-06-01

    The study proposes a novel approach for segmentation and visualization plus value-added surface area and volume measurements for brain medical image analysis. The proposed method contains edge detection and Bayesian based level set segmentation, surface and volume rendering, and surface area and volume measurements for 3D objects of interest (i.e., brain tumor, brain tissue, or whole brain). Two extensions based on edge detection and Bayesian level set are first used to segment 3D objects. Ray casting and a modified marching cubes algorithm are then adopted to facilitate volume and surface visualization of medical-image dataset. To provide physicians with more useful information for diagnosis, the surface area and volume of an examined 3D object are calculated by the techniques of linear algebra and surface integration. Experiment results are finally reported in terms of 3D object extraction, surface and volume rendering, and surface area and volume measurements for medical image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Comparisons and Limitations of Gradient Augmented Level Set and Algebraic Volume of Fluid Methods

    NASA Astrophysics Data System (ADS)

    Anumolu, Lakshman; Ryddner, Douglas; Trujillo, Mario

    2014-11-01

    Recent numerical methods for implicit interface transport are generally presented as enjoying higher order of spatial-temporal convergence when compared to classical methods or less sophisticated approaches. However, when applied to test cases, which are designed to simulate practical industrial conditions, significant reduction in convergence is observed in higher-order methods, whereas for the less sophisticated approaches same convergence is achieved but a growth in the error norms occurs. This provides an opportunity to understand the underlying issues which causes this decrease in accuracy in both types of methods. As an example we consider the Gradient Augmented Level Set method (GALS) and a variant of the Volume of Fluid (VoF) method in our study. Results show that while both methods do suffer from a loss of accuracy, it is the higher order method that suffers more. The implication is a significant reduction in the performance advantage of the GALS method over the VoF scheme. Reasons for this lie in the behavior of the higher order derivatives, particular in situations where the level set field is highly distorted. For the VoF approach, serious spurious deformations of the interface are observed, albeit with a deceptive zero loss of mass.

  9. Modelling of two-phase flow in a minichannel using level-set method

    NASA Astrophysics Data System (ADS)

    Grzybowski, H.; Mosdorf, R.

    2014-08-01

    Today there is a great interest in micro-scale multiphase fluid flow. In the paper, the numerical simulation of two-phase flow inside 3 mm minichannel was carried out. The liquid- gas interface was captured using the level-set method. During the calculation, the stabilization and reinitialization of level set function was performed in order to obtain the proper accuracy of the simulation. Incompressible Navier-Stokes equations were solved using the COMSOL Multiphysics® on a two-dimensional mesh. The process of formation of different two-phase flow patterns in the minichannel has been investigated. During the simulation it has been analysed three flow patterns: the bubbly flow and two kinds of slug flow with short and long slugs. It has been shown that unsteady flow at the inlet of the minichannel is responsible for the chaotic character of changes of the slug and bubble sizes. Such unsteady flow modifies the distance between the bubbles and slugs. It has been shown that for the low water inlet velocity the two-phase flow pattern becomes more stable.

  10. A mass conserving level set method for detailed numerical simulation of liquid atomization

    SciTech Connect

    Luo, Kun; Shao, Changxiao; Yang, Yue; Fan, Jianren

    2015-10-01

    An improved mass conserving level set method for detailed numerical simulations of liquid atomization is developed to address the issue of mass loss in the existing level set method. This method introduces a mass remedy procedure based on the local curvature at the interface, and in principle, can ensure the absolute mass conservation of the liquid phase in the computational domain. Three benchmark cases, including Zalesak's disk, a drop deforming in a vortex field, and the binary drop head-on collision, are simulated to validate the present method, and the excellent agreement with exact solutions or experimental results is achieved. It is shown that the present method is able to capture the complex interface with second-order accuracy and negligible additional computational cost. The present method is then applied to study more complex flows, such as a drop impacting on a liquid film and the swirling liquid sheet atomization, which again, demonstrates the advantages of mass conservation and the capability to represent the interface accurately.

  11. Robust space-time extraction of ventricular surface evolution using multiphase level sets

    NASA Astrophysics Data System (ADS)

    Drapaca, Corina S.; Cardenas, Valerie; Studholme, Colin

    2004-05-01

    This paper focuses on the problem of accurately extracting the CSF-tissue boundary, particularly around the ventricular surface, from serial structural MRI of the brain acquired in imaging studies of aging and dementia. This is a challenging problem because of the common occurrence of peri-ventricular lesions which locally alter the appearance of white matter. We examine a level set approach which evolves a four dimensional description of the ventricular surface over time. This has the advantage of allowing constraints on the contour in the temporal dimension, improving the consistency of the extracted object over time. We follow the approach proposed by Chan and Vese which is based on the Mumford and Shah model and implemented using the Osher and Sethian level set method. We have extended this to the 4 dimensional case to propagate a 4D contour toward the tissue boundaries through the evolution of a 5D implicit function. For convergence we use region-based information provided by the image rather than the gradient of the image. This is adapted to allow intensity contrast changes between time frames in the MRI sequence. Results on time sequences of 3D brain MR images are presented and discussed.

  12. Level set immersed boundary method for gas-liquid-solid interactions

    NASA Astrophysics Data System (ADS)

    Wang, Shizhao; Balaras, Elias

    2015-11-01

    We will discuss an approach to simulate the interaction between free surface flows and deformable structures. In our formulation the Navier-Stokes equations are solved on a block-structured grid with adaptive mesh refinement, and the pressure jumps across the interface between different phases, which is tracked using a level set approach, are sharply defined. Deformable structures are simulated with a solid mechanics solver utilizing a finite element method. The overall approach is tailored to problems with large displacement/deformations. The boundary conditions on a solid body are imposed using a direct forcing, immersed boundary method (Vanella & Balaras, J. Comput. Physics, 228(18), 6617-6628, 2009). The flow and structural solvers are coupled by a predictor-corrector, strong-coupling scheme. The consistency between the Eulerian field based level set method for fluid-fluid interface and Lagrangian marker based immersed boundary method for fluid-structure interface is ensured by reconstructing the flow field around the three phase intersections. A variety of 2D and 3D problems ranging from water impact of wedges, entry and exit of cylinders and flexible plates interacting with a free surfaces, are presented to demonstrate the accuracy of the proposed approach. Supported by ONR N000141110588 monitored by Dr. Thomas Fu.

  13. Time-optimal path planning in dynamic flows using level set equations: theory and schemes

    NASA Astrophysics Data System (ADS)

    Lolla, Tapovan; Lermusiaux, Pierre F. J.; Ueckermann, Mattheus P.; Haley, Patrick J.

    2014-10-01

    We develop an accurate partial differential equation-based methodology that predicts the time-optimal paths of autonomous vehicles navigating in any continuous, strong, and dynamic ocean currents, obviating the need for heuristics. The goal is to predict a sequence of steering directions so that vehicles can best utilize or avoid currents to minimize their travel time. Inspired by the level set method, we derive and demonstrate that a modified level set equation governs the time-optimal path in any continuous flow. We show that our algorithm is computationally efficient and apply it to a number of experiments. First, we validate our approach through a simple benchmark application in a Rankine vortex flow for which an analytical solution is available. Next, we apply our methodology to more complex, simulated flow fields such as unsteady double-gyre flows driven by wind stress and flows behind a circular island. These examples show that time-optimal paths for multiple vehicles can be planned even in the presence of complex flows in domains with obstacles. Finally, we present and support through illustrations several remarks that describe specific features of our methodology.

  14. On the Relationship between Variational Level Set-Based and SOM-Based Active Contours

    PubMed Central

    Abdelsamea, Mohammed M.; Gnecco, Giorgio; Gaber, Mohamed Medhat; Elyan, Eyad

    2015-01-01

    Most Active Contour Models (ACMs) deal with the image segmentation problem as a functional optimization problem, as they work on dividing an image into several regions by optimizing a suitable functional. Among ACMs, variational level set methods have been used to build an active contour with the aim of modeling arbitrarily complex shapes. Moreover, they can handle also topological changes of the contours. Self-Organizing Maps (SOMs) have attracted the attention of many computer vision scientists, particularly in modeling an active contour based on the idea of utilizing the prototypes (weights) of a SOM to control the evolution of the contour. SOM-based models have been proposed in general with the aim of exploiting the specific ability of SOMs to learn the edge-map information via their topology preservation property and overcoming some drawbacks of other ACMs, such as trapping into local minima of the image energy functional to be minimized in such models. In this survey, we illustrate the main concepts of variational level set-based ACMs, SOM-based ACMs, and their relationship and review in a comprehensive fashion the development of their state-of-the-art models from a machine learning perspective, with a focus on their strengths and weaknesses. PMID:25960736

  15. Soybean fruit development and set at the node level under combined photoperiod and radiation conditions.

    PubMed

    Nico, Magalí; Mantese, Anita I; Miralles, Daniel J; Kantolic, Adriana G

    2016-01-01

    In soybean, long days during post-flowering increase seed number. This positive photoperiodic effect on seed number has been previously associated with increments in the amount of radiation accumulated during the crop cycle because long days extend the duration of the crop cycle. However, evidence of intra-nodal processes independent of the availability of assimilates suggests that photoperiodic effects at the node level might also contribute to pod set. This work aims to identify the main mechanisms responsible for the increase in pod number per node in response to long days; including the dynamics of flowering, pod development, growth and set at the node level. Long days increased pods per node on the main stems, by increasing pods on lateral racemes (usually dominated positions) at some main stem nodes. Long days lengthened the flowering period and thereby increased the number of opened flowers on lateral racemes. The flowering period was prolonged under long days because effective seed filling was delayed on primary racemes (dominant positions). Long days also delayed the development of flowers into pods with filling seeds, delaying the initiation of pod elongation without modifying pod elongation rate. The embryo development matched the external pod length irrespective of the pod's chronological age. These results suggest that long days during post-flowering enhance pod number per node through a relief of the competition between pods of different hierarchy within the node. The photoperiodic effect on the development of dominant pods, delaying their elongation and therefore postponing their active growth, extends flowering and allows pod set at positions that are usually dominated.

  16. Soybean fruit development and set at the node level under combined photoperiod and radiation conditions

    PubMed Central

    Nico, Magalí; Mantese, Anita I.; Miralles, Daniel J.; Kantolic, Adriana G.

    2016-01-01

    In soybean, long days during post-flowering increase seed number. This positive photoperiodic effect on seed number has been previously associated with increments in the amount of radiation accumulated during the crop cycle because long days extend the duration of the crop cycle. However, evidence of intra-nodal processes independent of the availability of assimilates suggests that photoperiodic effects at the node level might also contribute to pod set. This work aims to identify the main mechanisms responsible for the increase in pod number per node in response to long days; including the dynamics of flowering, pod development, growth and set at the node level. Long days increased pods per node on the main stems, by increasing pods on lateral racemes (usually dominated positions) at some main stem nodes. Long days lengthened the flowering period and thereby increased the number of opened flowers on lateral racemes. The flowering period was prolonged under long days because effective seed filling was delayed on primary racemes (dominant positions). Long days also delayed the development of flowers into pods with filling seeds, delaying the initiation of pod elongation without modifying pod elongation rate. The embryo development matched the external pod length irrespective of the pod’s chronological age. These results suggest that long days during post-flowering enhance pod number per node through a relief of the competition between pods of different hierarchy within the node. The photoperiodic effect on the development of dominant pods, delaying their elongation and therefore postponing their active growth, extends flowering and allows pod set at positions that are usually dominated. PMID:26512057

  17. Standards-Setting Procedures in Accountability Research: Impacts of Conceptual Frameworks and Mapping Procedures on Passing Rates.

    ERIC Educational Resources Information Center

    Wang, LihShing; Pan, Wei; Austin, James T.

    Standard-setting research has yielded a rich array of more than 50 standard-setting procedures, but practitioners are likely to be confused about which to use. By synthesizing the accumulated research on standard setting and progress monitoring, this study developed a three-dimensional taxonomy for conceptualizing and operationalizing the various…

  18. Critical Factors to Consider in Evaluating Standard-Setting Studies to Map Language Test Scores to Frameworks of Language Proficiency

    ERIC Educational Resources Information Center

    Tannenbaum, Richard J.; Cho, Yeonsuk

    2014-01-01

    In this article, we consolidate and present in one place what is known about quality indicators for setting standards so that stakeholders may be able to recognize the signs of standard-setting quality. We use the context of setting standards to associate English language test scores with language proficiency descriptions such as those presented…

  19. Fully implicit methodology for the dynamics of biomembranes and capillary interfaces by combining the level set and Newton methods

    NASA Astrophysics Data System (ADS)

    Laadhari, Aymen; Saramito, Pierre; Misbah, Chaouqi; Székely, Gábor

    2017-08-01

    This framework is concerned with the numerical modeling of the dynamics of individual biomembranes and capillary interfaces in a surrounding Newtonian fluid. A level set approach helps to follow the interface motion. Our method features the use of high order fully implicit time integration schemes that enable to overcome stability issues related to the explicit discretization of the highly non-linear bending force or capillary force. At each time step, the tangent systems are derived and the resulting nonlinear problems are solved by a Newton-Raphson method. Based on the signed distance assumption, several inexact Newton strategies are employed to solve the capillary and vesicle problems and guarantee the second-order convergence behavior. We address in detail the main features of the proposed method, and we report several experiments in the two-dimensional case with the aim of illustrating its accuracy and efficiency. Comparative investigations with respect to the fully explicit scheme depict the stabilizing effect of the new method, which allows to use significantly larger time step sizes.

  20. Robust nuclei segmentation in cyto-histopathological images using statistical level set approach with topology preserving constraint

    NASA Astrophysics Data System (ADS)

    Taheri, Shaghayegh; Fevens, Thomas; Bui, Tien D.

    2017-02-01

    Computerized assessments for diagnosis or malignancy grading of cyto-histopathological specimens have drawn increased attention in the field of digital pathology. Automatic segmentation of cell nuclei is a fundamental step in such automated systems. Despite considerable research, nuclei segmentation is still a challenging task due noise, nonuniform illumination, and most importantly, in 2D projection images, overlapping and touching nuclei. In most published approaches, nuclei refinement is a post-processing step after segmentation, which usually refers to the task of detaching the aggregated nuclei or merging the over-segmented nuclei. In this work, we present a novel segmentation technique which effectively addresses the problem of individually segmenting touching or overlapping cell nuclei during the segmentation process. The proposed framework is a region-based segmentation method, which consists of three major modules: i) the image is passed through a color deconvolution step to extract the desired stains; ii) then the generalized fast radial symmetry transform is applied to the image followed by non-maxima suppression to specify the initial seed points for nuclei, and their corresponding GFRS ellipses which are interpreted as the initial nuclei borders for segmentation; iii) finally, these nuclei border initial curves are evolved through the use of a statistical level-set approach along with topology preserving criteria for segmentation and separation of nuclei at the same time. The proposed method is evaluated using Hematoxylin and Eosin, and fluorescent stained images, performing qualitative and quantitative analysis, showing that the method outperforms thresholding and watershed segmentation approaches.

  1. Whole abdominal wall segmentation using augmented active shape models (AASM) with multi-atlas label fusion and level set

    NASA Astrophysics Data System (ADS)

    Xu, Zhoubing; Baucom, Rebeccah B.; Abramson, Richard G.; Poulose, Benjamin K.; Landman, Bennett A.

    2016-03-01

    The abdominal wall is an important structure differentiating subcutaneous and visceral compartments and intimately involved with maintaining abdominal structure. Segmentation of the whole abdominal wall on routinely acquired computed tomography (CT) scans remains challenging due to variations and complexities of the wall and surrounding tissues. In this study, we propose a slice-wise augmented active shape model (AASM) approach to robustly segment both the outer and inner surfaces of the abdominal wall. Multi-atlas label fusion (MALF) and level set (LS) techniques are integrated into the traditional ASM framework. The AASM approach globally optimizes the landmark updates in the presence of complicated underlying local anatomical contexts. The proposed approach was validated on 184 axial slices of 20 CT scans. The Hausdorff distance against the manual segmentation was significantly reduced using proposed approach compared to that using ASM, MALF, and LS individually. Our segmentation of the whole abdominal wall enables the subcutaneous and visceral fat measurement, with high correlation to the measurement derived from manual segmentation. This study presents the first generic algorithm that combines ASM, MALF, and LS, and demonstrates practical application for automatically capturing visceral and subcutaneous fat volumes.

  2. Whole Abdominal Wall Segmentation using Augmented Active Shape Models (AASM) with Multi-Atlas Label Fusion and Level Set

    PubMed Central

    Xu, Zhoubing; Baucom, Rebeccah B.; Abramson, Richard G.; Poulose, Benjamin K.; Landman, Bennett A.

    2016-01-01

    The abdominal wall is an important structure differentiating subcutaneous and visceral compartments and intimately involved with maintaining abdominal structure. Segmentation of the whole abdominal wall on routinely acquired computed tomography (CT) scans remains challenging due to variations and complexities of the wall and surrounding tissues. In this study, we propose a slice-wise augmented active shape model (AASM) approach to robustly segment both the outer and inner surfaces of the abdominal wall. Multi-atlas label fusion (MALF) and level set (LS) techniques are integrated into the traditional ASM framework. The AASM approach globally optimizes the landmark updates in the presence of complicated underlying local anatomical contexts. The proposed approach was validated on 184 axial slices of 20 CT scans. The Hausdorff distance against the manual segmentation was significantly reduced using proposed approach compared to that using ASM, MALF, and LS individually. Our segmentation of the whole abdominal wall enables the subcutaneous and visceral fat measurement, with high correlation to the measurement derived from manual segmentation. This study presents the first generic algorithm that combines ASM, MALF, and LS, and demonstrates practical application for automatically capturing visceral and subcutaneous fat volumes. PMID:27127333

  3. Image Segmentation via Convolution of a Level-Set Function with a Rigaut Kernel*

    PubMed Central

    Subakan, Özlem N.; Vemuri, Baba C.

    2009-01-01

    Image segmentation is a fundamental task in Computer Vision and there are numerous algorithms that have been successfully applied in various domains. There are still plenty of challenges to be met with. In this paper, we consider one such challenge, that of achieving segmentation while preserving complicated and detailed features present in the image, be it a gray level or a textured image. We present a novel approach that does not make use of any prior information about the objects in the image being segmented. Segmentation is achieved using local orientation information, which is obtained via the application of a steerable Gabor filter bank, in a statistical framework. This information is used to construct a spatially varying kernel called the Rigaut Kernel, which is then convolved with the signed distance function of an evolving contour (placed in the image) to achieve segmentation. We present numerous experimental results on real images, including a quantitative evaluation. Superior performance of our technique is depicted via comparison to the state-of-the-art algorithms in literature. PMID:19209232

  4. Teachers' Lives in Context: A Framework for Understanding Barriers to High Quality Teaching within Resource Deprived Settings

    ERIC Educational Resources Information Center

    Schwartz, Kate; Cappella, Elise; Aber, J. Lawrence

    2016-01-01

    Within low-income communities in low- and high-resource countries, there is a profound need for more effective schools that are better able to foster child and youth development and support student learning. This paper presents a theoretical framework for understanding the role of teacher ecology in influencing teacher effectiveness and, through…

  5. A Framework for State-Level Renewable Energy Market Potential Studies

    EPA Pesticide Factsheets

    This document provides a framework/next steps for state officials who require estimates of renewable energy market potential, shows how to conduct a market potential study, and distinguishes between goal-oriented studies and other types of studies.

  6. Metal-Organic Frameworks for Resonant-Gravimetric Detection of Trace-Level Xylene Molecules.

    PubMed

    Xu, Tao; Xu, Pengcheng; Zheng, Dan; Yu, Haitao; Li, Xinxin

    2016-12-20

    As one of typical VOCs, xylene is seriously harmful to human health. Nowadays, however, there is really lack of portable sensing method to directly detect environmental xylene that has chemical inertness. Especially when the concentration of xylene is lower than the human olfactory threshold of 470 ppb, people are indeed hard to be aware of and avoid this harmful vapor. Herein the metal-organic framework (MOF) of HKUST-1 is first explored for sensing to the nonpolar molecule of p-xylene. And the sensing mechanism is identified that is via host-guest interaction of MOF with xylene molecule. By loading MOFs on mass-gravimetric resonant-cantilevers, sensing experiments for four MOFs of MOF-5, HKUST-1, ZIF-8, and MOF-177 approve that HKUST-1 has the highest sensitivity to p-xylene. The resonant-gravimetric sensing experiments with our HKUST-1 based sensors have demonstrated that trace-level p-xylene of 400 ppb can be detected that is lower than the human olfactory threshold of 470 ppb. We analyze that the specificity of HKUST-1 to xylene comes from Cu(2+)-induced moderate Lewis acidity and the "like dissolves like" interaction of the benzene ring. In situ diffuse reflectance infrared Fourier transform spectroscopy (DRIFTS) is used to elucidate the adsorbing/sensing mechanism of HKUST-1 to p-xylene, where p-xylene adsorbing induced blue-shift phenomenon is observed that confirms the sensing mechanism. Our study also indicates that the sensor shows good selectivity to various kinds of common interfering gases. And the long-term repeatability and stability of the sensing material are also approved for the usage/storage period of two months. This research approves that the MOF materials exhibit potential usages for high performance chemical sensors applications.

  7. Oligocene sea-level falls recorded in mid-Pacific atoll and archipelagic apron settings

    NASA Astrophysics Data System (ADS)

    Schlanger, S. O.; Premoli Silva, I.

    1986-05-01

    Drilling results from mid-Pacific atoll and archipelagic apron sites in the Line Islands and Marshall Islands provinces lead to the conclusion that Oligocene sea-level falls detected in Atlantic passive margin sequences are also recorded in a mid-plate-tectonic setting in the Pacific Basin. The mid-Pacific sea-level falls are recorded by (a) the presence of distinct, coarse-grained, graded beds of turbidite origin, rich in reef-derived skeletal debris of Oligocene, Eocene, and Cretaceous age, that were redeposited in deep-water archipelagic apron carbonate sequences of middle and late Oligocene age now flanking the atolls and (b) a marked stratigraphic hiatus and solution unconformity in the subsurface of Enewetak atoll which dates an Oligocene period of atoll emergence correlative with both the deposition of the turbidites and the coastal offlap events discerned in Atlantic passive margins. Correlation of the subsidence path of Enewetak atoll with the development of the Oligocene solution unconformity shows that ca. 30 Ma sea level was as much as 100 m lower than at present.

  8. The Use of Sound Level Meter Apps in the Clinical Setting.

    PubMed

    Fava, Gaetano; Oliveira, Gisele; Baglione, Melody; Pimpinella, Michael; Spitzer, Jaclyn B

    2016-02-01

    The purpose of this study was to compare sound level meter (SLM) readings obtained using a Larson-Davis (Depew, NY) Model 831 Type 1 SLM, a RadioShack (Fort Worth, TX) SLM, and iPhone 5 (Apple, Cupertino, CA) SLM apps. In Procedure 1, pure tones were measured in an anechoic chamber (125, 250, 500, 1000, 2000, 4000, and 8000 Hz); sound pressure levels (SPLs) ranged from 60 to 100 dB SPL in 10-dB increments. In Procedure 2, human voices were measured. Participants were 20 vocally healthy adults (7 women, 13 men; mean age = 25.1 years). The task was to sustain a vowel "ah" at 3 intensity levels: soft, habitual, and loud. Microphones were lined up equal distances from the participant's mouth, and recordings were captured simultaneously. Overall, the 3 SLM apps and the RadioShack SLM yielded inconsistent readings compared with the Type 1 SLM. The use of apps for SPL readings in the clinical setting is premature because all 3 apps adopted were incomparable with the Type 1 SLM.

  9. A review of the use of human factors classification frameworks that identify causal factors for adverse events in the hospital setting.

    PubMed

    Mitchell, R J; Williamson, A M; Molesworth, B; Chung, A Z Q

    2014-01-01

    Various human factors classification frameworks have been used to identified causal factors for clinical adverse events. A systematic review was conducted to identify human factors classification frameworks that identified the causal factors (including human error) of adverse events in a hospital setting. Six electronic databases were searched, identifying 1997 articles and 38 of these met inclusion criteria. Most studies included causal contributing factors as well as error and error type, but the nature of coding varied considerably between studies. The ability of human factors classification frameworks to provide information on specific causal factors for an adverse event enables the focus of preventive attention on areas where improvements are most needed. This review highlighted some areas needing considerable improvement in order to meet this need, including better definition of terms, more emphasis on assessing reliability of coding and greater sophistication in analysis of results of the classification. Practitioner Summary: Human factors classification frameworks can be used to identify causal factors of clinical adverse events. However, this review suggests that existing frameworks are diverse, limited in their identification of the context of human error and have poor reliability when used by different individuals.

  10. High-Order Discontinuous Galerkin Level Set Method for Interface Tracking and Re-Distancing on Unstructured Meshes

    NASA Astrophysics Data System (ADS)

    Greene, Patrick; Nourgaliev, Robert; Schofield, Sam

    2015-11-01

    A new sharp high-order interface tracking method for multi-material flow problems on unstructured meshes is presented. The method combines the marker-tracking algorithm with a discontinuous Galerkin (DG) level set method to implicitly track interfaces. DG projection is used to provide a mapping from the Lagrangian marker field to the Eulerian level set field. For the level set re-distancing, we developed a novel marching method that takes advantage of the unique features of the DG representation of the level set. The method efficiently marches outward from the zero level set with values in the new cells being computed solely from cell neighbors. Results are presented for a number of different interface geometries including ones with sharp corners and multiple hierarchical level sets. The method can robustly handle the level set discontinuities without explicit utilization of solution limiters. Results show that the expected high order (3rd and higher) of convergence for the DG representation of the level set is obtained for smooth solutions on unstructured meshes. High-order re-distancing on irregular meshes is a must for applications were the interfacial curvature is important for underlying physics, such as surface tension, wetting and detonation shock dynamics. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. Information management release number LLNL-ABS-675636.

  11. The Existence of Alternative Framework in Students' Scientific Imagination on the Concept of Matter at Submicroscopic Level: Macro Imagination

    ERIC Educational Resources Information Center

    Abdullah, Nurdiana; Surif, Johari

    2015-01-01

    This study is conducted with the purpose of identifying the alternative framework contained in students' imagination on the concept of matter at submicroscopic level. Through the design of purposive sampling techniques, a total of 15 students are interviewed to obtain the data. Data from analysis document is utilized to strengthen the interview.…

  12. How Multi-Levels of Individual and Team Learning Interact in a Public Healthcare Organisation: A Conceptual Framework

    ERIC Educational Resources Information Center

    Doyle, Louise; Kelliher, Felicity; Harrington, Denis

    2016-01-01

    The aim of this paper is to review the relevant literature on organisational learning and offer a preliminary conceptual framework as a basis to explore how the multi-levels of individual learning and team learning interact in a public healthcare organisation. The organisational learning literature highlights a need for further understanding of…

  13. Expected frontiers: Incorporating weather uncertainty into a policy analysis using an integrated bi-level multi-objective optimization framework

    EPA Science Inventory

    Weather is the main driver in both plant use of nutrients and fate and transport of nutrients in the environment. In previous work, we evaluated a green tax for control of agricultural nutrients in a bi-level optimization framework that linked deterministic models. In this study,...

  14. How Multi-Levels of Individual and Team Learning Interact in a Public Healthcare Organisation: A Conceptual Framework

    ERIC Educational Resources Information Center

    Doyle, Louise; Kelliher, Felicity; Harrington, Denis

    2016-01-01

    The aim of this paper is to review the relevant literature on organisational learning and offer a preliminary conceptual framework as a basis to explore how the multi-levels of individual learning and team learning interact in a public healthcare organisation. The organisational learning literature highlights a need for further understanding of…

  15. A Generic System-Level Framework for Self-Serve Health Monitoring System through Internet of Things (IoT).

    PubMed

    Ahmed, Mobyen Uddin; Björkman, Mats; Lindén, Maria

    2015-01-01

    Sensor data are traveling from sensors to a remote server, data is analyzed remotely in a distributed manner, and health status of a user is presented in real-time. This paper presents a generic system-level framework for a self-served health monitoring system through the Internet of Things (IoT) to facilities an efficient sensor data management.

  16. A General Framework for Power Analysis to Detect the Moderator Effects in Two- and Three-Level Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Dong, Nianbo; Spybrook, Jessaca; Kelcey, Ben

    2016-01-01

    The purpose of this study is to propose a general framework for power analyses to detect the moderator effects in two- and three-level cluster randomized trials (CRTs). The study specifically aims to: (1) develop the statistical formulations for calculating statistical power, minimum detectable effect size (MDES) and its confidence interval to…

  17. Expected frontiers: Incorporating weather uncertainty into a policy analysis using an integrated bi-level multi-objective optimization framework

    EPA Science Inventory

    Weather is the main driver in both plant use of nutrients and fate and transport of nutrients in the environment. In previous work, we evaluated a green tax for control of agricultural nutrients in a bi-level optimization framework that linked deterministic models. In this study,...

  18. A fully automated approach to prostate biopsy segmentation based on level-set and mean filtering

    PubMed Central

    Vidal, Juan; Bueno, Gloria; Galeotti, John; García-Rojo, Marcial; Relea, Fernanda; Déniz, Oscar

    2011-01-01

    With modern automated microscopes and digital cameras, pathologists no longer have to examine samples looking through microscope binoculars. Instead, the slide is digitized to an image, which can then be examined on a screen. This creates the possibility for computers to analyze the image. In this work, a fully automated approach to region of interest (ROI) segmentation in prostate biopsy images is proposed. This will allow the pathologists to focus on the most important areas of the image. The method proposed is based on level-set and mean filtering techniques for lumen centered expansion and cell density localization respectively. The novelty of the technique lies in the ability to detect complete ROIs, where a ROI is composed by the conjunction of three different structures, that is, lumen, cytoplasm, and cells, as well as regions with a high density of cells. The method is capable of dealing with full biopsies digitized at different magnifications. In this paper, results are shown with a set of 100 H and E slides, digitized at 5×, and ranging from 12 MB to 500 MB. The tests carried out show an average specificity above 99% across the board and average sensitivities of 95% and 80%, respectively, for the lumen centered expansion and cell density localization. The algorithms were also tested with images at 10× magnification (up to 1228 MB) obtaining similar results. PMID:22811961

  19. Setting ozone critical levels for protecting horticultural Mediterranean crops: case study of tomato.

    PubMed

    González-Fernández, I; Calvo, E; Gerosa, G; Bermejo, V; Marzuoli, R; Calatayud, V; Alonso, R

    2014-02-01

    Seven experiments carried out in Italy and Spain have been used to parameterising a stomatal conductance model and establishing exposure- and dose-response relationships for yield and quality of tomato with the main goal of setting O3 critical levels (CLe). CLe with confidence intervals, between brackets, were set at an accumulated hourly O3 exposure over 40 nl l(-1), AOT40 = 8.4 (1.2, 15.6) ppm h and a phytotoxic ozone dose above a threshold of 6 nmol m(-2) s(-1), POD6 = 2.7 (0.8, 4.6) mmol m(-2) for yield and AOT40 = 18.7 (8.5, 28.8) ppm h and POD6 = 4.1 (2.0, 6.2) mmol m(-2) for quality, both indices performing equally well. CLe confidence intervals provide information on the quality of the dataset and should be included in future calculations of O3 CLe for improving current methodologies. These CLe, derived for sensitive tomato cultivars, should not be applied for quantifying O3-induced losses at the risk of making important overestimations of the economical losses associated with O3 pollution.

  20. A GPU-accelerated adaptive discontinuous Galerkin method for level set equation

    NASA Astrophysics Data System (ADS)

    Karakus, A.; Warburton, T.; Aksel, M. H.; Sert, C.

    2016-01-01

    This paper presents a GPU-accelerated nodal discontinuous Galerkin method for the solution of two- and three-dimensional level set (LS) equation on unstructured adaptive meshes. Using adaptive mesh refinement, computations are localised mostly near the interface location to reduce the computational cost. Small global time step size resulting from the local adaptivity is avoided by local time-stepping based on a multi-rate Adams-Bashforth scheme. Platform independence of the solver is achieved with an extensible multi-threading programming API that allows runtime selection of different computing devices (GPU and CPU) and different threading interfaces (CUDA, OpenCL and OpenMP). Overall, a highly scalable, accurate and mass conservative numerical scheme that preserves the simplicity of LS formulation is obtained. Efficiency, performance and local high-order accuracy of the method are demonstrated through distinct numerical test cases.

  1. Numerical Schemes for the Hamilton-Jacobi and Level Set Equations on Triangulated Domains

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.; Sethian, James A.

    1997-01-01

    Borrowing from techniques developed for conservation law equations, numerical schemes which discretize the Hamilton-Jacobi (H-J), level set, and Eikonal equations on triangulated domains are presented. The first scheme is a provably monotone discretization for certain forms of the H-J equations. Unfortunately, the basic scheme lacks proper Lipschitz continuity of the numerical Hamiltonian. By employing a virtual edge flipping technique, Lipschitz continuity of the numerical flux is restored on acute triangulations. Next, schemes are introduced and developed based on the weaker concept of positive coefficient approximations for homogeneous Hamiltonians. These schemes possess a discrete maximum principle on arbitrary triangulations and naturally exhibit proper Lipschitz continuity of the numerical Hamiltonian. Finally, a class of Petrov-Galerkin approximations are considered. These schemes are stabilized via a least-squares bilinear form. The Petrov-Galerkin schemes do not possess a discrete maximum principle but generalize to high order accuracy.

  2. Initialisation of 3D level set for hippocampus segmentation from volumetric brain MR images

    NASA Astrophysics Data System (ADS)

    Hajiesmaeili, Maryam; Dehmeshki, Jamshid; Bagheri Nakhjavanlo, Bashir; Ellis, Tim

    2014-04-01

    Shrinkage of the hippocampus is a primary biomarker for Alzheimer's disease and can be measured through accurate segmentation of brain MR images. The paper will describe the problem of initialisation of a 3D level set algorithm for hippocampus segmentation that must cope with the some challenging characteristics, such as small size, wide range of intensities, narrow width, and shape variation. In addition, MR images require bias correction, to account for additional inhomogeneity associated with the scanner technology. Due to these inhomogeneities, using a single initialisation seed region inside the hippocampus is prone to failure. Alternative initialisation strategies are explored, such as using multiple initialisations in different sections (such as the head, body and tail) of the hippocampus. The Dice metric is used to validate our segmentation results with respect to ground truth for a dataset of 25 MR images. Experimental results indicate significant improvement in segmentation performance using the multiple initialisations techniques, yielding more accurate segmentation results for the hippocampus.

  3. Breast cancer diagnosis using level-set statistics and support vector machines.

    PubMed

    Liu, Jianguo; Yuan, Xiaohui; Buckles, Bill P

    2008-01-01

    Breast cancer diagnosis based on microscopic biopsy images and machine learning has demonstrated great promise in the past two decades. Various feature selection (or extraction) and classification algorithms have been attempted with success. However, some feature selection processes are complex and the number of features used can be quite large. We propose a new feature selection method based on level-set statistics. This procedure is simple and, when used with support vector machines (SVM), only a small number of features is needed to achieve satisfactory accuracy that is comparable to those using more sophisticated features. Therefore, the classification can be completed in much shorter time. We use multi-class support vector machines as the classification tool. Numerical results are reported to support the viability of this new procedure.

  4. Patient doses in paediatric CT: feasibility of setting diagnostic reference levels.

    PubMed

    Järvinen, H; Merimaa, K; Seuri, R; Tyrväinen, E; Perhomaa, M; Savikurki-Heikkilä, P; Svedström, E; Ziliukas, J; Lintrop, M

    2011-09-01

    Despite the fact that doses to paediatric patients from computed tomography (CT) examinations are of special concern, only few data or studies for setting of paediatric diagnostic reference levels (DRLs) have been published. In this study, doses to children were estimated from chest and head CT, in order to study the feasibility of DRLs for these examinations. It is shown that for the DRLs, patient dose data from different CT scanners should be collected in age or weight groups, possibly for different indications. For practical reasons, the DRLs for paediatric chest CT should be given as a continuous DRL curve as a function of patient weight. For paediatric head CT, DRLs for a few age groups could be given. The users of the DRLs should be aware of the calibration phantom applied in the console calibration for different paediatric scanning protocols. The feasibility of DRLs should be re-evaluated every 2-3 y.

  5. Wave breaking over sloping beaches using a coupled boundary integral-level set method

    SciTech Connect

    Garzon, M.; Adalsteinsson, D.; Gray, L.; Sethian, J.A.

    2003-12-08

    We present a numerical method for tracking breaking waves over sloping beaches. We use a fully non-linear potential model for in-compressible, irrotational and inviscid flow, and consider the effects of beach topography on breaking waves. The algorithm uses a Boundary Element Method (BEM) to compute the velocity at the interface, coupled to a Narrow Band Level Set Method to track the evolving air/water interface, and an associated extension equation to update the velocity potential both on and off the interface. The formulation of the algorithm is applicable to two and three dimensional breaking waves; in this paper, we concentrate on two-dimensional results showing wave breaking and rollup, and perform numerical convergence studies and comparison with previous techniques.

  6. Application of level-set method for deposition in three-dimensional reconstructed porous media.

    PubMed

    Vu, M T; Adler, P M

    2014-05-01

    Three-dimensional (3D) porous structures which are usually discretized by voxels can also be discretized by the level-set method (LSM) and flow, reactive transport, and structure evolution can be modeled. The determination of the solid-liquid interface is detailed as well as the discretization of the governing equations. Comparisons between a one-dimensional analytical solution and LSM are conducted for validation. Deposition in 3D reconstructed media is studied under various flow and reaction conditions. The evolution of the structure is explored locally by means of the pore geometry and globally by means of the permeability and the porosity. The difference between the voxel method and LSM is discussed during the investigations.

  7. Numerical simulation of overflow at vertical weirs using a hybrid level set/VOF method

    NASA Astrophysics Data System (ADS)

    Lv, Xin; Zou, Qingping; Reeve, Dominic

    2011-10-01

    This paper presents the applications of a newly developed free surface flow model to the practical, while challenging overflow problems for weirs. Since the model takes advantage of the strengths of both the level set and volume of fluid methods and solves the Navier-Stokes equations on an unstructured mesh, it is capable of resolving the time evolution of very complex vortical motions, air entrainment and pressure variations due to violent deformations following overflow of the weir crest. In the present study, two different types of vertical weir, namely broad-crested and sharp-crested, are considered for validation purposes. The calculated overflow parameters such as pressure head distributions, velocity distributions, and water surface profiles are compared against experimental data as well as numerical results available in literature. A very good quantitative agreement has been obtained. The numerical model, thus, offers a good alternative to traditional experimental methods in the study of weir problems.

  8. QUANTITATIVE CELL MOTILITY FOR IN VITRO WOUND HEALING USING LEVEL SET-BASED ACTIVE CONTOUR TRACKING.

    PubMed

    Bunyak, Filiz; Palaniappan, Kannappan; Nath, Sumit K; Baskin, Tobias I; Dong, Gang

    2006-04-06

    Quantifying the behavior of cells individually, and in clusters as part of a population, under a range of experimental conditions, is a challenging computational task with many biological applications. We propose a versatile algorithm for segmentation and tracking of multiple motile epithelial cells during wound healing using time-lapse video. The segmentation part of the proposed method relies on a level set-based active contour algorithm that robustly handles a large number of cells. The tracking part relies on a detection-based multiple-object tracking method with delayed decision enabled by multi-hypothesis testing. The combined method is robust to complex cell behavior including division and apoptosis, and to imaging artifacts such as illumination changes.

  9. Line-of-sight-attenuation chemical species tomography through the level set method

    NASA Astrophysics Data System (ADS)

    Twynstra, Matthew G.; Daun, Kyle J.; Waslander, Steven L.

    2014-08-01

    Chemical species tomography based on line-of-sight attenuation (LOSA-CST) is an emerging diagnostic for mapping the concentration of a gaseous species. Since laser absorption measurements alone are insufficient to specify a unique species concentration distribution, reconstruction algorithms must incorporate additional information that promotes presumed physical attributes of the distribution. This paper pioneers the application of the level set method to LOSA-CST. The species concentration distribution is initially represented by a signed distance function, which is progressively deformed by forces that scale with the difference between the measured and simulated absorption data, as well as deviation from spatial smoothness. The final distribution explains the LOSA data and is also qualitatively consistent with mixed advection/diffusion transport physics. The algorithm is demonstrated by solving a simulated laser tomography experiment on a turbulent methane plume.

  10. On the Geometry of the Level Sets of Bounded Static Potentials

    NASA Astrophysics Data System (ADS)

    Agostiniani, Virginia; Mazzieri, Lorenzo

    2017-10-01

    In this paper we present a new approach to the study of asymptotically flat static metrics arising in general relativity. In the case where the static potential is bounded, we introduce new quantities which are proven to be monotone along the level set flow of the potential function. We then show how to use these properties to detect the rotational symmetry of the static solutions, deriving a number of sharp inequalities. Among these, we prove the validity—without any dimensional restriction—of the Riemannian Penrose Inequality, as well as of a reversed version of it, in the class of asymptotically flat static metrics with connected horizon. As a consequence of our analysis, a simple proof of the classical 3-dimensional Black Hole Uniqueness Theorem is recovered and some geometric conditions are discussed under which the same statement holds in higher dimensions.

  11. Defining obesity: second-level agenda setting attributes in black newspapers and general audience newspapers.

    PubMed

    Lee, Hyunmin; Len-Ríos, María E

    2014-01-01

    This content analysis study examines how obesity is depicted in general-audience and Black newspaper stories (N=391) through the lens of second-level agenda setting theory. The results reveal that both Black newspapers and general-audience newspapers generally ascribe individual causes for obesity. While both types of newspapers largely neglected to mention solutions for the problem, Black newspapers were more likely than general-audience newspapers to suggest both individual and societal solutions for treating obesity. For Black newspapers, these solutions more often included community interventions. In addition, Black newspapers more often used a negative tone in stories and more frequently mentioned ethnic and racial minorities as at-risk groups.

  12. Blood pool agent contrast-enhanced MRA: level-set-based artery-vein separation

    NASA Astrophysics Data System (ADS)

    van Bemmel, Cornelis M.; Spreeuwers, Luuk J.; Verdonck, Bert; Viergever, Max A.; Niessen, Wiro J.

    2002-05-01

    Blood pool agents (BPAs) for contrast-enhanced magnetic resonance angiography (CE-MRA) allow prolonged imaging times for higher contrast and resolution by imaging during the steady-state when the contrast agent is distributed through the complete vascular system. However, simultaneous venous and arterial enhancement hampers interpretation. It is shown that arterial and venous segmentation in this equilibrium phase can be achieved if the central arterial axis (CAA) and central venous axis (CVA) are known. Since the CAA can not straightforwardly be obtained from the steady-state data, images acquired during the first-pass of the contrast agent can be utilized to determine the CAA with minimal user initialization. Utilizing the CAA to provide a rough arterial segmentation, the CVA can subsequently be determined from the steady-state dataset. The final segmentations of the arteries and veins are achieved by simultaneously evolving two level-sets in the steady-state dataset starting from the CAA and CVA.

  13. Large deformation solid-fluid interaction via a level set approach.

    SciTech Connect

    Schunk, Peter Randall; Noble, David R.; Baer, Thomas A.; Rao, Rekha Ranjana; Notz, Patrick K.; Wilkes, Edward Dean

    2003-12-01

    Solidification and blood flow seemingly have little in common, but each involves a fluid in contact with a deformable solid. In these systems, the solid-fluid interface moves as the solid advects and deforms, often traversing the entire domain of interest. Currently, these problems cannot be simulated without innumerable expensive remeshing steps, mesh manipulations or decoupling the solid and fluid motion. Despite the wealth of progress recently made in mechanics modeling, this glaring inadequacy persists. We propose a new technique that tracks the interface implicitly and circumvents the need for remeshing and remapping the solution onto the new mesh. The solid-fluid boundary is tracked with a level set algorithm that changes the equation type dynamically depending on the phases present. This novel approach to coupled mechanics problems promises to give accurate stresses, displacements and velocities in both phases, simultaneously.

  14. Where do young Irish women want Chlamydia-screening services to be set up? A qualitative study employing Goffman's impression management framework.

    PubMed

    Balfe, Myles; Brugha, Ruairi; O' Connell, Emer; McGee, Hannah; O' Donovan, Diarmuid

    2010-01-01

    We conducted interviews with 35 young women recruited from eight community healthcare rural and urban settings across two regions of Ireland. The aim of the study was to explore where these women thought Chlamydia-screening services should be located. Respondents wanted screening services to be located in settings where they would not be witnessed either asking for, or being asked to take, Chlamydia tests. Respondents were worried that their identities would become stigmatized if others were to find out that they had accepted screening. Findings are interpreted through Goffman's stigma and impression management framework. We conclude with public health recommendations.

  15. Modeling the advection of discontinuous quantities in Geophysical flows using Particle Level Sets

    NASA Astrophysics Data System (ADS)

    Aleksandrov, V.; Samuel, H.; Evonuk, M.

    2010-12-01

    Advection is one of the major processes that commonly acts on various scales in nature (core formation, mantle convective stirring, multi-phase flows in magma chambers, salt diapirism ...). While this process can be modeled numerically by solving conservation equations, various geodynamic scenarios involve advection of quantities with sharp discontinuities. Unfortunately, in these cases modeling numerically pure advection becomes very challenging, in particular because sharp discontinuities lead to numerical instabilities, which prevent the local use of high order numerical schemes. Several approaches have been used in computational geodynamics in order to overcome this difficulty, with variable amounts of success. Despite the use of correcting filters or non-oscillatory, shock-preserving schemes, Eulerian (fixed grid) techniques generally suffer from artificial numerical diffusion. Lagrangian approaches (dynamic grids or particles) tend to be more popular in computational geodynamics because they are not prone to excessive numerical diffusion. However, these approaches are generally computationally expensive, especially in 3D, and can suffer from spurious statistical noise. As an alternative to these aforementioned approaches, we have applied a relatively recent Particle Level set method [Enright et al., 2002] for modeling advection of quantities with the presence of sharp discontinuities. We have tested this improved method, which combines the best of Eulerian and Lagrangian approaches, against well known benchmarks and classical Geodynamic flows. In each case the Particle Level Set method accuracy equals or is better than other Eulerian and Lagrangian methods, and leads to significantly smaller computational cost, in particular in three-dimensional flows, where the reduction of computational time for modeling advection processes is most needed.

  16. An abdominal aortic aneurysm segmentation method: Level set with region and statistical information

    SciTech Connect

    Zhuge Feng; Rubin, Geoffrey D.; Sun Shaohua; Napel, Sandy

    2006-05-15

    We present a system for segmenting the human aortic aneurysm in CT angiograms (CTA), which, in turn, allows measurements of volume and morphological aspects useful for treatment planning. The system estimates a rough 'initial surface', and then refines it using a level set segmentation scheme augmented with two external analyzers: The global region analyzer, which incorporates a priori knowledge of the intensity, volume, and shape of the aorta and other structures, and the local feature analyzer, which uses voxel location, intensity, and texture features to train and drive a support vector machine classifier. Each analyzer outputs a value that corresponds to the likelihood that a given voxel is part of the aneurysm, which is used during level set iteration to control the evolution of the surface. We tested our system using a database of 20 CTA scans of patients with aortic aneurysms. The mean and worst case values of volume overlap, volume error, mean distance error, and maximum distance error relative to human tracing were 95.3%{+-}1.4% (s.d.); worst case=92.9%, 3.5%{+-}2.5% (s.d.); worst case=7.0%, 0.6{+-}0.2 mm (s.d.); worst case=1.0 mm, and 5.2{+-}2.3mm (s.d.); worstcase=9.6 mm, respectively. When implemented on a 2.8 GHz Pentium IV personal computer, the mean time required for segmentation was 7.4{+-}3.6min (s.d.). We also performed experiments that suggest that our method is insensitive to parameter changes within 10% of their experimentally determined values. This preliminary study proves feasibility for an accurate, precise, and robust system for segmentation of the abdominal aneurysm from CTA data, and may be of benefit to patients with aortic aneurysms.

  17. Texture analysis improves level set segmentation of the anterior abdominal wall

    PubMed Central

    Xu, Zhoubing; Allen, Wade M.; Baucom, Rebeccah B.; Poulose, Benjamin K.; Landman, Bennett A.

    2013-01-01

    Purpose: The treatment of ventral hernias (VH) has been a challenging problem for medical care. Repair of these hernias is fraught with failure; recurrence rates ranging from 24% to 43% have been reported, even with the use of biocompatible mesh. Currently, computed tomography (CT) is used to guide intervention through expert, but qualitative, clinical judgments, notably, quantitative metrics based on image-processing are not used. The authors propose that image segmentation methods to capture the three-dimensional structure of the abdominal wall and its abnormalities will provide a foundation on which to measure geometric properties of hernias and surrounding tissues and, therefore, to optimize intervention. Methods: In this study with 20 clinically acquired CT scans on postoperative patients, the authors demonstrated a novel approach to geometric classification of the abdominal. The authors’ approach uses a texture analysis based on Gabor filters to extract feature vectors and follows a fuzzy c-means clustering method to estimate voxelwise probability memberships for eight clusters. The memberships estimated from the texture analysis are helpful to identify anatomical structures with inhomogeneous intensities. The membership was used to guide the level set evolution, as well as to derive an initial start close to the abdominal wall. Results: Segmentation results on abdominal walls were both quantitatively and qualitatively validated with surface errors based on manually labeled ground truth. Using texture, mean surface errors for the outer surface of the abdominal wall were less than 2 mm, with 91% of the outer surface less than 5 mm away from the manual tracings; errors were significantly greater (2–5 mm) for methods that did not use the texture. Conclusions: The authors’ approach establishes a baseline for characterizing the abdominal wall for improving VH care. Inherent texture patterns in CT scans are helpful to the tissue classification, and texture

  18. Texture analysis improves level set segmentation of the anterior abdominal wall

    SciTech Connect

    Xu, Zhoubing; Allen, Wade M.; Baucom, Rebeccah B.; Poulose, Benjamin K.; Landman, Bennett A.

    2013-12-15

    Purpose: The treatment of ventral hernias (VH) has been a challenging problem for medical care. Repair of these hernias is fraught with failure; recurrence rates ranging from 24% to 43% have been reported, even with the use of biocompatible mesh. Currently, computed tomography (CT) is used to guide intervention through expert, but qualitative, clinical judgments, notably, quantitative metrics based on image-processing are not used. The authors propose that image segmentation methods to capture the three-dimensional structure of the abdominal wall and its abnormalities will provide a foundation on which to measure geometric properties of hernias and surrounding tissues and, therefore, to optimize intervention.Methods: In this study with 20 clinically acquired CT scans on postoperative patients, the authors demonstrated a novel approach to geometric classification of the abdominal. The authors’ approach uses a texture analysis based on Gabor filters to extract feature vectors and follows a fuzzy c-means clustering method to estimate voxelwise probability memberships for eight clusters. The memberships estimated from the texture analysis are helpful to identify anatomical structures with inhomogeneous intensities. The membership was used to guide the level set evolution, as well as to derive an initial start close to the abdominal wall.Results: Segmentation results on abdominal walls were both quantitatively and qualitatively validated with surface errors based on manually labeled ground truth. Using texture, mean surface errors for the outer surface of the abdominal wall were less than 2 mm, with 91% of the outer surface less than 5 mm away from the manual tracings; errors were significantly greater (2–5 mm) for methods that did not use the texture.Conclusions: The authors’ approach establishes a baseline for characterizing the abdominal wall for improving VH care. Inherent texture patterns in CT scans are helpful to the tissue classification, and texture

  19. An abdominal aortic aneurysm segmentation method: level set with region and statistical information.

    PubMed

    Zhuge, Feng; Rubin, Geoffrey D; Sun, Shaohua; Napel, Sandy

    2006-05-01

    We present a system for segmenting the human aortic aneurysm in CT angiograms (CTA), which, in turn, allows measurements of volume and morphological aspects useful for treatment planning. The system estimates a rough "initial surface," and then refines it using a level set segmentation scheme augmented with two external analyzers: The global region analyzer, which incorporates a priori knowledge of the intensity, volume, and shape of the aorta and other structures, and the local feature analyzer, which uses voxel location, intensity, and texture features to train and drive a support vector machine classifier. Each analyzer outputs a value that corresponds to the likelihood that a given voxel is part of the aneurysm, which is used during level set iteration to control the evolution of the surface. We tested our system using a database of 20 CTA scans of patients with aortic aneurysms. The mean and worst case values of volume overlap, volume error, mean distance error, and maximum distance error relative to human tracing were 95.3% +/- 1.4% (s.d.); worst case = 92.9%, 3.5% +/- 2.5% (s.d.); worst case = 7.0%, 0.6 +/- 0.2 mm (s.d.); worst case = 1.0 mm, and 5.2 +/- 2.3 mm (s.d.); worst case = 9.6 mm, respectively. When implemented on a 2.8 GHz Pentium IV personal computer, the mean time required for segmentation was 7.4 +/- 3.6 min (s.d.). We also performed experiments that suggest that our method is insensitive to parameter changes within 10% of their experimentally determined values. This preliminary study proves feasibility for an accurate, precise, and robust system for segmentation of the abdominal aneurysm from CTA data, and may be of benefit to patients with aortic aneurysms.

  20. Standard Setting in Relation to the Common European Framework of Reference for Languages: The Case of the State Examination of Dutch as a Second Language

    ERIC Educational Resources Information Center

    Bechger, Timo M.; Kuijper, Henk; Maris, Gunter

    2009-01-01

    This article reports on two related studies carried out to link the State examination of Dutch as a second language to the Common European Framework of Reference for languages (CEFR). In the first study, key persons from institutions for higher education were asked to determine the minimally required language level of beginning students. In the…

  1. Standard Setting in Relation to the Common European Framework of Reference for Languages: The Case of the State Examination of Dutch as a Second Language

    ERIC Educational Resources Information Center

    Bechger, Timo M.; Kuijper, Henk; Maris, Gunter

    2009-01-01

    This article reports on two related studies carried out to link the State examination of Dutch as a second language to the Common European Framework of Reference for languages (CEFR). In the first study, key persons from institutions for higher education were asked to determine the minimally required language level of beginning students. In the…

  2. Characterizing the spatiotemporal variability of groundwater levels of alluvial aquifers in different settings using drought indices

    NASA Astrophysics Data System (ADS)

    Haas, Johannes Christoph; Birk, Steffen

    2017-05-01

    To improve the understanding of how aquifers in different alluvial settings respond to extreme events in a changing environment, we analyze standardized time series of groundwater levels (Standardized Groundwater level Index - SGI), precipitation (Standardized Precipitation Index - SPI), and river stages of three subregions within the catchment of the river Mur (Austria). Using correlation matrices, differences and similarities between the subregions, ranging from the Alpine upstream part of the catchment to its shallow foreland basin, are identified and visualized. Generally, river stages exhibit the highest correlations with groundwater levels, frequently affecting not only the wells closest to the river, but also more distant parts of the alluvial aquifer. As a result, human impacts on the river are transferred to the aquifer, thus affecting the behavior of groundwater levels. Hence, to avoid misinterpretation of groundwater levels in this type of setting, it is important to account for the river and human impacts on it. While the river is a controlling factor in all of the subregions, an influence of precipitation is evident too. Except for deep wells found in an upstream Alpine basin, groundwater levels show the highest correlation with a precipitation accumulation period of 6 months (SPI6). The correlation in the foreland is generally higher than that in the Alpine subregions, thus corresponding to a trend from deeper wells in the Alpine parts of the catchment towards more shallow wells in the foreland. Extreme events are found to affect the aquifer in different ways. As shown with the well-known European 2003 drought and the local 2009 floods, correlations are reduced under flood conditions, but increased under drought. Thus, precipitation, groundwater levels and river stages tend to exhibit uniform behavior under drought conditions, whereas they may show irregular behavior during floods. Similarly, correlations are found to be weaker in years with little

  3. An ecofeminist conceptual framework to explore gendered environmental health inequities in urban settings and to inform healthy public policy.

    PubMed

    Chircop, Andrea

    2008-06-01

    This theoretical exploration is an attempt to conceptualize the link between gender and urban environmental health. The proposed ecofeminist framework enables an understanding of the link between the urban physical and social environments and health inequities mediated by gender and socioeconomic status. This framework is proposed as a theoretical magnifying glass to reveal the underlying logic that connects environmental exploitation on the one hand, and gendered health inequities on the other. Ecofeminism has the potential to reveal an inherent, normative conceptual analysis and argumentative justification of western society that permits the oppression of women and the exploitation of the environment. This insight will contribute to a better understanding of the mechanisms underlying gendered environmental health inequities and inform healthy public policy that is supportive of urban environmental health, particularly for low-income mothers.

  4. Characterization of mammographic masses based on level set segmentation with new image features and patient information

    SciTech Connect

    Shi Jiazheng; Sahiner, Berkman; Chan Heangping; Ge Jun; Hadjiiski, Lubomir; Helvie, Mark A.; Nees, Alexis; Wu Yita; Wei Jun; Zhou Chuan; Zhang Yiheng; Cui Jing

    2008-01-15

    Computer-aided diagnosis (CAD) for characterization of mammographic masses as malignant or benign has the potential to assist radiologists in reducing the biopsy rate without increasing false negatives. The purpose of this study was to develop an automated method for mammographic mass segmentation and explore new image based features in combination with patient information in order to improve the performance of mass characterization. The authors' previous CAD system, which used the active contour segmentation, and morphological, textural, and spiculation features, has achieved promising results in mass characterization. The new CAD system is based on the level set method and includes two new types of image features related to the presence of microcalcifications with the mass and abruptness of the mass margin, and patient age. A linear discriminant analysis (LDA) classifier with stepwise feature selection was used to merge the extracted features into a classification score. The classification accuracy was evaluated using the area under the receiver operating characteristic curve. The authors' primary data set consisted of 427 biopsy-proven masses (200 malignant and 227 benign) in 909 regions of interest (ROIs) (451 malignant and 458 benign) from multiple mammographic views. Leave-one-case-out resampling was used for training and testing. The new CAD system based on the level set segmentation and the new mammographic feature space achieved a view-based A{sub z} value of 0.83{+-}0.01. The improvement compared to the previous CAD system was statistically significant (p=0.02). When patient age was included in the new CAD system, view-based and case-based A{sub z} values were 0.85{+-}0.01 and 0.87{+-}0.02, respectively. The study also demonstrated the consistency of the newly developed CAD system by evaluating the statistics of the weights of the LDA classifiers in leave-one-case-out classification. Finally, an independent test on the publicly available digital database

  5. 5-SPICE: the application of an original framework for community health worker program design, quality improvement and research agenda setting

    PubMed Central

    Palazuelos, Daniel; DaEun Im, Dana; Peckarsky, Matthew; Schwarz, Dan; Farmer, Didi Bertrand; Dhillon, Ranu; Johnson, Ari; Orihuela, Claudia; Hackett, Jill; Bazile, Junior; Berman, Leslie; Ballard, Madeleine; Panjabi, Raj; Ternier, Ralph; Slavin, Sam; Lee, Scott; Selinsky, Steve; Mitnick, Carole Diane

    2013-01-01

    Introduction Despite decades of experience with community health workers (CHWs) in a wide variety of global health projects, there is no established conceptual framework that structures how implementers and researchers can understand, study and improve their respective programs based on lessons learned by other CHW programs. Objective To apply an original, non-linear framework and case study method, 5-SPICE, to multiple sister projects of a large, international non-governmental organization (NGO), and other CHW projects. Design Engaging a large group of implementers, researchers and the best available literature, the 5-SPICE framework was refined and then applied to a selection of CHW programs. Insights gleaned from the case study method were summarized in a tabular format named the ‘5×5-SPICE chart’. This format graphically lists the ways in which essential CHW program elements interact, both positively and negatively, in the implementation field. Results The 5×5-SPICE charts reveal a variety of insights that come from a more complex understanding of how essential CHW projects interact and influence each other in their unique context. Some have been well described in the literature previously, while others are exclusive to this article. An analysis of how best to compensate CHWs is also offered as an example of the type of insights that this method may yield. Conclusions The 5-SPICE framework is a novel instrument that can be used to guide discussions about CHW projects. Insights from this process can help guide quality improvement efforts, or be used as hypothesis that will form the basis of a program's research agenda. Recent experience with research protocols embedded into successfully implemented projects demonstrates how such hypothesis can be rigorously tested. PMID:23561023

  6. 5-SPICE: the application of an original framework for community health worker program design, quality improvement and research agenda setting.

    PubMed

    Palazuelos, Daniel; Ellis, Kyla; Im, Dana DaEun; Peckarsky, Matthew; Schwarz, Dan; Farmer, Didi Bertrand; Dhillon, Ranu; Johnson, Ari; Orihuela, Claudia; Hackett, Jill; Bazile, Junior; Berman, Leslie; Ballard, Madeleine; Panjabi, Raj; Ternier, Ralph; Slavin, Sam; Lee, Scott; Selinsky, Steve; Mitnick, Carole Diane

    2013-04-03

    Despite decades of experience with community health workers (CHWs) in a wide variety of global health projects, there is no established conceptual framework that structures how implementers and researchers can understand, study and improve their respective programs based on lessons learned by other CHW programs. To apply an original, non-linear framework and case study method, 5-SPICE, to multiple sister projects of a large, international non-governmental organization (NGO), and other CHW projects. Engaging a large group of implementers, researchers and the best available literature, the 5-SPICE framework was refined and then applied to a selection of CHW programs. Insights gleaned from the case study method were summarized in a tabular format named the '5×5-SPICE charts'. This format graphically lists the ways in which essential CHW program elements interact, both positively and negatively, in the implementation field. The 5×5-SPICE charts reveal a variety of insights that come from a more complex understanding of how essential CHW projects interact and influence each other in their unique context. Some have been well described in the literature previously, while others are exclusive to this article. An analysis of how best to compensate CHWs is also offered as an example of the type of insights that this method may yield. The 5-SPICE framework is a novel instrument that can be used to guide discussions about CHW projects. Insights from this process can help guide quality improvement efforts, or be used as hypothesis that will form the basis of a program's research agenda. Recent experience with research protocols embedded into successfully implemented projects demonstrates how such hypothesis can be rigorously tested.

  7. 5-SPICE: the application of an original framework for community health worker program design, quality improvement and research agenda setting.

    PubMed

    Palazuelos, Daniel; Ellis, Kyla; DaEun Im, Dana; Peckarsky, Matthew; Schwarz, Dan; Bertrand Farmer, Didi; Dhillon, Ranu; Johnson, Ari; Orihuela, Claudia; Hackett, Jill; Bazile, Junior; Berman, Leslie; Ballard, Madeleine; Panjabi, Raj; Ternier, Ralph; Slavin, Sam; Lee, Scott; Selinsky, Steve; Diane Mitnick, Carole

    2013-01-01

    Introduction Despite decades of experience with community health workers (CHWs) in a wide variety of global health projects, there is no established conceptual framework that structures how implementers and researchers can understand, study and improve their respective programs based on lessons learned by other CHW programs. Objective To apply an original, non-linear framework and case study method, 5-SPICE, to multiple sister projects of a large, international non-governmental organization (NGO), and other CHW projects. Design Engaging a large group of implementers, researchers and the best available literature, the 5-SPICE framework was refined and then applied to a selection of CHW programs. Insights gleaned from the case study method were summarized in a tabular forma