Science.gov

Sample records for level set framework

  1. An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs

    PubMed Central

    Mosaliganti, Kishore R.; Gelas, Arnaud; Megason, Sean G.

    2013-01-01

    In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish

  2. An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs.

    PubMed

    Mosaliganti, Kishore R; Gelas, Arnaud; Megason, Sean G

    2013-01-01

    In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish

  3. Setting Healthcare Priorities at the Macro and Meso Levels: A Framework for Evaluation

    PubMed Central

    Barasa, Edwine W.; Molyneux, Sassy; English, Mike; Cleary, Susan

    2015-01-01

    Background: Priority setting in healthcare is a key determinant of health system performance. However, there is no widely accepted priority setting evaluation framework. We reviewed literature with the aim of developing and proposing a framework for the evaluation of macro and meso level healthcare priority setting practices. Methods: We systematically searched Econlit, PubMed, CINAHL, and EBSCOhost databases and supplemented this with searches in Google Scholar, relevant websites and reference lists of relevant papers. A total of 31 papers on evaluation of priority setting were identified. These were supplemented by broader theoretical literature related to evaluation of priority setting. A conceptual review of selected papers was undertaken. Results: Based on a synthesis of the selected literature, we propose an evaluative framework that requires that priority setting practices at the macro and meso levels of the health system meet the following conditions: (1) Priority setting decisions should incorporate both efficiency and equity considerations as well as the following outcomes; (a) Stakeholder satisfaction, (b) Stakeholder understanding, (c) Shifted priorities (reallocation of resources), and (d) Implementation of decisions. (2) Priority setting processes should also meet the procedural conditions of (a) Stakeholder engagement, (b) Stakeholder empowerment, (c) Transparency, (d) Use of evidence, (e) Revisions, (f) Enforcement, and (g) Being grounded on community values. Conclusion: Available frameworks for the evaluation of priority setting are mostly grounded on procedural requirements, while few have included outcome requirements. There is, however, increasing recognition of the need to incorporate both consequential and procedural considerations in priority setting practices. In this review, we adapt an integrative approach to develop and propose a framework for the evaluation of priority setting practices at the macro and meso levels that draws from these

  4. A coupled level set framework for bladder wall segmentation with application to MR cystography.

    PubMed

    Duan, Chaijie; Liang, Zhengrong; Bao, Shangliang; Zhu, Hongbin; Wang, Su; Zhang, Guangxiang; Chen, John J; Lu, Hongbing

    2010-03-01

    In this paper, we propose a coupled level set (LS) framework for segmentation of bladder wall using T(1)-weighted magnetic resonance (MR) images with clinical applications to virtual cystoscopy (i.e., MR cystography). The framework uses two collaborative LS functions and a regional adaptive clustering algorithm to delineate the bladder wall for the wall thickness measurement on a voxel-by-voxel basis. It is significantly different from most of the pre-existing bladder segmentation work in four aspects. First of all, while most previous work only segments the inner border of the wall or at most manually segments the outer border, our framework extracts both the inner and outer borders automatically except that the initial seed point is given by manual selection. Secondly, it is adaptive to T(1)-weighted images with decreased intensities in urine, as opposed to enhanced intensities in T(2)-weighted scenario and computed tomography. Thirdly, by considering the image global intensity distribution and local intensity contrast, the defined image energy function in the framework is more immune to inhomogeneity effect, motion artifacts and image noise. Finally, the bladder wall thickness is measured by the length of integral path between the two borders which mimic the electric field line between two iso-potential surfaces. The framework was tested on six datasets with comparison to the well-known Chan-Vese (C-V) LS model. Five experts blindly scored the segmented inner and outer borders of the presented framework and the C-V model. The scores demonstrated statistically the improvement in detecting the inner and outer borders.

  5. A unified variational segmentation framework with a level-set based sparse composite shape prior

    NASA Astrophysics Data System (ADS)

    Liu, Wenyang; Ruan, Dan

    2015-03-01

    Image segmentation plays an essential role in many medical applications. Low SNR conditions and various artifacts makes its automation challenging. To achieve robust and accurate segmentation results, a good approach is to introduce proper shape priors. In this study, we present a unified variational segmentation framework that regularizes the target shape with a level-set based sparse composite prior. When the variational problem is solved with a block minimization/decent scheme, the regularizing impact of the sparse composite prior can be observed to adjust to the most recent shape estimate, and may be interpreted as a ‘dynamic’ shape prior, yet without compromising convergence thanks to the unified energy framework. The proposed method was applied to segment corpus callosum from 2D MR images and liver from 3D CT volumes. Its performance was evaluated using Dice Similarity Coefficient and Hausdorff distance, and compared with two benchmark level-set based segmentation methods. The proposed method has achieved statistically significant higher accuracy in both experiments and avoided faulty inclusion/exclusion of surrounding structures with similar intensities, as opposed to the benchmark methods.

  6. A multi-phase level set framework for source reconstruction in bioluminescence tomography

    SciTech Connect

    Huang Heyu; Qu Xiaochao; Liang Jimin; He Xiaowei; Chen Xueli; Yang Da'an; Tian Jie

    2010-07-01

    We propose a novel multi-phase level set algorithm for solving the inverse problem of bioluminescence tomography. The distribution of unknown interior source is considered as piecewise constant and represented by using multiple level set functions. The localization of interior bioluminescence source is implemented by tracing the evolution of level set function. An alternate search scheme is incorporated to ensure the global optimal of reconstruction. Both numerical and physical experiments are performed to evaluate the developed level set reconstruction method. Reconstruction results show that the proposed method can stably resolve the interior source of bioluminescence tomography.

  7. An automatic variational level set segmentation framework for computer aided dental X-rays analysis in clinical environments.

    PubMed

    Li, Shuo; Fevens, Thomas; Krzyzak, Adam; Li, Song

    2006-03-01

    An automatic variational level set segmentation framework for Computer Aided Dental X-rays Analysis (CADXA) in clinical environments is proposed. Designed for clinical environments, the segmentation contains two stages: a training stage and a segmentation stage. During the training stage, first, manually chosen representative images are segmented using hierarchical level set region detection. Then the window based feature extraction followed by principal component analysis (PCA) is applied and results are used to train a support vector machine (SVM) classifier. During the segmentation stage, dental X-rays are classified first by the trained SVM. The classifier provides initial contours which are close to correct boundaries for three coupled level sets driven by a proposed pathologically variational modeling which greatly accelerates the level set segmentation. Based on the segmentation results and uncertainty maps that are built based on a proposed uncertainty measurement, a computer aided analysis scheme is applied. The experimental results show that the proposed method is able to provide an automatic pathological segmentation which naturally segments those problem areas. Based on the segmentation results, the analysis scheme is able to provide indications of possible problem areas of bone loss and decay to the dentists. As well, the experimental results show that the proposed segmentation framework is able to speed up the level set segmentation in clinical environments.

  8. Improving district level health planning and priority setting in Tanzania through implementing accountability for reasonableness framework: Perceptions of stakeholders

    PubMed Central

    2010-01-01

    Background In 2006, researchers and decision-makers launched a five-year project - Response to Accountable Priority Setting for Trust in Health Systems (REACT) - to improve planning and priority-setting through implementing the Accountability for Reasonableness framework in Mbarali District, Tanzania. The objective of this paper is to explore the acceptability of Accountability for Reasonableness from the perspectives of the Council Health Management Team, local government officials, health workforce and members of user boards and committees. Methods Individual interviews were carried out with different categories of actors and stakeholders in the district. The interview guide consisted of a series of questions, asking respondents to describe their perceptions regarding each condition of the Accountability for Reasonableness framework in terms of priority setting. Interviews were analysed using thematic framework analysis. Documentary data were used to support, verify and highlight the key issues that emerged. Results Almost all stakeholders viewed Accountability for Reasonableness as an important and feasible approach for improving priority-setting and health service delivery in their context. However, a few aspects of Accountability for Reasonableness were seen as too difficult to implement given the socio-political conditions and traditions in Tanzania. Respondents mentioned: budget ceilings and guidelines, low level of public awareness, unreliable and untimely funding, as well as the limited capacity of the district to generate local resources as the major contextual factors that hampered the full implementation of the framework in their context. Conclusion This study was one of the first assessments of the applicability of Accountability for Reasonableness in health care priority-setting in Tanzania. The analysis, overall, suggests that the Accountability for Reasonableness framework could be an important tool for improving priority-setting processes in the

  9. Interoperability Context-Setting Framework

    SciTech Connect

    Widergren, Steven E.; Hardin, Dave; Ambrosio, Ron; Drummond, R.; Gunther, E.; Gilchrist, Grant; Cohen, David

    2007-01-31

    -conditioning (HVAC) unit up several degrees. The resulting load reduction becomes part of an aggregated response from the electricity service provider to the bulk system operator who is now in a better position to manage total system load with available generation. Looking across the electric system, from generating plants, to transmission substations, to the distribution system, to factories, office parks, and buildings, automation is growing, and the opportunities for unleashing new value propositions are exciting. How can we facilitate this change and do so in a way that ensures the reliability of electric resources for the wellbeing of our economy and security? The GridWise Architecture Council (GWAC) mission is to enable interoperability among the many entities that interact with the electric power system. A good definition of interoperability is, “The capability of two or more networks, systems, devices, applications, or components to exchange information between them and to use the information so exchanged.” As a step in the direction of enabling interoperability, the GWAC proposes a context-setting framework to organize concepts and terminology so that interoperability issues can be identified and debated, improvements to address issues articulated, and actions prioritized and coordinated across the electric power community.

  10. Level Set Strategy for SCFT

    NASA Astrophysics Data System (ADS)

    Ouaknin, Gaddiel

    This thesis investigates the design of sharp in terface level set methods in the context of self-consistent field theory (SCFT) in polymer physics. SCFT computes the structure and energy of inhomogeneous self-assembling polymers at thermodynamic equilibrium. Level set methods are based on an implicit representation of free boundaries, which enable motions with arbitrary change in topology. In addition, recent advances on how to impose Robin boundary conditions enables the study of free boundary problems of interest in the community interested in self-assembly. We first present a computational framework, encoded on a forest of quad/oct-trees in a parallel environment. We then present results of imposing sharp Neumann boundary conditions as was first proposed by de Gennes, which enables SCFT computations of meaningful quantities at the boundary of irregular geometries. We then introduce the concept of functional level-set derivative in the context of SCFT and rigorously derive expressions for the change of energy of a diblock copolymer with respect to an enclosing shape. The level-set derivative is then used to embed SCFT into a variable shape simulator, where the internal structure and the enclosing shape are coupled together and evolve in tandem in order to reduce the energy of the diblock copolymer. Finally an algorithm for solving the inverse problem for directed self-assembly is presented.

  11. Monitoring street-level spatial-temporal variations of carbon monoxide in urban settings using a wireless sensor network (WSN) framework.

    PubMed

    Wen, Tzai-Hung; Jiang, Joe-Air; Sun, Chih-Hong; Juang, Jehn-Yih; Lin, Tzu-Shiang

    2013-11-27

    Air pollution has become a severe environmental problem due to urbanization and heavy traffic. Monitoring street-level air quality is an important issue, but most official monitoring stations are installed to monitor large-scale air quality conditions, and their limited spatial resolution cannot reflect the detailed variations in air quality that may be induced by traffic jams. By deploying wireless sensors on crossroads and main roads, this study established a pilot framework for a wireless sensor network (WSN)-based real-time monitoring system to understand street-level spatial-temporal changes of carbon monoxide (CO) in urban settings. The system consists of two major components. The first component is the deployment of wireless sensors. We deployed 44 sensor nodes, 40 transmitter nodes and four gateway nodes in this study. Each sensor node includes a signal processing module, a CO sensor and a wireless communication module. In order to capture realistic human exposure to traffic pollutants, all sensors were deployed at a height of 1.5 m on lampposts and traffic signs. The study area covers a total length of 1.5 km of Keelung Road in Taipei City. The other component is a map-based monitoring platform for sensor data visualization and manipulation in time and space. Using intensive real-time street-level monitoring framework, we compared the spatial-temporal patterns of air pollution in different time periods. Our results capture four CO concentration peaks throughout the day at the location, which was located along an arterial and nearby traffic sign. The hourly average could reach 5.3 ppm from 5:00 pm to 7:00 pm due to the traffic congestion. The proposed WSN-based framework captures detailed ground information and potential risk of human exposure to traffic-related air pollution. It also provides street-level insights into real-time monitoring for further early warning of air pollution and urban environmental management.

  12. Monitoring Street-Level Spatial-Temporal Variations of Carbon Monoxide in Urban Settings Using a Wireless Sensor Network (WSN) Framework

    PubMed Central

    Wen, Tzai-Hung; Jiang, Joe-Air; Sun, Chih-Hong; Juang, Jehn-Yih; Lin, Tzu-Shiang

    2013-01-01

    Air pollution has become a severe environmental problem due to urbanization and heavy traffic. Monitoring street-level air quality is an important issue, but most official monitoring stations are installed to monitor large-scale air quality conditions, and their limited spatial resolution cannot reflect the detailed variations in air quality that may be induced by traffic jams. By deploying wireless sensors on crossroads and main roads, this study established a pilot framework for a wireless sensor network (WSN)-based real-time monitoring system to understand street-level spatial-temporal changes of carbon monoxide (CO) in urban settings. The system consists of two major components. The first component is the deployment of wireless sensors. We deployed 44 sensor nodes, 40 transmitter nodes and four gateway nodes in this study. Each sensor node includes a signal processing module, a CO sensor and a wireless communication module. In order to capture realistic human exposure to traffic pollutants, all sensors were deployed at a height of 1.5 m on lampposts and traffic signs. The study area covers a total length of 1.5 km of Keelung Road in Taipei City. The other component is a map-based monitoring platform for sensor data visualization and manipulation in time and space. Using intensive real-time street-level monitoring framework, we compared the spatial-temporal patterns of air pollution in different time periods. Our results capture four CO concentration peaks throughout the day at the location, which was located along an arterial and nearby traffic sign. The hourly average could reach 5.3 ppm from 5:00 pm to 7:00 pm due to the traffic congestion. The proposed WSN-based framework captures detailed ground information and potential risk of human exposure to traffic-related air pollution. It also provides street-level insights into real-time monitoring for further early warning of air pollution and urban environmental management. PMID:24287859

  13. A framework and a set of tools called Nutting models to estimate retention capacities and loads of nitrogen and phosphorus in rivers at catchment and national level (France)

    NASA Astrophysics Data System (ADS)

    Legeay, Pierre-Louis; Moatar, Florentina; Dupas, Rémi; Gascuel-Odoux, Chantal

    2016-04-01

    The Nutting-N and Nutting-P models (Dupas et al., 2013, 2015) have been developed to estimate Nitrogen and Phosphorus nonpoint-source emissions to surface water, using readily available data. These models were inspired from US model SPARROW (Smith al., 1997) and European model GREEN (Grizzetti et al., 2008), i.e. statistical approaches consisting of linking nitrogen and phosphorus surplus to catchment's land and rivers characteristics to find the catchment relative retention capacities. The nutrient load (L) at the outlet of each catchment is expressed as: L=R*(B*DS+PS) [1] where DS is diffuse sources (i.e. surplus in kg.ha-1/yr-1 for N, P storage in soil for P), PS is point sources from domestic and industrial origin (kg.ha-1.yr-1), R and B are the river system and basin reduction factor, respectively and they combine observed variables and calibrated parameters. The model was calibrated on independent catchments for the 2005-2009 and 2008-2012 periods. Variables were selected according to Bayesian Information Criterion (BIC) in order to optimize the predictive performance of the models. From these basic models, different improvements have been realized to build a framework and a set of tools: 1) a routing module has been added in order to improve estimations on 4 or 5 stream order, i.e. upscaling the basic Nutting approach; 2) a territorial module, in order to test the models at local scale (from 500 to 5000 km²); 3) a seasonal estimation has been investigated. The basic approach as well territorial application will be illustrated. These tools allow water manager to identify areas at risk where high nutrients loads are estimated, as well areas where retention is potentially high and can buffer high nutrient sources. References Dupas R., Curie F., Gascuel-Odoux C., Moatar F., Delmas M., Parnaudeau, V., Durand P., 2013. Assessing N emissions in surface water at the national level: Comparison of country-wide vs. regionalized models. Science of the Total Environment

  14. An adaptive level set method

    SciTech Connect

    Milne, Roger Brent

    1995-12-01

    This thesis describes a new method for the numerical solution of partial differential equations of the parabolic type on an adaptively refined mesh in two or more spatial dimensions. The method is motivated and developed in the context of the level set formulation for the curvature dependent propagation of surfaces in three dimensions. In that setting, it realizes the multiple advantages of decreased computational effort, localized accuracy enhancement, and compatibility with problems containing a range of length scales.

  15. High-Level Application Framework for LCLS

    SciTech Connect

    Chu, P; Chevtsov, S.; Fairley, D.; Larrieu, C.; Rock, J.; Rogind, D.; White, G.; Zalazny, M.; /SLAC

    2008-04-22

    A framework for high level accelerator application software is being developed for the Linac Coherent Light Source (LCLS). The framework is based on plug-in technology developed by an open source project, Eclipse. Many existing functionalities provided by Eclipse are available to high-level applications written within this framework. The framework also contains static data storage configuration and dynamic data connectivity. Because the framework is Eclipse-based, it is highly compatible with any other Eclipse plug-ins. The entire infrastructure of the software framework will be presented. Planned applications and plug-ins based on the framework are also presented.

  16. A Framework for Describing Interlanguages in Multilingual Settings.

    ERIC Educational Resources Information Center

    Tenjoh-Okwen, Thomas

    1989-01-01

    Outlines a contrastive analysis model and a non-contrastive analysis model for studying interlanguage in strictly bilingual settings, and suggests a bidimensional framework, including both linguistic and curricular components, for studying interlanguage in multilingual settings. (21 references) (CB)

  17. Exploring the UMLS: a rough sets based theoretical framework.

    PubMed

    Srinivasan, P

    1999-01-01

    The Unified Medical Language System (UMLS) [1] has a unique and leading position in the evolution of thesauri and metathesauri. Features that set it apart are: its composition from more than fifty component health care vocabularies; the sophisticated UMLS ontology linking the Metathesaurus with structures such as the Semantic Network and the SPECIALIST lexicon; and the high level of social collaboration invested in its construction and growth. It is our thesis that in order to successfully harness such a complex vocabulary for text retrieval we need sophisticated methods derived from a deeper understanding of the UMLS system. Thus we propose a theoretical framework based on the theory of rough sets, that supports the systematic and exploratory investigation of the UMLS Metathesaurus for text retrieval. Our goal is to make it more feasible for individuals such as patients and health care professionals to access relevant information at the point of need.

  18. Standard Setting to an International Reference Framework: Implications for Theory and Practice

    ERIC Educational Resources Information Center

    Lim, Gad S.; Geranpayeh, Ardeshir; Khalifa, Hanan; Buckendahl, Chad W.

    2013-01-01

    Standard setting theory has largely developed with reference to a typical situation, determining a level or levels of performance for one exam for one context. However, standard setting is now being used with international reference frameworks, where some parameters and assumptions of classical standard setting do not hold. We consider the…

  19. Can frameworks inform knowledge about health policy processes? Reviewing health policy papers on agenda setting and testing them against a specific priority-setting framework.

    PubMed

    Walt, Gill; Gilson, Lucy

    2014-12-01

    This article systematically reviews a set of health policy papers on agenda setting and tests them against a specific priority-setting framework. The article applies the Shiffman and Smith framework in extracting and synthesizing data from an existing set of papers, purposively identified for their relevance and systematically reviewed. Its primary aim is to assess how far the component parts of the framework help to identify the factors that influence the agenda setting stage of the policy process at global and national levels. It seeks to advance the field and inform the development of theory in health policy by examining the extent to which the framework offers a useful approach for organizing and analysing data. Applying the framework retrospectively to the selected set of papers, it aims to explore influences on priority setting and to assess how far the framework might gain from further refinement or adaptation, if used prospectively. In pursuing its primary aim, the article also demonstrates how the approach of framework synthesis can be used in health policy analysis research.

  20. Towards a Framework for Change Detection in Data Sets

    NASA Astrophysics Data System (ADS)

    Böttcher, Mirko; Nauck, Detlef; Ruta, Dymitr; Spott, Martin

    Since the world with its markets, innovations and customers is changing faster than ever before, the key to survival for businesses is the ability to detect, assess and respond to changing conditions rapidly and intelligently. Discovering changes and reacting to or acting upon them before others do has therefore become a strategical issue for many companies. However, existing data analysis techniques are insufflent for this task since they typically assume that the domain under consideration is stable over time. This paper presents a framework that detects changes within a data set at virtually any level of granularity. The underlying idea is to derive a rule-based description of the data set at different points in time and to subsequently analyse how these rules change. Nevertheless, further techniques are required to assist the data analyst in interpreting and assessing their changes. Therefore the framework also contains methods to discard rules that are non-drivers for change and to assess the interestingness of detected changes.

  1. DISJUNCTIVE NORMAL LEVEL SET: AN EFFICIENT PARAMETRIC IMPLICIT METHOD

    PubMed Central

    Mesadi, Fitsum; Cetin, Mujdat; Tasdizen, Tolga

    2016-01-01

    Level set methods are widely used for image segmentation because of their capability to handle topological changes. In this paper, we propose a novel parametric level set method called Disjunctive Normal Level Set (DNLS), and apply it to both two phase (single object) and multiphase (multi-object) image segmentations. The DNLS is formed by union of polytopes which themselves are formed by intersections of half-spaces. The proposed level set framework has the following major advantages compared to other level set methods available in the literature. First, segmentation using DNLS converges much faster. Second, the DNLS level set function remains regular throughout its evolution. Third, the proposed multiphase version of the DNLS is less sensitive to initialization, and its computational cost and memory requirement remains almost constant as the number of objects to be simultaneously segmented grows. The experimental results show the potential of the proposed method.

  2. Fast Sparse Level Sets on Graphics Hardware.

    PubMed

    Jalba, Andrei C; van der Laan, Wladimir J; Roerdink, Jos B T M

    2013-01-01

    The level-set method is one of the most popular techniques for capturing and tracking deformable interfaces. Although level sets have demonstrated great potential in visualization and computer graphics applications, such as surface editing and physically based modeling, their use for interactive simulations has been limited due to the high computational demands involved. In this paper, we address this computational challenge by leveraging the increased computing power of graphics processors, to achieve fast simulations based on level sets. Our efficient, sparse GPU level-set method is substantially faster than other state-of-the-art, parallel approaches on both CPU and GPU hardware. We further investigate its performance through a method for surface reconstruction, based on GPU level sets. Our novel multiresolution method for surface reconstruction from unorganized point clouds compares favorably with recent, existing techniques and other parallel implementations. Finally, we point out that both level-set computations and rendering of level-set surfaces can be performed at interactive rates, even on large volumetric grids. Therefore, many applications based on level sets can benefit from our sparse level-set method.

  3. International Review of Frameworks for Standard Setting & Labeling Development

    SciTech Connect

    Zhou, Nan; Khanna, Nina Zheng; Fridley, David; Romankiewicz, John

    2012-09-01

    As appliance energy efficiency standards and labeling (S&L) programs reach a broader geographic and product scope, a series of sophisticated and complex technical and economic analyses have been adopted by different countries in the world to support and enhance these growing S&L programs. The initial supporting techno-economic and impact analyses for S&L development make up a defined framework and process for setting and developing appropriate appliance efficiency standards and labeling programs. This report reviews in-depth the existing framework for standards setting and label development in the well-established programs of the U.S., Australia and the EU to identify and evaluate major trends in how and why key analyses are undertaken and to understand major similarities and differences between each of the frameworks.

  4. A Lagrangian particle level set method

    NASA Astrophysics Data System (ADS)

    Hieber, Simone E.; Koumoutsakos, Petros

    2005-11-01

    We present a novel particle level set method for capturing interfaces. The level set equation is solved in a Lagrangian frame using particles that carry the level set information. A key aspect of the method involves a consistent remeshing procedure for the regularization of the particle locations when the particle map gets distorted by the advection field. The Lagrangian description of the level set method is inherently adaptive and exact in the case of solid body motions. The efficiency and accuracy of the method is demonstrated in several benchmark problems in two and three dimensions involving pure advection and curvature induced motion of the interface. The simplicity of the particle description is shown to be well suited for real time simulations of surfaces involving cutting and reconnection as in virtual surgery environments.

  5. Interpretable Decision Sets: A Joint Framework for Description and Prediction

    PubMed Central

    Lakkaraju, Himabindu; Bach, Stephen H.; Jure, Leskovec

    2016-01-01

    One of the most important obstacles to deploying predictive models is the fact that humans do not understand and trust them. Knowing which variables are important in a model’s prediction and how they are combined can be very powerful in helping people understand and trust automatic decision making systems. Here we propose interpretable decision sets, a framework for building predictive models that are highly accurate, yet also highly interpretable. Decision sets are sets of independent if-then rules. Because each rule can be applied independently, decision sets are simple, concise, and easily interpretable. We formalize decision set learning through an objective function that simultaneously optimizes accuracy and interpretability of the rules. In particular, our approach learns short, accurate, and non-overlapping rules that cover the whole feature space and pay attention to small but important classes. Moreover, we prove that our objective is a non-monotone submodular function, which we efficiently optimize to find a near-optimal set of rules. Experiments show that interpretable decision sets are as accurate at classification as state-of-the-art machine learning techniques. They are also three times smaller on average than rule-based models learned by other methods. Finally, results of a user study show that people are able to answer multiple-choice questions about the decision boundaries of interpretable decision sets and write descriptions of classes based on them faster and more accurately than with other rule-based models that were designed for interpretability. Overall, our framework provides a new approach to interpretable machine learning that balances accuracy, interpretability, and computational efficiency. PMID:27853627

  6. Interpretable Decision Sets: A Joint Framework for Description and Prediction.

    PubMed

    Lakkaraju, Himabindu; Bach, Stephen H; Jure, Leskovec

    2016-08-01

    One of the most important obstacles to deploying predictive models is the fact that humans do not understand and trust them. Knowing which variables are important in a model's prediction and how they are combined can be very powerful in helping people understand and trust automatic decision making systems. Here we propose interpretable decision sets, a framework for building predictive models that are highly accurate, yet also highly interpretable. Decision sets are sets of independent if-then rules. Because each rule can be applied independently, decision sets are simple, concise, and easily interpretable. We formalize decision set learning through an objective function that simultaneously optimizes accuracy and interpretability of the rules. In particular, our approach learns short, accurate, and non-overlapping rules that cover the whole feature space and pay attention to small but important classes. Moreover, we prove that our objective is a non-monotone submodular function, which we efficiently optimize to find a near-optimal set of rules. Experiments show that interpretable decision sets are as accurate at classification as state-of-the-art machine learning techniques. They are also three times smaller on average than rule-based models learned by other methods. Finally, results of a user study show that people are able to answer multiple-choice questions about the decision boundaries of interpretable decision sets and write descriptions of classes based on them faster and more accurately than with other rule-based models that were designed for interpretability. Overall, our framework provides a new approach to interpretable machine learning that balances accuracy, interpretability, and computational efficiency.

  7. Level Set Segmentation of Lumbar Vertebrae Using Appearance Models

    NASA Astrophysics Data System (ADS)

    Fritscher, Karl; Leber, Stefan; Schmölz, Werner; Schubert, Rainer

    For the planning of surgical interventions of the spine exact knowledge about 3D shape and the local bone quality of vertebrae are of great importance in order to estimate the anchorage strength of screws or implants. As a prerequisite for quantitative analysis a method for objective and therefore automated segmentation of vertebrae is needed. In this paper a framework for the automatic segmentation of vertebrae using 3D appearance models in a level set framework is presented. In this framework model information as well as gradient information and probabilities of pixel intensities at object edges in the unseen image are used. The method is tested on 29 lumbar vertebrae leading to accurate results, which can be useful for surgical planning and further analysis of the local bone quality.

  8. Setting the stage for master's level success

    NASA Astrophysics Data System (ADS)

    Roberts, Donna

    Comprehensive reading, writing, research, and study skills play a critical role in a graduate student's success and ability to contribute to a field of study effectively. The literature indicated a need to support graduate student success in the areas of mentoring, navigation, as well as research and writing. The purpose of this two-phased mixed methods explanatory study was to examine factors that characterize student success at the Master's level in the fields of education, sociology and social work. The study was grounded in a transformational learning framework which focused on three levels of learning: technical knowledge, practical or communicative knowledge, and emancipatory knowledge. The study included two data collection points. Phase one consisted of a Master's Level Success questionnaire that was sent via Qualtrics to graduate level students at three colleges and universities in the Central Valley of California: a California State University campus, a University of California campus, and a private college campus. The results of the chi-square indicated that seven questionnaire items were significant with p values less than .05. Phase two in the data collection included semi-structured interview questions that resulted in three themes emerged using Dedoose software: (1) the need for more language and writing support at the Master's level, (2) the need for mentoring, especially for second-language learners, and (3) utilizing the strong influence of faculty in student success. It is recommended that institutions continually assess and strengthen their programs to meet the full range of learners and to support students to degree completion.

  9. Level-Set-Segmentierung von Rattenhirn MRTs

    NASA Astrophysics Data System (ADS)

    Eiben, Björn; Kunz, Dietmar; Pietrzyk, Uwe; Palm, Christoph

    In dieser Arbeit wird die Segmentierung von Gehirngewebe aus Kopfaufnahmen von Ratten mittels Level-Set-Methoden vorgeschlagen. Dazu wird ein zweidimensionaler, kontrastbasierter Ansatz zu einem dreidimensionalen, lokal an die Bildintensität adaptierten Segmentierer erweitert. Es wird gezeigt, dass mit diesem echten 3D-Ansatz die lokalen Bildstrukturen besser berücksichtigt werden können. Insbesondere Magnet-Resonanz-Tomographien (MRTs) mit globalen Helligkeitsgradienten, beispielsweise bedingt durch Oberfiächenspulen, können auf diese Weise zuverlässiger und ohne weitere Vorverarbeitungsschritte segmentiert werden. Die Leistungsfähigkeit des Algorithmus wird experimentell an Hand dreier Rattenhirn-MRTs demonstriert.

  10. Simulation of Etching Profiles Using Level Sets

    NASA Technical Reports Server (NTRS)

    Hwang, Helen; Govindan, T. R.; Meyyappan, M.; Arnold, James O. (Technical Monitor)

    1998-01-01

    Using plasma discharges to etch trenches and via holes in substrates is an important process in semiconductor manufacturing. Ion enhanced etching involves both neutral fluxes, which are isotropic, and ion fluxes, which are anisotropic. The angular distributions for the ions determines the degree of vertical etch, while the amount of the neutral fluxes determines the etch rate. We have developed a 2D profile evolution simulation which uses level set methods to model the plasma-substrate interface. Using level sets instead of traditional string models avoids the use of complicated delooping algorithms. The simulation calculates the etch rate based on the fluxes and distribution functions of both ions and neutrals. We will present etching profiles of Si substrates in low pressure (10s mTorr) Ar/Cl2 discharges for a variety of incident ion angular distributions. Both ion and neutral re-emission fluxes are included in the calculation of the etch rate, and their contributions to the total etch profile will be demonstrated. In addition, we will show RIE lag effects as a function of different trench aspect ratios. (For sample profiles, please see http://www.ipt.arc.nasa.gov/hwangfig1.html)

  11. Etch Profile Simulation Using Level Set Methods

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    Etching and deposition of materials are critical steps in semiconductor processing for device manufacturing. Both etching and deposition may have isotropic and anisotropic components, due to directional sputtering and redeposition of materials, for example. Previous attempts at modeling profile evolution have used so-called "string theory" to simulate the moving solid-gas interface between the semiconductor and the plasma. One complication of this method is that extensive de-looping schemes are required at the profile corners. We will present a 2D profile evolution simulation using level set theory to model the surface. (1) By embedding the location of the interface in a field variable, the need for de-looping schemes is eliminated and profile corners are more accurately modeled. This level set profile evolution model will calculate both isotropic and anisotropic etch and deposition rates of a substrate in low pressure (10s mTorr) plasmas, considering the incident ion energy angular distribution functions and neutral fluxes. We will present etching profiles of Si substrates in Ar/Cl2 discharges for various incident ion energies and trench geometries.

  12. Texture descriptor approaches to level set segmentation in medical images

    NASA Astrophysics Data System (ADS)

    Olveres, Jimena; Nava, Rodrigo; Moya-Albor, Ernesto; Escalante-Ramírez, Boris; Brieva, Jorge; Cristóbal, Gabriel; Vallejo, Enrique

    2014-05-01

    Medical image analysis has become an important tool for improving medical diagnosis and planning treatments. It involves volume or still image segmentation that plays a critical role in understanding image content by facilitating extraction of the anatomical organ or region-of-interest. It also may help towards the construction of reliable computer-aided diagnosis systems. Specifically, level set methods have emerged as a general framework for image segmentation; such methods are mainly based on gradient information and provide satisfactory results. However, the noise inherent to images and the lack of contrast information between adjacent regions hamper the performance of the algorithms, thus, others proposals have been suggested in the literature. For instance, characterization of regions as statistical parametric models to handle level set evolution. In this paper, we study the influence of texture on a level-set-based segmentation and propose the use of Hermite features that are incorporated into the level set model to improve organ segmentation that may be useful for quantifying left ventricular blood flow. The proposal was also compared against other texture descriptors such as local binary patterns, Image derivatives, and Hounsfield low attenuation values.

  13. A probabilistic level set formulation for interactive organ segmentation

    NASA Astrophysics Data System (ADS)

    Cremers, Daniel; Fluck, Oliver; Rousson, Mikael; Aharon, Shmuel

    2007-03-01

    Level set methods have become increasingly popular as a framework for image segmentation. Yet when used as a generic segmentation tool, they suffer from an important drawback: Current formulations do not allow much user interaction. Upon initialization, boundaries propagate to the final segmentation without the user being able to guide or correct the segmentation. In the present work, we address this limitation by proposing a probabilistic framework for image segmentation which integrates input intensity information and user interaction on equal footings. The resulting algorithm determines the most likely segmentation given the input image and the user input. In order to allow a user interaction in real-time during the segmentation, the algorithm is implemented on a graphics card and in a narrow band formulation.

  14. Iris segmentation using variational level set method

    NASA Astrophysics Data System (ADS)

    Roy, Kaushik; Bhattacharya, Prabir; Suen, Ching Y.

    2011-04-01

    Continuous efforts have been made to process degraded iris images for enhancement of the iris recognition performance in unconstrained situations. Recently, many researchers have focused on developing the iris segmentation techniques, which can deal with iris images in a non-cooperative environment where the probability of acquiring unideal iris images is very high due to gaze deviation, noise, blurring, and occlusion by eyelashes, eyelids, glasses, and hair. Although there have been many iris segmentation methods, most focus primarily on the accurate detection of iris images captured in a closely controlled environment. The novelty of this research effort is that we propose to apply a variational level set-based curve evolution scheme that uses a significantly larger time step to numerically solve the evolution partial differential equation (PDE) for segmentation of an unideal iris image accurately, and thereby, speeding up the curve evolution process drastically. The iris boundary represented by the variational level set may break and merge naturally during evolution, and thus, the topological changes are handled automatically. The proposed variational model is also robust against poor localization and weak iris/sclera boundaries. In order to solve the size irregularities occurring due to arbitrary shapes of the extracted iris/pupil regions, a simple method is applied based on connection of adjacent contour points. Furthermore, to reduce the noise effect, we apply a pixel-wise adaptive 2D Wiener filter. The verification and identification performance of the proposed scheme is validated on three challenging iris image datasets, namely, the ICE 2005, the WVU Unideal, and the UBIRIS Version 1.

  15. Framework for State-Level Renewable Energy Market Potential Studies

    SciTech Connect

    Kreycik, C.; Vimmerstedt, L.; Doris, E.

    2010-01-01

    State-level policymakers are relying on estimates of the market potential for renewable energy resources as they set goals and develop policies to accelerate the development of these resources. Therefore, accuracy of such estimates should be understood and possibly improved to appropriately support these decisions. This document provides a framework and next steps for state officials who require estimates of renewable energy market potential. The report gives insight into how to conduct a market potential study, including what supporting data are needed and what types of assumptions need to be made. The report distinguishes between goal-oriented studies and other types of studies, and explains the benefits of each.

  16. Advanced level set segmentation of the right atrium in MR

    NASA Astrophysics Data System (ADS)

    Chen, Siqi; Kohlberger, Timo; Kirchberg, Klaus J.

    2011-03-01

    Atrial fibrillation is a common heart arrhythmia, and can be effectively treated with ablation. Ablation planning requires 3D models of the patient's left atrium (LA) and/or right atrium (RA), therefore an automatic segmentation procedure to retrieve these models is desirable. In this study, we investigate the use of advanced level set segmentation approaches to automatically segment RA in magnetic resonance angiographic (MRA) volume images. Low contrast to noise ratio makes the boundary between the RA and the nearby structures nearly indistinguishable. Therefore, pure data driven segmentation approaches such as watershed and ChanVese methods are bound to fail. Incorporating training shapes through PCA modeling to constrain the segmentation is one popular solution, and is also used in our segmentation framework. The shape parameters from PCA are optimized with a global histogram based energy model. However, since the shape parameters span a much smaller space, it can not capture fine details of the shape. Therefore, we employ a second refinement step after the shape based segmentation stage, which follows closely the recent work of localized appearance model based techniques. The local appearance model is established through a robust point tracking mechanism and is learned through landmarks embedded on the surface of training shapes. The key contribution of our work is the combination of a statistical shape prior and a localized appearance prior for level set segmentation of the right atrium from MRA. We test this two step segmentation framework on porcine RA to verify the algorithm.

  17. A Bayesian Level-Set Inversion Protocol for Structural Zonation

    NASA Astrophysics Data System (ADS)

    Cardiff, M.; Kitanidis, P.

    2008-12-01

    Mapping the variability of subsurface properties via indirect methods is of great importance for problems in contaminant remediation and resource evaluation. In general, methods for inverse modeling commonly assume smooth and/or geostatistical distributions of the parameters being estimated. However, especially for field- and catchment-scale inverse problems, the existence of distinct, separate geologic facies is not consistent with the assumptions of these inversion techniques. Because of this drawback, it is important that we develop inversion methods that are built for imaging so-called "structural" parameter fields accurately. In our presentation, we discuss the use of a facies-based level set method for imaging geologic parameter fields. The level set framework is applicable when subsurface heterogeneity can be adequately represented as a set of relatively homogeneous geologic facies separated by sharp boundaries. During the inversion optimization, the shape of boundaries between facies are optimized in order to improve data fit. Our method can represent boundaries between arbitrary numbers of facies, and extensions to joint inversion can be handled without relying on petrophysical relations. As examples, we present several synthetic inverse problems that cover realistic estimation problems using nonlinear models with multiple datasets. Throughout our work, we adopt a Bayesian perspective which allows integration of prior information as well as linearized estimation of uncertainty in the boundary locations.

  18. Decentralized health care priority-setting in Tanzania: evaluating against the accountability for reasonableness framework.

    PubMed

    Maluka, Stephen; Kamuzora, Peter; San Sebastiån, Miguel; Byskov, Jens; Olsen, Øystein E; Shayo, Elizabeth; Ndawi, Benedict; Hurtig, Anna-Karin

    2010-08-01

    Priority-setting has become one of the biggest challenges faced by health decision-makers worldwide. Fairness is a key goal of priority-setting and Accountability for Reasonableness has emerged as a guiding framework for fair priority-setting. This paper describes the processes of setting health care priorities in Mbarali district, Tanzania, and evaluates the descriptions against Accountability for Reasonableness. Key informant interviews were conducted with district health managers, local government officials and other stakeholders using a semi-structured interview guide. Relevant documents were also gathered and group priority-setting in the district was observed. The results indicate that, while Tanzania has a decentralized public health care system, the reality of the district level priority-setting process was that it was not nearly as participatory as the official guidelines suggest it should have been. Priority-setting usually occurred in the context of budget cycles and the process was driven by historical allocation. Stakeholders' involvement in the process was minimal. Decisions (but not the reasoning behind them) were publicized through circulars and notice boards, but there were no formal mechanisms in place to ensure that this information reached the public. There were neither formal mechanisms for challenging decisions nor an adequate enforcement mechanism to ensure that decisions were made in a fair and equitable manner. Therefore, priority-setting in Mbarali district did not satisfy all four conditions of Accountability for Reasonableness; namely relevance, publicity, appeals and revision, and enforcement. This paper aims to make two important contributions to this problematic situation. First, it provides empirical analysis of priority-setting at the district level in the contexts of low-income countries. Second, it provides guidance to decision-makers on how to improve fairness, legitimacy, and sustainability of the priority-setting process.

  19. A modular framework for gene set analysis integrating multilevel omics data

    PubMed Central

    Sass, Steffen; Buettner, Florian; Mueller, Nikola S.; Theis, Fabian J.

    2013-01-01

    Modern high-throughput methods allow the investigation of biological functions across multiple ‘omics’ levels. Levels include mRNA and protein expression profiling as well as additional knowledge on, for example, DNA methylation and microRNA regulation. The reason for this interest in multi-omics is that actual cellular responses to different conditions are best explained mechanistically when taking all omics levels into account. To map gene products to their biological functions, public ontologies like Gene Ontology are commonly used. Many methods have been developed to identify terms in an ontology, overrepresented within a set of genes. However, these methods are not able to appropriately deal with any combination of several data types. Here, we propose a new method to analyse integrated data across multiple omics-levels to simultaneously assess their biological meaning. We developed a model-based Bayesian method for inferring interpretable term probabilities in a modular framework. Our Multi-level ONtology Analysis (MONA) algorithm performed significantly better than conventional analyses of individual levels and yields best results even for sophisticated models including mRNA fine-tuning by microRNAs. The MONA framework is flexible enough to allow for different underlying regulatory motifs or ontologies. It is ready-to-use for applied researchers and is available as a standalone application from http://icb.helmholtz-muenchen.de/mona. PMID:23975194

  20. Beyond SMART? A New Framework for Goal Setting

    ERIC Educational Resources Information Center

    Day, Trevor; Tosey, Paul

    2011-01-01

    This article extends currently reported theory and practice in the use of learning goals or targets with students in secondary and further education. Goal-setting and action-planning constructs are employed in personal development plans (PDPs) and personal learning plans (PLPs) and are advocated as practice within the English national policy…

  1. Confidence sets for optimal factor levels of a response surface.

    PubMed

    Wan, Fang; Liu, Wei; Bretz, Frank; Han, Yang

    2016-12-01

    Construction of confidence sets for the optimal factor levels is an important topic in response surfaces methodology. In Wan et al. (2015), an exact (1-α) confidence set has been provided for a maximum or minimum point (i.e., an optimal factor level) of a univariate polynomial function in a given interval. In this article, the method has been extended to construct an exact (1-α) confidence set for the optimal factor levels of response surfaces. The construction method is readily applied to many parametric and semiparametric regression models involving a quadratic function. A conservative confidence set has been provided as an intermediate step in the construction of the exact confidence set. Two examples are given to illustrate the application of the confidence sets. The comparison between confidence sets indicates that our exact confidence set is better than the only other confidence set available in the statistical literature that guarantees the (1-α) confidence level.

  2. A Systematic Framework for Addressing Treatment Integrity in School Settings

    ERIC Educational Resources Information Center

    Kupzyk, Sara; Shriver, Mark D.

    2016-01-01

    School psychologists are tasked with ensuring treatment integrity because the level of intervention implementation affects decisions about student progress. Treatment integrity includes multiple dimensions that may impact the effectiveness of an intervention including adherence, dosage, quality, and engagement. Unfortunately, treatment integrity…

  3. A contribution to set a legal framework for biofertilisers.

    PubMed

    Malusá, E; Vassilev, N

    2014-08-01

    The extensive research, production and use of microorganisms to improve plant nutrition have resulted in an inconsistent definition of the term "biofertiliser" which, in some cases, is due to the different microbial mechanisms involved. The rationale for adopting the term biofertiliser is that it derives from "biological fertiliser", that, in turn, implies the use of living microorganisms. Here, we propose a definition for this kind of products which is distinguishing them from biostimulants or other inorganic and organic fertilisers. Special emphasis is given to microorganism(s) with multifunctional properties and biofertilisers containing more than one microorganism. This definition could be included in legal provisions regulating registration and marketing requirements. A set of rules is also proposed which could guarantee the quality of biofertilisers present on the market and thus foster their use by farmers.

  4. Tailoring Healthy Workplace Interventions to Local Healthcare Settings: A Complexity Theory-Informed Workplace of Well-Being Framework.

    PubMed

    Brand, Sarah L; Fleming, Lora E; Wyatt, Katrina M

    2015-01-01

    Many healthy workplace interventions have been developed for healthcare settings to address the consistently low scores of healthcare professionals on assessments of mental and physical well-being. Complex healthcare settings present challenges for the scale-up and spread of successful interventions from one setting to another. Despite general agreement regarding the importance of the local setting in affecting intervention success across different settings, there is no consensus on what it is about a local setting that needs to be taken into account to design healthy workplace interventions appropriate for different local settings. Complexity theory principles were used to understand a workplace as a complex adaptive system and to create a framework of eight domains (system characteristics) that affect the emergence of system-level behaviour. This Workplace of Well-being (WoW) framework is responsive and adaptive to local settings and allows a shared understanding of the enablers and barriers to behaviour change by capturing local information for each of the eight domains. We use the results of applying the WoW framework to one workplace, a UK National Health Service ward, to describe the utility of this approach in informing design of setting-appropriate healthy workplace interventions that create workplaces conducive to healthy behaviour change.

  5. Identifying Heterogeneities in Subsurface Environment using the Level Set Method

    SciTech Connect

    Lei, Hongzhuan; Lu, Zhiming; Vesselinov, Velimir Valentinov

    2016-08-25

    These are slides from a presentation on identifying heterogeneities in subsurface environment using the level set method. The slides start with the motivation, then explain Level Set Method (LSM), the algorithms, some examples are given, and finally future work is explained.

  6. A new level set model for multimaterial flows

    SciTech Connect

    Starinshak, David P.; Karni, Smadar; Roe, Philip L.

    2014-01-08

    We present a new level set model for representing multimaterial flows in multiple space dimensions. Instead of associating a level set function with a specific fluid material, the function is associated with a pair of materials and the interface that separates them. A voting algorithm collects sign information from all level sets and determines material designations. M(M ₋1)/2 level set functions might be needed to represent a general M-material configuration; problems of practical interest use far fewer functions, since not all pairs of materials share an interface. The new model is less prone to producing indeterminate material states, i.e. regions claimed by more than one material (overlaps) or no material at all (vacuums). It outperforms existing material-based level set models without the need for reinitialization schemes, thereby avoiding additional computational costs and preventing excessive numerical diffusion.

  7. From Mouth-level to Tooth-level DMFS: Conceptualizing a Theoretical Framework

    PubMed Central

    Bandyopadhyay, Dipankar

    2015-01-01

    Objective There is no dearth of correlated count data in any biological or clinical settings, and the ability to accurately analyze and interpret such data remains an exciting area of research. In oral health epidemiology, the Decayed, Missing, Filled (DMF) index has been continuously used for over 70 years as the key measure to quantify caries experience. The DMF index projects a subject’s caries status using either the DMF(T), the total number of DMF teeth, or the DMF(S), counting the total DMF teeth surfaces, for that subject. However, surfaces within a particular tooth or a subject constitute clustered data, and the DMFS mostly overlook this clustering effect to attain an over-simplified summary index, ignoring the true tooth-level caries status. Besides, the DMFT/DMFS might exhibit excess of some specific counts (say, zeroes representing the set of relatively disease-free carious state), or can exhibit overdispersion, and accounting for the excess responses or overdispersion remains a key component is selecting the appropriate modeling strategy. Methods & Results This concept paper presents the rationale and the theoretical framework which a dental researcher might consider at the onset in order to choose a plausible statistical model for tooth-level DMFS. Various nuances related to model fitting, selection and parameter interpretation are also explained. Conclusion The author recommends conceptualizing the correct stochastic framework should serve as the guiding force to the dental researcher’s never-ending goal of assessing complex covariate-response relationships efficiently. PMID:26618183

  8. Public health and health promotion capacity at national and regional level: a review of conceptual frameworks.

    PubMed

    Aluttis, Christoph; den Broucke, Stephan Van; Chiotan, Cristina; Costongs, Caroline; Michelsen, Kai; Brand, Helmut

    2014-03-26

    The concept of capacity building for public health has gained much attention during the last decade. National as well as international organizations increasingly focus their efforts on capacity building to improve performance in the health sector. During the past two decades, a variety of conceptual frameworks have been developed which describe relevant dimensions for public health capacity. Notably, these frameworks differ in design and conceptualization. This paper therefore reviews the existing conceptual frameworks and integrates them into one framework, which contains the most relevant dimensions for public health capacity at the country- or regional level. A comprehensive literature search was performed to identify frameworks addressing public health capacity building at the national or regional level. We content-analysed these frameworks to identify the core dimensions of public health capacity. The dimensions were subsequently synthesized into a set of thematic areas to construct a conceptual framework which describes the most relevant dimensions for capacities at the national- or regional level. The systematic review resulted in the identification of seven core domains for public health capacity: resources, organizational structures, workforce, partnerships, leadership and governance, knowledge development and country specific context. Accordingly, these dimensions were used to construct a framework, which describes these core domains more in detail. Our research shows that although there is no generally agreedupon model of public health capacity, a number of key domains for public health and health promotion capacity are consistently recurring in existing frameworks, regardless of their geographical location or thematic area. As only little work on the core concepts of public health capacities has yet taken place, this study adds value to the discourse by identifying these consistencies across existing frameworks and by synthesising them into a new

  9. A 3D Level Set Method for Microwave Breast Imaging

    PubMed Central

    Colgan, Timothy J.; Hagness, Susan C.; Van Veen, Barry D.

    2015-01-01

    Objective Conventional inverse-scattering algorithms for microwave breast imaging result in moderate resolution images with blurred boundaries between tissues. Recent 2D numerical microwave imaging studies demonstrate that the use of a level set method preserves dielectric boundaries, resulting in a more accurate, higher resolution reconstruction of the dielectric properties distribution. Previously proposed level set algorithms are computationally expensive and thus impractical in 3D. In this paper we present a computationally tractable 3D microwave imaging algorithm based on level sets. Methods We reduce the computational cost of the level set method using a Jacobian matrix, rather than an adjoint method, to calculate Frechet derivatives. We demonstrate the feasibility of 3D imaging using simulated array measurements from 3D numerical breast phantoms. We evaluate performance by comparing full 3D reconstructions to those from a conventional microwave imaging technique. We also quantitatively assess the efficacy of our algorithm in evaluating breast density. Results Our reconstructions of 3D numerical breast phantoms improve upon those of a conventional microwave imaging technique. The density estimates from our level set algorithm are more accurate than those of conventional microwave imaging, and the accuracy is greater than that reported for mammographic density estimation. Conclusion Our level set method leads to a feasible level of computational complexity for full 3D imaging, and reconstructs the heterogeneous dielectric properties distribution of the breast more accurately than conventional microwave imaging methods. Significance 3D microwave breast imaging using a level set method is a promising low-cost, non-ionizing alternative to current breast imaging techniques. PMID:26011863

  10. Hippocampus segmentation using locally weighted prior based level set

    NASA Astrophysics Data System (ADS)

    Achuthan, Anusha; Rajeswari, Mandava

    2015-12-01

    Segmentation of hippocampus in the brain is one of a major challenge in medical image segmentation due to its' imaging characteristics, with almost similar intensity between another adjacent gray matter structure, such as amygdala. The intensity similarity has causes the hippocampus to have weak or fuzzy boundaries. With this main challenge being demonstrated by hippocampus, a segmentation method that relies on image information alone may not produce accurate segmentation results. Therefore, it is needed an assimilation of prior information such as shape and spatial information into existing segmentation method to produce the expected segmentation. Previous studies has widely integrated prior information into segmentation methods. However, the prior information has been utilized through a global manner integration, and this does not reflect the real scenario during clinical delineation. Therefore, in this paper, a locally integrated prior information into a level set model is presented. This work utilizes a mean shape model to provide automatic initialization for level set evolution, and has been integrated as prior information into the level set model. The local integration of edge based information and prior information has been implemented through an edge weighting map that decides at voxel level which information need to be observed during a level set evolution. The edge weighting map shows which corresponding voxels having sufficient edge information. Experiments shows that the proposed integration of prior information locally into a conventional edge-based level set model, known as geodesic active contour has shown improvement of 9% in averaged Dice coefficient.

  11. The exchange boundary framework: understanding the evolution of power within collaborative decision-making settings.

    PubMed

    Watson, Erin R; Foster-Fishman, Pennie G

    2013-03-01

    Many community decision-making bodies encounter challenges in creating conditions where stakeholders from disadvantaged populations can authentically participate in ways that give them actual influence over decisions affecting their lives (Foster-Fishman et al., Lessons for the journey: Strategies and suggestions for guiding planning, governance, and sustainability in comprehensive community initiatives. W.K. Kellogg Foundation, Battle Creek, MI, 2004). These challenges are often rooted in asymmetrical power dynamics operating within the settings (Prilleltensky, J Commun Psychol 36:116-136, 2008). In response, this paper presents the Exchange Boundary Framework, a new approach for understanding and promoting authentic, empowered participation within collaborative decision-making settings. The framework expands upon theories currently used in the field of community psychology by focusing on the underlying processes through which power operates in relationships and examining the evolution of power dynamics over time. By integrating concepts from social exchange theory (Emerson, Am Soc Rev 27:31-41, 1962) and social boundaries theory (Hayward, Polity 31(1):1-22, 1998), the framework situates power within parallel processes of resources exchange and social regulation. The framework can be used to understand the conditions leading to power asymmetries within collaborative decisionmaking processes, and guide efforts to promote more equitable and authentic participation by all stakeholders within these settings. In this paper we describe the Exchange Boundary Framework, apply it to three distinct case studies, and discuss key considerations for its application within collaborative community settings.

  12. Construal level mind-sets moderate self- and social stereotyping.

    PubMed

    McCrea, Sean M; Wieber, Frank; Myers, Andrea L

    2012-01-01

    Construal level theory suggests that events and objects can be represented at either a higher, more abstract level involving consideration of superordinate goals, desirability, global processing, and broad categorizations or a lower, more concrete level involving consideration of subordinate goals, feasibility, local processing, and narrow categorizations. Analogously, social targets (including the self) can be represented more broadly, as members of a group, or more narrowly, as individuals. Because abstract construals induce a similarity focus, they were predicted to increase the perceived fit between social targets and a salient social category. Accordingly, placing individuals into a more abstract construal mind-set via an unrelated task increased the activation and use of stereotypes of salient social groups, stereotype-consistent trait ratings of the self, group identification, and stereotype-consistent performance relative to more concrete construal mind-sets. Thus, nonsocial contextual influences (construal level mind-sets) affect stereotyping of self and others.

  13. A PDE-Based Fast Local Level Set Method

    NASA Astrophysics Data System (ADS)

    Peng, Danping; Merriman, Barry; Osher, Stanley; Zhao, Hongkai; Kang, Myungjoo

    1999-11-01

    We develop a fast method to localize the level set method of Osher and Sethian (1988, J. Comput. Phys.79, 12) and address two important issues that are intrinsic to the level set method: (a) how to extend a quantity that is given only on the interface to a neighborhood of the interface; (b) how to reset the level set function to be a signed distance function to the interface efficiently without appreciably moving the interface. This fast local level set method reduces the computational effort by one order of magnitude, works in as much generality as the original one, and is conceptually simple and easy to implement. Our approach differs from previous related works in that we extract all the information needed from the level set function (or functions in multiphase flow) and do not need to find explicitly the location of the interface in the space domain. The complexity of our method to do tasks such as extension and distance reinitialization is O(N), where N is the number of points in space, not O(N log N) as in works by Sethian (1996, Proc. Nat. Acad. Sci. 93, 1591) and Helmsen and co-workers (1996, SPIE Microlithography IX, p. 253). This complexity estimation is also valid for quite general geometrically based front motion for our localized method.

  14. Level-Set Topology Optimization with Aeroelastic Constraints

    NASA Technical Reports Server (NTRS)

    Dunning, Peter D.; Stanford, Bret K.; Kim, H. Alicia

    2015-01-01

    Level-set topology optimization is used to design a wing considering skin buckling under static aeroelastic trim loading, as well as dynamic aeroelastic stability (flutter). The level-set function is defined over the entire 3D volume of a transport aircraft wing box. Therefore, the approach is not limited by any predefined structure and can explore novel configurations. The Sequential Linear Programming (SLP) level-set method is used to solve the constrained optimization problems. The proposed method is demonstrated using three problems with mass, linear buckling and flutter objective and/or constraints. A constraint aggregation method is used to handle multiple buckling constraints in the wing skins. A continuous flutter constraint formulation is used to handle difficulties arising from discontinuities in the design space caused by a switching of the critical flutter mode.

  15. An improved level set method for vertebra CT image segmentation

    PubMed Central

    2013-01-01

    Background Clinical diagnosis and therapy for the lumbar disc herniation requires accurate vertebra segmentation. The complex anatomical structure and the degenerative deformations of the vertebrae makes its segmentation challenging. Methods An improved level set method, namely edge- and region-based level set method (ERBLS), is proposed for vertebra CT images segmentation. By considering the gradient information and local region characteristics of images, the proposed model can efficiently segment images with intensity inhomogeneity and blurry or discontinuous boundaries. To reduce the dependency on manual initialization in many active contour models and for an automatic segmentation, a simple initialization method for the level set function is built, which utilizes the Otsu threshold. In addition, the need of the costly re-initialization procedure is completely eliminated. Results Experimental results on both synthetic and real images demonstrated that the proposed ERBLS model is very robust and efficient. Compared with the well-known local binary fitting (LBF) model, our method is much more computationally efficient and much less sensitive to the initial contour. The proposed method has also applied to 56 patient data sets and produced very promising results. Conclusions An improved level set method suitable for vertebra CT images segmentation is proposed. It has the flexibility of segmenting the vertebra CT images with blurry or discontinuous edges, internal inhomogeneity and no need of re-initialization. PMID:23714300

  16. Bi-directional evolutionary level set method for topology optimization

    NASA Astrophysics Data System (ADS)

    Zhu, Benliang; Zhang, Xianmin; Fatikow, Sergej; Wang, Nianfeng

    2015-03-01

    A bi-directional evolutionary level set method for solving topology optimization problems is presented in this article. The proposed method has three main advantages over the standard level set method. First, new holes can be automatically generated in the design domain during the optimization process. Second, the dependency of the obtained optimized configurations upon the initial configurations is eliminated. Optimized configurations can be obtained even being started from a minimum possible initial guess. Third, the method can be easily implemented and is computationally more efficient. The validity of the proposed method is tested on the mean compliance minimization problem and the compliant mechanisms topology optimization problem.

  17. The ICF: A Framework for Setting Goals for Children with Speech Impairment

    ERIC Educational Resources Information Center

    McLeod, Sharynne; Bleile, Ken

    2004-01-01

    The International Classification of Functioning, Disability and Health (ICF) (World Health Organization, 2001) is proposed as a framework for integrative goal setting for children with speech impairment. The ICF incorporates both impairment and social factors to consider when selecting appropriate goals to bring about change in the lives of…

  18. Geologic setting of the low-level burial grounds

    SciTech Connect

    Lindsey, K.A.; Jaeger, G.K.; Slate, J.L.; Swett, K.J.; Mercer, R.B.

    1994-10-13

    This report describes the regional and site specific geology of the Hanford Sites low-level burial grounds in the 200 East and West Areas. The report incorporates data from boreholes across the entire 200 Areas, integrating the geology of this area into a single framework. Geologic cross-sections, isopach maps, and structure contour maps of all major geological units from the top of the Columbia River Basalt Group to the surface are included. The physical properties and characteristics of the major suprabasalt sedimentary units also are discussed.

  19. A Quadrature Free Discontinuous Galerkin Conservative Level Set Scheme

    NASA Astrophysics Data System (ADS)

    Czajkowski, Mark; Desjardins, Olivier

    2010-11-01

    In an effort to improve the scalability and accuracy of the Accurate Conservative Level Set (ACLS) scheme [Desjardins et al., J COMPUT PHYS 227 (2008)], a scheme based on the quadrature free discontinuous Galerkin (DG) methodology has been developed. ACLS relies on a hyperbolic tangent level set function that is transported and reinitialized using conservative schemes in order to alleviate mass conservation issues known to plague level set methods. DG allows for an arbitrarily high order representation of the interface by using a basis of high order polynomials while only using data from the faces of neighboring cells. The small stencil allows DG to have excellent parallel scalability. The diffusion term present in the conservative reinitialization equation is handled using local DG method [Cockburn et al., SIAM J NUMER ANAL 39, (2001)] while the normals are computed from a limited form of the level set function in order to avoid spurious oscillations. The resulting scheme is shown to be both robust, accurate, and highly scalable, making it a method of choice for large-scale simulations of multiphase flows with complex interfacial topology.

  20. The distortion of the level set gradient under advection

    NASA Astrophysics Data System (ADS)

    Trujillo, Mario F.; Anumolu, Lakshman; Ryddner, Doug

    2017-04-01

    The practice of periodically reinitializing the level set function is well established in two-phase flow applications as a way of controlling the growth of anomalies and/or numerical errors. In the present work, the underlying roots of this anomalous growth are studied, where it is established that the augmentation of the magnitude of the level set gradient (| ∇ϕ |) is directly connected to the nature of the flow field; hence, it is not necessarily the result of some type of numerical error. More specifically, for a general flow field advecting the level set function, it is shown that the eigenpairs of the strain rate tensor are responsible for the rate of change of | ∇ϕ | along a fluid particle trajectory. This straining action not only affects the magnitude of | ∇ϕ |, but the general character of ϕ, and consequently contributes to the growth in numerical error. These numerical consequences are examined by adopting the Gradient Augmented Level Set method. Specifically, it is shown that the local error for ϕ is directly connected to the size of | ∇ϕ | and to the magnitude of the second and fourth order derivatives of ϕ. These analytical findings are subsequently supported by various examples. The role of reinitialization is discussed, where it is shown that in cases where the zero level set contour has a local radius of curvature that is below the local grid resolution, reinitialization exacerbates rather than diminishes the degree of error. For other cases, where the interface is well resolved, reinitialization helps stabilize the error as intended.

  1. A variational approach to path planning in three dimensions using level set methods

    NASA Astrophysics Data System (ADS)

    Cecil, Thomas; Marthaler, Daniel E.

    2006-01-01

    In this paper we extend the two-dimensional methods set forth in [T. Cecil, D. Marthaler, A variational approach to search and path planning using level set methods, UCLA CAM Report, 04-61, 2004], proposing a variational approach to a path planning problem in three dimensions using a level set framework. After defining an energy integral over the path, we use gradient flow on the defined energy and evolve the entire path until a locally optimal steady state is reached. We follow the framework for motion of curves in three dimensions set forth in [P. Burchard, L.-T. Cheng, B. Merriman, S. Osher, Motion of curves in three spatial dimensions using a level set approach, J. Comput. Phys. 170(2) (2001) 720-741], modified appropriately to take into account that we allow for paths with positive, varying widths. Applications of this method extend to robotic motion and visibility problems, for example. Numerical methods and algorithms are given, and examples are presented.

  2. Ambient ultraviolet radiation levels in public shade settings.

    PubMed

    Moise, A F; Aynsley, R

    1999-11-01

    As people become better informed about the harmful effects of prolonged exposure to solar ultraviolet radiation (UVR, 280-400 nm) they will seek the protection of shade, particularly in tropical locations such as Townsville (19 degrees south). Using broad-band radiation sensors for solar ultraviolet-B (280-315 nm), ultraviolet-A (315-400 nm) and daylight (400-800 nm) radiation, the exposure levels were measured in both the horizontal (shaded and unshaded) and vertical (shaded and unshaded) directions. The measurements were conducted at eight locations (shade settings) in Townsville during the period between December 1997 (summer) and May 1998 (beginning of winter). The quality of protection was assessed by the ratio of unshaded to shaded radiation exposure, the UVB/shade protection ratio (UVB-SPR). The UVB-SPR varies considerably between the different shade settings, with a beach umbrella showing the least protection and dense foliage the highest protection. The roof of a house verandah can provide only little protection if the verandah catches the afternoon sun. Increasing cloud cover decreases the UVB-SPR for all settings because of the increase in the diffuse fraction of the radiation. Only one setting provided a UVB-SPR of 15 or higher, as suggested for protective shading against solar UVB radiation. Shade from direct sunlight alone does not provide enough protection against high levels of solar UVR. Apart from the transmission qualities of the shading material, it is the construction of the whole shade setting that determines the exposure levels underneath. A shade structure with enough overhang is recommended so that high levels of scattered radiation do not reach the skin.

  3. Level set method coupled with Energy Image features for brain MR image segmentation.

    PubMed

    Punga, Mirela Visan; Gaurav, Rahul; Moraru, Luminita

    2014-06-01

    Up until now, the noise and intensity inhomogeneity are considered one of the major drawbacks in the field of brain magnetic resonance (MR) image segmentation. This paper introduces the energy image feature approach for intensity inhomogeneity correction. Our approach of segmentation takes the advantage of image features and preserves the advantages of the level set methods in region-based active contours framework. The energy image feature represents a new image obtained from the original image when the pixels' values are replaced by local energy values computed in the 3×3 mask size. The performance and utility of the energy image features were tested and compared through two different variants of level set methods: one as the encompassed local and global intensity fitting method and the other as the selective binary and Gaussian filtering regularized level set method. The reported results demonstrate the flexibility of the energy image feature to adapt to level set segmentation framework and to perform the challenging task of brain lesion segmentation in a rather robust way.

  4. Skull defect reconstruction based on a new hybrid level set.

    PubMed

    Zhang, Ziqun; Zhang, Ran; Song, Zhijian

    2014-01-01

    Skull defect reconstruction is an important aspect of surgical repair. Historically, a skull defect prosthesis was created by the mirroring technique, surface fitting, or formed templates. These methods are not based on the anatomy of the individual patient's skull, and therefore, the prosthesis cannot precisely correct the defect. This study presented a new hybrid level set model, taking into account both the global optimization region information and the local accuracy edge information, while avoiding re-initialization during the evolution of the level set function. Based on the new method, a skull defect was reconstructed, and the skull prosthesis was produced by rapid prototyping technology. This resulted in a skull defect prosthesis that well matched the skull defect with excellent individual adaptation.

  5. The constrained reinitialization equation for level set methods

    NASA Astrophysics Data System (ADS)

    Hartmann, Daniel; Meinke, Matthias; Schröder, Wolfgang

    2010-03-01

    Based on the constrained reinitialization scheme [D. Hartmann, M. Meinke, W. Schröder, Differential equation based constrained reinitialization for level set methods, J. Comput. Phys. 227 (2008) 6821-6845] a new constrained reinitialization equation incorporating a forcing term is introduced. Two formulations for high-order constrained reinitialization (HCR) are presented combining the simplicity and generality of the original reinitialization equation [M. Sussman, P. Smereka, S. Osher, A level set approach for computing solutions to incompressible two-phase flow, J. Comput. Phys. 114 (1994) 146-159] in terms of high-order standard discretization and the accuracy of the constrained reinitialization scheme in terms of interface displacement. The novel HCR schemes represent simple extensions of standard implementations of the original reinitialization equation. The results evidence the significantly increased accuracy and robustness of the novel schemes.

  6. A chance-constrained programming level set method for longitudinal segmentation of lung tumors in CT.

    PubMed

    Rouchdy, Youssef; Bloch, Isabelle

    2011-01-01

    This paper presents a novel stochastic level set method for the longitudinal tracking of lung tumors in computed tomography (CT). The proposed model addresses the limitations of registration based and segmentation based methods for longitudinal tumor tracking. It combines the advantages of each approach using a new probabilistic framework, namely Chance-Constrained Programming (CCP). Lung tumors can shrink or grow over time, which can be reflected in large changes of shape, appearance and volume in CT images. Traditional level set methods with a priori knowledge about shape are not suitable since the tumors are undergoing random and large changes in shape. Our CCP level set model allows to introduce a flexible prior to track structures with a highly variable shape by permitting a constraint violation of the prior up to a specified probability level. The chance constraints are computed from two given points by the user or from segmented tumors from a reference image. The reference image can be one of the images studied or an external template. We present a numerical scheme to approximate the solution of the proposed model and apply it to track lung tumors in CT. Finally, we compare our approach with a Bayesian level set. The CCP level set model gives the best results: it is more coherent with the manual segmentation.

  7. High Frequency Acoustic Propagation using Level Set Methods

    DTIC Science & Technology

    2007-01-01

    solution of the high frequency approximation to the wave equation. Traditional solutions to the Eikonal equation in high frequency acoustics are...curvature can be extracted at any point of the front from the level set function (provided the normal and curvature are well-defined at that point ), and... points per wavelength to resolve the wave). Ray tracing is therefore the current standard for high frequency propagation modeling. LSM may provide

  8. A Level Set Filter for Speckle Reduction in SAR Images

    NASA Astrophysics Data System (ADS)

    Li, Hongga; Huang, Bo; Huang, Xiaoxia

    2010-12-01

    Despite much effort and significant progress in recent years, speckle removal for Synthetic Aperture Radar (SAR) image still is a challenging problem in image processing. Unlike the traditional noise filters, which are mainly based on local neighborhood statistical average or frequencies transform, in this paper, we propose a speckle reduction method based on the theory of level set, one form of curvature flow propagation. Firstly, based on partial differential equation, the Lee filter can be cast as a formulation of anisotropic diffusion function; furthermore, we continued to deduce it into a level set formulation. Level set flow into the method allows the front interface to propagate naturally with topological changes, where the speed is proportional to the curvature of the intensity contours in an image. Hence, small speckle will disappear quickly, while large scale interfaces will be slow to evolve. Secondly, for preserving finer detailed structures in images when smoothing the speckle, the evolution is switched between minimum or maximum curvature speed depending on the scale of speckle. The proposed method has been illustrated by experiments on simulation image and ERS-2 SAR images under different circumstances. Its advantages over the traditional speckle reduction filter approaches have also been demonstrated.

  9. Level Set Approach to Anisotropic Wet Etching of Silicon

    PubMed Central

    Radjenović, Branislav; Radmilović-Radjenović, Marija; Mitrić, Miodrag

    2010-01-01

    In this paper a methodology for the three dimensional (3D) modeling and simulation of the profile evolution during anisotropic wet etching of silicon based on the level set method is presented. Etching rate anisotropy in silicon is modeled taking into account full silicon symmetry properties, by means of the interpolation technique using experimentally obtained values for the etching rates along thirteen principal and high index directions in KOH solutions. The resulting level set equations are solved using an open source implementation of the sparse field method (ITK library, developed in medical image processing community), extended for the case of non-convex Hamiltonians. Simulation results for some interesting initial 3D shapes, as well as some more practical examples illustrating anisotropic etching simulation in the presence of masks (simple square aperture mask, convex corner undercutting and convex corner compensation, formation of suspended structures) are shown also. The obtained results show that level set method can be used as an effective tool for wet etching process modeling, and that is a viable alternative to the Cellular Automata method which now prevails in the simulations of the wet etching process. PMID:22399916

  10. A linear optimal transportation framework for quantifying and visualizing variations in sets of images

    PubMed Central

    Wang, Wei; Slepčev, Dejan; Basu, Saurav; Ozolek, John A.

    2012-01-01

    Transportation-based metrics for comparing images have long been applied to analyze images, especially where one can interpret the pixel intensities (or derived quantities) as a distribution of ‘mass’ that can be transported without strict geometric constraints. Here we describe a new transportation-based framework for analyzing sets of images. More specifically, we describe a new transportation-related distance between pairs of images, which we denote as linear optimal transportation (LOT). The LOT can be used directly on pixel intensities, and is based on a linearized version of the Kantorovich-Wasserstein metric (an optimal transportation distance, as is the earth mover’s distance). The new framework is especially well suited for computing all pairwise distances for a large database of images efficiently, and thus it can be used for pattern recognition in sets of images. In addition, the new LOT framework also allows for an isometric linear embedding, greatly facilitating the ability to visualize discriminant information in different classes of images. We demonstrate the application of the framework to several tasks such as discriminating nuclear chromatin patterns in cancer cells, decoding differences in facial expressions, galaxy morphologies, as well as sub cellular protein distributions. PMID:23729991

  11. A theoretical and computational setting for a geometrically nonlinear gradient damage modelling framework

    NASA Astrophysics Data System (ADS)

    Nedjar, B.

    The present work deals with the extension to the geometrically nonlinear case of recently proposed ideas on elastic- and elastoplastic-damage modelling frameworks within the infinitesimal theory. The particularity of these models is that the damage part of the modelling involves the gradient of damage quantity which, together with the equations of motion, are ensuing from a new formulation of the principle of virtual power. It is shown how the thermodynamics of irreversible processes is crucial in the characterization of the dissipative phenomena and in setting the convenient forms for the constitutive relations. On the numerical side, we discuss the problem of numerically integrating these equations and the implementation within the context of the finite element method is described in detail. And finally, we present a set of representative numerical simulations to illustrate the effectiveness of the proposed framework.

  12. Framework development for the assessment of interprofessional teamwork in mental health settings.

    PubMed

    Tomizawa, Ryoko; Shigeta, Masahiro; Reeves, Scott

    2017-01-01

    In mental health settings, interprofessional practice is regarded as a comprehensive approach to prevent relapse and manage chronic conditions with practice of various teamwork interventions. To reinforce the potential of interprofessional teamwork, it is recommended that theories or conceptual frameworks be employed. There continues, however, to be a limited use of such approaches that assess the quality of interprofessional teamwork in mental health settings. This article aimed to present a new conceptual framework for the assessment of interprofessional teamwork based on the findings of a scoping review of the literature. This review was undertaken to identify conceptual frameworks utilised in interprofessional teamwork in mental health settings. After reviewing 952 articles, the methodological characteristics extracted from 12 articles were considered. The included studies were synthesised into the Donabedian structure-process-outcome model. The findings revealed that structural issues comprised three elements: professional characteristics, client-care characteristics, and contextual characteristics in organisations. Process issues comprised two elements: team mechanisms and community-oriented services. Finally, outcome issues comprised the following elements: clients' outcomes and professionals' outcomes. The review findings suggested possibilities for further development of how to assess the quality of interprofessional teamwork and provided information about what specific approach is required to improve interprofessional teamwork. Future research should utilise various areas and cultures to clarify the adaptation potential.

  13. Multiregion level-set partitioning of synthetic aperture radar images.

    PubMed

    Ben Ayed, Ismail; Mitiche, Amar; Belhadj, Ziad

    2005-05-01

    The purpose of this study is to investigate Synthetic Aperture Radar (SAR) image segmentation into a given but arbitrary number of gamma homogeneous regions via active contours and level sets. The segmentation of SAR images is a difficult problem due to the presence of speckle which can be modeled as strong, multiplicative noise. The proposed algorithm consists of evolving simple closed planar curves within an explicit correspondence between the interiors of curves and regions of segmentation to minimize a criterion containing a term of conformity of data to a speckle model of noise and a term of regularization. Results are shown on both synthetic and real images.

  14. A geometric level set model for ultrasounds analysis

    SciTech Connect

    Sarti, A.; Malladi, R.

    1999-10-01

    We propose a partial differential equation (PDE) for filtering and segmentation of echocardiographic images based on a geometric-driven scheme. The method allows edge-preserving image smoothing and a semi-automatic segmentation of the heart chambers, that regularizes the shapes and improves edge fidelity especially in presence of distinct gaps in the edge map as is common in ultrasound imagery. A numerical scheme for solving the proposed PDE is borrowed from level set methods. Results on human in vivo acquired 2D, 2D+time,3D, 3D+time echocardiographic images are shown.

  15. A conceptual framework of computations in mid-level vision.

    PubMed

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words-or, rather, descriptors-capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations.

  16. A conceptual framework of computations in mid-level vision

    PubMed Central

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P.

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words—or, rather, descriptors—capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations. PMID:25566044

  17. Microarray missing data imputation based on a set theoretic framework and biological knowledge

    PubMed Central

    Gan, Xiangchao; Liew, Alan Wee-Chung; Yan, Hong

    2006-01-01

    Gene expressions measured using microarrays usually suffer from the missing value problem. However, in many data analysis methods, a complete data matrix is required. Although existing missing value imputation algorithms have shown good performance to deal with missing values, they also have their limitations. For example, some algorithms have good performance only when strong local correlation exists in data while some provide the best estimate when data is dominated by global structure. In addition, these algorithms do not take into account any biological constraint in their imputation. In this paper, we propose a set theoretic framework based on projection onto convex sets (POCS) for missing data imputation. POCS allows us to incorporate different types of a priori knowledge about missing values into the estimation process. The main idea of POCS is to formulate every piece of prior knowledge into a corresponding convex set and then use a convergence-guaranteed iterative procedure to obtain a solution in the intersection of all these sets. In this work, we design several convex sets, taking into consideration the biological characteristic of the data: the first set mainly exploit the local correlation structure among genes in microarray data, while the second set captures the global correlation structure among arrays. The third set (actually a series of sets) exploits the biological phenomenon of synchronization loss in microarray experiments. In cyclic systems, synchronization loss is a common phenomenon and we construct a series of sets based on this phenomenon for our POCS imputation algorithm. Experiments show that our algorithm can achieve a significant reduction of error compared to the KNNimpute, SVDimpute and LSimpute methods. PMID:16549873

  18. Variational level set segmentation for forest based on MCMC sampling

    NASA Astrophysics Data System (ADS)

    Yang, Tie-Jun; Huang, Lin; Jiang, Chuan-xian; Nong, Jian

    2014-11-01

    Environmental protection is one of the themes of today's world. The forest is a recycler of carbon dioxide and natural oxygen bar. Protection of forests, monitoring of forest growth is long-term task of environmental protection. It is very important to automatically statistic the forest coverage rate using optical remote sensing images and the computer, by which we can timely understand the status of the forest of an area, and can be freed from tedious manual statistics. Towards the problem of computational complexity of the global optimization using convexification, this paper proposes a level set segmentation method based on Markov chain Monte Carlo (MCMC) sampling and applies it to forest segmentation in remote sensing images. The presented method needs not to do any convexity transformation for the energy functional of the goal, and uses MCMC sampling method with global optimization capability instead. The possible local minima occurring by using gradient descent method is also avoided. There are three major contributions in the paper. Firstly, by using MCMC sampling, the convexity of the energy functional is no longer necessary and global optimization can still be achieved. Secondly, taking advantage of the data (texture) and knowledge (a priori color) to guide the construction of Markov chain, the convergence rate of Markov chains is improved significantly. Finally, the level set segmentation method by integrating a priori color and texture for forest is proposed. The experiments show that our method can efficiently and accurately segment forest in remote sensing images.

  19. Interface Surface Area Tracking for the Conservative Level Set Method

    NASA Astrophysics Data System (ADS)

    Firehammer, Stephanie; Desjardins, Olivier

    2015-11-01

    One key question in liquid-gas flows is how to model the interface between phases in a way that is mass, momentum, and energy conserving. The accurate conservative level set (ACLS) method of Desjardins et al. provides a tool for tracking a liquid-gas interface with minimal mass conservation issues; however, it does not explicitly compute the interface surface area and thus nothing can be said a priori about the balance between kinetic energy and surface energy. This work examines an equation for the transport of interface surface area density, which can be written in terms of the gradient of the volume fraction. Furthermore this presentation will outline a numerical method for jointly transporting a conservative level set and surface area density. Finally, we will explore oppportunities for energy conservation via the accurate exchange of energy between the flow field and the interface through surface tension, with test cases to show the results of our extended ACLS method. Funding from the National Science Foundation is gratefully acknowledged.

  20. Using the level set method to track ice sheet boundaries

    NASA Astrophysics Data System (ADS)

    Lindsey, D. S.; Dupont, T. K.

    2009-12-01

    Simulating ice-sheet volume changes requires tracking the interface of ice and its surrounding media, e.g. water, air, and sediment or rock. This can be challenging when using a fixed, or Eulerian, grid and allowing the interface to move via kinematic boundary conditions. For example, the interface may fall between grid points at a given point in time, making the application of boundary conditions less than straightforward. The level set method of Osher and Sethian (1988) offers an alternative approach, wherein a continuous level set function evolves within the domain via the combined kinematics of ice and its encompassing materials. The methods true strength lies in tracking the interface of two materials through time. Pralong and Funk (2004) applied this method to the movement of a glacier’s ice/air interface, offering a glimpse of the potential of this method for glaciology. Here we perform a simple preliminary test of the method for a two-dimensional (flow-line) model of an ice shelf, comparing the results to analytic approximations of the movement of both the ice/air interface and the ice front. Future experiments will incorporate grounded ice and include basal and lateral-shear stresses. The ultimate goal of this work is provide a practical approach for two and three-dimensional ice-sheet models to naturally track their moving boundaries.

  1. Crossing levels in systems ergonomics: a framework to support 'mesoergonomic' inquiry.

    PubMed

    Karsh, Ben-Tzion; Waterson, Patrick; Holden, Richard J

    2014-01-01

    In this paper we elaborate and articulate the need for what has been termed 'mesoergonomics'. In particular, we argue that the concept has the potential to bridge the gap between, and integrate, established work within the domains of micro- and macroergonomics. Mesoergonomics is defined as an open systems approach to human factors and ergonomics (HFE) theory and research whereby the relationship between variables in at least two different system levels or echelons is studied, and where the dependent variables are human factors and ergonomic constructs. We present a framework which can be used to structure a set of questions for future work and prompt further empirical and conceptual inquiry. The framework consists of four steps: (1) establishing the purpose of the mesoergonomic investigation; (2) selecting human factors and ergonomics variables; (3) selecting a specific type of mesoergonomic investigation; and (4) establishing relationships between system levels. In addition, we describe two case studies which illustrate the workings of the framework and the value of adopting a mesoergonomic perspective within HFE. The paper concludes with a set of issues which could form part of a future agenda for research within systems ergonomics.

  2. A risk-informed decision framework for setting environmental windows for dredging projects.

    PubMed

    Suedel, Burton C; Kim, Jongbum; Clarke, Douglas G; Linkov, Igor

    2008-09-15

    Sediment dredging is necessary to sustain navigation infrastructure in ports and harbor areas. In the United States alone between 250 and 300 million cubic yards of sediment are dredged annually. Dredging activities may cause stress on aquatic biota by locally increasing turbidity and suspended sediment concentrations, physically disturbing habitat by elevated sedimentation rates, interfering in migratory behaviors, and hydraulically entraining bottom dwelling organisms. Environmental windows are a management practice used to alleviate such stresses on resident and transient biota by placing temporal restrictions on the conduct of dredging operations. Adherence to environmental windows can significantly inflate costs for project sponsors and local stakeholders. Since their inception following passage of NEPA in 1969 the process for setting environmental windows has not followed structured procedures and represents an example of the difficulty inherent in achieving a balance between biological resource protection and cost-effective construction and maintenance of navigation infrastructure. Recent developments in the fields of risk assessment for non-chemical stressors as well as experience in implementing structured risk-informed decision-making tools for sediment and natural resource management are summarized in this paper in relation to setting environmental windows. Combining risk assessment and multi-criteria decision analysis allows development of a framework for an objective process consistent with recommendations by the National Academy of Sciences for setting environmental windows. A hypothetical application of the framework for protection of Pacific herring (Clupea pallasii) in San Francisco Bay is discussed.

  3. Bio-molecule Surfaces Construction via a Higher-Order Level-Set Method.

    PubMed

    Bajaj, Chandrajit L; Xu, Guo-Liang; Zhang, Qin

    2008-11-01

    We present a general framework for a higher-order spline level-set (HLS) method and apply this to bio-molecule surfaces construction. Starting from a first order energy functional, we obtain a general level set formulation of geometric partial differential equation, and provide an efficient approach to solve this partial differential equation using a C(2) spline basis. We also present a fast cubic spline interpolation algorithm based on convolution and the Z-transform, which exploits the local relationship of interpolatory cubic spline coefficients with respect to given function data values. One example of our HLS method is demonstrated, which is the construction of bio-molecule surfaces (an implicit solvation interface) with their individual atomic coordinates and solvated radii as prerequisite.

  4. Fast parallel algorithms: from images to level sets and labels

    NASA Astrophysics Data System (ADS)

    Nguyen, H. T.; Jung, Ken K.; Raghavan, Raghu

    1990-07-01

    Decomposition into level sets refers to assigning a code with respect to intensity or elevation while labeling refers to assigning a code with respect to disconnected regions. We present a sequence of parallel algorithms for these two processes. The process of labeling includes re-assign labels into a natural sequence and compare different labeling algorithm. We discuss the difference between edge-based and region-based labeling. The speed improvements in this labeling scheme come from the collective efficiency of different techniques. We have implemented these algorithms on an in-house built Geometric Single Instruction Multiple Data (GSIMD) parallel machine with global buses and a Multiple Instruction Multiple Data (MIMD) controller. This allows real time image interpretation on live data at a rate that is much higher than video rate. The performance figures will be shown.

  5. Modeling cellular deformations using the level set formalism

    PubMed Central

    Yang, Liu; Effler, Janet C; Kutscher, Brett L; Sullivan, Sarah E; Robinson, Douglas N; Iglesias, Pablo A

    2008-01-01

    Background Many cellular processes involve substantial shape changes. Traditional simulations of these cell shape changes require that grids and boundaries be moved as the cell's shape evolves. Here we demonstrate that accurate cell shape changes can be recreated using level set methods (LSM), in which the cellular shape is defined implicitly, thereby eschewing the need for updating boundaries. Results We obtain a viscoelastic model of Dictyostelium cells using micropipette aspiration and show how this viscoelastic model can be incorporated into LSM simulations to recreate the observed protrusion of cells into the micropipette faithfully. We also demonstrate the use of our techniques by simulating the cell shape changes elicited by the chemotactic response to an external chemoattractant gradient. Conclusion Our results provide a simple but effective means of incorporating cellular deformations into mathematical simulations of cell signaling. Such methods will be useful for simulating important cellular events such as chemotaxis and cytokinesis. PMID:18652669

  6. A Level Set Method for vaporizing two-phase flows

    NASA Astrophysics Data System (ADS)

    Tanguy, Sébastien; Ménard, Thibaut; Berlemont, Alain

    2007-02-01

    Development and applications of numerical methods devoted to reactive interface simulations are presented. Emphasis is put on vaporization, where numerical difficulties arise in imposing accurate jump conditions for heat and mass transfers. We use both the Level Set Method and the Ghost Fluid Method to capture the interface motion accurately and to handle suitable jump conditions. A local vaporization mass flow rate per unit of surface area is defined and Stefan flow is involved in the process. Specific care has been devoted to the extension of discontinuous variables across the interface to populate ghost cells, in order to avoid parasitic currents and numerical diffusion across the interface. A projection method is set up to impose both the velocity field continuity and a divergence-free condition for the extended velocity field across the interface. The d2 law is verified in the numerical simulations of the vaporization of an isolated static drop. Results are then presented for a water droplet moving in air. Vapor mass fraction and temperature fields inside and outside the droplet are presented.

  7. Statistics of dark matter halos in the excursion set peak framework

    SciTech Connect

    Lapi, A.; Danese, L. E-mail: danese@sissa.it

    2014-07-01

    We derive approximated, yet very accurate analytical expressions for the abundance and clustering properties of dark matter halos in the excursion set peak framework; the latter relies on the standard excursion set approach, but also includes the effects of a realistic filtering of the density field, a mass-dependent threshold for collapse, and the prescription from peak theory that halos tend to form around density maxima. We find that our approximations work excellently for diverse power spectra, collapse thresholds and density filters. Moreover, when adopting a cold dark matter power spectra, a tophat filtering and a mass-dependent collapse threshold (supplemented with conceivable scatter), our approximated halo mass function and halo bias represent very well the outcomes of cosmological N-body simulations.

  8. Comprehensive evaluation of long-term hydrological data sets: Constraints of the Budyko framework

    NASA Astrophysics Data System (ADS)

    Greve, Peter; Orlowsky, Boris; Seneviratne, Sonia I.

    2013-04-01

    An accurate estimate of the climatological land water balance is essential for a wide range of socio-economical issues. Despite the simplicity of the underlying water balance equation, its individual variables are of complex nature. Global estimates, either derived from observations or from models, of precipitation (P ) and especially evapotranspiration (ET) are characterized by high uncertainties. This leads to inconsistent results in determining conditions related to the land water balance and its components. In this study, we consider the Budyko framework as a constraint to evaluate long-term hydrological data sets within the period from 1984 to 2005. The Budyko framework is a well established empirically based relationsship between ET-P and Ep-P , with Ep being the potential evaporation. We use estimates of ET associated with the LandFlux-EVAL initiative (Mueller et. al., 2012), either derived from observations, CMIP5 models or land-surface models (LSMs) driven with observation-based forcing or atmospheric reanalyses. Data sets of P comprise all commonly used global observation-based estimates. Ep is determined by methods of differing complexity with recent global temperature and radiation data sets. Based on this comprehensive synthesis of data sets and methods to determine Ep, more than 2000 possible combinations for ET-P in conjunction with Ep-P are created. All combinations are validated against the Budyko curve and against physical limits within the Budyko phase space. For this purpose we develop an error measure based on the root mean square error which combines both constraints. We find that uncertainties are mainly induced by the ET data sets. In particular, reanalysis and CMIP5 data sets are characterized by low realism. The realism of LSMs is further not primarily controlled by the forcing, as different LSMs driven with the same forcing show significantly different error measures. Our comprehensive approach is thus suitable to detect uncertainties

  9. Implementing accountability for reasonableness framework at district level in Tanzania: a realist evaluation

    PubMed Central

    2011-01-01

    Background Despite the growing importance of the Accountability for Reasonableness (A4R) framework in priority setting worldwide, there is still an inadequate understanding of the processes and mechanisms underlying its influence on legitimacy and fairness, as conceived and reflected in service management processes and outcomes. As a result, the ability to draw scientifically sound lessons for the application of the framework to services and interventions is limited. This paper evaluates the experiences of implementing the A4R approach in Mbarali District, Tanzania, in order to find out how the innovation was shaped, enabled, and constrained by the interaction between contexts, mechanisms and outcomes. Methods This study draws on the principles of realist evaluation -- a largely qualitative approach, chiefly concerned with testing and refining programme theories by exploring the complex interactions of contexts, mechanisms, and outcomes. Mixed methods were used in data collection, including individual interviews, non-participant observation, and document reviews. A thematic framework approach was adopted for the data analysis. Results The study found that while the A4R approach to priority setting was helpful in strengthening transparency, accountability, stakeholder engagement, and fairness, the efforts at integrating it into the current district health system were challenging. Participatory structures under the decentralisation framework, central government's call for partnership in district-level planning and priority setting, perceived needs of stakeholders, as well as active engagement between researchers and decision makers all facilitated the adoption and implementation of the innovation. In contrast, however, limited local autonomy, low level of public awareness, unreliable and untimely funding, inadequate accountability mechanisms, and limited local resources were the major contextual factors that hampered the full implementation. Conclusion This study

  10. Framework for State-Level Renewable Energy Market Potential Studies

    EPA Pesticide Factsheets

    This document provides a framework and next steps for state officials who require estimates of renewable energy market potential. The report gives insight into how to conduct a market potential study.

  11. Parallel level-set methods on adaptive tree-based grids

    NASA Astrophysics Data System (ADS)

    Mirzadeh, Mohammad; Guittet, Arthur; Burstedde, Carsten; Gibou, Frederic

    2016-10-01

    We present scalable algorithms for the level-set method on dynamic, adaptive Quadtree and Octree Cartesian grids. The algorithms are fully parallelized and implemented using the MPI standard and the open-source p4est library. We solve the level set equation with a semi-Lagrangian method which, similar to its serial implementation, is free of any time-step restrictions. This is achieved by introducing a scalable global interpolation scheme on adaptive tree-based grids. Moreover, we present a simple parallel reinitialization scheme using the pseudo-time transient formulation. Both parallel algorithms scale on the Stampede supercomputer, where we are currently using up to 4096 CPU cores, the limit of our current account. Finally, a relevant application of the algorithms is presented in modeling a crystallization phenomenon by solving a Stefan problem, illustrating a level of detail that would be impossible to achieve without a parallel adaptive strategy. We believe that the algorithms presented in this article will be of interest and useful to researchers working with the level-set framework and modeling multi-scale physics in general.

  12. Device for timing and power level setting for microwave applications

    NASA Astrophysics Data System (ADS)

    Ursu, M.-P.; Buidoş, T.

    2016-08-01

    Nowadays, the microwaves are widely used for various technological processes. The microwaves are emitted by magnetrons, which have strict requirements concerning power supplies for anode and filament cathodes, intensity of magnetic field, cooling and electromagnetic shielding. The magnetrons do not tolerate any alteration of their required voltages, currents and magnetic fields, which means that their output microwave power is fixed, so the only way to alter the power level is to use time-division, by turning the magnetron on and off by repetitive time patterns. In order to attain accurate and reproducible results, as well as correct and safe operation of the microwave device, all these requirements must be fulfilled. Safe, correct and reproducible operation of the microwave appliance can be achieved by means of a specially built electronic device, which ensures accurate and reproducible exposure times, interlocking of the commands and automatic switch off when abnormal operating conditions occur. This driving device, designed and realized during the completion of Mr.Ursu's doctoral thesis, consists of a quartz time-base, several programmable frequency and duration dividers, LED displays, sensors and interlocking gates. The active and passive electronic components are placed on custom-made PCB's, designed and made by means of computer-aided applications and machines. The driving commands of the electronic device are delivered to the magnetron power supplies by means of optic zero-passing relays. The inputs of the electronic driving device can sense the status of the microwave appliance. The user is able to enter the total exposure time, the division factor that sets the output power level and, as a novelty, the clock frequency of the time divider.

  13. Medical image segmentation using level set and watershed transform

    NASA Astrophysics Data System (ADS)

    Zhu, Fuping; Tian, Jie

    2003-07-01

    One of the most popular level set algorithms is the so-called fast marching method. In this paper, a medical image segmentation algorithm is proposed based on the combination of fast marching method and watershed transformation. First, the original image is smoothed using nonlinear diffusion filter, then the smoothed image is over-segmented by the watershed algorithm. Last, the image is segmented automatically using the modified fast marching method. Due to introducing over-segmentation, the arrival time the seeded point to the boundary of region should be calculated. For other pixels inside the region of the seeded point, the arrival time is not calculated because of the region homogeneity. So the algorithm"s speed improves greatly. Moreover, the speed function is redefined based on the statistical similarity degree of the nearby regions. We also extend our algorithm to 3D circumstance and segment medical image series. Experiments show that the algorithm can fast and accurately obtain segmentation results of medical images.

  14. Haustral fold segmentation with curvature-guided level set evolution.

    PubMed

    Zhu, Hongbin; Barish, Matthew; Pickhardt, Perry; Liang, Zhengrong

    2013-02-01

    Human colon has complex structures mostly because of the haustral folds. The folds are thin flat protrusions on the colon wall, which complicate the shape analysis for computer-aided detection (CAD) of colonic polyps. Fold segmentation may help reduce the structural complexity, and the folds can serve as an anatomic reference for computed tomographic colonography (CTC). Therefore, in this study, based on a model of the haustral fold boundaries, we developed a level-set approach to automatically segment the fold surfaces. To evaluate the developed fold segmentation algorithm, we first established the ground truth of haustral fold boundaries by experts' drawing on 15 patient CTC datasets without severe under/over colon distention from two medical centers. The segmentation algorithm successfully detected 92.7% of the folds in the ground truth. In addition to the sensitivity measure, we further developed a merit of segmented-area ratio (SAR), i.e., the ratio between the area of the intersection and union of the expert-drawn folds and the area of the automatically segmented folds, to measure the segmentation accuracy. The segmentation algorithm reached an average value of SAR = 86.2%, showing a good match with the ground truth on the fold surfaces. We believe the automatically segmented fold surfaces have the potential to benefit many postprocedures in CTC, such as CAD, taenia coli extraction, supine-prone registration, etc.

  15. Towards a Dynamic Conceptual Framework for English-Medium Education in Multilingual University Settings

    ERIC Educational Resources Information Center

    Dafouz, Emma; Smit, Ute

    2016-01-01

    At a time of increasing internationalization in tertiary education, English-Medium Education in Multilingual University Settings (EMEMUS) has become a common practice. While there is already ample research describing this phenomenon at a local level (Smit and Dafouz 2012a), the theoretical side needs to be elaborated. This article thus aims to…

  16. An Analysis Framework Addressing the Scale and Legibility of Large Scientific Data Sets

    SciTech Connect

    Childs, Hank R.

    2006-01-01

    Much of the previous work in the large data visualization area has solely focused on handling the scale of the data. This task is clearly a great challenge and necessary, but it is not sufficient. Applying standard visualization techniques to large scale data sets often creates complicated pictures where meaningful trends are lost. A second challenge, then, is to also provide algorithms that simplify what an analyst must understand, using either visual or quantitative means. This challenge can be summarized as improving the legibility or reducing the complexity of massive data sets. Fully meeting both of these challenges is the work of many, many PhD dissertations. In this dissertation, we describe some new techniques to address both the scale and legibility challenges, in hope of contributing to the larger solution. In addition to our assumption of simultaneously addressing both scale and legibility, we add an additional requirement that the solutions considered fit well within an interoperable framework for diverse algorithms, because a large suite of algorithms is often necessary to fully understand complex data sets. For scale, we present a general architecture for handling large data, as well as details of a contract-based system for integrating advanced optimizations into a data flow network design. We also describe techniques for volume rendering and performing comparisons at the extreme scale. For legibility, we present several techniques. Most noteworthy are equivalence class functions, a technique to drive visualizations using statistical methods, and line-scan based techniques for characterizing shape.

  17. Protein complex-based analysis framework for high-throughput data sets.

    PubMed

    Vinayagam, Arunachalam; Hu, Yanhui; Kulkarni, Meghana; Roesel, Charles; Sopko, Richelle; Mohr, Stephanie E; Perrimon, Norbert

    2013-02-26

    Analysis of high-throughput data increasingly relies on pathway annotation and functional information derived from Gene Ontology. This approach has limitations, in particular for the analysis of network dynamics over time or under different experimental conditions, in which modules within a network rather than complete pathways might respond and change. We report an analysis framework based on protein complexes, which are at the core of network reorganization. We generated a protein complex resource for human, Drosophila, and yeast from the literature and databases of protein-protein interaction networks, with each species having thousands of complexes. We developed COMPLEAT (http://www.flyrnai.org/compleat), a tool for data mining and visualization for complex-based analysis of high-throughput data sets, as well as analysis and integration of heterogeneous proteomics and gene expression data sets. With COMPLEAT, we identified dynamically regulated protein complexes among genome-wide RNA interference data sets that used the abundance of phosphorylated extracellular signal-regulated kinase in cells stimulated with either insulin or epidermal growth factor as the output. The analysis predicted that the Brahma complex participated in the insulin response.

  18. GeneSetDB: A comprehensive meta-database, statistical and visualisation framework for gene set analysis

    PubMed Central

    Araki, Hiromitsu; Knapp, Christoph; Tsai, Peter; Print, Cristin

    2012-01-01

    Most “omics” experiments require comprehensive interpretation of the biological meaning of gene lists. To address this requirement, a number of gene set analysis (GSA) tools have been developed. Although the biological value of GSA is strictly limited by the breadth of the gene sets used, very few methods exist for simultaneously analysing multiple publically available gene set databases. Therefore, we constructed GeneSetDB (http://genesetdb.auckland.ac.nz/haeremai.html), a comprehensive meta-database, which integrates 26 public databases containing diverse biological information with a particular focus on human disease and pharmacology. GeneSetDB enables users to search for gene sets containing a gene identifier or keyword, generate their own gene sets, or statistically test for enrichment of an uploaded gene list across all gene sets, and visualise gene set enrichment and overlap using a clustered heat map. PMID:23650583

  19. Loudness discomfort level for speech: comparison of two instructional sets for saturation sound pressure level selection.

    PubMed

    Beattie, R C; Svihovec, D A; Carmen, R E; Kunkel, H A

    1980-01-01

    This study was undertaken to compare the speech loudness discomfort levels (LDL's) with two instructional sets which have been proposed for saturation sound pressure level selection of hearing aids. The phraseology recommended by McCandless and by Berger was presented to normal-hearing and hearing-impaired listeners. The normal-hearing subjects obtained mean LDL's of 94.6 and 111.9 dB SPL for these respective instructions, which was statistically significant. The hearing-impaired listeners also showed LDL's with Berger's instructions (114.7 dB SPL) to be significantly higher than with McCandless' instructional set (109.3 dB SPL). Consequently, this investigation suggests that these two instructional sets may lead to substantially different saturation sound pressure levels. Further studies are needed to determine the most appropriate phraseology for LDL measurement, including the assessment of speech intelligibility at various saturation sound pressure levels. Another instructional set was constructed which (1) includes an explanation to patients of the purpose and importance of the test, (2) requests listeners to indicate the upper level they are "willing" to listen as opposed to the level they are "able" to listen, (3) instructs patients to search thoroughly around their LDL before making a final judgment, and (4) contains a statement that the LDL should be made with the understanding that the speech could be listened to for a period of time. Whatever instructions are used, clinicians are advised to interpret their LDL's very cautiously until validational studies are available.

  20. INSTITUTIONALIZING SAFEGUARDS-BY-DESIGN: HIGH-LEVEL FRAMEWORK

    SciTech Connect

    Trond Bjornard PhD; Joseph Alexander; Robert Bean; Brian Castle; Scott DeMuth, Ph.D.; Phillip Durst; Michael Ehinger; Prof. Michael Golay, Ph.D.; Kevin Hase, Ph.D.; David J. Hebditch, DPhil; John Hockert, Ph.D.; Bruce Meppen; James Morgan; Jerry Phillips, Ph.D., PE

    2009-02-01

    The application of a Safeguards-by-Design (SBD) process for new nuclear facilities can reduce proliferation risks. A multi-laboratory team was sponsored in Fiscal Year (FY) 2008 to define a SBD process and determine how it could be incorporated into existing facility design and construction processes. The possibility to significantly influence major design features, such as process selection and plant layout, largely ends with the conceptual design step. Therefore SBD’s principal focus must be on the early inclusion of safeguards requirements and the early identification of beneficial design features. The result could help form the basis for a new international norm for integrating safeguards into facility design. This is an interim report describing progress and project status as of the end of FY08. In this effort, SBD is defined as a structured approach to ensure the timely, efficient, and cost-effective integration of international and national safeguards, physical security, and other nonproliferation objectives into the overall design process for a nuclear facility. A key objective is to ensure that security and nonproliferation issues are considered when weighing facility design alternatives. Central to the work completed in FY08 was a study in which a SBD process was developed in the context of the current DOE facility acquisition process. The DOE study enabled the development of a “SBD design loop” that is suitable for use in any facility design process. It is a graded, iterative process that incorporates safeguards concerns throughout the conceptual, preliminary and final design processes. Additionally, a set of proposed design principles for SBD was developed. A “Generic SBD Process” was then developed. Key features of the process include the initiation of safeguards design activities in the pre-conceptual planning phase, early incorporation of safeguards requirements into the project requirements, early appointment of an SBD team, and

  1. Profile Evolution Simulation in Etching Systems Using Level Set Methods

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Govindan, T. R.; Meyyappan, M.

    1998-01-01

    Semiconductor device profiles are determined by the characteristics of both etching and deposition processes. In particular, a highly anisotropic etch is required to achieve vertical sidewalls. However, etching is comprised of both anisotropic and isotropic components, due to ion and neutral fluxes, respectively. In Ar/Cl2 plasmas, for example, neutral chlorine reacts with the Si surfaces to form silicon chlorides. These compounds are then removed by the impinging ion fluxes. Hence the directionality of the ions (and thus the ion angular distribution function, or IAD), as well as the relative fluxes of neutrals and ions determines the amount of undercutting. One method of modeling device profile evolution is to simulate the moving solid-gas interface between the semiconductor and the plasma as a string of nodes. The velocity of each node is calculated and then the nodes are advanced accordingly. Although this technique appears to be relatively straightforward, extensive looping schemes are required at the profile corners. An alternate method is to use level set theory, which involves embedding the location of the interface in a field variable. The normal speed is calculated at each mesh point, and the field variable is updated. The profile comers are more accurately modeled as the need for looping algorithms is eliminated. The model we have developed is a 2-D Level Set Profile Evolution Simulation (LSPES). The LSPES calculates etch rates of a substrate in low pressure plasmas due to the incident ion and neutral fluxes. For a Si substrate in an Ar/C12 gas mixture, for example, the predictions of the LSPES are identical to those from a string evolution model for high neutral fluxes and two different ion angular distributions.(2) In the figure shown, the relative neutral to ion flux in the bulk plasma is 100 to 1. For a moderately isotropic ion angular distribution function as shown in the cases in the left hand column, both the LSPES (top row) and rude's string

  2. Some free boundary problems in potential flow regime usinga based level set method

    SciTech Connect

    Garzon, M.; Bobillo-Ares, N.; Sethian, J.A.

    2008-12-09

    Recent advances in the field of fluid mechanics with moving fronts are linked to the use of Level Set Methods, a versatile mathematical technique to follow free boundaries which undergo topological changes. A challenging class of problems in this context are those related to the solution of a partial differential equation posed on a moving domain, in which the boundary condition for the PDE solver has to be obtained from a partial differential equation defined on the front. This is the case of potential flow models with moving boundaries. Moreover the fluid front will possibly be carrying some material substance which will diffuse in the front and be advected by the front velocity, as for example the use of surfactants to lower surface tension. We present a Level Set based methodology to embed this partial differential equations defined on the front in a complete Eulerian framework, fully avoiding the tracking of fluid particles and its known limitations. To show the advantages of this approach in the field of Fluid Mechanics we present in this work one particular application: the numerical approximation of a potential flow model to simulate the evolution and breaking of a solitary wave propagating over a slopping bottom and compare the level set based algorithm with previous front tracking models.

  3. Image registration via level-set motion: applications to atlas-based segmentation.

    PubMed

    Vemuri, B C; Ye, J; Chen, Y; Leonard, C M

    2003-03-01

    Image registration is an often encountered problem in various fields including medical imaging, computer vision and image processing. Numerous algorithms for registering image data have been reported in these areas. In this paper, we present a novel curve evolution approach expressed in a level-set framework to achieve image intensity morphing and a simple non-linear PDE for the corresponding coordinate registration. The key features of the intensity morphing model are that (a) it is very fast and (b) existence and uniqueness of the solution for the evolution model are established in a Sobolev space as opposed to using viscosity methods. The salient features of the coordinate registration model are its simplicity and computational efficiency. The intensity morph is easily achieved via evolving level-sets of one image into the level-sets of the other. To explicitly estimate the coordinate transformation between the images, we derive a non-linear PDE-based motion model which can be solved very efficiently. We demonstrate the performance of our algorithm on a variety of images including synthetic and real data. As an application of the PDE-based motion model, atlas based segmentation of hippocampal shape from several MR brain scans is depicted. In each of these experiments, automated hippocampal shape recovery results are validated via manual "expert" segmentations.

  4. Research on classified real-time flood forecasting framework based on K-means cluster and rough set.

    PubMed

    Xu, Wei; Peng, Yong

    2015-01-01

    This research presents a new classified real-time flood forecasting framework. In this framework, historical floods are classified by a K-means cluster according to the spatial and temporal distribution of precipitation, the time variance of precipitation intensity and other hydrological factors. Based on the classified results, a rough set is used to extract the identification rules for real-time flood forecasting. Then, the parameters of different categories within the conceptual hydrological model are calibrated using a genetic algorithm. In real-time forecasting, the corresponding category of parameters is selected for flood forecasting according to the obtained flood information. This research tests the new classified framework on Guanyinge Reservoir and compares the framework with the traditional flood forecasting method. It finds that the performance of the new classified framework is significantly better in terms of accuracy. Furthermore, the framework can be considered in a catchment with fewer historical floods.

  5. A universal surface complexation framework for modeling proton binding onto bacterial surfaces in geologic settings

    USGS Publications Warehouse

    Borrok, D.; Turner, B.F.; Fein, J.B.

    2005-01-01

    Adsorption onto bacterial cell walls can significantly affect the speciation and mobility of aqueous metal cations in many geologic settings. However, a unified thermodynamic framework for describing bacterial adsorption reactions does not exist. This problem originates from the numerous approaches that have been chosen for modeling bacterial surface protonation reactions. In this study, we compile all currently available potentiometric titration datasets for individual bacterial species, bacterial consortia, and bacterial cell wall components. Using a consistent, four discrete site, non-electrostatic surface complexation model, we determine total functional group site densities for all suitable datasets, and present an averaged set of 'universal' thermodynamic proton binding and site density parameters for modeling bacterial adsorption reactions in geologic systems. Modeling results demonstrate that the total concentrations of proton-active functional group sites for the 36 bacterial species and consortia tested are remarkably similar, averaging 3.2 ?? 1.0 (1??) ?? 10-4 moles/wet gram. Examination of the uncertainties involved in the development of proton-binding modeling parameters suggests that ignoring factors such as bacterial species, ionic strength, temperature, and growth conditions introduces relatively small error compared to the unavoidable uncertainty associated with the determination of cell abundances in realistic geologic systems. Hence, we propose that reasonable estimates of the extent of bacterial cell wall deprotonation can be made using averaged thermodynamic modeling parameters from all of the experiments that are considered in this study, regardless of bacterial species used, ionic strength, temperature, or growth condition of the experiment. The average site densities for the four discrete sites are 1.1 ?? 0.7 ?? 10-4, 9.1 ?? 3.8 ?? 10-5, 5.3 ?? 2.1 ?? 10-5, and 6.6 ?? 3.0 ?? 10-5 moles/wet gram bacteria for the sites with pKa values of 3

  6. Parallel Computation of the Topology of Level Sets

    SciTech Connect

    Pascucci, V; Cole-McLaughlin, K

    2004-12-16

    This paper introduces two efficient algorithms that compute the Contour Tree of a 3D scalar field F and its augmented version with the Betti numbers of each isosurface. The Contour Tree is a fundamental data structure in scientific visualization that is used to preprocess the domain mesh to allow optimal computation of isosurfaces with minimal overhead storage. The Contour Tree can also be used to build user interfaces reporting the complete topological characterization of a scalar field, as shown in Figure 1. Data exploration time is reduced since the user understands the evolution of level set components with changing isovalue. The Augmented Contour Tree provides even more accurate information segmenting the range space of the scalar field in portion of invariant topology. The exploration time for a single isosurface is also improved since its genus is known in advance. Our first new algorithm augments any given Contour Tree with the Betti numbers of all possible corresponding isocontours in linear time with the size of the tree. Moreover we show how to extend the scheme introduced in [3] with the Betti number computation without increasing its complexity. Thus, we improve on the time complexity from our previous approach [10] from O(m log m) to O(n log n + m), where m is the number of cells and n is the number of vertices in the domain of F. Our second contribution is a new divide-and-conquer algorithm that computes the Augmented Contour Tree with improved efficiency. The approach computes the output Contour Tree by merging two intermediate Contour Trees and is independent of the interpolant. In this way we confine any knowledge regarding a specific interpolant to an independent function that computes the tree for a single cell. We have implemented this function for the trilinear interpolant and plan to replace it with higher order interpolants when needed. The time complexity is O(n + t log n), where t is the number of critical points of F. For the first time

  7. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the

  8. Shared investment projects and forecasting errors: setting framework conditions for coordination and sequencing data quality activities.

    PubMed

    Leitner, Stephan; Brauneis, Alexander; Rausch, Alexandra

    2015-01-01

    In this paper, we investigate the impact of inaccurate forecasting on the coordination of distributed investment decisions. In particular, by setting up a computational multi-agent model of a stylized firm, we investigate the case of investment opportunities that are mutually carried out by organizational departments. The forecasts of concern pertain to the initial amount of money necessary to launch and operate an investment opportunity, to the expected intertemporal distribution of cash flows, and the departments' efficiency in operating the investment opportunity at hand. We propose a budget allocation mechanism for coordinating such distributed decisions The paper provides guidance on how to set framework conditions, in terms of the number of investment opportunities considered in one round of funding and the number of departments operating one investment opportunity, so that the coordination mechanism is highly robust to forecasting errors. Furthermore, we show that-in some setups-a certain extent of misforecasting is desirable from the firm's point of view as it supports the achievement of the corporate objective of value maximization. We then address the question of how to improve forecasting quality in the best possible way, and provide policy advice on how to sequence activities for improving forecasting quality so that the robustness of the coordination mechanism to errors increases in the best possible way. At the same time, we show that wrong decisions regarding the sequencing can lead to a decrease in robustness. Finally, we conduct a comprehensive sensitivity analysis and prove that-in particular for relatively good forecasters-most of our results are robust to changes in setting the parameters of our multi-agent simulation model.

  9. Simultaneous segmentation and reconstruction: A level set method approach for limited view computed tomography

    PubMed Central

    Yoon, Sungwon; Pineda, Angel R.; Fahrig, Rebecca

    2010-01-01

    Purpose: An iterative tomographic reconstruction algorithm that simultaneously segments and reconstructs the reconstruction domain is proposed and applied to tomographic reconstructions from a sparse number of projection images. Methods: The proposed algorithm uses a two-phase level set method segmentation in conjunction with an iterative tomographic reconstruction to achieve simultaneous segmentation and reconstruction. The simultaneous segmentation and reconstruction is achieved by alternating between level set function evolutions and per-region intensity value updates. To deal with the limited number of projections, a priori information about the reconstruction is enforced via penalized likelihood function. Specifically, smooth function within each region (piecewise smooth function) and bounded function intensity values for each region are assumed. Such a priori information is formulated into a quadratic objective function with linear bound constraints. The level set function evolutions are achieved by artificially time evolving the level set function in the negative gradient direction; the intensity value updates are achieved by using the gradient projection conjugate gradient algorithm. Results: The proposed simultaneous segmentation and reconstruction results were compared to “conventional” iterative reconstruction (with no segmentation), iterative reconstruction followed by segmentation, and filtered backprojection. Improvements of 6%–13% in the normalized root mean square error were observed when the proposed algorithm was applied to simulated projections of a numerical phantom and to real fan-beam projections of the Catphan phantom, both of which did not satisfy the a priori assumptions. Conclusions: The proposed simultaneous segmentation and reconstruction resulted in improved reconstruction image quality. The algorithm correctly segments the reconstruction space into regions, preserves sharp edges between different regions, and smoothes the noise

  10. Simultaneous segmentation and reconstruction: A level set method approach for limited view computed tomography

    SciTech Connect

    Yoon, Sungwon; Pineda, Angel R.; Fahrig, Rebecca

    2010-05-15

    Purpose: An iterative tomographic reconstruction algorithm that simultaneously segments and reconstructs the reconstruction domain is proposed and applied to tomographic reconstructions from a sparse number of projection images. Methods: The proposed algorithm uses a two-phase level set method segmentation in conjunction with an iterative tomographic reconstruction to achieve simultaneous segmentation and reconstruction. The simultaneous segmentation and reconstruction is achieved by alternating between level set function evolutions and per-region intensity value updates. To deal with the limited number of projections, a priori information about the reconstruction is enforced via penalized likelihood function. Specifically, smooth function within each region (piecewise smooth function) and bounded function intensity values for each region are assumed. Such a priori information is formulated into a quadratic objective function with linear bound constraints. The level set function evolutions are achieved by artificially time evolving the level set function in the negative gradient direction; the intensity value updates are achieved by using the gradient projection conjugate gradient algorithm. Results: The proposed simultaneous segmentation and reconstruction results were compared to ''conventional'' iterative reconstruction (with no segmentation), iterative reconstruction followed by segmentation, and filtered backprojection. Improvements of 6%-13% in the normalized root mean square error were observed when the proposed algorithm was applied to simulated projections of a numerical phantom and to real fan-beam projections of the Catphan phantom, both of which did not satisfy the a priori assumptions. Conclusions: The proposed simultaneous segmentation and reconstruction resulted in improved reconstruction image quality. The algorithm correctly segments the reconstruction space into regions, preserves sharp edges between different regions, and smoothes the noise

  11. Combating Terrorism: A Conceptual Framework for Targeting at the Operational Level

    DTIC Science & Technology

    2007-11-02

    COMBATING TERRORISM: A CONCEPTUAL FRAMEWORK FOR TARGETING AT THE OPERATIONAL LEVEL A thesis presented to the Faculty of the US Army Command and...ART AND SCIENCE THESIS APPROVAL PAGE Name of Candidate: Lt Col Angus S. J. Fay Thesis Title: Combating Terrorism: A Conceptual Framework for... conceptual framework for targeting terrorism at the operational level is worthy of investigation. Thesis Question Is there utility within the JIPB

  12. Topology Optimization using the Level Set and eXtended Finite Element Methods: Theory and Applications

    NASA Astrophysics Data System (ADS)

    Villanueva Perez, Carlos Hernan

    Computational design optimization provides designers with automated techniques to develop novel and non-intuitive optimal designs. Topology optimization is a design optimization technique that allows for the evolution of a broad variety of geometries in the optimization process. Traditional density-based topology optimization methods often lack a sufficient resolution of the geometry and physical response, which prevents direct use of the optimized design in manufacturing and the accurate modeling of the physical response of boundary conditions. The goal of this thesis is to introduce a unified topology optimization framework that uses the Level Set Method (LSM) to describe the design geometry and the eXtended Finite Element Method (XFEM) to solve the governing equations and measure the performance of the design. The methodology is presented as an alternative to density-based optimization approaches, and is able to accommodate a broad range of engineering design problems. The framework presents state-of-the-art methods for immersed boundary techniques to stabilize the systems of equations and enforce the boundary conditions, and is studied with applications in 2D and 3D linear elastic structures, incompressible flow, and energy and species transport problems to test the robustness and the characteristics of the method. A comparison of the framework against density-based topology optimization approaches is studied with regards to convergence, performance, and the capability to manufacture the designs. Furthermore, the ability to control the shape of the design to operate within manufacturing constraints is developed and studied. The analysis capability of the framework is validated quantitatively through comparison against previous benchmark studies, and qualitatively through its application to topology optimization problems. The design optimization problems converge to intuitive designs and resembled well the results from previous 2D or density-based studies.

  13. Measuring Afterschool Program Quality Using Setting-Level Observational Approaches

    ERIC Educational Resources Information Center

    Oh, Yoonkyung; Osgood, D. Wayne; Smith, Emilie P.

    2015-01-01

    The importance of afterschool hours for youth development is widely acknowledged, and afterschool settings have recently received increasing attention as an important venue for youth interventions, bringing a growing need for reliable and valid measures of afterschool quality. This study examined the extent to which the two observational tools,…

  14. An Evaluation of the High Level Architecture (HLA) as a Framework for NASA Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reid, Michael R.; Powers, Edward I. (Technical Monitor)

    2000-01-01

    The High Level Architecture (HLA) is a current US Department of Defense and an industry (IEEE-1516) standard architecture for modeling and simulations. It provides a framework and set of functional rules and common interfaces for integrating separate and disparate simulators into a larger simulation. The goal of the HLA is to reduce software costs by facilitating the reuse of simulation components and by providing a runtime infrastructure to manage the simulations. In order to evaluate the applicability of the HLA as a technology for NASA space mission simulations, a Simulations Group at Goddard Space Flight Center (GSFC) conducted a study of the HLA and developed a simple prototype HLA-compliant space mission simulator. This paper summarizes the prototyping effort and discusses the potential usefulness of the HLA in the design and planning of future NASA space missions with a focus on risk mitigation and cost reduction.

  15. Marker ReDistancing/Level Set Method for High-Fidelity Implicit Interface Tracking

    SciTech Connect

    Robert Nourgaliev; Samet Kadioglu; Vincent Mousseau; Dana Knoll

    2010-02-01

    A hybrid of the Front-Tracking (FT) and the Level-Set (LS) methods is introduced, combining advantages and removing drawbacks of both methods. The kinematics of the interface is treated in a Lagrangian (FT) manner, by tracking markers placed at the interface. The markers are not connected – instead, the interface topology is resolved in an Eulerian (LS) framework, by wrapping a signed distance function around Lagrangian markers each time the markers move. For accuracy and efficiency, we have developed a high-order “anchoring” algorithm and an implicit PDE-based re-distancing. We have demonstrated that the method is 3rd-order accurate in space, near the markers, and therefore 1st-order convergent in curvature; in contrast to traditional PDE-based re-initialization algorithms, which tend to slightly relocate the zero Level Set and can be shown to be non-convergent in curvature. The implicit pseudo-time discretization of the re-distancing equation is implemented within the Jacobian-Free Newton Krylov (JFNK) framework combined with ILU(k) preconditioning. We have demonstrated that the steady-state solutions in pseudo-time can be achieved very efficiently, with iterations (CFL ), in contrast to the explicit re-distancing which requires 100s of iterations with CFL . The most cost-effective algorithm is found to be a hybrid of explicit and implicit discretizations, in which we apply first 10-15 iterations with explicit discretization (to bring the initial guess to the ball of convergence for the Newton’s method) and then finishing with 2-3 implicit steps, bringing the re-distancing equation to a complete steady-state. The eigenscopy of the JFNK-ILU(k) demonstrates the efficiency of the ILU(k) preconditioner, which effectively cluster eigenvalues of the otherwise extremely ill-conditioned Jacobian matrices, thereby enabling the Krylov (GMRES) method to converge with iterations, with only a few levels of ILU fill-ins. Importantly, due to the Level Set localization

  16. Joint Infrared Target Recognition and Segmentation Using a Shape Manifold-Aware Level Set

    PubMed Central

    Yu, Liangjiang; Fan, Guoliang; Gong, Jiulu; Havlicek, Joseph P.

    2015-01-01

    We propose new techniques for joint recognition, segmentation and pose estimation of infrared (IR) targets. The problem is formulated in a probabilistic level set framework where a shape constrained generative model is used to provide a multi-class and multi-view shape prior and where the shape model involves a couplet of view and identity manifolds (CVIM). A level set energy function is then iteratively optimized under the shape constraints provided by the CVIM. Since both the view and identity variables are expressed explicitly in the objective function, this approach naturally accomplishes recognition, segmentation and pose estimation as joint products of the optimization process. For realistic target chips, we solve the resulting multi-modal optimization problem by adopting a particle swarm optimization (PSO) algorithm and then improve the computational efficiency by implementing a gradient-boosted PSO (GB-PSO). Evaluation was performed using the Military Sensing Information Analysis Center (SENSIAC) ATR database, and experimental results show that both of the PSO algorithms reduce the cost of shape matching during CVIM-based shape inference. Particularly, GB-PSO outperforms other recent ATR algorithms, which require intensive shape matching, either explicitly (with pre-segmentation) or implicitly (without pre-segmentation). PMID:25938202

  17. Joint infrared target recognition and segmentation using a shape manifold-aware level set.

    PubMed

    Yu, Liangjiang; Fan, Guoliang; Gong, Jiulu; Havlicek, Joseph P

    2015-04-29

    We propose new techniques for joint recognition, segmentation and pose estimation of infrared (IR) targets. The problem is formulated in a probabilistic level set framework where a shape constrained generative model is used to provide a multi-class and multi-view shape prior and where the shape model involves a couplet of view and identity manifolds (CVIM). A level set energy function is then iteratively optimized under the shape constraints provided by the CVIM. Since both the view and identity variables are expressed explicitly in the objective function, this approach naturally accomplishes recognition, segmentation and pose estimation as joint products of the optimization process. For realistic target chips, we solve the resulting multi-modal optimization problem by adopting a particle swarm optimization (PSO) algorithm and then improve the computational efficiency by implementing a gradient-boosted PSO (GB-PSO). Evaluation was performed using the Military Sensing Information Analysis Center (SENSIAC) ATR database, and experimental results show that both of the PSO algorithms reduce the cost of shape matching during CVIM-based shape inference. Particularly, GB-PSO outperforms other recent ATR algorithms, which require intensive shape matching, either explicitly (with pre-segmentation) or implicitly (without pre-segmentation).

  18. Structural engineering masters level education framework of knowledge for the needs of initial professional practice

    NASA Astrophysics Data System (ADS)

    Balogh, Zsuzsa Enriko

    For at least the last decade, engineering, civil engineering, along with structural engineering as a profession within civil engineering, have and continue to face an emerging need for "Raising the Bar" of preparedness of young engineers seeking to become practicing professional engineers. The present consensus of the civil engineering profession is that the increasing need for broad and in-depth knowledge should require the young structural engineers to have at least a Masters-Level education. This study focuses on the Masters-Level preparedness in the structural engineering area within the civil engineering field. It follows much of the methodology used in the American Society of Civil Engineers (ASCE) Body of Knowledge determination for civil engineering and extends this type of study to better define the portion of the young engineers preparation beyond the undergraduate program for one specialty area of civil engineering. The objective of this research was to create a Framework of Knowledge for the young engineer which identifies and recognizes the needs of the profession, along with the profession's expectations of how those needs can be achieved in the graduate-level academic setting, in the practice environment, and through lifelong learning opportunities with an emphasis on the initial five years experience past completion of a Masters program in structural engineering. This study applied a modified Delphi method to obtain the critical information from members of the structural engineering profession. The results provide a Framework of Knowledge which will be useful to several groups seeking to better ensure the preparedness of the future young structural engineers at the Masters-Level.

  19. Comparing volume of fluid and level set methods for evaporating liquid-gas flows

    NASA Astrophysics Data System (ADS)

    Palmore, John; Desjardins, Olivier

    2016-11-01

    This presentation demonstrates three numerical strategies for simulating liquid-gas flows undergoing evaporation. The practical aim of this work is to choose a framework capable of simulating the combustion of liquid fuels in an internal combustion engine. Each framework is analyzed with respect to its accuracy and computational cost. All simulations are performed using a conservative, finite volume code for simulating reacting, multiphase flows under the low-Mach assumption. The strategies used in this study correspond to different methods for tracking the liquid-gas interface and handling the transport of the discontinuous momentum and vapor mass fractions fields. The first two strategies are based on conservative, geometric volume of fluid schemes using directionally split and un-split advection, respectively. The third strategy is the accurate conservative level set method. For all strategies, special attention is given to ensuring the consistency between the fluxes of mass, momentum, and vapor fractions. The study performs three-dimensional simulations of an isolated droplet of a single component fuel evaporating into air. Evaporation rates and vapor mass fractions are compared to analytical results.

  20. 21 CFR 530.23 - Procedure for setting and announcing safe levels.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 6 2012-04-01 2012-04-01 false Procedure for setting and announcing safe levels... for setting and announcing safe levels. (a) FDA may issue an order establishing a safe level for a... in the Federal Register a notice of the order. The notice will include: (1) A statement setting...

  1. 21 CFR 530.23 - Procedure for setting and announcing safe levels.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 6 2014-04-01 2014-04-01 false Procedure for setting and announcing safe levels... for setting and announcing safe levels. (a) FDA may issue an order establishing a safe level for a... in the Federal Register a notice of the order. The notice will include: (1) A statement setting...

  2. A novel framework for assessing metadata quality in epidemiological and public health research settings.

    PubMed

    McMahon, Christiana; Denaxas, Spiros

    2016-01-01

    Metadata are critical in epidemiological and public health research. However, a lack of biomedical metadata quality frameworks and limited awareness of the implications of poor quality metadata renders data analyses problematic. In this study, we created and evaluated a novel framework to assess metadata quality of epidemiological and public health research datasets. We performed a literature review and surveyed stakeholders to enhance our understanding of biomedical metadata quality assessment. The review identified 11 studies and nine quality dimensions; none of which were specifically aimed at biomedical metadata. 96 individuals completed the survey; of those who submitted data, most only assessed metadata quality sometimes, and eight did not at all. Our framework has four sections: a) general information; b) tools and technologies; c) usability; and d) management and curation. We evaluated the framework using three test cases and sought expert feedback. The framework can assess biomedical metadata quality systematically and robustly.

  3. A novel framework for assessing metadata quality in epidemiological and public health research settings

    PubMed Central

    McMahon, Christiana; Denaxas, Spiros

    2016-01-01

    Metadata are critical in epidemiological and public health research. However, a lack of biomedical metadata quality frameworks and limited awareness of the implications of poor quality metadata renders data analyses problematic. In this study, we created and evaluated a novel framework to assess metadata quality of epidemiological and public health research datasets. We performed a literature review and surveyed stakeholders to enhance our understanding of biomedical metadata quality assessment. The review identified 11 studies and nine quality dimensions; none of which were specifically aimed at biomedical metadata. 96 individuals completed the survey; of those who submitted data, most only assessed metadata quality sometimes, and eight did not at all. Our framework has four sections: a) general information; b) tools and technologies; c) usability; and d) management and curation. We evaluated the framework using three test cases and sought expert feedback. The framework can assess biomedical metadata quality systematically and robustly. PMID:27570670

  4. Telemedicine: what framework, what levels of proof, implementation rules.

    PubMed

    Zannad, Faiez; Maugendre, Philippe; Audry, Antoine; Avril, Carole; Blaise, Lucile; Blin, Olivier; Burnel, Philippe; Falise-Mirat, Béatrice; Girault, Danièle; Giri, Isabelle; Goehrs, Jean-Marie; Lassale, Catherine; Le Meur, Roland; Leurent, Pierre; Ratignier-Carbonneil, Christelle; Rossignol, Patrick; Satonnet, Evelyne; Simon, Pierre; Treluyer, Laurent

    2014-01-01

    The concept of telemedicine was formalised in France in the 2009 "Hospital, patients, health territories" (loi hôpital, patients, santé, territoire) law and the 2010 decree through which it was applied. Many experiments have been carried out and the regulatory institutions (Ministry, Regional Health Agency [Agence régionale de santé, ARS], French National Health Authority [Haute autorité de santé, HAS], etc.) have issued various guidance statements and recommendations on its organisation and on the expectations of its evaluation. With this background, the round table wanted to produce recommendations on different areas of medical telemonitoring (the role of telemonitoring, the regulatory system, the principles for assessment, methods of use and conditions for sustained and seamless deployment). Whilst many studies carried out on new medical telemonitoring approaches have led to the postulate that it offers benefit, both clinically and in terms of patient quality of life, more information is needed to demonstrate its impact on the organisation of healthcare and the associated medico-economic benefit (criteria, methods, resources). Similarly, contractual frameworks for deployment of telemonitoring do exist, although they are complicated and involve many different stakeholders (Director General fo the Care Offering [Direction générale de l'offre de soins, DGOS], ARS, HAS, Agency for Shared Health Information Systems [Agence des systèmes d'information partagés de santé, ASIP], French National Data Protection Commission [Commission nationale informatique et libertés, CNIL], French National Medical Council [Conseil national de l'Ordre des médecins, CNOM], etc.) that would benefit from a shared approach and seamless exchange between the partners involved. The current challenge is also to define the conditions required to validate a stable economic model in order to promote organisational change. One topical issue is placing the emphasis on its evaluation and

  5. The Uniframe System-Level Generative Programming Framework

    DTIC Science & Technology

    2003-08-01

    on this component. Auxiliary Aspect In addition to computation and cooperation, mobility , security, and fault tolerance are necessary features...postprocessing collaborators separated by comma or NONE> 8. Auxiliary Attributes: 8.1 Mobility : <Yes/No> 8.2 Security: <security level > 8.3 Fault...depend on this component. Auxiliary Attributes: o Mobility : This entry indicates whether the component is mobile or not. o Security: This entry

  6. A Conceptual Framework for Educational Design at Modular Level to Promote Transfer of Learning

    ERIC Educational Resources Information Center

    Botma, Yvonne; Van Rensburg, G. H.; Coetzee, I. M.; Heyns, T.

    2015-01-01

    Students bridge the theory-practice gap when they apply in practice what they have learned in class. A conceptual framework was developed that can serve as foundation to design for learning transfer at modular level. The framework is based on an adopted and adapted systemic model of transfer of learning, existing learning theories, constructive…

  7. Concurrent Validity of the Independent Reading Level Assessment Framework and a State Assessment

    ERIC Educational Resources Information Center

    Ralston, Nicole C.; Waggoner, Jacqueline M.; Tarasawa, Beth; Jackson, Amy

    2016-01-01

    This study investigates the use of screening assessments within the increasingly popular Response to Intervention (RTI) framework, specifically seeking to collect concurrent validity evidence on one potential new screening tool, the Independent Reading Level Assessment (IRLA) framework. Furthermore, this study builds on existing literature by…

  8. High-level waste tank farm set point document

    SciTech Connect

    Anthony, J.A. III

    1995-01-15

    Setpoints for nuclear safety-related instrumentation are required for actions determined by the design authorization basis. Minimum requirements need to be established for assuring that setpoints are established and held within specified limits. This document establishes the controlling methodology for changing setpoints of all classifications. The instrumentation under consideration involve the transfer, storage, and volume reduction of radioactive liquid waste in the F- and H-Area High-Level Radioactive Waste Tank Farms. The setpoint document will encompass the PROCESS AREA listed in the Safety Analysis Report (SAR) (DPSTSA-200-10 Sup 18) which includes the diversion box HDB-8 facility. In addition to the PROCESS AREAS listed in the SAR, Building 299-H and the Effluent Transfer Facility (ETF) are also included in the scope.

  9. Investigating the Experience of Outdoor and Adventurous Project Work in an Educational Setting Using a Self-Determination Framework

    ERIC Educational Resources Information Center

    Sproule, John; Martindale, Russell; Wang, John; Allison, Peter; Nash, Christine; Gray, Shirley

    2013-01-01

    The purpose of this study was to carry out a preliminary investigation to explore the use of outdoor and adventurous project work (PW) within an educational setting. Specifically, differences between the PW and normal academic school experiences were examined using a self-determination theory framework integrated with a goal orientation and…

  10. Toppled television sets and head injuries in the pediatric population: a framework for prevention.

    PubMed

    Cusimano, Michael D; Parker, Nadine

    2016-01-01

    Injuries to children caused by falling televisions have become more frequent during the last decade. These injuries can be severe and even fatal and are likely to become even more common in the future as TVs increase in size and become more affordable. To formulate guidelines for the prevention of these injuries, the authors systematically reviewed the literature on injuries related to toppling televisions. The authors searched MEDLINE, PubMed, Embase, Scopus, CINAHL (Cumulative Index to Nursing and Allied Health Literature), Cochrane Library, and Google Scholar according to the Cochrane guidelines for all studies involving children 0-18 years of age who were injured by toppled TVs. Factors contributing to injury were categorized using Haddon's Matrix, and the public health approach was used as a framework for developing strategies to prevent these injuries. The vast majority (84%) of the injuries occurred in homes and more than three-fourths were unwitnessed by adult caregivers. The TVs were most commonly large and elevated off the ground. Dressers and other furniture not designed to support TVs were commonly involved in the TV-toppling incident. The case fatality rate varies widely, but almost all deaths reported (96%) were due to brain injuries. Toddlers between the ages of 1 and 3 years most frequently suffer injuries to the head and neck, and they are most likely to suffer severe injuries. Many of these injuries require brain imaging and neurosurgical intervention. Prevention of these injuries will require changes in TV design and legislation as well as increases in public education and awareness. Television-toppling injuries can be easily prevented; however, the rates of injury do not reflect a sufficient level of awareness, nor do they reflect an acceptable effort from an injury prevention perspective.

  11. Evaluation of the causal framework used for setting national ambient air quality standards.

    PubMed

    Goodman, Julie E; Prueitt, Robyn L; Sax, Sonja N; Bailey, Lisa A; Rhomberg, Lorenz R

    2013-11-01

    Abstract A scientifically sound assessment of the potential hazards associated with a substance requires a systematic, objective and transparent evaluation of the weight of evidence (WoE) for causality of health effects. We critically evaluated the current WoE framework for causal determination used in the United States Environmental Protection Agency's (EPA's) assessments of the scientific data on air pollutants for the National Ambient Air Quality Standards (NAAQS) review process, including its methods for literature searches; study selection, evaluation and integration; and causal judgments. The causal framework used in recent NAAQS evaluations has many valuable features, but it could be more explicit in some cases, and some features are missing that should be included in every WoE evaluation. Because of this, it has not always been applied consistently in evaluations of causality, leading to conclusions that are not always supported by the overall WoE, as we demonstrate using EPA's ozone Integrated Science Assessment as a case study. We propose additions to the NAAQS causal framework based on best practices gleaned from a previously conducted survey of available WoE frameworks. A revision of the NAAQS causal framework so that it more closely aligns with these best practices and the full and consistent application of the framework will improve future assessments of the potential health effects of criteria air pollutants by making the assessments more thorough, transparent, and scientifically sound.

  12. Architecture-Driven Level Set Optimization: From Clustering to Subpixel Image Segmentation.

    PubMed

    Balla-Arabe, Souleymane; Gao, Xinbo; Ginhac, Dominique; Brost, Vincent; Yang, Fan

    2016-12-01

    Thanks to their effectiveness, active contour models (ACMs) are of great interest for computer vision scientists. The level set methods (LSMs) refer to the class of geometric active contours. Comparing with the other ACMs, in addition to subpixel accuracy, it has the intrinsic ability to automatically handle topological changes. Nevertheless, the LSMs are computationally expensive. A solution for their time consumption problem can be hardware acceleration using some massively parallel devices such as graphics processing units (GPUs). But the question is: which accuracy can we reach while still maintaining an adequate algorithm to massively parallel architecture? In this paper, we attempt to push back the compromise between, speed and accuracy, efficiency and effectiveness, to a higher level, comparing with state-of-the-art methods. To this end, we designed a novel architecture-aware hybrid central processing unit (CPU)-GPU LSM for image segmentation. The initialization step, using the well-known k -means algorithm, is fast although executed on a CPU, while the evolution equation of the active contour is inherently local and therefore suitable for GPU-based acceleration. The incorporation of local statistics in the level set evolution allowed our model to detect new boundaries which are not extracted by the used clustering algorithm. Comparing with some cutting-edge LSMs, the introduced model is faster, more accurate, less subject to giving local minima, and therefore suitable for automatic systems. Furthermore, it allows two-phase clustering algorithms to benefit from the numerous LSM advantages such as the ability to achieve robust and subpixel accurate segmentation results with smooth and closed contours. Intensive experiments demonstrate, objectively and subjectively, the good performance of the introduced framework both in terms of speed and accuracy.

  13. Bushmeat genetics: setting up a reference framework for the DNA typing of African forest bushmeat.

    PubMed

    Gaubert, Philippe; Njiokou, Flobert; Olayemi, Ayodeji; Pagani, Paolo; Dufour, Sylvain; Danquah, Emmanuel; Nutsuakor, Mac Elikem K; Ngua, Gabriel; Missoup, Alain-Didier; Tedesco, Pablo A; Dernat, Rémy; Antunes, Agostinho

    2015-05-01

    The bushmeat trade in tropical Africa represents illegal, unsustainable off-takes of millions of tons of wild game - mostly mammals - per year. We sequenced four mitochondrial gene fragments (cyt b, COI, 12S, 16S) in >300 bushmeat items representing nine mammalian orders and 59 morphological species from five western and central African countries (Guinea, Ghana, Nigeria, Cameroon and Equatorial Guinea). Our objectives were to assess the efficiency of cross-species PCR amplification and to evaluate the usefulness of our multilocus approach for reliable bushmeat species identification. We provide a straightforward amplification protocol using a single 'universal' primer pair per gene that generally yielded >90% PCR success rates across orders and was robust to different types of meat preprocessing and DNA extraction protocols. For taxonomic identification, we set up a decision pipeline combining similarity- and tree-based approaches with an assessment of taxonomic expertise and coverage of the GENBANK database. Our multilocus approach permitted us to: (i) adjust for existing taxonomic gaps in GENBANK databases, (ii) assign to the species level 67% of the morphological species hypotheses and (iii) successfully identify samples with uncertain taxonomic attribution (preprocessed carcasses and cryptic lineages). High levels of genetic polymorphism across genes and taxa, together with the excellent resolution observed among species-level clusters (neighbour-joining trees and Klee diagrams) advocate the usefulness of our markers for bushmeat DNA typing. We formalize our DNA typing decision pipeline through an expert-curated query database - DNA BUSHMEAT - that shall permit the automated identification of African forest bushmeat items.

  14. Cooperative fuzzy games approach to setting target levels of ECs in quality function deployment.

    PubMed

    Yang, Zhihui; Chen, Yizeng; Yin, Yunqiang

    2014-01-01

    Quality function deployment (QFD) can provide a means of translating customer requirements (CRs) into engineering characteristics (ECs) for each stage of product development and production. The main objective of QFD-based product planning is to determine the target levels of ECs for a new product or service. QFD is a breakthrough tool which can effectively reduce the gap between CRs and a new product/service. Even though there are conflicts among some ECs, the objective of developing new product is to maximize the overall customer satisfaction. Therefore, there may be room for cooperation among ECs. A cooperative game framework combined with fuzzy set theory is developed to determine the target levels of the ECs in QFD. The key to develop the model is the formulation of the bargaining function. In the proposed methodology, the players are viewed as the membership functions of ECs to formulate the bargaining function. The solution for the proposed model is Pareto-optimal. An illustrated example is cited to demonstrate the application and performance of the proposed approach.

  15. Benchmarking density functional theory predictions of framework structures and properties in a chemically diverse test set of metal-organic frameworks

    SciTech Connect

    Nazarian, Dalar; Ganesh, P.; Sholl, David S.

    2015-01-01

    We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionals for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.

  16. A global health delivery framework approach to epilepsy care in resource-limited settings.

    PubMed

    Cochran, Maggie F; Berkowitz, Aaron L

    2015-11-15

    The Global Health Delivery (GHD) framework (Farmer, Kim, and Porter, Lancet 2013;382:1060-69) allows for the analysis of health care delivery systems along four axes: a care delivery value chain that incorporates prevention, diagnosis, and treatment of a medical condition; shared delivery infrastructure that integrates care within existing healthcare delivery systems; alignment of care delivery with local context; and generation of economic growth and social development through the health care delivery system. Here, we apply the GHD framework to epilepsy care in rural regions of low- and middle-income countries (LMIC) where there are few or no neurologists.

  17. Novel multimodality segmentation using level sets and Jensen-Rényi divergence

    SciTech Connect

    Markel, Daniel; Zaidi, Habib; El Naqa, Issam

    2013-12-15

    Purpose: Positron emission tomography (PET) is playing an increasing role in radiotherapy treatment planning. However, despite progress, robust algorithms for PET and multimodal image segmentation are still lacking, especially if the algorithm were extended to image-guided and adaptive radiotherapy (IGART). This work presents a novel multimodality segmentation algorithm using the Jensen-Rényi divergence (JRD) to evolve the geometric level set contour. The algorithm offers improved noise tolerance which is particularly applicable to segmentation of regions found in PET and cone-beam computed tomography. Methods: A steepest gradient ascent optimization method is used in conjunction with the JRD and a level set active contour to iteratively evolve a contour to partition an image based on statistical divergence of the intensity histograms. The algorithm is evaluated using PET scans of pharyngolaryngeal squamous cell carcinoma with the corresponding histological reference. The multimodality extension of the algorithm is evaluated using 22 PET/CT scans of patients with lung carcinoma and a physical phantom scanned under varying image quality conditions. Results: The average concordance index (CI) of the JRD segmentation of the PET images was 0.56 with an average classification error of 65%. The segmentation of the lung carcinoma images had a maximum diameter relative error of 63%, 19.5%, and 14.8% when using CT, PET, and combined PET/CT images, respectively. The estimated maximal diameters of the gross tumor volume (GTV) showed a high correlation with the macroscopically determined maximal diameters, with aR{sup 2} value of 0.85 and 0.88 using the PET and PET/CT images, respectively. Results from the physical phantom show that the JRD is more robust to image noise compared to mutual information and region growing. Conclusions: The JRD has shown improved noise tolerance compared to mutual information for the purpose of PET image segmentation. Presented is a flexible

  18. An explanatory framework of teachers' perceptions of a positive mealtime environment in a preschool setting.

    PubMed

    Mita, Satoko C; Gray, Samuel A; Goodell, L Suzanne

    2015-07-01

    Attending a preschool center may help preschoolers with growth and development that encourage a healthy lifestyle, including sound eating behaviors. Providing a positive mealtime environment (PME) may be one of the keys to fostering a child's healthy eating habits in the classroom. However, a specific definition of a PME, the components of a PME, or directions on how to create one have not been established. The purpose of this study, therefore, was to explore Head Start teachers' perceptions related to a PME and create a conceptual framework representing these perceptions. To achieve this purpose, researchers conducted 65 in-depth phone interviews with Head Start teachers around the US. Applying principles of grounded theory, researchers developed a conceptual framework depicting teachers' perceptions of PME, consisting of five key components: (1) the people (i.e., teachers, kitchen staff, parent volunteers, and children), (2) positive emotional tone (e.g., relaxed and happy), (3) rules, expectations, and routines (e.g., family-style mealtime), (4) operations of a PME (i.e., eating, socialization, and learning), and (5) both short- and long-term outcomes of a PME. With this PME framework, researchers may be able to enhance the effectiveness of nutrition interventions related to a PME, focusing on the factors in the conceptual framework as well as barriers associated with achieving these factors.

  19. Validation of the Visitor and Resident Framework in an E-Book Setting

    ERIC Educational Resources Information Center

    Engelsmann, Hazel C.; Greifeneder, Elke; Lauridsen, Nikoline D.; Nielsen, Anja G.

    2014-01-01

    Introduction: By applying the visitor and resident framework on e-book usage, the article explores whether the concepts of a resident and a visitor can help to explain e-book use, and can help to gain a better insight into users' motivations for e-book use. Method: A questionnaire and semi-structured interviews were conducted with users of the…

  20. Screening Systems and Decision Making at the Preschool Level: Application of a Comprehensive Validity Framework

    ERIC Educational Resources Information Center

    Kettler, Ryan J.; Feeney-Kettler, Kelly A.

    2011-01-01

    Universal screening is designed to be an efficient method for identifying preschool students with mental health problems, but prior to use, screening systems must be evaluated to determine their appropriateness within a specific setting. In this article, an evidence-based validity framework is applied to four screening systems for identifying…

  1. Conceptual Framework and Levels of Abstraction for a Complex Large-Scale System

    SciTech Connect

    Simpson, Mary J.

    2005-03-23

    A conceptual framework and levels of abstraction are created to apply across all potential threats. Bioterrorism is used as a complex example to describe the general framework. Bioterrorism is unlimited with respect to the use of a specific agent, mode of dissemination, and potential target. Because the threat is open-ended, there is a strong need for a common, systemic understanding of attack scenarios related to bioterrorism. In recognition of this large-scale complex problem, systems are being created to define, design and use the proper level of abstraction and conceptual framework in bioterrorism. The wide variety of biological agents and delivery mechanisms provide an opportunity for dynamic scale changes by the linking or interlinking of existing threat components. Concurrent impacts must be separated and evaluated in terms of a given environment and/or ‘abstraction framework.’

  2. Alternative Frameworks of the Secondary School Students on the Concept of Condensation at Submicroscopic Level

    ERIC Educational Resources Information Center

    Abdullah, Nurdiana; Surif, Johari; Ismail, Syuhaida

    2016-01-01

    The study was carried out to identify the alternative frameworks on the concept of condensation at submicroscopic level among secondary school students (N = 324). Data was collected by using the qualitative method through the Understanding Test on the Concept of Matter at Submicroscopic Level which consisted of 10 open-ended questions. The…

  3. A novel level set model with automated initialization and controlling parameters for medical image segmentation.

    PubMed

    Liu, Qingyi; Jiang, Mingyan; Bai, Peirui; Yang, Guang

    2016-03-01

    In this paper, a level set model without the need of generating initial contour and setting controlling parameters manually is proposed for medical image segmentation. The contribution of this paper is mainly manifested in three points. First, we propose a novel adaptive mean shift clustering method based on global image information to guide the evolution of level set. By simple threshold processing, the results of mean shift clustering can automatically and speedily generate an initial contour of level set evolution. Second, we devise several new functions to estimate the controlling parameters of the level set evolution based on the clustering results and image characteristics. Third, the reaction diffusion method is adopted to supersede the distance regularization term of RSF-level set model, which can improve the accuracy and speed of segmentation effectively with less manual intervention. Experimental results demonstrate the performance and efficiency of the proposed model for medical image segmentation.

  4. Intervention complexity--a conceptual framework to inform priority-setting in health.

    PubMed

    Gericke, Christian A; Kurowski, Christoph; Ranson, M Kent; Mills, Anne

    2005-04-01

    Health interventions vary substantially in the degree of effort required to implement them. To some extent this is apparent in their financial cost, but the nature and availability of non-financial resources is often of similar importance. In particular, human resource requirements are frequently a major constraint. We propose a conceptual framework for the analysis of interventions according to their degree of technical complexity; this complements the notion of institutional capacity in considering the feasibility of implementing an intervention. Interventions are categorized into four dimensions: characteristics of the basic intervention; characteristics of delivery; requirements on government capacity; and usage characteristics. The analysis of intervention complexity should lead to a better understanding of supply- and demand-side constraints to scaling up, indicate priorities for further research and development, and can point to potential areas for improvement of specific aspects of each intervention to close the gap between the complexity of an intervention and the capacity to implement it. The framework is illustrated using the examples of scaling up condom social marketing programmes, and the DOTS strategy for tuberculosis control in highly resource-constrained countries. The framework could be used as a tool for policy-makers, planners and programme managers when considering the expansion of existing projects or the introduction of new interventions. Intervention complexity thus complements the considerations of burden of disease, cost-effectiveness, affordability and political feasibility in health policy decision-making. Reducing the technical complexity of interventions will be crucial to meeting the health-related Millennium Development Goals.

  5. A 4D Framework for Ocean Basin Paleodepths and Eustatic Sea Level Change

    NASA Astrophysics Data System (ADS)

    Muller, R.; Sdrolias, M.; Gaina, C.

    2006-12-01

    A digital framework for paleobathymetry of the ocean basins requires the complete reconstruction of ocean floor through time, including the main ocean basins, back-arc basins, and now subducted ocean crust. We reconstruct paleo-oceans by creating "synthetic plates", the locations and geometry of which is established on the basis of preserved ocean crust (magnetic lineations and fracture zones), geological data, and the rules of plate tectonics. We reconstruct the spreading histories of the Pacific, Phoenix, Izanagi, Farallon and Kula plates, the plates involved in the Indian, Atlantic, Caribbean, Arctic, Tethys and Arctic oceanic domains and all plates involved in preserved backarc basins. Based mainly on the GML-standards compliant GPlates software and the Generic Mapping Tools, we have created a set of global oceanic paleo-isochrons and paleoceanic age and depth grids. We show that the late-Cretaceous sea level highstand and the subsequent long-term drop in sea level was primarily caused by the changing age-area distribution of Pacific ocean floor through time. The emplacement of oceanic plateaus has resulted in a 40 m sealevel rise between 125 and 110 Ma, and a further 60 m rise after 110 Ma, whereas the oceanic age and latitude dependence of marine sediments has resulted in a 40m sealevel rise since about 120Ma, offsetting the gradual post-80Ma drop in sealevel due to the ageing and deepening mainly of the Pacific ocean basin, with the net effect being an about 200m drop after 80 Ma. Between 140 Ma and the present, oceanic crustal production dropped by over 40% in the Pacific, but stayed roughly constant in the remaining ocean basins. Our results suggest that the overall magnitude of 1st order sealevel change implied by Haq's sea level curve is correct.

  6. LANL2DZ basis sets recontracted in the framework of density functional theory.

    PubMed

    Chiodo, S; Russo, N; Sicilia, E

    2006-09-14

    In this paper we report recontracted LANL2DZ basis sets for first-row transition metals. The valence-electron shell basis functions were recontracted using the PWP86 generalized gradient approximation functional and the hybrid B3LYP one. Starting from the original LANL2DZ basis sets a cyclic method was used in order to optimize variationally the contraction coefficients, while the contraction scheme was held fixed at the original one of the LANL2DZ basis functions. The performance of the recontracted basis sets was analyzed by direct comparison between calculated and experimental excitation and ionization energies. Results reported here compared with those obtained using the original basis sets show clearly an improvement in the reproduction of the corresponding experimental gaps.

  7. Educational Preparation and Experiences in the Clinical Setting: Entry-Level Clinical Athletic Trainers' Perspectives

    ERIC Educational Resources Information Center

    Schilling, Jim

    2011-01-01

    Context: The clinical job setting: (Outpatient/Ambulatory/Rehabilitation Clinic) should no longer be referred to as a nontraditional setting as it employs the greatest percentage of certified members. Understanding the experiences, knowledge, and skills necessary to be successful in the clinical setting as entry-level certified athletic trainers…

  8. Integrating Compact Constraint and Distance Regularization with Level Set for Hepatocellular Carcinoma (HCC) Segmentation on Computed Tomography (CT) Images

    NASA Astrophysics Data System (ADS)

    Gui, Luying; He, Jian; Qiu, Yudong; Yang, Xiaoping

    2017-01-01

    This paper presents a variational level set approach to segment lesions with compact shapes on medical images. In this study, we investigate to address the problem of segmentation for hepatocellular carcinoma which are usually of various shapes, variable intensities, and weak boundaries. An efficient constraint which is called the isoperimetric constraint to describe the compactness of shapes is applied in this method. In addition, in order to ensure the precise segmentation and stable movement of the level set, a distance regularization is also implemented in the proposed variational framework. Our method is applied to segment various hepatocellular carcinoma regions on Computed Tomography images with promising results. Comparison results also prove that the proposed method is more accurate than other two approaches.

  9. Intervention complexity--a conceptual framework to inform priority-setting in health.

    PubMed Central

    Gericke, Christian A.; Kurowski, Christoph; Ranson, M. Kent; Mills, Anne

    2005-01-01

    Health interventions vary substantially in the degree of effort required to implement them. To some extent this is apparent in their financial cost, but the nature and availability of non-financial resources is often of similar importance. In particular, human resource requirements are frequently a major constraint. We propose a conceptual framework for the analysis of interventions according to their degree of technical complexity; this complements the notion of institutional capacity in considering the feasibility of implementing an intervention. Interventions are categorized into four dimensions: characteristics of the basic intervention; characteristics of delivery; requirements on government capacity; and usage characteristics. The analysis of intervention complexity should lead to a better understanding of supply- and demand-side constraints to scaling up, indicate priorities for further research and development, and can point to potential areas for improvement of specific aspects of each intervention to close the gap between the complexity of an intervention and the capacity to implement it. The framework is illustrated using the examples of scaling up condom social marketing programmes, and the DOTS strategy for tuberculosis control in highly resource-constrained countries. The framework could be used as a tool for policy-makers, planners and programme managers when considering the expansion of existing projects or the introduction of new interventions. Intervention complexity thus complements the considerations of burden of disease, cost-effectiveness, affordability and political feasibility in health policy decision-making. Reducing the technical complexity of interventions will be crucial to meeting the health-related Millennium Development Goals. PMID:15868020

  10. Benchmarking density functional theory predictions of framework structures and properties in a chemically diverse test set of metal-organic frameworks

    DOE PAGES

    Nazarian, Dalar; Ganesh, P.; Sholl, David S.

    2015-01-01

    We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionalsmore » for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.« less

  11. Classification of large-scale fundus image data sets: a cloud-computing framework.

    PubMed

    Roychowdhury, Sohini; Roychowdhury, Sohini; Roychowdhury, Sohini

    2016-08-01

    Large medical image data sets with high dimensionality require substantial amount of computation time for data creation and data processing. This paper presents a novel generalized method that finds optimal image-based feature sets that reduce computational time complexity while maximizing overall classification accuracy for detection of diabetic retinopathy (DR). First, region-based and pixel-based features are extracted from fundus images for classification of DR lesions and vessel-like structures. Next, feature ranking strategies are used to distinguish the optimal classification feature sets. DR lesion and vessel classification accuracies are computed using the boosted decision tree and decision forest classifiers in the Microsoft Azure Machine Learning Studio platform, respectively. For images from the DIARETDB1 data set, 40 of its highest-ranked features are used to classify four DR lesion types with an average classification accuracy of 90.1% in 792 seconds. Also, for classification of red lesion regions and hemorrhages from microaneurysms, accuracies of 85% and 72% are observed, respectively. For images from STARE data set, 40 high-ranked features can classify minor blood vessels with an accuracy of 83.5% in 326 seconds. Such cloud-based fundus image analysis systems can significantly enhance the borderline classification performances in automated screening systems.

  12. Integrating spatial fuzzy clustering with level set methods for automated medical image segmentation.

    PubMed

    Li, Bing Nan; Chui, Chee Kong; Chang, Stephen; Ong, S H

    2011-01-01

    The performance of the level set segmentation is subject to appropriate initialization and optimal configuration of controlling parameters, which require substantial manual intervention. A new fuzzy level set algorithm is proposed in this paper to facilitate medical image segmentation. It is able to directly evolve from the initial segmentation by spatial fuzzy clustering. The controlling parameters of level set evolution are also estimated from the results of fuzzy clustering. Moreover the fuzzy level set algorithm is enhanced with locally regularized evolution. Such improvements facilitate level set manipulation and lead to more robust segmentation. Performance evaluation of the proposed algorithm was carried on medical images from different modalities. The results confirm its effectiveness for medical image segmentation.

  13. A framework for testing and promoting expanded dissemination of promising preventive interventions that are being implemented in community settings.

    PubMed

    Mason, W Alex; Fleming, Charles B; Thompson, Ronald W; Haggerty, Kevin P; Snyder, James J

    2014-10-01

    Many evidence-based preventive interventions have been developed in recent years, but few are widely used. With the current focus on efficacy trials, widespread dissemination and implementation of evidence-based interventions are often afterthoughts. One potential strategy for reversing this trend is to find a promising program with a strong delivery vehicle in place and improve and test the program's efficacy through rigorous evaluation. If the program is supported by evidence, the dissemination vehicle is already in place and potentially can be expanded. This strategy has been used infrequently and has met with limited success to date, in part, because the field lacks a framework for guiding such research. To address this gap, we outline a framework for moving promising preventive interventions that are currently being implemented in community settings through a process of rigorous testing and, if needed, program modification in order to promote expanded dissemination. The framework is guided by RE-AIM (Reach, Efficacy/Effectiveness, Adoption, Implementation, and Maintenance) (Glasgow et al., Am J Publ Health 89:1322-1327, 1999), which focuses attention on external as well as internal validity in program tests, and is illustrated with examples. Challenges, such as responding to negative and null results, and opportunities inherent in the framework are discussed.

  14. Holocene sea level variations on the basis of integration of independent data sets

    SciTech Connect

    Sahagian, D.; Berkman, P. . Dept. of Geological Sciences and Byrd Polar Research Center)

    1992-01-01

    Variations in sea level through earth history have occurred at a wide variety of time scales. Sea level researchers have attacked the problem of measuring these sea level changes through a variety of approaches, each relevant only to the time scale in question, and usually only relevant to the specific locality from which a specific type of data are derived. There is a plethora of different data types that can and have been used (locally) for the measurement of Holocene sea level variations. The problem of merging different data sets for the purpose of constructing a global eustatic sea level curve for the Holocene has not previously been adequately addressed. The authors direct the efforts to that end. Numerous studies have been published regarding Holocene sea level changes. These have involved exposed fossil reef elevations, elevation of tidal deltas, elevation of depth of intertidal peat deposits, caves, tree rings, ice cores, moraines, eolian dune ridges, marine-cut terrace elevations, marine carbonate species, tide gauges, and lake level variations. Each of these data sets is based on particular set of assumptions, and is valid for a specific set of environments. In order to obtain the most accurate possible sea level curve for the Holocene, these data sets must be merged so that local and other influences can be filtered out of each data set. Since each data set involves very different measurements, each is scaled in order to define the sensitivity of the proxy measurement parameter to sea level, including error bounds. This effectively determines the temporal and spatial resolution of each data set. The level of independence of data sets is also quantified, in order to rule out the possibility of a common non-eustatic factor affecting more than one variety of data. The Holocene sea level curve is considered to be independent of other factors affecting the proxy data, and is taken to represent the relation between global ocean water and basin volumes.

  15. Design of the control set in the framework of variational data assimilation

    NASA Astrophysics Data System (ADS)

    Gejadze, I. Yu.; Malaterre, P.-O.

    2016-11-01

    Solving data assimilation problems under uncertainty in basic model parameters and in source terms may require a careful design of the control set. The task is to avoid such combinations of the control variables which may either lead to ill-posedness of the control problem formulation or compromise the robustness of the solution procedure. We suggest a method for quantifying the performance of a control set which is formed as a subset of the full set of uncertainty-bearing model inputs. Based on this quantity one can decide if the chosen 'safe' control set is sufficient in terms of the prediction accuracy. Technically, the method presents a certain generalization of the 'variational' uncertainty quantification method for observed systems. It is implemented as a matrix-free method, thus allowing high-dimensional applications. Moreover, if the Automatic Differentiation is utilized for computing the tangent linear and adjoint mappings, then it could be applied to any multi-input 'black-box' system. As application example we consider the full Saint-Venant hydraulic network model SIC2, which describes the flow dynamics in river and canal networks. The developed methodology seem useful in the context of the future SWOT satellite mission, which will provide observations of river systems the properties of which are known with quite a limited precision.

  16. An Examination of the Replicability of Angoff Standard Setting Results within a Generalizability Theory Framework

    ERIC Educational Resources Information Center

    Clauser, Jerome C.; Margolis, Melissa J.; Clauser, Brian E.

    2014-01-01

    Evidence of stable standard setting results over panels or occasions is an important part of the validity argument for an established cut score. Unfortunately, due to the high cost of convening multiple panels of content experts, standards often are based on the recommendation from a single panel of judges. This approach implicitly assumes that…

  17. Intellectual Curiosity in Action: A Framework to Assess First-Year Seminars in Liberal Arts Settings

    ERIC Educational Resources Information Center

    Kolb, Kenneth H.; Longest, Kyle C.; Barnett, Jenna C.

    2014-01-01

    Fostering students' intellectual curiosity is a common goal of first-year seminar programs--especially in liberal arts settings. The authors propose an alternative method to assess this ambiguous, value-laden concept. Relying on data gathered from pre- and posttest in-depth interviews of 34 students enrolled in first-year seminars, they construct…

  18. Evidence-Based Standard Setting: Establishing a Validity Framework for Cut Scores

    ERIC Educational Resources Information Center

    McClarty, Katie Larsen; Way, Walter D.; Porter, Andrew C.; Beimers, Jennifer N.; Miles, Julie A.

    2013-01-01

    Performance standards are a powerful way to communicate K-12 student achievement (e.g., proficiency) and are the cornerstone of standards-based reform. As education reform shifts the focus to college and career readiness, approaches for setting performance standards need to be revised. We argue that the focus on assessing student readiness can…

  19. Treating Voice Disorders in the School-Based Setting: Working within the Framework of IDEA

    ERIC Educational Resources Information Center

    Ruddy, Bari Hoffman; Sapienza, Christine M.

    2004-01-01

    The role of the speech-language pathologist (SLP) has developed considerably over the last 10 years given the medical and technological advances in life-sustaining procedures. Over time, children born with congenital, surgical, or "medically fragile" conditions have become mainstreamed into regular school-based settings, thus extending…

  20. A set of STS assays targeting the chromosome 22 physical framework markers

    SciTech Connect

    MacCollin, M.; Romano, D.; Trofatter, J.; Menon, A.; Gusella, J. ); Budarf, M.; Emanuel, B. Children's Hospital, Philadelphia, PA ); Denny, C. ); Rouleau, G. ); Fontaine, B. )

    1993-03-01

    The widespread use of the sequence-tagged site (STS) as a quick, efficient, and reproducible assay for comparing physical and genetic map information promises to facilitate greatly long-range goals of the mapping of the human genome. The authors have designed 21 STS assays for loci on human chromosome 22. These assays primarily tag the physical framework markers of the long arm of 22, but additional assays have been designed from known genes and loci in the neurofibromatosis 2 (NF2) region. The availability of these assays will make these loci available to the research community without physical transfer of materials and will serve as start points for further efforts to physically map chromosome 22 with yeast artificial chromosome clones. 19 refs., 1 fig., 1 tab.

  1. Improving adolescent health policy: incorporating a framework for assessing state-level policies.

    PubMed

    Brindis, Claire D; Moore, Kristin

    2014-01-01

    Many US policies that affect health are made at the state, not the federal, level. Identifying state-level policies and data to analyze how different policies affect outcomes may help policy makers ascertain the usefulness of their public policies and funding decisions in improving the health of adolescent populations. A framework for describing and assessing the role of federal and state policies on adolescent health and well-being is proposed; an example of how the framework might be applied to the issue of teen childbearing is included. Such a framework can also help inform analyses of whether and how state and federal policies contribute to the variation across states in meeting adolescent health needs. A database on state policies, contextual variables, and health outcomes data can further enable researchers and policy makers to examine how these factors are associated with behaviors they aim to impact.

  2. Disseminating hypnosis to health care settings: Applying the RE-AIM framework

    PubMed Central

    Yeh, Vivian M.; Schnur, Julie B.; Montgomery, Guy H.

    2014-01-01

    Hypnosis is a brief intervention ready for wider dissemination in medical contexts. Overall, hypnosis remains underused despite evidence supporting its beneficial clinical impact. This review will evaluate the evidence supporting hypnosis for dissemination using guidelines formulated by Glasgow and colleagues (1999). Five dissemination dimensions will be considered: Reach, Efficacy, Adoption, Implementation, and Maintenance (RE-AIM). Reach In medical settings, hypnosis is capable of helping a diverse range of individuals with a wide variety of problems. Efficacy There is evidence supporting the use of hypnosis for chronic pain, acute pain and emotional distress arising from medical procedures and conditions, cancer treatment-related side-effects and irritable bowel syndrome. Adoption Although hypnosis is currently not a part of mainstream clinical practices, evidence suggests that patients and healthcare providers are open to trying hypnosis, and may become more so when educated about what hypnosis can do. Implementation Hypnosis is a brief intervention capable of being administered effectively by healthcare providers. Maintenance Given the low resource needs of hypnosis, opportunities for reimbursement, and the ability of the intervention to potentially help medical settings reduce costs, the intervention has the qualities necessary to be integrated into routine care in a self-sustaining way in medical settings. In sum, hypnosis is a promising candidate for further dissemination. PMID:25267941

  3. Level set based vertebra segmentation for the evaluation of Ankylosing Spondylitis

    NASA Astrophysics Data System (ADS)

    Tan, Sovira; Yao, Jianhua; Ward, Michael M.; Yao, Lawrence; Summers, Ronald M.

    2006-03-01

    Ankylosing Spondylitis is a disease of the vertebra where abnormal bone structures (syndesmophytes) grow at intervertebral disk spaces. Because this growth is so slow as to be undetectable on plain radiographs taken over years, it is necessary to resort to computerized techniques to complement qualitative human judgment with precise quantitative measures on 3-D CT images. Very fine segmentation of the vertebral body is required to capture the small structures caused by the pathology. We propose a segmentation algorithm based on a cascade of three level set stages and requiring no training or prior knowledge. First, the noise inside the vertebral body that often blocks the proper evolution of level set surfaces is attenuated by a sigmoid function whose parameters are determined automatically. The 1st level set (geodesic active contour) is designed to roughly segment the interior of the vertebra despite often highly inhomogeneous and even discontinuous boundaries. The result is used as an initial contour for the 2nd level set (Laplacian level set) that closely captures the inner boundary of the cortical bone. The last level set (reversed Laplacian level set) segments the outer boundary of the cortical bone and also corrects small flaws of the previous stage. We carried out extensive tests on 30 vertebrae (5 from each of 6 patients). Two medical experts scored the results at intervertebral disk spaces focusing on end plates and syndesmophytes. Only two minor segmentation errors at vertebral end plates were reported and two syndesmophytes were considered slightly under-segmented.

  4. Implementing the New State Framework for History-Social Studies of (Tenth Grade Level).

    ERIC Educational Resources Information Center

    Leavey, Don

    1990-01-01

    Describes experience of implementing new California History Social Science Framework at the tenth grade level at Edison High School, Huntington Beach, California. Discusses the anxieties felt by teachers as they omitted areas of world history to teach selected topics in greater depth. Presents the world history course structure that was developed…

  5. A level-set procedure for the design of electromagnetic metamaterials.

    PubMed

    Zhou, Shiwei; Li, Wei; Sun, Guangyong; Li, Qing

    2010-03-29

    Achieving negative permittivity and negative permeability signifies a key topic of research in the design of metamaterials. This paper introduces a level-set based topology optimization method, in which the interface between the vacuum and metal phases is implicitly expressed by the zero-level contour of a higher dimensional level-set function. Following a sensitivity analysis, the optimization maximizes the objective based on the normal direction of the level-set function and induced current flow, thereby generating the desirable patterns of current flow on metal surface. As a benchmark example, the U-shaped structure and its variations are obtained from the level-set topology optimization. Numerical examples demonstrate that both negative permittivity and negative permeability can be attained.

  6. Level set discrete element method for three-dimensional computations with triaxial case study

    NASA Astrophysics Data System (ADS)

    Kawamoto, Reid; Andò, Edward; Viggiani, Gioacchino; Andrade, José E.

    2016-06-01

    In this paper, we outline the level set discrete element method (LS-DEM) which is a discrete element method variant able to simulate systems of particles with arbitrary shape using level set functions as a geometric basis. This unique formulation allows seamless interfacing with level set-based characterization methods as well as computational ease in contact calculations. We then apply LS-DEM to simulate two virtual triaxial specimens generated from XRCT images of experiments and demonstrate LS-DEM's ability to quantitatively capture and predict stress-strain and volume-strain behavior observed in the experiments.

  7. The Harmonizing Outcome Measures for Eczema (HOME) roadmap: a methodological framework to develop core sets of outcome measurements in dermatology.

    PubMed

    Schmitt, Jochen; Apfelbacher, Christian; Spuls, Phyllis I; Thomas, Kim S; Simpson, Eric L; Furue, Masutaka; Chalmers, Joanne; Williams, Hywel C

    2015-01-01

    Core outcome sets (COSs) are consensus-derived minimum sets of outcomes to be assessed in a specific situation. COSs are being increasingly developed to limit outcome-reporting bias, allow comparisons across trials, and strengthen clinical decision making. Despite the increasing interest in outcomes research, methods to develop COSs have not yet been standardized. The aim of this paper is to present the Harmonizing Outcomes Measures for Eczema (HOME) roadmap for the development and implementation of COSs, which was developed on the basis of our experience in the standardization of outcome measurements for atopic eczema. Following the establishment of a panel representing all relevant stakeholders and a research team experienced in outcomes research, the scope and setting of the core set should be defined. The next steps are the definition of a core set of outcome domains such as symptoms or quality of life, followed by the identification or development and validation of appropriate outcome measurement instruments to measure these core domains. Finally, the consented COS needs to be disseminated, implemented, and reviewed. We believe that the HOME roadmap is a useful methodological framework to develop COSs in dermatology, with the ultimate goal of better decision making and promoting patient-centered health care.

  8. Ice cover, landscape setting, and geological framework of Lake Vostok, East Antarctica

    USGS Publications Warehouse

    Studinger, M.; Bell, R.E.; Karner, G.D.; Tikku, A.A.; Holt, J.W.; Morse, D.L.; David, L.; Richter, T.G.; Kempf, S.D.; Peters, M.E.; Blankenship, D.D.; Sweeney, R.E.; Rystrom, V.L.

    2003-01-01

    Lake Vostok, located beneath more than 4 km of ice in the middle of East Antarctica, is a unique subglacial habitat and may contain microorganisms with distinct adaptations to such an extreme environment. Melting and freezing at the base of the ice sheet, which slowly flows across the lake, controls the flux of water, biota and sediment particles through the lake. The influx of thermal energy, however, is limited to contributions from below. Thus the geological origin of Lake Vostok is a critical boundary condition for the subglacial ecosystem. We present the first comprehensive maps of ice surface, ice thickness and subglacial topography around Lake Vostok. The ice flow across the lake and the landscape setting are closely linked to the geological origin of Lake Vostok. Our data show that Lake Vostok is located along a major geological boundary. Magnetic and gravity data are distinct east and west of the lake, as is the roughness of the subglacial topography. The physiographic setting of the lake has important consequences for the ice flow and thus the melting and freezing pattern and the lake's circulation. Lake Vostok is a tectonically controlled subglacial lake. The tectonic processes provided the space for a unique habitat and recent minor tectonic activity could have the potential to introduce small, but significant amounts of thermal energy into the lake. ?? 2002 Elsevier Science B.V. All rights reserved.

  9. A rough set based rational clustering framework for determining correlated genes.

    PubMed

    Jeyaswamidoss, Jeba Emilyn; Thangaraj, Kesavan; Ramar, Kadarkarai; Chitra, Muthusamy

    2016-06-01

    Cluster analysis plays a foremost role in identifying groups of genes that show similar behavior under a set of experimental conditions. Several clustering algorithms have been proposed for identifying gene behaviors and to understand their significance. The principal aim of this work is to develop an intelligent rough clustering technique, which will efficiently remove the irrelevant dimensions in a high-dimensional space and obtain appropriate meaningful clusters. This paper proposes a novel biclustering technique that is based on rough set theory. The proposed algorithm uses correlation coefficient as a similarity measure to simultaneously cluster both the rows and columns of a gene expression data matrix and mean squared residue to generate the initial biclusters. Furthermore, the biclusters are refined to form the lower and upper boundaries by determining the membership of the genes in the clusters using mean squared residue. The algorithm is illustrated with yeast gene expression data and the experiment proves the effectiveness of the method. The main advantage is that it overcomes the problem of selection of initial clusters and also the restriction of one object belonging to only one cluster by allowing overlapping of biclusters.

  10. A Conceptual Framework for Organizational Readiness to Implement Nutrition and Physical Activity Programs in Early Childhood Education Settings

    PubMed Central

    Upadhyaya, Mudita; Schober, Daniel J.; Byrd-Williams, Courtney

    2014-01-01

    Across multiple sectors, organizational readiness predicts the success of program implementation. However, the factors influencing readiness of early childhood education (ECE) organizations for implementation of new nutrition and physical activity programs is poorly understood. This study presents a new conceptual framework to measure organizational readiness to implement nutrition and physical activity programs in ECE centers serving children aged 0 to 5 years. The framework was validated for consensus on relevance and generalizability by conducting focus groups; the participants were managers (16 directors and 2 assistant directors) of ECE centers. The framework theorizes that it is necessary to have “collective readiness,” which takes into account such factors as resources, organizational operations, work culture, and the collective attitudes, motivation, beliefs, and intentions of ECE staff. Results of the focus groups demonstrated consensus on the relevance of proposed constructs across ECE settings. Including readiness measures during program planning and evaluation could inform implementation of ECE programs targeting nutrition and physical activity behaviors. PMID:25357258

  11. Epidemic Reconstruction in a Phylogenetics Framework: Transmission Trees as Partitions of the Node Set.

    PubMed

    Hall, Matthew; Woolhouse, Mark; Rambaut, Andrew

    2015-12-01

    The use of genetic data to reconstruct the transmission tree of infectious disease epidemics and outbreaks has been the subject of an increasing number of studies, but previous approaches have usually either made assumptions that are not fully compatible with phylogenetic inference, or, where they have based inference on a phylogeny, have employed a procedure that requires this tree to be fixed. At the same time, the coalescent-based models of the pathogen population that are employed in the methods usually used for time-resolved phylogeny reconstruction are a considerable simplification of epidemic process, as they assume that pathogen lineages mix freely. Here, we contribute a new method that is simultaneously a phylogeny reconstruction method for isolates taken from an epidemic, and a procedure for transmission tree reconstruction. We observe that, if one or more samples is taken from each host in an epidemic or outbreak and these are used to build a phylogeny, a transmission tree is equivalent to a partition of the set of nodes of this phylogeny, such that each partition element is a set of nodes that is connected in the full tree and contains all the tips corresponding to samples taken from one and only one host. We then implement a Monte Carlo Markov Chain (MCMC) procedure for simultaneous sampling from the spaces of both trees, utilising a newly-designed set of phylogenetic tree proposals that also respect node partitions. We calculate the posterior probability of these partitioned trees based on a model that acknowledges the population structure of an epidemic by employing an individual-based disease transmission model and a coalescent process taking place within each host. We demonstrate our method, first using simulated data, and then with sequences taken from the H7N7 avian influenza outbreak that occurred in the Netherlands in 2003. We show that it is superior to established coalescent methods for reconstructing the topology and node heights of the

  12. Setting a Minimum Standard of Care in Clinical Trials: Human Rights and Bioethics as Complementary Frameworks.

    PubMed

    Marouf, Fatma E; Esplin, Bryn S

    2015-06-11

    For the past few decades, there has been intense debate in bioethics about the standard of care that should be provided in clinical trials conducted in developing countries. Some interpret the Declaration of Helsinki to mean that control groups should receive the best intervention available worldwide, while others interpret this and other international guidelines to mean the best local standard of care. Questions of justice are particularly relevant where limited resources mean that the local standard of care is no care at all. Introducing human rights law into this complex and longstanding debate adds a new and important perspective. Through non-derogable rights, including the core obligations of the right to health, human rights law can help set a minimum standard of care.

  13. Basin-scale runoff prediction: An Ensemble Kalman Filter framework based on global hydrometeorological data sets

    NASA Astrophysics Data System (ADS)

    Lorenz, Christof; Tourian, Mohammad J.; Devaraju, Balaji; Sneeuw, Nico; Kunstmann, Harald

    2015-10-01

    In order to cope with the steady decline of the number of in situ gauges worldwide, there is a growing need for alternative methods to estimate runoff. We present an Ensemble Kalman Filter based approach that allows us to conclude on runoff for poorly or irregularly gauged basins. The approach focuses on the application of publicly available global hydrometeorological data sets for precipitation (GPCC, GPCP, CRU, UDEL), evapotranspiration (MODIS, FLUXNET, GLEAM, ERA interim, GLDAS), and water storage changes (GRACE, WGHM, GLDAS, MERRA LAND). Furthermore, runoff data from the GRDC and satellite altimetry derived estimates are used. We follow a least squares prediction that exploits the joint temporal and spatial auto- and cross-covariance structures of precipitation, evapotranspiration, water storage changes and runoff. We further consider time-dependent uncertainty estimates derived from all data sets. Our in-depth analysis comprises of 29 large river basins of different climate regions, with which runoff is predicted for a subset of 16 basins. Six configurations are analyzed: the Ensemble Kalman Filter (Smoother) and the hard (soft) Constrained Ensemble Kalman Filter (Smoother). Comparing the predictions to observed monthly runoff shows correlations larger than 0.5, percentage biases lower than ± 20%, and NSE-values larger than 0.5. A modified NSE-metric, stressing the difference to the mean annual cycle, shows an improvement of runoff predictions for 14 of the 16 basins. The proposed method is able to provide runoff estimates for nearly 100 poorly gauged basins covering an area of more than 11,500,000 km2 with a freshwater discharge, in volume, of more than 125,000 m3/s.

  14. Basin-scale runoff prediction: An Ensemble Kalman Filter framework based on global hydrometeorological data sets

    NASA Astrophysics Data System (ADS)

    Kunstmann, Harald; Lorenz, Christof; Tourian, Mohammad; Devaraju, Balaji; Sneeuw, Nico

    2016-04-01

    In order to cope with the steady decline of the number of in situ gauges worldwide, there is a growing need for alternative methods to estimate runoff. We present an Ensemble Kalman Filter based approach that allows us to conclude on runoff for poorly or irregularly gauged basins. The approach focuses on the application of publicly available global hydrometeorological data sets for precipitation (GPCC, GPCP, CRU, UDEL), evapotranspiration (MODIS, FLUXNET, GLEAM, ERA interim, GLDAS), and water storage changes (GRACE, WGHM, GLDAS, MERRA LAND). Furthermore, runoff data from the GRDC and satellite altimetry derived estimates are used. We follow a least squares prediction that exploits the joint temporal and spatial auto- and cross-covariance structures of precipitation, evapotranspiration, water storage changes and runoff. We further consider time-dependent uncertainty estimates derived from all data sets. Our in-depth analysis comprises of 29 large river basins of different climate regions, with which runoff is predicted for a subset of 16 basins. Six configurations are analyzed: the Ensemble Kalman Filter (Smoother) and the hard (soft) Constrained Ensemble Kalman Filter (Smoother). Comparing the predictions to observed monthly runoff shows correlations larger than 0.5, percentage biases lower than ± 20%, and NSE-values larger than 0.5. A modified NSE-metric, stressing the difference to the mean annual cycle, shows an improvement of runoff predictions for 14 of the 16 basins. The proposed method is able to provide runoff estimates for nearly 100 poorly gauged basins covering an area of more than 11,500,000 km2 with a freshwater discharge, in volume, of more than 125,000 m3/s.

  15. An adaptive level set approach for incompressible two-phase flows

    SciTech Connect

    Sussman, M.; Almgren, A.S.; Bell, J.B.

    1997-04-01

    In Sussman, Smereka and Osher, a numerical method using the level set approach was formulated for solving incompressible two-phase flow with surface tension. In the level set approach, the interface is represented as the zero level set of a smooth function; this has the effect of replacing the advection of density, which has steep gradients at the interface, with the advection of the level set function, which is smooth. In addition, the interface can merge or break up with no special treatment. The authors maintain the level set function as the signed distance from the interface in order to robustly compute flows with high density ratios and stiff surface tension effects. In this work, they couple the level set scheme to an adaptive projection method for the incompressible Navier-Stokes equations, in order to achieve higher resolution of the interface with a minimum of additional expense. They present two-dimensional axisymmetric and fully three-dimensional results of air bubble and water drop computations.

  16. A discontinuous Galerkin conservative level set scheme for interface capturing in multiphase flows

    NASA Astrophysics Data System (ADS)

    Owkes, Mark; Desjardins, Olivier

    2013-09-01

    The accurate conservative level set (ACLS) method of Desjardins et al. [O. Desjardins, V. Moureau, H. Pitsch, An accurate conservative level set/ghost fluid method for simulating turbulent atomization, J. Comput. Phys. 227 (18) (2008) 8395-8416] is extended by using a discontinuous Galerkin (DG) discretization. DG allows for the scheme to have an arbitrarily high order of accuracy with the smallest possible computational stencil resulting in an accurate method with good parallel scaling. This work includes a DG implementation of the level set transport equation, which moves the level set with the flow field velocity, and a DG implementation of the reinitialization equation, which is used to maintain the shape of the level set profile to promote good mass conservation. A near second order converging interface curvature is obtained by following a height function methodology (common amongst volume of fluid schemes) in the context of the conservative level set. Various numerical experiments are conducted to test the properties of the method and show excellent results, even on coarse meshes. The tests include Zalesak’s disk, two-dimensional deformation of a circle, time evolution of a standing wave, and a study of the Kelvin-Helmholtz instability. Finally, this novel methodology is employed to simulate the break-up of a turbulent liquid jet.

  17. A discontinuous Galerkin conservative level set scheme for interface capturing in multiphase flows

    SciTech Connect

    Owkes, Mark Desjardins, Olivier

    2013-09-15

    The accurate conservative level set (ACLS) method of Desjardins et al. [O. Desjardins, V. Moureau, H. Pitsch, An accurate conservative level set/ghost fluid method for simulating turbulent atomization, J. Comput. Phys. 227 (18) (2008) 8395–8416] is extended by using a discontinuous Galerkin (DG) discretization. DG allows for the scheme to have an arbitrarily high order of accuracy with the smallest possible computational stencil resulting in an accurate method with good parallel scaling. This work includes a DG implementation of the level set transport equation, which moves the level set with the flow field velocity, and a DG implementation of the reinitialization equation, which is used to maintain the shape of the level set profile to promote good mass conservation. A near second order converging interface curvature is obtained by following a height function methodology (common amongst volume of fluid schemes) in the context of the conservative level set. Various numerical experiments are conducted to test the properties of the method and show excellent results, even on coarse meshes. The tests include Zalesak’s disk, two-dimensional deformation of a circle, time evolution of a standing wave, and a study of the Kelvin–Helmholtz instability. Finally, this novel methodology is employed to simulate the break-up of a turbulent liquid jet.

  18. Locally constrained active contour: a region-based level set for ovarian cancer metastasis segmentation

    NASA Astrophysics Data System (ADS)

    Liu, Jianfei; Yao, Jianhua; Wang, Shijun; Linguraru, Marius George; Summers, Ronald M.

    2014-03-01

    Accurate segmentation of ovarian cancer metastases is clinically useful to evaluate tumor growth and determine follow-up treatment. We present a region-based level set algorithm with localization constraints to segment ovarian cancer metastases. Our approach is established on a representative region-based level set, Chan-Vese model, in which an active contour is driven by region competition. To reduce over-segmentation, we constrain the level set propagation within a narrow image band by embedding a dynamic localization function. The metastasis intensity prior is also estimated from image regions within the level set initialization. The localization function and intensity prior force the level set to stop at the desired metastasis boundaries. Our approach was validated on 19 ovarian cancer metastases with radiologist-labeled ground-truth on contrast-enhanced CT scans from 15 patients. The comparison between our algorithm and geodesic active contour indicated that the volume overlap was 75+/-10% vs. 56+/-6%, the Dice coefficient was 83+/-8% vs. 63+/-8%, and the average surface distance was 2.2+/-0.6mm vs. 4.4+/-0.9mm. Experimental results demonstrated that our algorithm outperformed traditional level set algorithms.

  19. [Intellectual development disorders in Latin America: a framework for setting policy priorities for research and care].

    PubMed

    Lazcano-Ponce, Eduardo; Katz, Gregorio; Allen-Leigh, Betania; Magaña Valladares, Laura; Rangel-Eudave, Guillermina; Minoletti, Alberto; Wahlberg, Ernesto; Vásquez, Armando; Salvador-Carulla, Luis

    2013-09-01

    Intellectual development disorders (IDDs) are a set of development disorders characterized by significantly limited cognitive functioning, learning disorders, and disorders related to adaptive skills and behavior. Previously grouped under the term "intellectual disability," this problem has not been widely studied or quantified in Latin America. Those affected are absent from public policy and do not benefit from government social development and poverty reduction strategies. This article offers a critical look at IDDs and describes a new taxonomy; it also proposes recognizing IDDs as a public health issue and promoting the professionalization of care, and suggests an agenda for research and regional action. In Latin America there is no consensus on the diagnostic criteria for IDDs. A small number of rehabilitation programs cover a significant proportion of the people who suffer from IDDs, evidence-based services are not offered, and health care guidelines have not been evaluated. Manuals on psychiatric diagnosis focus heavily on identifying serious IDDs and contribute to underreporting and erroneous classification. The study of these disorders has not been a legal, social science, or public health priority, resulting in a dearth of scientific evidence on them. Specific competencies and professionalization of care for these persons are needed, and interventions must be carried out with a view to prevention, rehabilitation, community integration, and inclusion in the work force.

  20. Evolving entities: towards a unified framework for understanding diversity at the species and higher levels

    PubMed Central

    Barraclough, Timothy G.

    2010-01-01

    Current approaches to studying the evolution of biodiversity differ in their treatment of species and higher level diversity patterns. Species are regarded as the fundamental evolutionarily significant units of biodiversity, both in theory and in practice, and extensive theory explains how they originate and evolve. However, most species are still delimited using qualitative methods that only relate indirectly to the underlying theory. In contrast, higher level patterns of diversity have been subjected to rigorous quantitative study (using phylogenetics), but theory that adequately explains the observed patterns has been lacking. Most evolutionary analyses of higher level diversity patterns have considered non-equilibrium explanations based on rates of diversification (i.e. exponentially growing clades), rather than equilibrium explanations normally used at the species level and below (i.e. constant population sizes). This paper argues that species level and higher level patterns of diversity can be considered within a common framework, based on equilibrium explanations. It shows how forces normally considered in the context of speciation, namely divergent selection and geographical isolation, can generate evolutionarily significant units of diversity above the level of reproductively isolated species. Prospects for the framework to answer some unresolved questions about higher level diversity patterns are discussed. PMID:20439282

  1. Options for future effective water management in Lombok: A multi-level nested framework

    NASA Astrophysics Data System (ADS)

    Sjah, Taslim; Baldwin, Claudia

    2014-11-01

    Previous research on water use in Lombok identified reduced water available in springs and limits on seasonal water availability. It foreshadowed increasing competition for water resources in critical areas of Lombok. This study examines preliminary information on local social-institutional arrangements for water allocation in the context of Ostrom's rules for self-governing institutions. We identify robust customary mechanisms for decision-making about water sharing and rules at a local level and suggest areas of further investigation for strengthening multi-level networked and nested frameworks, in collaboration with higher levels of government.

  2. Ontological Problem-Solving Framework for Assigning Sensor Systems and Algorithms to High-Level Missions

    PubMed Central

    Qualls, Joseph; Russomanno, David J.

    2011-01-01

    The lack of knowledge models to represent sensor systems, algorithms, and missions makes opportunistically discovering a synthesis of systems and algorithms that can satisfy high-level mission specifications impractical. A novel ontological problem-solving framework has been designed that leverages knowledge models describing sensors, algorithms, and high-level missions to facilitate automated inference of assigning systems to subtasks that may satisfy a given mission specification. To demonstrate the efficacy of the ontological problem-solving architecture, a family of persistence surveillance sensor systems and algorithms has been instantiated in a prototype environment to demonstrate the assignment of systems to subtasks of high-level missions. PMID:22164081

  3. A framework for sea level rise vulnerability assessment for southwest U.S. military installations

    USGS Publications Warehouse

    Chadwick, B.; Flick, Reinhard; Helly, J.; Nishikawa, T.; Pei, Fang Wang; O'Reilly, W.; Guza, R.; Bromirski, Peter; Young, A.; Crampton, W.; Wild, B.; Canner, I.

    2011-01-01

    We describe an analysis framework to determine military installation vulnerabilities under increases in local mean sea level as projected over the next century. The effort is in response to an increasing recognition of potential climate change ramifications for national security and recommendations that DoD conduct assessments of the impact on U.S. military installations of climate change. Results of the effort described here focus on development of a conceptual framework for sea level rise vulnerability assessment at coastal military installations in the southwest U.S. We introduce the vulnerability assessment in the context of a risk assessment paradigm that incorporates sources in the form of future sea level conditions, pathways of impact including inundation, flooding, erosion and intrusion, and a range of military installation specific receptors such as critical infrastructure and training areas. A unique aspect of the methodology is the capability to develop wave climate projections from GCM outputs and transform these to future wave conditions at specific coastal sites. Future sea level scenarios are considered in the context of installation sensitivity curves which reveal response thresholds specific to each installation, pathway and receptor. In the end, our goal is to provide a military-relevant framework for assessment of accelerated SLR vulnerability, and develop the best scientifically-based scenarios of waves, tides and storms and their implications for DoD installations in the southwestern U.S. ?? 2011 MTS.

  4. Joint inversion of geophysical data using petrophysical clustering and facies deformation wth the level set technique

    NASA Astrophysics Data System (ADS)

    Revil, A.

    2015-12-01

    Geological expertise and petrophysical relationships can be brought together to provide prior information while inverting multiple geophysical datasets. The merging of such information can result in more realistic solution in the distribution of the model parameters, reducing ipse facto the non-uniqueness of the inverse problem. We consider two level of heterogeneities: facies, described by facies boundaries and heteroegenities inside each facies determined by a correlogram. In this presentation, we pose the geophysical inverse problem in terms of Gaussian random fields with mean functions controlled by petrophysical relationships and covariance functions controlled by a prior geological cross-section, including the definition of spatial boundaries for the geological facies. The petrophysical relationship problem is formulated as a regression problem upon each facies. The inversion of the geophysical data is performed in a Bayesian framework. We demonstrate the usefulness of this strategy using a first synthetic case for which we perform a joint inversion of gravity and galvanometric resistivity data with the stations located at the ground surface. The joint inversion is used to recover the density and resistivity distributions of the subsurface. In a second step, we consider the possibility that the facies boundaries are deformable and their shapes are inverted as well. We use the level set approach to perform such deformation preserving prior topological properties of the facies throughout the inversion. With the help of prior facies petrophysical relationships and topological characteristic of each facies, we make posterior inference about multiple geophysical tomograms based on their corresponding geophysical data misfits. The method is applied to a second synthetic case showing that we can recover the heterogeneities inside the facies, the mean values for the petrophysical properties, and, to some extent, the facies boundaries using the 2D joint inversion of

  5. A three-tier framework for monitoring antiretroviral therapy in high HIV burden settings

    PubMed Central

    Osler, Meg; Hilderbrand, Katherine; Hennessey, Claudine; Arendse, Juanita; Goemaere, Eric; Ford, Nathan; Boulle, Andrew

    2014-01-01

    The provision of antiretroviral therapy (ART) in low and middle-income countries is a chronic disease intervention of unprecedented magnitude and is the dominant health systems challenge for high-burden countries, many of which rank among the poorest in the world. Substantial external investment, together with the requirement for service evolution to adapt to changing needs, including the constant shift to earlier ART initiation, makes outcome monitoring and reporting particularly important. However, there is growing concern at the inability of many high-burden countries to report on the outcomes of patients who have been in care for various durations, or even the number of patients in care at a particular point in time. In many instances, countries can only report on the number of patients ever started on ART. Despite paper register systems coming under increasing strain, the evolution from paper directly to complex electronic medical record solutions is not viable in many contexts. Implementing a bridging solution, such as a simple offline electronic version of the paper register, can be a pragmatic alternative. This paper describes and recommends a three-tiered monitoring approach in low- and middle-income countries based on the experience implementing such a system in the Western Cape province of South Africa. A three-tier approach allows Ministries of Health to strategically implement one of the tiers in each facility offering ART services. Each tier produces the same nationally required monthly enrolment and quarterly cohort reports so that outputs from the three tiers can be aggregated into a single database at any level of the health system. The choice of tier is based on context and resources at the time of implementation. As resources and infrastructure improve, more facilities will transition to the next highest and more technologically sophisticated tier. Implementing a three-tier monitoring system at country level for pre-antiretroviral wellness, ART

  6. A three-tier framework for monitoring antiretroviral therapy in high HIV burden settings.

    PubMed

    Osler, Meg; Hilderbrand, Katherine; Hennessey, Claudine; Arendse, Juanita; Goemaere, Eric; Ford, Nathan; Boulle, Andrew

    2014-01-01

    The provision of antiretroviral therapy (ART) in low and middle-income countries is a chronic disease intervention of unprecedented magnitude and is the dominant health systems challenge for high-burden countries, many of which rank among the poorest in the world. Substantial external investment, together with the requirement for service evolution to adapt to changing needs, including the constant shift to earlier ART initiation, makes outcome monitoring and reporting particularly important. However, there is growing concern at the inability of many high-burden countries to report on the outcomes of patients who have been in care for various durations, or even the number of patients in care at a particular point in time. In many instances, countries can only report on the number of patients ever started on ART. Despite paper register systems coming under increasing strain, the evolution from paper directly to complex electronic medical record solutions is not viable in many contexts. Implementing a bridging solution, such as a simple offline electronic version of the paper register, can be a pragmatic alternative. This paper describes and recommends a three-tiered monitoring approach in low- and middle-income countries based on the experience implementing such a system in the Western Cape province of South Africa. A three-tier approach allows Ministries of Health to strategically implement one of the tiers in each facility offering ART services. Each tier produces the same nationally required monthly enrolment and quarterly cohort reports so that outputs from the three tiers can be aggregated into a single database at any level of the health system. The choice of tier is based on context and resources at the time of implementation. As resources and infrastructure improve, more facilities will transition to the next highest and more technologically sophisticated tier. Implementing a three-tier monitoring system at country level for pre-antiretroviral wellness, ART

  7. The Effects on Motor Performance of Setting an Overt Level of Aspiration by Mentally Retarded Students.

    ERIC Educational Resources Information Center

    Kozar, Bill

    This study investigates the effects of setting an overt level of aspiration on the standing long jump performance of mildly and moderately retarded institutionalized children. Thirty-three mildly retarded and seven moderately retarded students were randomly assigned to either an overt level of aspiration (OLA) group or a control group. Each…

  8. Target Detection in SAR Images Based on a Level Set Approach

    SciTech Connect

    Marques, Regis C.P.; Medeiros, Fatima N.S.; Ushizima, Daniela M.

    2008-09-01

    This paper introduces a new framework for point target detection in synthetic aperture radar (SAR) images. We focus on the task of locating reflective small regions using alevel set based algorithm. Unlike most of the approaches in image segmentation, we address an algorithm which incorporates speckle statistics instead of empirical parameters and also discards speckle filtering. The curve evolves according to speckle statistics, initially propagating with a maximum upward velocity in homogeneous areas. Our approach is validated by a series of tests on synthetic and real SAR images and compared with three other segmentation algorithms, demonstrating that it configures a novel and efficient method for target detection purpose.

  9. Breast mass segmentation in digital mammography based on pulse coupled neural network and level set method

    NASA Astrophysics Data System (ADS)

    Xie, Weiying; Ma, Yide; Li, Yunsong

    2015-05-01

    A novel approach to mammographic image segmentation, termed as PCNN-based level set algorithm, is presented in this paper. Just as its name implies, a method based on pulse coupled neural network (PCNN) in conjunction with the variational level set method for medical image segmentation. To date, little work has been done on detecting the initial zero level set contours based on PCNN algorithm for latterly level set evolution. When all the pixels of the input image are fired by PCNN, the small pixel value will be a much more refined segmentation. In mammographic image, the breast tumor presents big pixel value. Additionally, the mammographic image with predominantly dark region, so that we firstly obtain the negative of mammographic image with predominantly dark region except the breast tumor before all the pixels of an input image are fired by PCNN. Therefore, in here, PCNN algorithm is employed to achieve mammary-specific, initial mass contour detection. After that, the initial contours are all extracted. We define the extracted contours as the initial zero level set contours for automatic mass segmentation by variational level set in mammographic image analysis. What's more, a new proposed algorithm improves external energy of variational level set method in terms of mammographic images in low contrast. In accordance with the gray scale of mass region in mammographic image is higher than the region surrounded, so the Laplace operator is used to modify external energy, which could make the bright spot becoming much brighter than the surrounded pixels in the image. A preliminary evaluation of the proposed method performs on a known public database namely MIAS, rather than synthetic images. The experimental results demonstrate that our proposed approach can potentially obtain better masses detection results in terms of sensitivity and specificity. Ultimately, this algorithm could lead to increase both sensitivity and specificity of the physicians' interpretation of

  10. A CONCEPTUAL FRAMEWORK FOR MANAGING RADIATION DOSE TO PATIENTS IN DIAGNOSTIC RADIOLOGY USING REFERENCE DOSE LEVELS.

    PubMed

    Almén, Anja; Båth, Magnus

    2016-06-01

    The overall aim of the present work was to develop a conceptual framework for managing radiation dose in diagnostic radiology with the intention to support optimisation. An optimisation process was first derived. The framework for managing radiation dose, based on the derived optimisation process, was then outlined. The outset of the optimisation process is four stages: providing equipment, establishing methodology, performing examinations and ensuring quality. The optimisation process comprises a series of activities and actions at these stages. The current system of diagnostic reference levels is an activity in the last stage, ensuring quality. The system becomes a reactive activity only to a certain extent engaging the core activity in the radiology department, performing examinations. Three reference dose levels-possible, expected and established-were assigned to the three stages in the optimisation process, excluding ensuring quality. A reasonably achievable dose range is also derived, indicating an acceptable deviation from the established dose level. A reasonable radiation dose for a single patient is within this range. The suggested framework for managing radiation dose should be regarded as one part of the optimisation process. The optimisation process constitutes a variety of complementary activities, where managing radiation dose is only one part. This emphasises the need to take a holistic approach integrating the optimisation process in different clinical activities.

  11. A hybrid method for pancreas extraction from CT image based on level set methods.

    PubMed

    Jiang, Huiyan; Tan, Hanqing; Fujita, Hiroshi

    2013-01-01

    This paper proposes a novel semiautomatic method to extract the pancreas from abdominal CT images. Traditional level set and region growing methods that request locating initial contour near the final boundary of object have problem of leakage to nearby tissues of pancreas region. The proposed method consists of a customized fast-marching level set method which generates an optimal initial pancreas region to solve the problem that the level set method is sensitive to the initial contour location and a modified distance regularized level set method which extracts accurate pancreas. The novelty in our method is the proper selection and combination of level set methods, furthermore an energy-decrement algorithm and an energy-tune algorithm are proposed to reduce the negative impact of bonding force caused by connected tissue whose intensity is similar with pancreas. As a result, our method overcomes the shortages of oversegmentation at weak boundary and can accurately extract pancreas from CT images. The proposed method is compared to other five state-of-the-art medical image segmentation methods based on a CT image dataset which contains abdominal images from 10 patients. The evaluated results demonstrate that our method outperforms other methods by achieving higher accuracy and making less false segmentation in pancreas extraction.

  12. Issues related to setting exemption levels for oil and gas NORM

    SciTech Connect

    Blunt, D. L.; Gooden, D. S.; Smith, K. P.

    1999-11-12

    In the absence of any federal regulations that specifically address the handling and disposal of wastes containing naturally occurring radioactive material (NORM), individual states have taken responsibility for developing their own regulatory programs for NORM. A key issue in developing NORM rules is defining exemption levels--specific levels or concentrations that determine which waste materials are subject to controlled management. In general, states have drawn upon existing standards and guidelines for similar waste types in establishing exemption levels for NORM. Simply adopting these standards may not be appropriate for oil and gas NORM for several reasons. The Interstate Oil and Gas Compact Commission's NORM Subcommittee has summarized the issues involved in setting exemption levels in a report titled ``Naturally Occurring Radioactive Materials (NORM): Issues from the Oil and Gas Point of View''. The committee has also recommended a set of exemption levels for controlled practices and for remediation activities on the basis of the issues discussed.

  13. Aerostructural Level Set Topology Optimization for a Common Research Model Wing

    NASA Technical Reports Server (NTRS)

    Dunning, Peter D.; Stanford, Bret K.; Kim, H. Alicia

    2014-01-01

    The purpose of this work is to use level set topology optimization to improve the design of a representative wing box structure for the NASA common research model. The objective is to minimize the total compliance of the structure under aerodynamic and body force loading, where the aerodynamic loading is coupled to the structural deformation. A taxi bump case was also considered, where only body force loads were applied. The trim condition that aerodynamic lift must balance the total weight of the aircraft is enforced by allowing the root angle of attack to change. The level set optimization method is implemented on an unstructured three-dimensional grid, so that the method can optimize a wing box with arbitrary geometry. Fast matching and upwind schemes are developed for an unstructured grid, which make the level set method robust and efficient. The adjoint method is used to obtain the coupled shape sensitivities required to perform aerostructural optimization of the wing box structure.

  14. A Variational Level Set Approach to Segmentation and Bias Correction of Images with Intensity Inhomogeneity

    PubMed Central

    Huang, Rui; Ding, Zhaohua; Gatenby, Chris; Metaxas, Dimitris; Gore, John

    2009-01-01

    This paper presents a variational level set approach to joint segmentation and bias correction of images with intensity inhomogeneity. Our method is based on an observation that intensities in a relatively small local region are separable, despite of the inseparability of the intensities in the whole image caused by the intensity inhomogeneity. We first define a weighted K-means clustering objective function for image intensities in a neighborhood around each point, with the cluster centers having a multiplicative factor that estimates the bias within the neighborhood. The objective function is then integrated over the entire domain and incorporated into a variational level set formulation. The energy minimization is performed via a level set evolution process. Our method is able to estimate bias of quite general profiles. Moreover, it is robust to initialization, and therefore allows automatic applications. The proposed method has been used for images of various modalities with promising results. PMID:18982712

  15. Hepatic vessel segmentation using variational level set combined with non-local robust statistics.

    PubMed

    Lu, Siyu; Huang, Hui; Liang, Ping; Chen, Gang; Xiao, Liang

    2017-02-01

    Hepatic vessel segmentation is a challenging step in therapy guided by magnetic resonance imaging (MRI). This paper presents an improved variational level set method, which uses non-local robust statistics to suppress the influence of noise in MR images. The non-local robust statistics, which represent vascular features, are learned adaptively from seeds provided by users. K-means clustering in neighborhoods of seeds is utilized to exclude inappropriate seeds, which are obviously corrupted by noise. The neighborhoods of appropriate seeds are placed in an array to calculate the non-local robust statistics, and the variational level set formulation can be constructed. Bias correction is utilized in the level set formulation to reduce the influence of intensity inhomogeneity of MRI. Experiments were conducted over real MR images, and showed that the proposed method performed better on small hepatic vessel segmentation compared with other segmentation methods.

  16. Setting-level influences on implementation of the responsive classroom approach.

    PubMed

    Wanless, Shannon B; Patton, Christine L; Rimm-Kaufman, Sara E; Deutsch, Nancy L

    2013-02-01

    We used mixed methods to examine the association between setting-level factors and observed implementation of a social and emotional learning intervention (Responsive Classroom® approach; RC). In study 1 (N = 33 3rd grade teachers after the first year of RC implementation), we identified relevant setting-level factors and uncovered the mechanisms through which they related to implementation. In study 2 (N = 50 4th grade teachers after the second year of RC implementation), we validated our most salient Study 1 finding across multiple informants. Findings suggested that teachers perceived setting-level factors, particularly principal buy-in to the intervention and individualized coaching, as influential to their degree of implementation. Further, we found that intervention coaches' perspectives of principal buy-in were more related to implementation than principals' or teachers' perspectives. Findings extend the application of setting theory to the field of implementation science and suggest that interventionists may want to consider particular accounts of school setting factors before determining the likelihood of schools achieving high levels of implementation.

  17. Setting the Direction Framework

    ERIC Educational Resources Information Center

    Alberta Education, 2009

    2009-01-01

    Alberta has a long and proud history of meeting the educational needs of students with disabilities and diverse needs. The province serves many thousand students with behavioural, communicational and intellectual needs; as well as students with mental health challenges, learning or physical disabilities and students who are gifted and talented.…

  18. A level-set method for thermal motion of bubbles and droplets

    NASA Astrophysics Data System (ADS)

    Balcázar, Néstor; Oliva, Assensi; Rigola, Joaquim

    2016-09-01

    A conservative level-set model for direct simulation of two-phase flows with thermocapillary effects at dynamically deformable interface is presented. The Navier-Stokes equations coupled with the energy conservation equation are solved by means of a finite-volume/level-set method. Some numerical examples including thermocapillary motion of single and multiple fluid particles are computed by means of the present method. The results are compared with analytical solutions and numerical results from the literature as validations of the proposed model.

  19. Numerical Schemes for the Hamilton-Jacobi and Level Set Equations on Triangulated Domains

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.; Sethian, James A.

    2006-01-01

    Borrowing from techniques developed for conservation law equations, we have developed both monotone and higher order accurate numerical schemes which discretize the Hamilton-Jacobi and level set equations on triangulated domains. The use of unstructured meshes containing triangles (2D) and tetrahedra (3D) easily accommodates mesh adaptation to resolve disparate level set feature scales with a minimal number of solution unknowns. The minisymposium talk will discuss these algorithmic developments and present sample calculations using our adaptive triangulation algorithm applied to various moving interface problems such as etching, deposition, and curvature flow.

  20. Geometrically constrained isogeometric parameterized level-set based topology optimization via trimmed elements

    NASA Astrophysics Data System (ADS)

    Wang, Yingjun; Benson, David J.

    2016-12-01

    In this paper, an approach based on the fast point-in-polygon (PIP) algorithm and trimmed elements is proposed for isogeometric topology optimization (TO) with arbitrary geometric constraints. The isogeometric parameterized level-set-based TO method, which directly uses the non-uniform rational basis splines (NURBS) for both level set function (LSF) parameterization and objective function calculation, provides higher accuracy and efficiency than previous methods. The integration of trimmed elements is completed by the efficient quadrature rule that can design the quadrature points and weights for arbitrary geometric shape. Numerical examples demonstrate the efficiency and flexibility of the method.

  1. Segmentierung des Femurs aus MRT-Daten mit Shape-Based Level-Sets

    NASA Astrophysics Data System (ADS)

    Dekomien, Claudia; Busch, Martin; Teske, Wolfram; Winter, Susanne

    Inhalt dieser Arbeit ist die Segmentierung des Femurs aus MRT-Datensätzen mit einem Shape-based Level-Set-Ansatz. Der Algorithmus besteht aus zwei Phasen, der Modellerstellung und der Segmentierungsphase. In der Segmentierungsphase wurde ein kantenbasiertes und ein auf Intensitäten basierendes Optimierungskriterium mit einander kombiniert. Für eine lokale Verbesserung des Ergebnisses wurde zusätzlich ein Laplacian Level-Set-Verfahren angewendet. Der Femur konnte mit diesem Ansatz in drei verschiedenen MRT-Sequenzen und einem Fusionsdatensatz gut segmentiert werden.

  2. Level set segmentation of brain magnetic resonance images based on local Gaussian distribution fitting energy.

    PubMed

    Wang, Li; Chen, Yunjie; Pan, Xiaohua; Hong, Xunning; Xia, Deshen

    2010-05-15

    This paper presents a variational level set approach in a multi-phase formulation to segmentation of brain magnetic resonance (MR) images with intensity inhomogeneity. In our model, the local image intensities are characterized by Gaussian distributions with different means and variances. We define a local Gaussian distribution fitting energy with level set functions and local means and variances as variables. The means and variances of local intensities are considered as spatially varying functions. Therefore, our method is able to deal with intensity inhomogeneity without inhomogeneity correction. Our method has been applied to 3T and 7T MR images with promising results.

  3. Individual-and Setting-Level Correlates of Secondary Traumatic Stress in Rape Crisis Center Staff.

    PubMed

    Dworkin, Emily R; Sorell, Nicole R; Allen, Nicole E

    2016-02-01

    Secondary traumatic stress (STS) is an issue of significant concern among providers who work with survivors of sexual assault. Although STS has been studied in relation to individual-level characteristics of a variety of types of trauma responders, less research has focused specifically on rape crisis centers as environments that might convey risk or protection from STS, and no research to knowledge has modeled setting-level variation in correlates of STS. The current study uses a sample of 164 staff members representing 40 rape crisis centers across a single Midwestern state to investigate the staff member-and agency-level correlates of STS. Results suggest that correlates exist at both levels of analysis. Younger age and greater severity of sexual assault history were statistically significant individual-level predictors of increased STS. Greater frequency of supervision was more strongly related to secondary stress for non-advocates than for advocates. At the setting level, lower levels of supervision and higher client loads agency-wide accounted for unique variance in staff members' STS. These findings suggest that characteristics of both providers and their settings are important to consider when understanding their STS.

  4. Accurate Adaptive Level Set Method and Sharpening Technique for Three Dimensional Deforming Interfaces

    NASA Technical Reports Server (NTRS)

    Kim, Hyoungin; Liou, Meng-Sing

    2011-01-01

    In this paper, we demonstrate improved accuracy of the level set method for resolving deforming interfaces by proposing two key elements: (1) accurate level set solutions on adapted Cartesian grids by judiciously choosing interpolation polynomials in regions of different grid levels and (2) enhanced reinitialization by an interface sharpening procedure. The level set equation is solved using a fifth order WENO scheme or a second order central differencing scheme depending on availability of uniform stencils at each grid point. Grid adaptation criteria are determined so that the Hamiltonian functions at nodes adjacent to interfaces are always calculated by the fifth order WENO scheme. This selective usage between the fifth order WENO and second order central differencing schemes is confirmed to give more accurate results compared to those in literature for standard test problems. In order to further improve accuracy especially near thin filaments, we suggest an artificial sharpening method, which is in a similar form with the conventional re-initialization method but utilizes sign of curvature instead of sign of the level set function. Consequently, volume loss due to numerical dissipation on thin filaments is remarkably reduced for the test problems

  5. Demons versus Level-Set motion registration for coronary (18)F-sodium fluoride PET.

    PubMed

    Rubeaux, Mathieu; Joshi, Nikhil; Dweck, Marc R; Fletcher, Alison; Motwani, Manish; Thomson, Louise E; Germano, Guido; Dey, Damini; Berman, Daniel S; Newby, David E; Slomka, Piotr J

    2016-02-27

    Ruptured coronary atherosclerotic plaques commonly cause acute myocardial infarction. It has been recently shown that active microcalcification in the coronary arteries, one of the features that characterizes vulnerable plaques at risk of rupture, can be imaged using cardiac gated (18)F-sodium fluoride ((18)F-NaF) PET. We have shown in previous work that a motion correction technique applied to cardiac-gated (18)F-NaF PET images can enhance image quality and improve uptake estimates. In this study, we further investigated the applicability of different algorithms for registration of the coronary artery PET images. In particular, we aimed to compare demons vs. level-set nonlinear registration techniques applied for the correction of cardiac motion in coronary (18)F-NaF PET. To this end, fifteen patients underwent (18)F-NaF PET and prospective coronary CT angiography (CCTA). PET data were reconstructed in 10 ECG gated bins; subsequently these gated bins were registered using demons and level-set methods guided by the extracted coronary arteries from CCTA, to eliminate the effect of cardiac motion on PET images. Noise levels, target-to-background ratios (TBR) and global motion were compared to assess image quality. Compared to the reference standard of using only diastolic PET image (25% of the counts from PET acquisition), cardiac motion registration using either level-set or demons techniques almost halved image noise due to the use of counts from the full PET acquisition and increased TBR difference between (18)F-NaF positive and negative lesions. The demons method produces smoother deformation fields, exhibiting no singularities (which reflects how physically plausible the registration deformation is), as compared to the level-set method, which presents between 4 and 8% of singularities, depending on the coronary artery considered. In conclusion, the demons method produces smoother motion fields as compared to the level-set method, with a motion that is

  6. Demons versus level-set motion registration for coronary 18F-sodium fluoride PET

    NASA Astrophysics Data System (ADS)

    Rubeaux, Mathieu; Joshi, Nikhil; Dweck, Marc R.; Fletcher, Alison; Motwani, Manish; Thomson, Louise E.; Germano, Guido; Dey, Damini; Berman, Daniel S.; Newby, David E.; Slomka, Piotr J.

    2016-03-01

    Ruptured coronary atherosclerotic plaques commonly cause acute myocardial infarction. It has been recently shown that active microcalcification in the coronary arteries, one of the features that characterizes vulnerable plaques at risk of rupture, can be imaged using cardiac gated 18F-sodium fluoride (18F-NaF) PET. We have shown in previous work that a motion correction technique applied to cardiac-gated 18F-NaF PET images can enhance image quality and improve uptake estimates. In this study, we further investigated the applicability of different algorithms for registration of the coronary artery PET images. In particular, we aimed to compare demons vs. level-set nonlinear registration techniques applied for the correction of cardiac motion in coronary 18F-NaF PET. To this end, fifteen patients underwent 18F-NaF PET and prospective coronary CT angiography (CCTA). PET data were reconstructed in 10 ECG gated bins; subsequently these gated bins were registered using demons and level-set methods guided by the extracted coronary arteries from CCTA, to eliminate the effect of cardiac motion on PET images. Noise levels, target-to-background ratios (TBR) and global motion were compared to assess image quality. Compared to the reference standard of using only diastolic PET image (25% of the counts from PET acquisition), cardiac motion registration using either level-set or demons techniques almost halved image noise due to the use of counts from the full PET acquisition and increased TBR difference between 18F-NaF positive and negative lesions. The demons method produces smoother deformation fields, exhibiting no singularities (which reflects how physically plausible the registration deformation is), as compared to the level-set method, which presents between 4 and 8% of singularities, depending on the coronary artery considered. In conclusion, the demons method produces smoother motion fields as compared to the level-set method, with a motion that is physiologically

  7. Demons versus Level-Set motion registration for coronary 18F-sodium fluoride PET

    PubMed Central

    Rubeaux, Mathieu; Joshi, Nikhil; Dweck, Marc R.; Fletcher, Alison; Motwani, Manish; Thomson, Louise E.; Germano, Guido; Dey, Damini; Berman, Daniel S.; Newby, David E.; Slomka, Piotr J.

    2016-01-01

    Ruptured coronary atherosclerotic plaques commonly cause acute myocardial infarction. It has been recently shown that active microcalcification in the coronary arteries, one of the features that characterizes vulnerable plaques at risk of rupture, can be imaged using cardiac gated 18F-sodium fluoride (18F-NaF) PET. We have shown in previous work that a motion correction technique applied to cardiac-gated 18F-NaF PET images can enhance image quality and improve uptake estimates. In this study, we further investigated the applicability of different algorithms for registration of the coronary artery PET images. In particular, we aimed to compare demons vs. level-set nonlinear registration techniques applied for the correction of cardiac motion in coronary 18F-NaF PET. To this end, fifteen patients underwent 18F-NaF PET and prospective coronary CT angiography (CCTA). PET data were reconstructed in 10 ECG gated bins; subsequently these gated bins were registered using demons and level-set methods guided by the extracted coronary arteries from CCTA, to eliminate the effect of cardiac motion on PET images. Noise levels, target-to-background ratios (TBR) and global motion were compared to assess image quality. Compared to the reference standard of using only diastolic PET image (25% of the counts from PET acquisition), cardiac motion registration using either level-set or demons techniques almost halved image noise due to the use of counts from the full PET acquisition and increased TBR difference between 18F-NaF positive and negative lesions. The demons method produces smoother deformation fields, exhibiting no singularities (which reflects how physically plausible the registration deformation is), as compared to the level-set method, which presents between 4 and 8% of singularities, depending on the coronary artery considered. In conclusion, the demons method produces smoother motion fields as compared to the level-set method, with a motion that is physiologically

  8. Weld defect detection on digital radiographic image using level set method

    NASA Astrophysics Data System (ADS)

    Halim, Suhaila Abd; Petrus, Bertha Trissan; Ibrahim, Arsmah; Manurung, Yupiter HP; Jayes, Mohd Idris

    2013-09-01

    Segmentation is the most critical task and widely used to obtain useful information in image processing. In this study, Level set based on Chan Vese method is explored and applied to define weld defect on digital radiographic image and its accuracy is evaluated to measure its performance. A set of images with region of interest (ROI) that contain defect are used as input image. The ROI image is pre-processed to improve their quality for better detection. Then, the image is segmented using level set method that is implemented using MATLAB R2009a. The accuracy of the method is evaluated using Receiver Operating Characteristic (ROC). Experimental results show that the method generated an area underneath the ROC of 0.7 in the set of images and the operational point reached corresponds to 0.6 of sensitivity and 0.8 of specificity. The application of segmentation technique such as Chan-Vese level set able to assist radiographer in detecting the defect on digital radiographic image accurately.

  9. Bounding probabilistic sea-level projections within the framework of the possibility theory

    NASA Astrophysics Data System (ADS)

    Le Cozannet, Gonéri; Manceau, Jean-Charles; Rohmer, Jeremy

    2017-01-01

    Despite progresses in climate change science, projections of future sea-level rise remain highly uncertain, especially due to large unknowns in the melting processes affecting the ice-sheets in Greenland and Antarctica. Based on climate-models outcomes and the expertise of scientists concerned with these issues, the IPCC provided constraints to the quantiles of sea-level projections. Moreover, additional physical limits to future sea-level rise have been established, although approximately. However, many probability functions can comply with this imprecise knowledge. In this contribution, we provide a framework based on extra-probabilistic theories (namely the possibility theory) to model the uncertainties in sea-level rise projections by 2100 under the RCP 8.5 scenario. The results provide a concise representation of uncertainties in future sea-level rise and of their intrinsically imprecise nature, including a maximum bound of the total uncertainty. Today, coastal impact studies are increasingly moving away from deterministic sea-level projections, which underestimate the expectancy of damages and adaptation needs compared to probabilistic laws. However, we show that the probability functions used so-far have only explored a rather conservative subset of sea-level projections compliant with the IPCC. As a consequence, coastal impact studies relying on these probabilistic sea-level projections are expected to underestimate the possibility of large damages and adaptation needs.

  10. Analysis of Forensic Autopsy in 120 Cases of Medical Disputes Among Different Levels of Institutional Settings.

    PubMed

    Yu, Lin-Sheng; Ye, Guang-Hua; Fan, Yan-Yan; Li, Xing-Biao; Feng, Xiang-Ping; Han, Jun-Ge; Lin, Ke-Zhi; Deng, Miao-Wu; Li, Feng

    2015-09-01

    Despite advances in medical science, the causes of death can sometimes only be determined by pathologists after a complete autopsy. Few studies have investigated the importance of forensic autopsy in medically disputed cases among different levels of institutional settings. Our study aimed to analyze forensic autopsy in 120 cases of medical disputes among five levels of institutional settings between 2001 and 2012 in Wenzhou, China. The results showed an overall concordance rate of 55%. Of the 39% of clinically missed diagnosis, cardiovascular pathology comprises 55.32%, while respiratory pathology accounts for the remaining 44. 68%. Factors that increase the likelihood of missed diagnoses were private clinics, community settings, and county hospitals. These results support that autopsy remains an important tool in establishing causes of death in medically disputed case, which may directly determine or exclude the fault of medical care and therefore in helping in resolving these cases.

  11. An investigation of children's levels of inquiry in an informal science setting

    NASA Astrophysics Data System (ADS)

    Clark-Thomas, Beth Anne

    Elementary school students' understanding of both science content and processes are enhanced by the higher level thinking associated with inquiry-based science investigations. Informal science setting personnel, elementary school teachers, and curriculum specialists charged with designing inquiry-based investigations would be well served by an understanding of the varying influence of certain present factors upon the students' willingness and ability to delve into such higher level inquiries. This study examined young children's use of inquiry-based materials and factors which may influence the level of inquiry they engaged in during informal science activities. An informal science setting was selected as the context for the examination of student inquiry behaviors because of the rich inquiry-based environment present at the site and the benefits previously noted in the research regarding the impact of informal science settings upon the construction of knowledge in science. The study revealed several patterns of behavior among children when they are engaged in inquiry-based activities at informal science exhibits. These repeated behaviors varied in the children's apparent purposeful use of the materials at the exhibits. These levels of inquiry behavior were taxonomically defined as high/medium/low within this study utilizing a researcher-developed tool. Furthermore, in this study adult interventions, questions, or prompting were found to impact the level of inquiry engaged in by the children. This study revealed that higher levels of inquiry were preceded by task directed and physical feature prompts. Moreover, the levels of inquiry behaviors were haltered, even lowered, when preceded by a prompt that focused on a science content or concept question. Results of this study have implications for the enhancement of inquiry-based science activities in elementary schools as well as in informal science settings. These findings have significance for all science educators

  12. A fast level set method for synthetic aperture radar ocean image segmentation.

    PubMed

    Huang, Xiaoxia; Huang, Bo; Li, Hongga

    2009-01-01

    Segmentation of high noise imagery like Synthetic Aperture Radar (SAR) images is still one of the most challenging tasks in image processing. While level set, a novel approach based on the analysis of the motion of an interface, can be used to address this challenge, the cell-based iterations may make the process of image segmentation remarkably slow, especially for large-size images. For this reason fast level set algorithms such as narrow band and fast marching have been attempted. Built upon these, this paper presents an improved fast level set method for SAR ocean image segmentation. This competent method is dependent on both the intensity driven speed and curvature flow that result in a stable and smooth boundary. Notably, it is optimized to track moving interfaces for keeping up with the point-wise boundary propagation using a single list and a method of fast up-wind scheme iteration. The list facilitates efficient insertion and deletion of pixels on the propagation front. Meanwhile, the local up-wind scheme is used to update the motion of the curvature front instead of solving partial differential equations. Experiments have been carried out on extraction of surface slick features from ERS-2 SAR images to substantiate the efficacy of the proposed fast level set method.

  13. Segmentation of ventricles in Alzheimer mr images using anisotropic diffusion filtering and level set method.

    PubMed

    Anandh, K R; Sujatha, C M; Ramakrishnan, S

    2014-01-01

    Ventricle enlargement is a useful structural biomarker for the diagnosis of Alzheimer’s Disease (AD). This devastating neurodegenerative disorder results in progression of dementia. Although AD results in the passive increment of ventricle volume, there exists a large overlap in the volume measurements of AD and normal subjects. Hence, shape based analysis of ventricle dilation is appropriate to detect the subtle morphological changes among these two groups. In this work, segmentation of ventricle in Alzheimer MR images is employed using level set method and anisotropic based diffusion filtering. Images considered for this study are preprocessed using filters. Anisotropic based diffusion filtering is employed to extract the edge map. This filtering performs region specific smoothing process using the diffusion coefficient as a function of image gradient. Filtered images are subjected to level set method which employs an improved diffusion rate equation for the level set evolution. Geometric features are extracted from the segmented ventricles. Results show that the diffusion filter could extract edge map with sharp region boundaries. The modified level set method is able to extract the morphological changes in ventricles. The observed morphological changes are distinct for normal and AD subjects (p < 0.0001). It is also observed that the sizes of ventricle in the AD subjects are noticeably enlarged when compared to normal subjects. Features obtained from the segmented ventricles are also clearly distinct and demonstrate the differences in the AD subjects. As ventricle volume and its morphometry are significant biomarkers, this study seems to be clinically relevant.

  14. Total variation based edge enhancement for level set segmentation and asymmetry analysis in breast thermograms.

    PubMed

    Prabha, S; Anandh, K R; Sujatha, C M; Ramakrishnan, S

    2014-01-01

    In this work, an attempt has been made to perform asymmetry analysis in breast thermograms using non-linear total variation diffusion filter and reaction diffusion based level set method. Breast images used in this study are obtained from online database of the project PROENG. Initially the images are subjected to total variation (TV) diffusion filter to generate the edge map. Reaction diffusion based level set method is employed to segment the breast tissues using TV edge map as stopping boundary function. Asymmetry analysis is performed on the segmented breast tissues using wavelet based structural texture features. The results show that nonlinear total variation based reaction diffusion level set method could efficiently segment the breast tissues. This method yields high correlation between the segmented output and the ground truth than the conventional level set. Structural texture features extracted from the wavelet coefficients are found to be significant in demarcating normal and abnormal tissues. Hence, it appears that the asymmetry analysis on segmented breast tissues extracted using total variation edge map can be used efficiently to identify the pathological conditions of breast thermograms.

  15. 76 FR 9004 - Public Comment on Setting Achievement Levels in Writing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-16

    ... Document. Materials for Review and Comment Policymakers, teachers, researchers, State and local writing specialists, members of professional writing and teacher organizations, and members of the public are invited... Public Comment on Setting Achievement Levels in Writing AGENCY: U.S. Department of Education,...

  16. Energy-optimal path planning by stochastic dynamically orthogonal level-set optimization

    NASA Astrophysics Data System (ADS)

    Subramani, Deepak N.; Lermusiaux, Pierre F. J.

    2016-04-01

    A stochastic optimization methodology is formulated for computing energy-optimal paths from among time-optimal paths of autonomous vehicles navigating in a dynamic flow field. Based on partial differential equations, the methodology rigorously leverages the level-set equation that governs time-optimal reachability fronts for a given relative vehicle-speed function. To set up the energy optimization, the relative vehicle-speed and headings are considered to be stochastic and new stochastic Dynamically Orthogonal (DO) level-set equations are derived. Their solution provides the distribution of time-optimal reachability fronts and corresponding distribution of time-optimal paths. An optimization is then performed on the vehicle's energy-time joint distribution to select the energy-optimal paths for each arrival time, among all stochastic time-optimal paths for that arrival time. Numerical schemes to solve the reduced stochastic DO level-set equations are obtained, and accuracy and efficiency considerations are discussed. These reduced equations are first shown to be efficient at solving the governing stochastic level-sets, in part by comparisons with direct Monte Carlo simulations. To validate the methodology and illustrate its accuracy, comparisons with semi-analytical energy-optimal path solutions are then completed. In particular, we consider the energy-optimal crossing of a canonical steady front and set up its semi-analytical solution using a energy-time nested nonlinear double-optimization scheme. We then showcase the inner workings and nuances of the energy-optimal path planning, considering different mission scenarios. Finally, we study and discuss results of energy-optimal missions in a wind-driven barotropic quasi-geostrophic double-gyre ocean circulation.

  17. Pull-push level sets: a new term to encode prior knowledge for the segmentation of teeth images

    NASA Astrophysics Data System (ADS)

    de Luis Garcia, Rodrigo; San Jose Estepar, Raul; Alberola-Lopez, Carlos

    2005-04-01

    This paper presents a novel level set method for contour detection in multiple-object scenarios applied to the segmentation of teeth images. Teeth segmentation from 2D images of dental plaster cast models is a difficult problem because it is necessary to independently segment several objects that have very badly defined borders between them. Current methods for contour detection which only employ image information cannot successfully segment such structures. Being therefore necessary to use prior knowledge about the problem domain, current approaches in the literature are limited to the extraction of shape information of individual objects, whereas the key factor in such a problem are the relative positions of the different objects composing the anatomical structure. Therefore, we propose a novel method for introducing such information into a level set framework. This results in a new energy term which can be explained as a regional term that takes into account the relative positions of the different objects, and consequently creates an attraction or repulsion force that favors a determined configuration. The proposed method is compared with balloon and GVF snakes, as well as with the Geodesic Active Regions model, showing accurate results.

  18. Dynamically reconfigurable framework for pixel-level visible light communication projector

    NASA Astrophysics Data System (ADS)

    Zhou, Leijie; Fukushima, Shogo; Naemura, Takeshi

    2014-03-01

    We have developed the Pixel-level Visible Light Communication (PVLC) projector based on the DLP (Digital Light Processing) system. The projector can embed invisible data pixel by pixel into a visible image to realize augmented reality applications. However, it cannot update either invisible or visible contents in real time. In order to solve the problem, we improve the projector so that a PC can dynamically control the system and enable us to achieve a high-frame-rate feature by resolution conversion. This paper proposes the system framework and the design method for the dynamically reconfigurable PVLC projector.

  19. A GPU Accelerated Discontinuous Galerkin Conservative Level Set Method for Simulating Atomization

    NASA Astrophysics Data System (ADS)

    Jibben, Zechariah J.

    This dissertation describes a process for interface capturing via an arbitrary-order, nearly quadrature free, discontinuous Galerkin (DG) scheme for the conservative level set method (Olsson et al., 2005, 2008). The DG numerical method is utilized to solve both advection and reinitialization, and executed on a refined level set grid (Herrmann, 2008) for effective use of processing power. Computation is executed in parallel utilizing both CPU and GPU architectures to make the method feasible at high order. Finally, a sparse data structure is implemented to take full advantage of parallelism on the GPU, where performance relies on well-managed memory operations. With solution variables projected into a kth order polynomial basis, a k + 1 order convergence rate is found for both advection and reinitialization tests using the method of manufactured solutions. Other standard test cases, such as Zalesak's disk and deformation of columns and spheres in periodic vortices are also performed, showing several orders of magnitude improvement over traditional WENO level set methods. These tests also show the impact of reinitialization, which often increases shape and volume errors as a result of level set scalar trapping by normal vectors calculated from the local level set field. Accelerating advection via GPU hardware is found to provide a 30x speedup factor comparing a 2.0GHz Intel Xeon E5-2620 CPU in serial vs. a Nvidia Tesla K20 GPU, with speedup factors increasing with polynomial degree until shared memory is filled. A similar algorithm is implemented for reinitialization, which relies on heavier use of shared and global memory and as a result fills them more quickly and produces smaller speedups of 18x.

  20. A localized re-initialization equation for the conservative level set method

    NASA Astrophysics Data System (ADS)

    McCaslin, Jeremy O.; Desjardins, Olivier

    2014-04-01

    The conservative level set methodology for interface transport is modified to allow for localized level set re-initialization. This approach is suitable to applications in which there is a significant amount of spatial variability in level set transport. The steady-state solution of the modified re-initialization equation matches that of the original conservative level set provided an additional Eikonal equation is solved, which can be done efficiently through a fast marching method (FMM). Implemented within the context of the accurate conservative level set method (ACLS) (Desjardins et al., 2008, [6]), the FMM solution of this Eikonal equation comes at no additional cost. A metric for the appropriate amount of local re-initialization is proposed based on estimates of local flow deformation and numerical diffusion. The method is compared to standard global re-initialization for two test cases, yielding the expected results that minor differences are observed for Zalesak's disk, and improvements in both mass conservation and interface topology are seen for a drop deforming in a vortex. Finally, the method is applied to simulation of a viscously damped standing wave and a three-dimensional drop impacting on a shallow pool. Negligible differences are observed for the standing wave, as expected. For the last case, results suggest that spatially varying re-initialization provides a reduction in spurious interfacial corrugations, improvements in the prediction of radial growth of the splashing lamella, and a reduction in conservation errors, as well as a reduction in overall computational cost that comes from improved conditioning of the pressure Poisson equation due to the removal of spurious corrugations.

  1. Online monitoring of oil film using electrical capacitance tomography and level set method

    SciTech Connect

    Xue, Q. Ma, M.; Sun, B. Y.; Cui, Z. Q.; Wang, H. X.

    2015-08-15

    In the application of oil-air lubrication system, electrical capacitance tomography (ECT) provides a promising way for monitoring oil film in the pipelines by reconstructing cross sectional oil distributions in real time. While in the case of small diameter pipe and thin oil film, the thickness of the oil film is hard to be observed visually since the interface of oil and air is not obvious in the reconstructed images. And the existence of artifacts in the reconstructions has seriously influenced the effectiveness of image segmentation techniques such as level set method. Besides, level set method is also unavailable for online monitoring due to its low computation speed. To address these problems, a modified level set method is developed: a distance regularized level set evolution formulation is extended to image two-phase flow online using an ECT system, a narrowband image filter is defined to eliminate the influence of artifacts, and considering the continuity of the oil distribution variation, the detected oil-air interface of a former image can be used as the initial contour for the detection of the subsequent frame; thus, the propagation from the initial contour to the boundary can be greatly accelerated, making it possible for real time tracking. To testify the feasibility of the proposed method, an oil-air lubrication facility with 4 mm inner diameter pipe is measured in normal operation using an 8-electrode ECT system. Both simulation and experiment results indicate that the modified level set method is capable of visualizing the oil-air interface accurately online.

  2. Automatic Measurement of Thalamic Diameter in 2D Fetal Ultrasound Brain Images using Shape Prior Constrained Regularized Level Sets.

    PubMed

    Sridar, Pradeeba; Kumar, Ashnil; Li, Changyang; Woo, Joyce; Quinton, Ann; Benzie, Ron; Peek, Michael; Feng, Dagan; Ramarathnam, Krishna Kumar; Nanan, Ralph; Kim, Jinman

    2016-06-20

    We derived an automated algorithm for accurately measuring the thalamic diameter from 2D fetal ultrasound (US) brain images. The algorithm overcomes the inherent limitations of the US image modality: non-uniform density, missing boundaries, and strong speckle noise. We introduced a 'guitar' structure that represents the negative space surrounding the thalamic regions. The guitar acts as a landmark for deriving the widest points of the thalamus even when its boundaries are not identifiable. We augmented a generalized level-set framework with a shape prior and constraints derived from statistical shape models of the guitars; this framework was used to segment US images and measure the thalamic diameter. Our segmentation method achieved a higher mean Dice similarity coefficient, Hausdorff distance, specificity and reduced contour leakage when compared to other well-established methods. The automatic thalamic diameter measurement had an inter-observer variability of -0.56±2.29 millimeters compared to manual measurement by an expert sonographer. Our method was capable of automatically estimating the thalamic diameter, with the measurement accuracy on par with clinical assessment. Our method can be used as part of computer-assisted screening tools that automatically measure the biometrics of the fetal thalamus; these biometrics are linked to neuro-developmental outcomes.

  3. Loosely coupled level sets for retinal layers and drusen segmentation in subjects with dry age-related macular degeneration

    NASA Astrophysics Data System (ADS)

    Novosel, Jelena; Wang, Ziyuan; de Jong, Henk; Vermeer, Koenraad A.; van Vliet, Lucas J.

    2016-03-01

    Optical coherence tomography (OCT) is used to produce high-resolution three-dimensional images of the retina, which permit the investigation of retinal irregularities. In dry age-related macular degeneration (AMD), a chronic eye disease that causes central vision loss, disruptions such as drusen and changes in retinal layer thicknesses occur which could be used as biomarkers for disease monitoring and diagnosis. Due to the topology disrupting pathology, existing segmentation methods often fail. Here, we present a solution for the segmentation of retinal layers in dry AMD subjects by extending our previously presented loosely coupled level sets framework which operates on attenuation coefficients. In eyes affected by AMD, Bruch's membrane becomes visible only below the drusen and our segmentation framework is adapted to delineate such a partially discernible interface. Furthermore, the initialization stage, which tentatively segments five interfaces, is modified to accommodate the appearance of drusen. This stage is based on Dijkstra's algorithm and combines prior knowledge on the shape of the interface, gradient and attenuation coefficient in the newly proposed cost function. This prior knowledge is incorporated by varying the weights for horizontal, diagonal and vertical edges. Finally, quantitative evaluation of the accuracy shows a good agreement between manual and automated segmentation.

  4. Streaming level set algorithm for 3D segmentation of confocal microscopy images.

    PubMed

    Gouaillard, Alexandre; Mosaliganti, Kishore; Gelas, Arnaud; Souhait, Lydie; Obholzer, Nikolaus; Megason, Sean

    2009-01-01

    We present a high performance variant of the popular geodesic active contours which are used for splitting cell clusters in microscopy images. Previously, we implemented a linear pipelined version that incorporates as many cues as possible into developing a suitable level-set speed function so that an evolving contour exactly segments a cell/nuclei blob. We use image gradients, distance maps, multiple channel information and a shape model to drive the evolution. We also developed a dedicated seeding strategy that uses the spatial coherency of the data to generate an over complete set of seeds along with a quality metric which is further used to sort out which seed should be used for a given cell. However, the computational performance of any level-set methodology is quite poor when applied to thousands of 3D data-sets each containing thousands of cells. Those data-sets are common in confocal microscopy. In this work, we explore methods to stream the algorithm in shared memory, multi-core environments. By partitioning the input and output using spatial data structures we insure the spatial coherency needed by our seeding algorithm as well as improve drastically the speed without memory overhead. Our results show speed-ups up to a factor of six.

  5. A Multilevel Conceptual Framework to Understand the Role of Food Insecurity on Antiretroviral Therapy Adherence in Low-Resource Settings: From Theory to Practice.

    PubMed

    Masa, Rainier; Chowa, Gina

    2017-04-03

    The objective of this study was to describe a multilevel conceptual framework to understand the role of food insecurity on antiretroviral therapy adherence. The authors illustrated an example of how they used the multilevel framework to develop an intervention for poor people living with HIV in a rural and low-resource community. The framework incorporates intrapersonal, interpersonal, and structural-level theories of understanding and changing health behaviors. The framework recognizes the role of personal, social, and environmental factors on cognition and behavior, with particular attention to ways in which treatment adherence is enabled or prevented by structural conditions, such as food insecurity.

  6. Level-Set Minimization of Potential Controlled Hadwiger Valuations for Molecular Solvation

    PubMed Central

    Cheng, Li-Tien; Li, Bo; Wang, Zhongming

    2012-01-01

    A level-set method is developed for the numerical minimization of a class of Had-wiger valuations with a potential on a set of three-dimensional bodies. Such valuations are linear combinations of the volume, surface area, and surface integral of mean curvature. The potential increases rapidly as the body shrinks beyond a critical size. The combination of the Hadwiger valuation and the potential is the mean-field free-energy functional of the solvation of non-polar molecules in the recently developed variational implicit-solvent model. This functional of surfaces is minimized by the level-set evolution in the steepest decent of the free energy. The normal velocity of this surface evolution consists of both the mean and Gaussian curvatures, and a lower-order, “forcing” term arising from the potential. The forward Euler method is used to discretize the time derivative with a dynamic time stepping that satisfies a CFL condition. The normal velocity is decomposed into two parts. The first part consists of both the mean and Gaussian curvature terms. It is of parabolic type with parameter correction, and is discretized by central differencing. The second part has all the lower-order terms. It is of hyperbolic type, and is discretized by an upwinding scheme. New techniques of local level-set method and numerical integration are developed. Numerical tests demonstrate a second-order convergence of the method. Examples of application to the modeling of molecular solvation are presented. PMID:22323839

  7. Geological repository for nuclear high level waste in France from feasibility to design within a legal framework

    SciTech Connect

    Voizard, Patrice; Mayer, Stefan; Ouzounian, Gerald

    2007-07-01

    Over the past 15 years, the French program on deep geologic disposal of high level and long-lived radioactive waste has benefited from a clear legal framework as the result of the December 30, 1991 French Waste Act. To fulfil its obligations stipulated in this law, ANDRA has submitted the 'Dossier 2005 Argile' (clay) and 'Dossier 2005 Granite' to the French Government. The first of those reports presents a concept for the underground disposal of nuclear waste at a specific clay site and focuses on a feasibility study. Knowledge of the host rock characteristics is based on the investigations carried out at the Meuse/Haute Marne Underground Research Laboratory. The repository concept addresses various issues, the most important of which relates to the large amount of waste, the clay host rock and the reversibility requirement. This phase has ended upon review and evaluation of the 'Dossier 2005' made by different organisations including the National Review Board, the National Safety Authority and the NEA International Review Team. By passing the 'new', June 28, 2006 Planning Act on the sustainable management of radioactive materials and waste, the French parliament has further defined a clear legal framework for future work. This June 28 Planning Act thus sets a schedule and defines the objectives for the next phase of repository design in requesting the submission of a construction authorization application by 2015. The law calls for the repository program to be in a position to commission disposal installations by 2025. (authors)

  8. An innovative hydrogeologic setting for disposal of low-level radioactive wastes

    NASA Astrophysics Data System (ADS)

    Legrand, Harry E.

    1989-05-01

    A natural unique hydrogeological setting favorable for safe and economical disposal of low-level radioactive wastes occurs in the flat hinterland of southeastern North Carolina. The uniqueness results partly from the absence of vertical and horizontal groundwater gradients, representing a nonflow, or null, zone. The null setting is localized to key horizons 30 to 75 feet below land surface and to areas where glauconitic sandy clays of the Peedee Formation lie under less than 25 feet of surficial sandy clays; the Peedee contains nearly stagnant brackish groundwater slightly below the proposed disposal zone. Issues to overcome include: (1) demonstrating better combined safety and economical features over conventional and prescribed settings, (2) dewatering the low-permeability disposal zone for the 20-year operational period, and (3) changing rules to allow disposal slightly below the zone in which the normal water table occurs. Favorable site characteristics of the key setting are: (1) no major aquifer to contaminate, (2) no surface streams or lakes to contaminate, (3) optimal ion exchange and sorptive capacity (clay and glauconite pellets), (4) no appreciable or distinctive vertical and horizontal gradients, (5) no elongated contaminated plume to develop, (6) no surface erosion, (7) a capable setting for injection of potential contaminated water into deep brackish water wells, if needed and allowed, (8) minimum problems of the “overfilled bathtub effect,” (9) no apparent long-term harmful environmental impact (normal water table would be restored after the 20-year period), (10) relatively inexpensive disposal (engineered barriers not needed and desired), (11) simple and relatively inexpensive monitoring, (12) large tracts of land likely available, and (13) sparse population. In spite of legal and political obstacles to shallow land burial, the null setting described is a capable hydrogeological host to contain low-level radioactive wastes. The setting may have

  9. A Framework for Lab Work Management in Mass Courses. Application to Low Level Input/Output without Hardware

    ERIC Educational Resources Information Center

    Rodriguez, Santiago; Zamorano, Juan; Rosales, Francisco; Dopico, Antonio Garcia; Pedraza, Jose Luis

    2007-01-01

    This paper describes a complete lab work management framework designed and developed in the authors' department to help teachers to manage the small projects that students are expected to complete as lab assignments during their graduate-level computer engineering studies. The paper focuses on an application example of the framework to a specific…

  10. A Bayesian framework for cell-level protein network analysis for multivariate proteomics image data

    NASA Astrophysics Data System (ADS)

    Kovacheva, Violet N.; Sirinukunwattana, Korsuk; Rajpoot, Nasir M.

    2014-03-01

    The recent development of multivariate imaging techniques, such as the Toponome Imaging System (TIS), has facilitated the analysis of multiple co-localisation of proteins. This could hold the key to understanding complex phenomena such as protein-protein interaction in cancer. In this paper, we propose a Bayesian framework for cell level network analysis allowing the identification of several protein pairs having significantly higher co-expression levels in cancerous tissue samples when compared to normal colon tissue. It involves segmenting the DAPI-labeled image into cells and determining the cell phenotypes according to their protein-protein dependence profile. The cells are phenotyped using Gaussian Bayesian hierarchical clustering (GBHC) after feature selection is performed. The phenotypes are then analysed using Difference in Sums of Weighted cO-dependence Profiles (DiSWOP), which detects differences in the co-expression patterns of protein pairs. We demonstrate that the pairs highlighted by the proposed framework have high concordance with recent results using a different phenotyping method. This demonstrates that the results are independent of the clustering method used. In addition, the highlighted protein pairs are further analysed via protein interaction pathway databases and by considering the localization of high protein-protein dependence within individual samples. This suggests that the proposed approach could identify potentially functional protein complexes active in cancer progression and cell differentiation.

  11. A Framework for Spatial Assessment of Local Level Vulnerability and Adaptive Capacity to Extreme Heat

    NASA Astrophysics Data System (ADS)

    Wilhelmi, O.; Hayden, M.; Harlan, S.; Ruddell, D.; Komatsu, K.; England, B.; Uejio, C.

    2008-12-01

    Changing climate is predicted to increase the intensity and impacts of heat waves prompting the need to develop preparedness and adaptation strategies that reduce societal vulnerability. Central to understanding societal vulnerability, is adaptive capacity, the potential of a system or population to modify its features/behaviors so as to better cope with existing and anticipated stresses and fluctuations. Adaptive capacity influences adaptation, the actual adjustments made to cope with the impacts from current and future hazardous heat events. Understanding societal risks, vulnerabilities and adaptive capacity to extreme heat events and climate change requires an interdisciplinary approach that includes information about weather and climate, the natural and built environment, social processes and characteristics, interactions with the stakeholders, and an assessment of community vulnerability. This project presents a framework for an interdisciplinary approach and a case study that explore linkages between quantitative and qualitative data for a more comprehensive understanding of local level vulnerability and adaptive capacity to extreme heat events in Phoenix, Arizona. In this talk, we will present a methodological framework for conducting collaborative research on societal vulnerability and adaptive capacity on a local level that includes integration of household surveys into a quantitative spatial assessment of societal vulnerability. We highlight a collaborative partnership among researchers, community leaders and public health officials. Linkages between assessment of local adaptive capacity and development of regional climate change adaptation strategies will be discussed.

  12. Systems Science and Obesity Policy: A Novel Framework for Analyzing and Rethinking Population-Level Planning

    PubMed Central

    Matteson, Carrie L.; Finegood, Diane T.

    2014-01-01

    Objectives. We demonstrate the use of a systems-based framework to assess solutions to complex health problems such as obesity. Methods. We coded 12 documents published between 2004 and 2013 aimed at influencing obesity planning for complex systems design (9 reports from US and Canadian governmental or health authorities, 1 Cochrane review, and 2 Institute of Medicine reports). We sorted data using the intervention-level framework (ILF), a novel solutions-oriented approach to complex problems. An in-depth comparison of 3 documents provides further insight into complexity and systems design in obesity policy. Results. The majority of strategies focused mainly on changing the determinants of energy imbalance (food intake and physical activity). ILF analysis brings to the surface actions aimed at higher levels of system function and points to a need for more innovative policy design. Conclusions. Although many policymakers acknowledge obesity as a complex problem, many strategies stem from the paradigm of individual choice and are limited in scope. The ILF provides a template to encourage natural systems thinking and more strategic policy design grounded in complexity science. PMID:24832406

  13. Probabilistic framework for assessing the ice sheet contribution to sea level change.

    PubMed

    Little, Christopher M; Urban, Nathan M; Oppenheimer, Michael

    2013-02-26

    Previous sea level rise (SLR) assessments have excluded the potential for dynamic ice loss over much of Greenland and Antarctica, and recently proposed "upper bounds" on Antarctica's 21st-century SLR contribution are derived principally from regions where present-day mass loss is concentrated (basin 15, or B15, drained largely by Pine Island, Thwaites, and Smith glaciers). Here, we present a probabilistic framework for assessing the ice sheet contribution to sea level change that explicitly accounts for mass balance uncertainty over an entire ice sheet. Applying this framework to Antarctica, we find that ongoing mass imbalances in non-B15 basins give an SLR contribution by 2100 that: (i) is comparable to projected changes in B15 discharge and Antarctica's surface mass balance, and (ii) varies widely depending on the subset of basins and observational dataset used in projections. Increases in discharge uncertainty, or decreases in the exceedance probability used to define an upper bound, increase the fractional contribution of non-B15 basins; even weak spatial correlations in future discharge growth rates markedly enhance this sensitivity. Although these projections rely on poorly constrained statistical parameters, they may be updated with observations and/or models at many spatial scales, facilitating a more comprehensive account of uncertainty that, if implemented, will improve future assessments.

  14. Comparison of bladder segmentation using deep-learning convolutional neural network with and without level sets

    NASA Astrophysics Data System (ADS)

    Cha, Kenny H.; Hadjiiski, Lubomir M.; Samala, Ravi K.; Chan, Heang-Ping; Cohan, Richard H.; Caoili, Elaine M.

    2016-03-01

    We are developing a CAD system for detection of bladder cancer in CTU. In this study we investigated the application of deep-learning convolutional neural network (DL-CNN) to the segmentation of the bladder, which is a challenging problem because of the strong boundary between the non-contrast and contrast-filled regions in the bladder. We trained a DL-CNN to estimate the likelihood of a pixel being inside the bladder using neighborhood information. The segmented bladder was obtained from thresholding and hole-filling of the likelihood map. We compared the segmentation performance of the DL-CNN alone and with additional cascaded 3D and 2D level sets to refine the segmentation using 3D hand-segmented contours as reference standard. The segmentation accuracy was evaluated by five performance measures: average volume intersection %, average % volume error, average absolute % error, average minimum distance, and average Jaccard index for a data set of 81 training and 92 test cases. For the training set, DLCNN with level sets achieved performance measures of 87.2+/-6.1%, 6.0+/-9.1%, 8.7+/-6.1%, 3.0+/-1.2 mm, and 81.9+/-7.6%, respectively, while the DL-CNN alone obtained the values of 73.6+/-8.5%, 23.0+/-8.5%, 23.0+/-8.5%, 5.1+/-1.5 mm, and 71.5+/-9.2%, respectively. For the test set, the DL-CNN with level sets achieved performance measures of 81.9+/-12.1%, 10.2+/-16.2%, 14.0+/-13.0%, 3.6+/-2.0 mm, and 76.2+/-11.8%, respectively, while DL-CNN alone obtained 68.7+/-12.0%, 27.2+/-13.7%, 27.4+/-13.6%, 5.7+/-2.2 mm, and 66.2+/-11.8%, respectively. DL-CNN alone is effective in segmenting bladders but may not follow the details of the bladder wall. The combination of DL-CNN with level sets provides highly accurate bladder segmentation.

  15. Unsupervised segmentation of the prostate using MR images based on level set with a shape prior.

    PubMed

    Liu, Xin; Langer, D L; Haider, M A; Van der Kwast, T H; Evans, A J; Wernick, M N; Yetik, I S

    2009-01-01

    Prostate cancer is the second leading cause of cancer death in American men. Current prostate MRI can benefit from automated tumor localization to help guide biopsy, radiotherapy and surgical planning. An important step of automated prostate cancer localization is the segmentation of the prostate. In this paper, we propose a fully automatic method for the segmentation of the prostate. We firstly apply a deformable ellipse model to find an ellipse that best fits the prostate shape. Then, this ellipse is used to initiate the level set and constrain the level set evolution with a shape penalty term. Finally, certain post processing methods are applied to refine the prostate boundaries. We apply the proposed method to real diffusion-weighted (DWI) MRI images data to test the performance. Our results show that accurate segmentation can be obtained with the proposed method compared to human readers.

  16. A level set-based shape optimization method for periodic sound barriers composed of elastic scatterers

    NASA Astrophysics Data System (ADS)

    Hashimoto, Hiroshi; Kim, Min-Geun; Abe, Kazuhisa; Cho, Seonho

    2013-10-01

    This paper presents a level set-based topology optimization method for noise barriers formed from an assembly of scatterers. The scattering obstacles are modeled by elastic bodies arranged periodically along the wall. Due to the periodicity, the problem can be reduced to that in a unit cell. The interaction between the elastic scatterers and the acoustic field is described in the context of the level set analysis. The semi-infinite acoustic wave regions located on the both sides of the barrier are represented by impedance matrices. The objective function is defined by the energy transmission passing the barrier. The design sensitivity is evaluated analytically by the aid of adjoint equations. The dependency of the optimal profile on the stiffness of scatterers and on the target frequency band is examined. The feasibility of the developed optimization method is proved through numerical examples.

  17. A novel approach to segmentation and measurement of medical image using level set methods.

    PubMed

    Chen, Yao-Tien

    2017-02-17

    The study proposes a novel approach for segmentation and visualization plus value-added surface area and volume measurements for brain medical image analysis. The proposed method contains edge detection and Bayesian based level set segmentation, surface and volume rendering, and surface area and volume measurements for 3D objects of interest (i.e., brain tumor, brain tissue, or whole brain). Two extensions based on edge detection and Bayesian level set are first used to segment 3D objects. Ray casting and a modified marching cubes algorithm are then adopted to facilitate volume and surface visualization of medical-image dataset. To provide physicians with more useful information for diagnosis, the surface area and volume of an examined 3D object are calculated by the techniques of linear algebra and surface integration. Experiment results are finally reported in terms of 3D object extraction, surface and volume rendering, and surface area and volume measurements for medical image analysis.

  18. Active contour segmentation using level set function with enhanced image from prior intensity.

    PubMed

    Kim, Sunhee; Kim, Youngjun; Lee, Deukhee; Park, Sehyung

    2015-01-01

    This paper presents a new active contour segmentation model using a level set function that can correctly capture both the strong and the weak boundaries of a target enclosed by bright and dark regions at the same time. We introduce an enhanced image obtained from prior information about the intensity of the target. The enhanced image emphasizes the regions where pixels have intensities close to the prior intensity. This enables a desirable segmentation of an image having a partially low contrast with the target surrounded by regions that are brighter or darker than the target. We define an edge indicator function on an original image, and local and regularization forces on an enhanced image. An edge indicator function and two forces are incorporated in order to identify the strong and weak boundaries, respectively. We established an evolution equation of contours in the level set formulation and experimented with several medical images to show the performance of the proposed method.

  19. A novel breast ultrasound image segmentation algorithm based on neutrosophic similarity score and level set.

    PubMed

    Guo, Yanhui; Şengür, Abdulkadir; Tian, Jia-Wei

    2016-01-01

    Breast ultrasound (BUS) image segmentation is a challenging task due to the speckle noise, poor quality of the ultrasound images and size and location of the breast lesions. In this paper, we propose a new BUS image segmentation algorithm based on neutrosophic similarity score (NSS) and level set algorithm. At first, the input BUS image is transferred to the NS domain via three membership subsets T, I and F, and then, a similarity score NSS is defined and employed to measure the belonging degree to the true tumor region. Finally, the level set method is used to segment the tumor from the background tissue region in the NSS image. Experiments have been conducted on a variety of clinical BUS images. Several measurements are used to evaluate and compare the proposed method's performance. The experimental results demonstrate that the proposed method is able to segment the BUS images effectively and accurately.

  20. A Cartesian Adaptive Level Set Method for Two-Phase Flows

    NASA Technical Reports Server (NTRS)

    Ham, F.; Young, Y.-N.

    2003-01-01

    In the present contribution we develop a level set method based on local anisotropic Cartesian adaptation as described in Ham et al. (2002). Such an approach should allow for the smallest possible Cartesian grid capable of resolving a given flow. The remainder of the paper is organized as follows. In section 2 the level set formulation for free surface calculations is presented and its strengths and weaknesses relative to the other free surface methods reviewed. In section 3 the collocated numerical method is described. In section 4 the method is validated by solving the 2D and 3D drop oscilation problem. In section 5 we present some results from more complex cases including the 3D drop breakup in an impulsively accelerated free stream, and the 3D immiscible Rayleigh-Taylor instability. Conclusions are given in section 6.

  1. Automatic segmentation of Leishmania parasite in microscopic images using a modified CV level set method

    NASA Astrophysics Data System (ADS)

    Farahi, Maria; Rabbani, Hossein; Talebi, Ardeshir; Sarrafzadeh, Omid; Ensafi, Shahab

    2015-12-01

    Visceral Leishmaniasis is a parasitic disease that affects liver, spleen and bone marrow. According to World Health Organization report, definitive diagnosis is possible just by direct observation of the Leishman body in the microscopic image taken from bone marrow samples. We utilize morphological and CV level set method to segment Leishman bodies in digital color microscopic images captured from bone marrow samples. Linear contrast stretching method is used for image enhancement and morphological method is applied to determine the parasite regions and wipe up unwanted objects. Modified global and local CV level set methods are proposed for segmentation and a shape based stopping factor is used to hasten the algorithm. Manual segmentation is considered as ground truth to evaluate the proposed method. This method is tested on 28 samples and achieved 10.90% mean of segmentation error for global model and 9.76% for local model.

  2. Atlas-based segmentation of 3D cerebral structures with competitive level sets and fuzzy control.

    PubMed

    Ciofolo, Cybèle; Barillot, Christian

    2009-06-01

    We propose a novel approach for the simultaneous segmentation of multiple structures with competitive level sets driven by fuzzy control. To this end, several contours evolve simultaneously toward previously defined anatomical targets. A fuzzy decision system combines the a priori knowledge provided by an anatomical atlas with the intensity distribution of the image and the relative position of the contours. This combination automatically determines the directional term of the evolution equation of each level set. This leads to a local expansion or contraction of the contours, in order to match the boundaries of their respective targets. Two applications are presented: the segmentation of the brain hemispheres and the cerebellum, and the segmentation of deep internal structures. Experimental results on real magnetic resonance (MR) images are presented, quantitatively assessed and discussed.

  3. Therapeutic and diagnostic set for irradiation the cell lines in low level laser therapy

    NASA Astrophysics Data System (ADS)

    Gryko, Lukasz; Zajac, Andrzej; Gilewski, Marian; Szymanska, Justyna; Goralczyk, Krzysztof

    2014-05-01

    In the paper is presented optoelectronic diagnostic set for standardization the biostimulation procedures performed on cell lines. The basic functional components of the therapeutic set are two digitally controlled illuminators. They are composed of the sets of semiconductor emitters - medium power laser diodes and high power LEDs emitting radiation in wide spectral range from 600 nm to 1000 nm. Emitters are coupled with applicator by fibre optic and optical systems that provides uniform irradiation of vessel with cell culture samples. Integrated spectrometer and optical power meter allow to control the energy and spectral parameters of electromagnetic radiation during the Low Level Light Therapy procedure. Dedicated power supplies and digital controlling system allow independent power of each emitter . It was developed active temperature stabilization system to thermal adjust spectral line of emitted radiation to more efficient association with absorption spectra of biological acceptors. Using the set to controlled irradiation and allowing to measure absorption spectrum of biological medium it is possible to carry out objective assessment the impact of the exposure parameters on the state cells subjected to Low Level Light Therapy. That procedure allows comparing the biological response of cell lines after irradiation with radiation of variable spectral and energetic parameters. Researches were carried out on vascular endothelial cell lines. Cells proliferations after irradiation of LEDs: 645 nm, 680 nm, 740 nm, 780 nm, 830 nm, 870 nm, 890 nm, 970 nm and lasers 650 nm and 830 nm were examined.

  4. Hydrological drivers of record-setting water level rise on Earth's largest lake system

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Bruxer, J.; Durnford, D.; Smith, J. P.; Clites, A. H.; Seglenieks, F.; Qian, S. S.; Hunter, T. S.; Fortin, V.

    2016-05-01

    Between January 2013 and December 2014, water levels on Lake Superior and Lake Michigan-Huron, the two largest lakes on Earth by surface area, rose at the highest rate ever recorded for a 2 year period beginning in January and ending in December of the following year. This historic event coincided with below-average air temperatures and extensive winter ice cover across the Great Lakes. It also brought an end to a 15 year period of persistently below-average water levels on Lakes Superior and Michigan-Huron that included several months of record-low water levels. To differentiate hydrological drivers behind the recent water level rise, we developed a Bayesian Markov chain Monte Carlo (MCMC) routine for inferring historical estimates of the major components of each lake's water budget. Our results indicate that, in 2013, the water level rise on Lake Superior was driven by increased spring runoff and over-lake precipitation. In 2014, reduced over-lake evaporation played a more significant role in Lake Superior's water level rise. The water level rise on Lake Michigan-Huron in 2013 was also due to above-average spring runoff and persistent over-lake precipitation, while in 2014, it was due to a rare combination of below-average evaporation, above-average runoff and precipitation, and very high inflow rates from Lake Superior through the St. Marys River. We expect, in future research, to apply our new framework across the other Laurentian Great Lakes, and to Earth's other large freshwater basins as well.

  5. Segmentation of the liver from abdominal MR images: a level-set approach

    NASA Astrophysics Data System (ADS)

    Abdalbari, Anwar; Huang, Xishi; Ren, Jing

    2015-03-01

    The usage of prior knowledge in segmentation of abdominal MR images enables more accurate and comprehensive interpretation about the organ to segment. Prior knowledge about abdominal organ like liver vessels can be employed to get an accurate segmentation of the liver that leads to accurate diagnosis or treatment plan. In this paper, a new method for segmenting the liver from abdominal MR images using liver vessels as prior knowledge is proposed. This paper employs the technique of level set method to segment the liver from MR abdominal images. The speed image used in the level set method is responsible for propagating and stopping region growing at boundaries. As a result of the poor contrast of the MR images between the liver and the surrounding organs i.e. stomach, kidneys, and heart causes leak of the segmented liver to those organs that lead to inaccurate or incorrect segmentation. For that reason, a second speed image is developed, as an extra term to the level set, to control the front propagation at weak edges with the help of the original speed image. The basic idea of the proposed approach is to use the second speed image as a boundary surface which is approximately orthogonal to the area of the leak. The aim of the new speed image is to slow down the level set propagation and prevent the leak in the regions close to liver boundary. The new speed image is a surface created by filling holes to reconstruct the liver surface. These holes are formed as a result of the exit and the entry of the liver vessels, and are considered the main cause of the segmentation leak. The result of the proposed method shows superior outcome than other methods in the literature.

  6. Stochastic level-set variational implicit-solvent approach to solute-solvent interfacial fluctuations.

    PubMed

    Zhou, Shenggao; Sun, Hui; Cheng, Li-Tien; Dzubiella, Joachim; Li, Bo; McCammon, J Andrew

    2016-08-07

    Recent years have seen the initial success of a variational implicit-solvent model (VISM), implemented with a robust level-set method, in capturing efficiently different hydration states and providing quantitatively good estimation of solvation free energies of biomolecules. The level-set minimization of the VISM solvation free-energy functional of all possible solute-solvent interfaces or dielectric boundaries predicts an equilibrium biomolecular conformation that is often close to an initial guess. In this work, we develop a theory in the form of Langevin geometrical flow to incorporate solute-solvent interfacial fluctuations into the VISM. Such fluctuations are crucial to biomolecular conformational changes and binding process. We also develop a stochastic level-set method to numerically implement such a theory. We describe the interfacial fluctuation through the "normal velocity" that is the solute-solvent interfacial force, derive the corresponding stochastic level-set equation in the sense of Stratonovich so that the surface representation is independent of the choice of implicit function, and develop numerical techniques for solving such an equation and processing the numerical data. We apply our computational method to study the dewetting transition in the system of two hydrophobic plates and a hydrophobic cavity of a synthetic host molecule cucurbit[7]uril. Numerical simulations demonstrate that our approach can describe an underlying system jumping out of a local minimum of the free-energy functional and can capture dewetting transitions of hydrophobic systems. In the case of two hydrophobic plates, we find that the wavelength of interfacial fluctuations has a strong influence to the dewetting transition. In addition, we find that the estimated energy barrier of the dewetting transition scales quadratically with the inter-plate distance, agreeing well with existing studies of molecular dynamics simulations. Our work is a first step toward the inclusion of

  7. Stochastic level-set variational implicit-solvent approach to solute-solvent interfacial fluctuations

    NASA Astrophysics Data System (ADS)

    Zhou, Shenggao; Sun, Hui; Cheng, Li-Tien; Dzubiella, Joachim; Li, Bo; McCammon, J. Andrew

    2016-08-01

    Recent years have seen the initial success of a variational implicit-solvent model (VISM), implemented with a robust level-set method, in capturing efficiently different hydration states and providing quantitatively good estimation of solvation free energies of biomolecules. The level-set minimization of the VISM solvation free-energy functional of all possible solute-solvent interfaces or dielectric boundaries predicts an equilibrium biomolecular conformation that is often close to an initial guess. In this work, we develop a theory in the form of Langevin geometrical flow to incorporate solute-solvent interfacial fluctuations into the VISM. Such fluctuations are crucial to biomolecular conformational changes and binding process. We also develop a stochastic level-set method to numerically implement such a theory. We describe the interfacial fluctuation through the "normal velocity" that is the solute-solvent interfacial force, derive the corresponding stochastic level-set equation in the sense of Stratonovich so that the surface representation is independent of the choice of implicit function, and develop numerical techniques for solving such an equation and processing the numerical data. We apply our computational method to study the dewetting transition in the system of two hydrophobic plates and a hydrophobic cavity of a synthetic host molecule cucurbit[7]uril. Numerical simulations demonstrate that our approach can describe an underlying system jumping out of a local minimum of the free-energy functional and can capture dewetting transitions of hydrophobic systems. In the case of two hydrophobic plates, we find that the wavelength of interfacial fluctuations has a strong influence to the dewetting transition. In addition, we find that the estimated energy barrier of the dewetting transition scales quadratically with the inter-plate distance, agreeing well with existing studies of molecular dynamics simulations. Our work is a first step toward the inclusion of

  8. Level Set Based Hippocampus Segmentation in MR Images with Improved Initialization Using Region Growing

    PubMed Central

    Zhou, Zhaozhong; Ding, Xiaokang; Deng, Xiaolei; Zou, Ling; Li, Bailin

    2017-01-01

    The hippocampus has been known as one of the most important structures referred to as Alzheimer's disease and other neurological disorders. However, segmentation of the hippocampus from MR images is still a challenging task due to its small size, complex shape, low contrast, and discontinuous boundaries. For the accurate and efficient detection of the hippocampus, a new image segmentation method based on adaptive region growing and level set algorithm is proposed. Firstly, adaptive region growing and morphological operations are performed in the target regions and its output is used for the initial contour of level set evolution method. Then, an improved edge-based level set method utilizing global Gaussian distributions with different means and variances is developed to implement the accurate segmentation. Finally, gradient descent method is adopted to get the minimization of the energy equation. As proved by experiment results, the proposed method can ideally extract the contours of the hippocampus that are very close to manual segmentation drawn by specialists. PMID:28191031

  9. Automatic Lumen Segmentation in Intravascular Optical Coherence Tomography Images Using Level Set

    PubMed Central

    Cheng, Kang; Qin, Xianjing; Yin, Qinye; Li, Jianan; Zhao, Wei

    2017-01-01

    Automatic lumen segmentation from intravascular optical coherence tomography (IVOCT) images is an important and fundamental work for diagnosis and treatment of coronary artery disease. However, it is a very challenging task due to irregular lumen caused by unstable plaque and bifurcation vessel, guide wire shadow, and blood artifacts. To address these problems, this paper presents a novel automatic level set based segmentation algorithm which is very competent for irregular lumen challenge. Before applying the level set model, a narrow image smooth filter is proposed to reduce the effect of artifacts and prevent the leakage of level set meanwhile. Moreover, a divide-and-conquer strategy is proposed to deal with the guide wire shadow. With our proposed method, the influence of irregular lumen, guide wire shadow, and blood artifacts can be appreciably reduced. Finally, the experimental results showed that the proposed method is robust and accurate by evaluating 880 images from 5 different patients and the average DSC value was 98.1% ± 1.1%. PMID:28270857

  10. A level set method for cupping artifact correction in cone-beam CT

    SciTech Connect

    Xie, Shipeng; Li, Haibo; Ge, Qi; Li, Chunming

    2015-08-15

    Purpose: To reduce cupping artifacts and improve the contrast-to-noise ratio in cone-beam computed tomography (CBCT). Methods: A level set method is proposed to reduce cupping artifacts in the reconstructed image of CBCT. The authors derive a local intensity clustering property of the CBCT image and define a local clustering criterion function of the image intensities in a neighborhood of each point. This criterion function defines an energy in terms of the level set functions, which represent a segmentation result and the cupping artifacts. The cupping artifacts are estimated as a result of minimizing this energy. Results: The cupping artifacts in CBCT are reduced by an average of 90%. The results indicate that the level set-based algorithm is practical and effective for reducing the cupping artifacts and preserving the quality of the reconstructed image. Conclusions: The proposed method focuses on the reconstructed image without requiring any additional physical equipment, is easily implemented, and provides cupping correction through a single-scan acquisition. The experimental results demonstrate that the proposed method successfully reduces the cupping artifacts.

  11. Comparison between advected-field and level-set methods in the study of vesicle dynamics

    NASA Astrophysics Data System (ADS)

    Maitre, E.; Misbah, C.; Peyla, P.; Raoult, A.

    2012-07-01

    Phospholipidic membranes and vesicles constitute a basic element in real biological functions. Vesicles are viewed as a model system to mimic basic viscoelastic behaviors of some cells, like red blood cells. Phase field and level-set models are powerful tools to tackle dynamics of membranes and their coupling to the flow. These two methods are somewhat similar, but to date no bridge between them has been made. This is a first focus of this paper, where we show how the phase-field methods developed in Biben and Misbah (2003) [7], Beaucourt (2004) [9], Biben (2005) [33] for immersed vesicles could be considered as a level-set method for a particular strain-stress relationship. The main conclusion is that the two methods share several common features and we shall provide the correspondence between the two methods. Furthermore, a constitutive viscoelastic law is derived for the composite fluid: the ambient fluid and the membranes. We present two different approaches to deal with the membrane local incompressibility, and point out differences. Some numerical results following from the level-set approach are presented.

  12. Vascular Tree Segmentation in Medical Images Using Hessian-Based Multiscale Filtering and Level Set Method

    PubMed Central

    Jin, Jiaoying; Yang, Linjun; Zhang, Xuming

    2013-01-01

    Vascular segmentation plays an important role in medical image analysis. A novel technique for the automatic extraction of vascular trees from 2D medical images is presented, which combines Hessian-based multiscale filtering and a modified level set method. In the proposed algorithm, the morphological top-hat transformation is firstly adopted to attenuate background. Then Hessian-based multiscale filtering is used to enhance vascular structures by combining Hessian matrix with Gaussian convolution to tune the filtering response to the specific scales. Because Gaussian convolution tends to blur vessel boundaries, which makes scale selection inaccurate, an improved level set method is finally proposed to extract vascular structures by introducing an external constrained term related to the standard deviation of Gaussian function into the traditional level set. Our approach was tested on synthetic images with vascular-like structures and 2D slices extracted from real 3D abdomen magnetic resonance angiography (MRA) images along the coronal plane. The segmentation rates for synthetic images are above 95%. The results for MRA images demonstrate that the proposed method can extract most of the vascular structures successfully and accurately in visualization. Therefore, the proposed method is effective for the vascular tree extraction in medical images. PMID:24348738

  13. A Quadrature-Free Conservative Level Set RKDG for Simulating Atomization

    NASA Astrophysics Data System (ADS)

    Jibben, Zechariah; Herrmann, Marcus

    2012-11-01

    We present an arbitrary high-order, quadrature-free, Runge-Kutta discontinuous Galerkin (RKDG) method for the solution of the conservative level set equation (Olsson et al., 2007), used for capturing phase interfaces in atomizing multiphase flows. Special care is taken to maintain high-order accuracy in the reinitialization equation, using appropriate slope limiters when necessary and a shared basis across cell interfaces for the diffusive flux. For efficiency, we implement the method in the context of the dual narrow band overset mesh approach of the Refined Level Set Grid method (Herrmann, 2008). The accuracy, consistency, and convergence of the resulting method is demonstrated using the method of manufactured solutions (MMS) and several standard test cases, including Zalesak's disk and columns and spheres in prescribed deformation fields. Using MMS, we demonstrate k + 1 order spatial convergence for k-th order orthonormal Legendre polynomial basis functions. We furthermore show several orders of magnitude improvement in shape and volume errors over traditional WENO based distance function level set methods, and k - 1 order spatial convergence of interfacial curvature using direct neighbor cells only. Supported by Stanford's 2012 CTR Summer Program and NSF grant CBET-1054272.

  14. Stabilized Conservative Level Set Method with Adaptive Wavelet-based Mesh Refinement

    NASA Astrophysics Data System (ADS)

    Shervani-Tabar, Navid; Vasilyev, Oleg V.

    2016-11-01

    This paper addresses one of the main challenges of the conservative level set method, namely the ill-conditioned behavior of the normal vector away from the interface. An alternative formulation for reconstruction of the interface is proposed. Unlike the commonly used methods which rely on the unit normal vector, Stabilized Conservative Level Set (SCLS) uses a modified renormalization vector with diminishing magnitude away from the interface. With the new formulation, in the vicinity of the interface the reinitialization procedure utilizes compressive flux and diffusive terms only in the normal direction to the interface, thus, preserving the conservative level set properties, while away from the interfaces the directional diffusion mechanism automatically switches to homogeneous diffusion. The proposed formulation is robust and general. It is especially well suited for use with adaptive mesh refinement (AMR) approaches due to need for a finer resolution in the vicinity of the interface in comparison with the rest of the domain. All of the results were obtained using the Adaptive Wavelet Collocation Method, a general AMR-type method, which utilizes wavelet decomposition to adapt on steep gradients in the solution while retaining a predetermined order of accuracy.

  15. A Performance Comparison Between a Level Set Method and an Unsplit Volume of Fluid Method

    NASA Astrophysics Data System (ADS)

    Desjardins, Olivier; Chiodi, Robert; Owkes, Mark

    2016-11-01

    The simulation of high density ratio liquid-gas flows presents many numerical difficulties due to the necessity to track the interface and the discontinuities in physical properties associated with the interface. Two main categories of methods used to track the interface are level set methods and volume of fluid (VOF) methods. In particular, conservative level set methods track and transport the interface using a scalar field, with the interface profile represented by a hyperbolic tangent function of a finite thickness. Volume of fluid methods, on the other hand, store the percentage of each fluid in the computational cells. Both methods offer distinct advantages, however, the strengths and weaknesses of each method relative to each other have yet to be thoroughly investigated. This work compares the accuracy and computational efficiency for an accurate conservative level set method and an unsplit VOF method using canonical test cases, such as Zalesak's disk, the deformation of a circle, and the deformation of a sphere. The mass conservation and ability to correctly predict instability for a more complex case of an air-blast atomization of a planar liquid layer will also be presented.

  16. On the geometry of two-dimensional slices of irregular level sets in turbulent flows

    SciTech Connect

    Catrakis, H.J.; Cook, A.W.; Dimotakis, P.E.; Patton, J.M.

    1998-03-20

    Isoscalar surfaces in turbulent flows are found to be more complex than (self-similar) fractals, in both the far field of liquid-phase turbulent jets and in a realization of Rayleigh-Taylor-instability flow. In particular, they exhibit a scale-dependent coverage dimension, D{sub 2}((lambda)), for 2-D slices of scalar level sets, that increases with scale, from unity, at small scales, to 2, at large scales. For the jet flow and Reynolds numbers investigated, the isoscalar-surface geometry is both scalar-threshold- and Re-dependent; the level-set (coverage) length decreases with increasing Re, indicating enhanced mixing with increasing Reynolds number; and the size distribution of closed regions is well described by lognormal statistics at small scales. A similar D{sub 2}((lambda)) behavior is found for level-set data of 3-D density-interface behavior in recent direct numerical-simulation studies of Rayleigh-Taylor-instability flow. A comparison of (spatial) spectral and isoscalar coverage statistics will be disc

  17. A Real-Time Algorithm for the Approximation of Level-Set-Based Curve Evolution

    PubMed Central

    Shi, Yonggang; Karl, William Clem

    2010-01-01

    In this paper, we present a complete and practical algorithm for the approximation of level-set-based curve evolution suitable for real-time implementation. In particular, we propose a two-cycle algorithm to approximate level-set-based curve evolution without the need of solving partial differential equations (PDEs). Our algorithm is applicable to a broad class of evolution speeds that can be viewed as composed of a data-dependent term and a curve smoothness regularization term. We achieve curve evolution corresponding to such evolution speeds by separating the evolution process into two different cycles: one cycle for the data-dependent term and a second cycle for the smoothness regularization. The smoothing term is derived from a Gaussian filtering process. In both cycles, the evolution is realized through a simple element switching mechanism between two linked lists, that implicitly represents the curve using an integer valued level-set function. By careful construction, all the key evolution steps require only integer operations. A consequence is that we obtain significant computation speedups compared to exact PDE-based approaches while obtaining excellent agreement with these methods for problems of practical engineering interest. In particular, the resulting algorithm is fast enough for use in real-time video processing applications, which we demonstrate through several image segmentation and video tracking experiments. PMID:18390371

  18. Numerical Simulation of Dynamic Contact Angles and Contact Lines in Multiphase Flows using Level Set Method

    NASA Astrophysics Data System (ADS)

    Pendota, Premchand

    Many physical phenomena and industrial applications involve multiphase fluid flows and hence it is of high importance to be able to simulate various aspects of these flows accurately. The Dynamic Contact Angles (DCA) and the contact lines at the wall boundaries are a couple of such important aspects. In the past few decades, many mathematical models were developed for predicting the contact angles of the inter-face with the wall boundary under various flow conditions. These models are used to incorporate the physics of DCA and contact line motion in numerical simulations using various interface capturing/tracking techniques. In the current thesis, a simple approach to incorporate the static and dynamic contact angle boundary conditions using the level set method is developed and implemented in multiphase CFD codes, LIT (Level set Interface Tracking) (Herrmann (2008)) and NGA (flow solver) (Desjardins et al (2008)). Various DCA models and associated boundary conditions are reviewed. In addition, numerical aspects such as the occurrence of a stress singularity at the contact lines and grid convergence of macroscopic interface shape are dealt with in the context of the level set approach.

  19. Implementation of E.U. Water Framework Directive: source assessment of metallic substances at catchment levels.

    PubMed

    Chon, Ho-Sik; Ohandja, Dieudonne-Guy; Voulvoulis, Nikolaos

    2010-01-01

    The E.U. Water Framework Directive (WFD) aims to prevent deterioration of water quality and to phase out or reduce the concentrations of priority substances at catchment levels. It requires changes in water management from a local scale to a river basin scale, and establishes Environmental Quality Standards (EQS) as a guideline for the chemical status of receiving waters. According to the Directive, the standard and the scope of the investigation for water management are more stringent and expanded than in the past, and this change also needs to be applied to restoring the level of metals in water bodies. The aim of this study was to identify anthropogenic emission sources of metallic substances at catchment levels. Potential sources providing substantial amounts of such substances in receiving waters included stormwater, industrial effluents, treated effluents, agricultural drainage, sediments, mining drainage and landfill leachates. Metallic substances have more emission sources than other dangerous substances at catchment levels. Therefore, source assessment for these substances is required to be considered more significantly to restore their chemical status in the context of the WFD. To improve source assessment quality, research on the role of societal and environmental parameters and contribution of each source to the chemical distribution in receiving waters need to be carried out.

  20. Stacking sequence and shape optimization of laminated composite plates via a level-set method

    NASA Astrophysics Data System (ADS)

    Allaire, G.; Delgado, G.

    2016-12-01

    We consider the optimal design of composite laminates by allowing a variable stacking sequence and in-plane shape of each ply. In order to optimize both variables we rely on a decomposition technique which aggregates the constraints into one unique constraint margin function. Thanks to this approach, an exactly equivalent bi-level optimization problem is established. This problem is made up of an inner level represented by the combinatorial optimization of the stacking sequence and an outer level represented by the topology and geometry optimization of each ply. We propose for the stacking sequence optimization an outer approximation method which iteratively solves a set of mixed integer linear problems associated to the evaluation of the constraint margin function. For the topology optimization of each ply, we lean on the level set method for the description of the interfaces and the Hadamard method for boundary variations by means of the computation of the shape gradient. Numerical experiments are performed on an aeronautic test case where the weight is minimized subject to different mechanical constraints, namely compliance, reserve factor and buckling load.

  1. Street Level Hydrology: An Urban Application of the WRF-Hydro Framework in Denver, Colorado

    NASA Astrophysics Data System (ADS)

    Read, L.; Hogue, T. S.; Salas, F. R.; Gochis, D.

    2015-12-01

    Urban flood modeling at the watershed scale carries unique challenges in routing complexity, data resolution, social and political issues, and land surface - infrastructure interactions. The ability to accurately trace and predict the flow of water through the urban landscape enables better emergency response management, floodplain mapping, and data for future urban infrastructure planning and development. These services are of growing importance as urban population is expected to continue increasing by 1.84% per year for the next 25 years, increasing the vulnerability of urban regions to damages and loss of life from floods. Although a range of watershed-scale models have been applied in specific urban areas to examine these issues, there is a trend towards national scale hydrologic modeling enabled by supercomputing resources to understand larger system-wide hydrologic impacts and feedbacks. As such it is important to address how urban landscapes can be represented in large scale modeling processes. The current project investigates how coupling terrain and infrastructure routing can improve flow prediction and flooding events over the urban landscape. We utilize the WRF-Hydro modeling framework and a high-resolution terrain routing grid with the goal of compiling standard data needs necessary for fine scale urban modeling and dynamic flood forecasting in the urban setting. The city of Denver is selected as a case study, as it has experienced several large flooding events in the last five years and has an urban annual population growth rate of 1.5%, one of the highest in the U.S. Our work highlights the hydro-informatic challenges associated with linking channel networks and drainage infrastructure in an urban area using the WRF-Hydro modeling framework and high resolution urban models for short-term flood prediction.

  2. Assessing the Macro-Level Correlates of Malware Infections Using a Routine Activities Framework.

    PubMed

    Holt, Thomas J; Burruss, George W; Bossler, Adam M

    2016-12-01

    The ability to gain unauthorized access to computer systems to engage in espionage and data theft poses a massive threat to individuals worldwide. There has been minimal focus, however, on the role of malicious software, or malware, which can automate this process. This study examined the macro-correlates of malware infection at the national level by using an open repository of known malware infections and utilizing a routine activities framework. Negative inflated binomial models for counts indicated that nations with greater technological infrastructure, more political freedoms, and with less organized crime financial impact were more likely to report malware infections. The number of Computer Emergency Response Teams (CERTs) in a nation was not significantly related with reported malware infection. The implications of the study for the understanding of malware infection, routine activity theory, and target-hardening strategies are discussed.

  3. Providing conceptual framework support for distributed Web-based simulation within the high-level architecture

    NASA Astrophysics Data System (ADS)

    Page, Ernest H.; Griffin, Sean P.; Rother, S. L.

    1998-08-01

    Web-based simulation, a subject of increasing interest to both simulation researchers and practitioners, has the potential to significantly influence the application and availability of simulation as a problem-solving technique. Web technologies also portend cost-effective distributed modeling and simulation. These applications will require solutions to the systems interoperability problem similar to the DoD High Level Architecture (HLA). The suitability of the HLA to serve 'mainstream' simulation is examined.Approaches for incorporating discrete event simulation conceptual frameworks within the HLA are described and ongoing research in this area noted. Issues raised include a discussion of the appropriate roles for a simulation-support language and a simulation-support architecture.

  4. Conceptualizing and assessing heterosexism in high schools: a setting-level approach.

    PubMed

    Chesir-Teran, Daniel

    2003-06-01

    Heterosexism is defined as a setting-level process that systematically privileges heterosexuality relative to homosexuality, based on the assumption that heterosexuality, as well as heterosexual power and privilege are the norm and the ideal. The many ways heterosexism is manifest in the physical-architectural, program-policy, suprapersonal, and social features of high schools are described followed by a proposal for a comprehensive assessment strategy. Strategies used in previous research are reviewed in terms of what is assessed, how it is assessed, and how it is analyzed. The author advocates for more comprehensive assessments and for school-level analyses to enable comparisons between schools, facilitate research on the effects of heterosexism, and provide a basis for evaluating interventions. Additional issues include reliability and validity, links between heterosexism and other forms of oppression, heterosexism in other contexts or at other levels, and implications for theory and practice in community psychology.

  5. Improved inhalation technology for setting safe exposure levels for workplace chemicals

    NASA Technical Reports Server (NTRS)

    Stuart, Bruce O.

    1993-01-01

    Threshold Limit Values recommended as allowable air concentrations of a chemical in the workplace are often based upon a no-observable-effect-level (NOEL) determined by experimental inhalation studies using rodents. A 'safe level' for human exposure must then be estimated by the use of generalized safety factors in attempts to extrapolate from experimental rodents to man. The recent development of chemical-specific physiologically-based toxicokinetics makes use of measured physiological, biochemical, and metabolic parameters to construct a validated model that is able to 'scale-up' rodent response data to predict the behavior of the chemical in man. This procedure is made possible by recent advances in personal computer software and the emergence of appropriate biological data, and provides an analytical tool for much more reliable risk evaluation and airborne chemical exposure level setting for humans.

  6. Statistical criteria to set alarm levels for continuous measurements of ground contamination.

    PubMed

    Brandl, A; Jimenez, A D Herrera

    2008-08-01

    In the course of the decommissioning of the ASTRA research reactor at the site of the Austrian Research Centers at Seibersdorf, the operator and licensee, Nuclear Engineering Seibersdorf, conducted an extensive site survey and characterization to demonstrate compliance with regulatory site release criteria. This survey included radiological characterization of approximately 400,000 m(2) of open land on the Austrian Research Centers premises. Part of this survey was conducted using a mobile large-area gas proportional counter, continuously recording measurements while it was moved at a speed of 0.5 ms(-1). In order to set reasonable investigation levels, two alarm levels based on statistical considerations were developed. This paper describes the derivation of these alarm levels and the operational experience gained by detector deployment in the field.

  7. Differential optimal dopamine levels for set-shifting and working memory in Parkinson's disease.

    PubMed

    Fallon, Sean James; Smulders, Katrijn; Esselink, Rianne A; van de Warrenburg, Bart P; Bloem, Bastiaan R; Cools, Roshan

    2015-10-01

    Parkinson's disease (PD) is an important model for the role of dopamine in supporting human cognition. However, despite the uniformity of midbrain dopamine depletion only some patients experience cognitive impairment. The neurocognitive mechanisms of this heterogeneity remain unclear. A genetic polymorphism in the catechol O-methyltransferase (COMT) enzyme, predominantly thought to exert its cognitive effect through acting on prefrontal cortex (PFC) dopamine transmission, provides us with an experimental window onto dopamine's role in cognitive performance in PD. In a large cohort of PD patients (n=372), we examined the association between COMT genotype and two tasks known to implicate prefrontal dopamine (spatial working memory and attentional set-shifting) and on a task less sensitive to prefrontal dopamine (paired associates learning). Consistent with the known neuroanatomical locus of its effects, differences between the COMT genotype groups were observed on dopamine-dependant tasks, but not the paired associates learning task. However, COMT genotype had differential effects on the two prefrontal dopamine tasks. Putative prefrontal dopamine levels influenced spatial working memory in an 'Inverted-U'-shaped fashion, whereas a linear, dose-dependant pattern was observed for attentional set-shifting. Cumulatively, these results revise our understanding of when COMT genotype modulates cognitive functioning in PD patients by showing that the behavioural consequences of genetic variation vary according to task demands, presumably because set-shifting and working memory have different optimal dopamine levels.

  8. Study of burn scar extraction automatically based on level set method using remote sensing data.

    PubMed

    Liu, Yang; Dai, Qin; Liu, Jianbo; Liu, ShiBin; Yang, Jin

    2014-01-01

    Burn scar extraction using remote sensing data is an efficient way to precisely evaluate burn area and measure vegetation recovery. Traditional burn scar extraction methodologies have no well effect on burn scar image with blurred and irregular edges. To address these issues, this paper proposes an automatic method to extract burn scar based on Level Set Method (LSM). This method utilizes the advantages of the different features in remote sensing images, as well as considers the practical needs of extracting the burn scar rapidly and automatically. This approach integrates Change Vector Analysis (CVA), Normalized Difference Vegetation Index (NDVI) and the Normalized Burn Ratio (NBR) to obtain difference image and modifies conventional Level Set Method Chan-Vese (C-V) model with a new initial curve which results from a binary image applying K-means method on fitting errors of two near-infrared band images. Landsat 5 TM and Landsat 8 OLI data sets are used to validate the proposed method. Comparison with conventional C-V model, OSTU algorithm, Fuzzy C-mean (FCM) algorithm are made to show that the proposed approach can extract the outline curve of fire burn scar effectively and exactly. The method has higher extraction accuracy and less algorithm complexity than that of the conventional C-V model.

  9. Structural topology design of container ship based on knowledge-based engineering and level set method

    NASA Astrophysics Data System (ADS)

    Cui, Jin-ju; Wang, De-yu; Shi, Qi-qi

    2015-06-01

    Knowledge-Based Engineering (KBE) is introduced into the ship structural design in this paper. From the implementation of KBE, the design solutions for both Rules Design Method (RDM) and Interpolation Design Method (IDM) are generated. The corresponding Finite Element (FE) models are generated. Topological design of the longitudinal structures is studied where the Gaussian Process (GP) is employed to build the surrogate model for FE analysis. Multi-objective optimization methods inspired by Pareto Front are used to reduce the design tank weight and outer surface area simultaneously. Additionally, an enhanced Level Set Method (LSM) which employs implicit algorithm is applied to the topological design of typical bracket plate which is used extensively in ship structures. Two different sets of boundary conditions are considered. The proposed methods show satisfactory efficiency and accuracy.

  10. Conceptual framework for assessing the response of delta channel networks to Holocene sea level rise

    NASA Astrophysics Data System (ADS)

    Jerolmack, Douglas J.

    2009-08-01

    Recent research has identified two fundamental unit processes that build delta distributary channels. The first is mouth-bar deposition at the shoreline and subsequent channel bifurcation, which is driven by progradation of the shoreline; the second is avulsion to a new channel, a result of aggradation of the delta topset. The former creates relatively small, branching networks such as Wax Lake Delta; the latter generates relatively few, long distributaries such as the Mississippi and Atchafalaya channels on the Mississippi Delta. The relative rate of progradation to aggradation, and hence the creation of accommodation space, emerges as a controlling parameter on channel network form. Field and experimental research has identified sea level as the dominant control on Holocene delta growth worldwide, and has empirically linked channel network changes to changes in the rate of sea level rise. Here I outline a simple modeling framework for distributary network evolution, and use this to explore large-scale changes in Holocene channel pattern that have been observed in deltas such as the Rhine-Meuse and Mississippi. Rapid early- to mid-Holocene sea level rise forced many deltas into an aggradational mode, where I hypothesize that avulsion and the generation of large-scale branches should dominate. Slowing of sea level rise in the last ˜6000 yr allowed partitioning of sediment into progradation, facilitating the growth of smaller-scale distributary trees at the shorelines of some deltas, and a reduction in the number of large-scale branches. Significant antecedent topography modulates delta response; the filling of large incised valleys, for example, caused many deltas to bypass the aggradational phase. Human effects on deltas can be cast in terms of geologic controls affecting accommodation: constriction of channels forces rapid local progradation and mouth-bar bifurcation, while accelerated sea level rise increases aggradation and induces more frequent channel

  11. Large-Eddy Simulation of Premixed and Partially Premixed Turbulent Combustion Using a Level Set Method

    NASA Astrophysics Data System (ADS)

    Duchamp de Lageneste, Laurent; Pitsch, Heinz

    2001-11-01

    Level-set methods (G-equation) have been recently used in the context of RANS to model turbulent premixed (Hermann 2000) or partially premixed (Chen 1999) combustion. By directly taking into account unsteady effects, LES can be expected to improve predictions over RANS. Since the reaction zone thickness of premixed flames in technical devices is usually much smaller than the LES grid spacing, chemical reactions completely occur on the sub-grid scales and hence have to be modeled entirely. In the level-set methodology, the flame front is represented by an arbitrary iso-surface G0 of a scalar field G whose evolution is described by the so-called G-equation. This equation is only valid at G=G_0, and hence decoupled from other G levels. Heat release is then modeled using a flamelet approach in which temperature is determined as a function of G and the mixture-fraction Z. In the present study, the proposed approach has been formulated for LES and validated using data from a turbulent Bunsen burner experiment (Chen, Peters 1996). Simulation of an experimental Lean Premixed Prevapourised (LPP) dump combustor (Besson, Bruel 1999, 2000) under different premixed or partially premixed conditions will also be presented.

  12. Physical Therapy for Young Children Diagnosed with Autism Spectrum Disorders–Clinical Frameworks Model in an Israeli Setting

    PubMed Central

    Atun-Einy, Osnat; Lotan, Meir; Harel, Yael; Shavit, Efrat; Burstein, Shimshon; Kempner, Gali

    2013-01-01

    Recent research findings suggest that many children with Autism Spectrum Disorders (ASD) demonstrate delayed and atypical motor achievements. It has now become clear that a more holistic, integrative and multi-disciplinary intervention is required to effectively address the motor-related impairments of this population. It is also crucial to ensure that this group of clients has access to early physical therapy (PT) interventions. Despite accumulating research on physical interventions, little is known about intervention model for implementation at a national level. This report introduces a model that uniquely illustrates implementation of PT services for a large number of children with ASD. The model has been operating for the past 2 years in one country (Israel), and includes an optional implementation model of PT practice settings for young children diagnosed with ASD. The Israeli setting offers a unique opportunity for implementing PT services for a multitude of children with ASD on a regular basis as an accepted/needed service. The initial outcomes of the present implementation suggest that an intensive PT intervention program might enhance therapeutic outcomes for this population, and contribute to our knowledge on the potential of PT for individuals with ASD. PMID:24400265

  13. Differential graded Lie algebras and singularities of level sets of momentum mappings

    NASA Astrophysics Data System (ADS)

    Goldman, William M.; Millson, John J.

    1990-08-01

    The germ of an analytic variety X at a point x∈ X is said to be quadratic if it is bi-analytically isomorphic to the germ of a cone defined by a system of homogeneous quadratic equations at the origin. Arms, Marsden and Moncrief show in [2] that under certain conditions the analytic germ of a level set of a momentum mapping is quadratic. We discuss related ideas in a more algebraic context by associating to an affine Hamiltonian action a differential graded Lie algebra, which in the presence of an invariant positive complex structure, is formal in the sence of [5].

  14. Computerized segmentation of liver in hepatic CT and MRI by means of level-set geodesic active contouring.

    PubMed

    Suzuki, Kenji; Huynh, Hieu Trung; Liu, Yipeng; Calabrese, Dominic; Zhou, Karen; Oto, Aytekin; Hori, Masatoshi

    2013-01-01

    Computerized liver volumetry has been studied, because the current "gold-standard" manual volumetry is subjective and very time-consuming. Liver volumetry is done in either CT or MRI. A number of researchers have developed computerized liver segmentation in CT, but there are fewer studies on ones for MRI. Our purpose in this study was to develop a general framework for liver segmentation in both CT and MRI. Our scheme consisted of 1) an anisotropic diffusion filter to reduce noise while preserving liver structures, 2) a scale-specific gradient magnitude filter to enhance liver boundaries, 3) a fast-marching algorithm to roughly determine liver boundaries, and 4) a geodesic-active-contour model coupled with a level-set algorithm to refine the initial boundaries. Our CT database contained hepatic CT scans of 18 liver donors obtained under a liver transplant protocol. Our MRI database contains 23 patients with 1.5T MRI scanners. To establish "gold-standard" liver volumes, radiologists manually traced the contour of the liver on each CT or MR slice. We compared our computer volumetry with "gold-standard" manual volumetry. Computer volumetry in CT and MRI reached excellent agreement with manual volumetry (intra-class correlation coefficient = 0.94 and 0.98, respectively). Average user time for computer volumetry in CT and MRI was 0.57 ± 0.06 and 1.0 ± 0.13 min. per case, respectively, whereas those for manual volumetry were 39.4 ± 5.5 and 24.0 ± 4.4 min. per case, respectively, with statistically significant difference (p < .05). Our computerized liver segmentation framework provides an efficient and accurate way of measuring liver volumes in both CT and MRI.

  15. Assessing levels of adaptation during implementation of evidence-based interventions: introducing the Rogers-Rütten framework.

    PubMed

    Bowen, Shelly-Ann K; Saunders, Ruth P; Richter, Donna L; Hussey, Jim; Elder, Keith; Lindley, Lisa

    2010-12-01

    Most HIV-prevention funding agencies require the use of evidence-based behavioral interventions, tested and proven to be effective through outcome evaluation. Adaptation of programs during implementation is common and may be influenced by many factors, including agency mission, time constraints, and funding streams. There are few theoretical frameworks to understand how these organizational and program-related factors influence the level of adaptation. This study used constructs from both Rogers's diffusion theory and Rütten's framework for policy analysis to create a conceptual framework that identifies determinants hypothesized to affect the level of adaptation. Preliminary measures of these constructs were also developed. This framework and its measures assess organizational and program-related factors associated with adaptation and could serve as a model to assess implementation and adaptation in fields outside of HIV prevention.

  16. Vessel Segmentation and Blood Flow Simulation Using Level-Sets and Embedded Boundary Methods

    SciTech Connect

    Deschamps, T; Schwartz, P; Trebotich, D; Colella, P; Saloner, D; Malladi, R

    2004-12-09

    In this article we address the problem of blood flow simulation in realistic vascular objects. The anatomical surfaces are extracted by means of Level-Sets methods that accurately model the complex and varying surfaces of pathological objects such as aneurysms and stenoses. The surfaces obtained are defined at the sub-pixel level where they intersect the Cartesian grid of the image domain. It is therefore straightforward to construct embedded boundary representations of these objects on the same grid, for which recent work has enabled discretization of the Navier-Stokes equations for incompressible fluids. While most classical techniques require construction of a structured mesh that approximates the surface in order to extrapolate a 3D finite-element gridding of the whole volume, our method directly simulates the blood-flow inside the extracted surface without losing any complicated details and without building additional grids.

  17. Level set-based core segmentation of mammographic masses facilitating three stage (core, periphery, spiculation) analysis.

    PubMed

    Ball, John E; Bruce, Lori Mann

    2007-01-01

    We present mammographic mass core segmentation, based on the Chan-Vese level set method. The proposed method is analyzed via resulting feature efficacies. Additionally, the core segmentation method is used to investigate the idea of a three stage segmentation approach, i.e. segment the mass core, periphery, and spiculations (if any exist) and use features from these three segmentations to classify the mass as either benign or malignant. The proposed core segmentation method and a proposed end-to-end computer aided detection (CAD) system using a three stage segmentation are implemented and experimentally tested with a set of 60 mammographic images from the Digital Database of Screening Mammography. Receiver operating characteristic (ROC) curve AZ values for morphological and texture features extracted from the core segmentation are shown to be on par, or better, than those extracted from a periphery segmentation. The efficacy of the core segmentation features when combined with the periphery and spiculation segmentation features are shown to be feature set dependent. The proposed end-to-end system uses stepwise linear discriminant analysis for feature selection and a maximum likelihood classifier. Using all three stages (core + periphery + spiculations) results in an overall accuracy (OA) of 90% with 2 false negatives (FN). Since many CAD systems only perform a periphery analysis, adding core features could be a benefit to potentially increase OA and reduce FN cases.

  18. Modelling of two-phase flow in a minichannel using level-set method

    NASA Astrophysics Data System (ADS)

    Grzybowski, H.; Mosdorf, R.

    2014-08-01

    Today there is a great interest in micro-scale multiphase fluid flow. In the paper, the numerical simulation of two-phase flow inside 3 mm minichannel was carried out. The liquid- gas interface was captured using the level-set method. During the calculation, the stabilization and reinitialization of level set function was performed in order to obtain the proper accuracy of the simulation. Incompressible Navier-Stokes equations were solved using the COMSOL Multiphysics® on a two-dimensional mesh. The process of formation of different two-phase flow patterns in the minichannel has been investigated. During the simulation it has been analysed three flow patterns: the bubbly flow and two kinds of slug flow with short and long slugs. It has been shown that unsteady flow at the inlet of the minichannel is responsible for the chaotic character of changes of the slug and bubble sizes. Such unsteady flow modifies the distance between the bubbles and slugs. It has been shown that for the low water inlet velocity the two-phase flow pattern becomes more stable.

  19. Level set immersed boundary method for gas-liquid-solid interactions

    NASA Astrophysics Data System (ADS)

    Wang, Shizhao; Balaras, Elias

    2015-11-01

    We will discuss an approach to simulate the interaction between free surface flows and deformable structures. In our formulation the Navier-Stokes equations are solved on a block-structured grid with adaptive mesh refinement, and the pressure jumps across the interface between different phases, which is tracked using a level set approach, are sharply defined. Deformable structures are simulated with a solid mechanics solver utilizing a finite element method. The overall approach is tailored to problems with large displacement/deformations. The boundary conditions on a solid body are imposed using a direct forcing, immersed boundary method (Vanella & Balaras, J. Comput. Physics, 228(18), 6617-6628, 2009). The flow and structural solvers are coupled by a predictor-corrector, strong-coupling scheme. The consistency between the Eulerian field based level set method for fluid-fluid interface and Lagrangian marker based immersed boundary method for fluid-structure interface is ensured by reconstructing the flow field around the three phase intersections. A variety of 2D and 3D problems ranging from water impact of wedges, entry and exit of cylinders and flexible plates interacting with a free surfaces, are presented to demonstrate the accuracy of the proposed approach. Supported by ONR N000141110588 monitored by Dr. Thomas Fu.

  20. A mass conserving level set method for detailed numerical simulation of liquid atomization

    SciTech Connect

    Luo, Kun; Shao, Changxiao; Yang, Yue; Fan, Jianren

    2015-10-01

    An improved mass conserving level set method for detailed numerical simulations of liquid atomization is developed to address the issue of mass loss in the existing level set method. This method introduces a mass remedy procedure based on the local curvature at the interface, and in principle, can ensure the absolute mass conservation of the liquid phase in the computational domain. Three benchmark cases, including Zalesak's disk, a drop deforming in a vortex field, and the binary drop head-on collision, are simulated to validate the present method, and the excellent agreement with exact solutions or experimental results is achieved. It is shown that the present method is able to capture the complex interface with second-order accuracy and negligible additional computational cost. The present method is then applied to study more complex flows, such as a drop impacting on a liquid film and the swirling liquid sheet atomization, which again, demonstrates the advantages of mass conservation and the capability to represent the interface accurately.

  1. Robust space-time extraction of ventricular surface evolution using multiphase level sets

    NASA Astrophysics Data System (ADS)

    Drapaca, Corina S.; Cardenas, Valerie; Studholme, Colin

    2004-05-01

    This paper focuses on the problem of accurately extracting the CSF-tissue boundary, particularly around the ventricular surface, from serial structural MRI of the brain acquired in imaging studies of aging and dementia. This is a challenging problem because of the common occurrence of peri-ventricular lesions which locally alter the appearance of white matter. We examine a level set approach which evolves a four dimensional description of the ventricular surface over time. This has the advantage of allowing constraints on the contour in the temporal dimension, improving the consistency of the extracted object over time. We follow the approach proposed by Chan and Vese which is based on the Mumford and Shah model and implemented using the Osher and Sethian level set method. We have extended this to the 4 dimensional case to propagate a 4D contour toward the tissue boundaries through the evolution of a 5D implicit function. For convergence we use region-based information provided by the image rather than the gradient of the image. This is adapted to allow intensity contrast changes between time frames in the MRI sequence. Results on time sequences of 3D brain MR images are presented and discussed.

  2. Time-optimal path planning in dynamic flows using level set equations: theory and schemes

    NASA Astrophysics Data System (ADS)

    Lolla, Tapovan; Lermusiaux, Pierre F. J.; Ueckermann, Mattheus P.; Haley, Patrick J.

    2014-10-01

    We develop an accurate partial differential equation-based methodology that predicts the time-optimal paths of autonomous vehicles navigating in any continuous, strong, and dynamic ocean currents, obviating the need for heuristics. The goal is to predict a sequence of steering directions so that vehicles can best utilize or avoid currents to minimize their travel time. Inspired by the level set method, we derive and demonstrate that a modified level set equation governs the time-optimal path in any continuous flow. We show that our algorithm is computationally efficient and apply it to a number of experiments. First, we validate our approach through a simple benchmark application in a Rankine vortex flow for which an analytical solution is available. Next, we apply our methodology to more complex, simulated flow fields such as unsteady double-gyre flows driven by wind stress and flows behind a circular island. These examples show that time-optimal paths for multiple vehicles can be planned even in the presence of complex flows in domains with obstacles. Finally, we present and support through illustrations several remarks that describe specific features of our methodology.

  3. Comparisons and Limitations of Gradient Augmented Level Set and Algebraic Volume of Fluid Methods

    NASA Astrophysics Data System (ADS)

    Anumolu, Lakshman; Ryddner, Douglas; Trujillo, Mario

    2014-11-01

    Recent numerical methods for implicit interface transport are generally presented as enjoying higher order of spatial-temporal convergence when compared to classical methods or less sophisticated approaches. However, when applied to test cases, which are designed to simulate practical industrial conditions, significant reduction in convergence is observed in higher-order methods, whereas for the less sophisticated approaches same convergence is achieved but a growth in the error norms occurs. This provides an opportunity to understand the underlying issues which causes this decrease in accuracy in both types of methods. As an example we consider the Gradient Augmented Level Set method (GALS) and a variant of the Volume of Fluid (VoF) method in our study. Results show that while both methods do suffer from a loss of accuracy, it is the higher order method that suffers more. The implication is a significant reduction in the performance advantage of the GALS method over the VoF scheme. Reasons for this lie in the behavior of the higher order derivatives, particular in situations where the level set field is highly distorted. For the VoF approach, serious spurious deformations of the interface are observed, albeit with a deceptive zero loss of mass.

  4. Variational B-spline level-set: a linear filtering approach for fast deformable model evolution.

    PubMed

    Bernard, Olivier; Friboulet, Denis; Thévenaz, Philippe; Unser, Michael

    2009-06-01

    In the field of image segmentation, most level-set-based active-contour approaches take advantage of a discrete representation of the associated implicit function. We present in this paper a different formulation where the implicit function is modeled as a continuous parametric function expressed on a B-spline basis. Starting from the active-contour energy functional, we show that this formulation allows us to compute the solution as a restriction of the variational problem on the space spanned by the B-splines. As a consequence, the minimization of the functional is directly obtained in terms of the B-spline coefficients. We also show that each step of this minimization may be expressed through a convolution operation. Because the B-spline functions are separable, this convolution may in turn be performed as a sequence of simple 1-D convolutions, which yields an efficient algorithm. As a further consequence, each step of the level-set evolution may be interpreted as a filtering operation with a B-spline kernel. Such filtering induces an intrinsic smoothing in the algorithm, which can be controlled explicitly via the degree and the scale of the chosen B-spline kernel. We illustrate the behavior of this approach on simulated as well as experimental images from various fields.

  5. On the Relationship between Variational Level Set-Based and SOM-Based Active Contours

    PubMed Central

    Abdelsamea, Mohammed M.; Gnecco, Giorgio; Gaber, Mohamed Medhat; Elyan, Eyad

    2015-01-01

    Most Active Contour Models (ACMs) deal with the image segmentation problem as a functional optimization problem, as they work on dividing an image into several regions by optimizing a suitable functional. Among ACMs, variational level set methods have been used to build an active contour with the aim of modeling arbitrarily complex shapes. Moreover, they can handle also topological changes of the contours. Self-Organizing Maps (SOMs) have attracted the attention of many computer vision scientists, particularly in modeling an active contour based on the idea of utilizing the prototypes (weights) of a SOM to control the evolution of the contour. SOM-based models have been proposed in general with the aim of exploiting the specific ability of SOMs to learn the edge-map information via their topology preservation property and overcoming some drawbacks of other ACMs, such as trapping into local minima of the image energy functional to be minimized in such models. In this survey, we illustrate the main concepts of variational level set-based ACMs, SOM-based ACMs, and their relationship and review in a comprehensive fashion the development of their state-of-the-art models from a machine learning perspective, with a focus on their strengths and weaknesses. PMID:25960736

  6. Student Performance-University Preference Model: A Framework for Helping Students Choose the Right A-Level Subjects

    ERIC Educational Resources Information Center

    Wilkins, Stephen; Meeran, Sheik

    2011-01-01

    Every year, many students in the UK fail to achieve a place at their preferred university because they take the wrong A-level subjects. This study aims to suggest a framework for helping students choose the right subjects. Data on student achievement in A-level examinations were obtained from a UK sixth form college over a four-year period.…

  7. Modelling Molecular Mechanisms: A Framework of Scientific Reasoning to Construct Molecular-Level Explanations for Cellular Behaviour

    ERIC Educational Resources Information Center

    van Mil, Marc H. W.; Boerwinkel, Dirk Jan; Waarlo, Arend Jan

    2013-01-01

    Although molecular-level details are part of the upper-secondary biology curriculum in most countries, many studies report that students fail to connect molecular knowledge to phenomena at the level of cells, organs and organisms. Recent studies suggest that students lack a framework to reason about complex systems to make this connection. In this…

  8. Soybean fruit development and set at the node level under combined photoperiod and radiation conditions.

    PubMed

    Nico, Magalí; Mantese, Anita I; Miralles, Daniel J; Kantolic, Adriana G

    2016-01-01

    In soybean, long days during post-flowering increase seed number. This positive photoperiodic effect on seed number has been previously associated with increments in the amount of radiation accumulated during the crop cycle because long days extend the duration of the crop cycle. However, evidence of intra-nodal processes independent of the availability of assimilates suggests that photoperiodic effects at the node level might also contribute to pod set. This work aims to identify the main mechanisms responsible for the increase in pod number per node in response to long days; including the dynamics of flowering, pod development, growth and set at the node level. Long days increased pods per node on the main stems, by increasing pods on lateral racemes (usually dominated positions) at some main stem nodes. Long days lengthened the flowering period and thereby increased the number of opened flowers on lateral racemes. The flowering period was prolonged under long days because effective seed filling was delayed on primary racemes (dominant positions). Long days also delayed the development of flowers into pods with filling seeds, delaying the initiation of pod elongation without modifying pod elongation rate. The embryo development matched the external pod length irrespective of the pod's chronological age. These results suggest that long days during post-flowering enhance pod number per node through a relief of the competition between pods of different hierarchy within the node. The photoperiodic effect on the development of dominant pods, delaying their elongation and therefore postponing their active growth, extends flowering and allows pod set at positions that are usually dominated.

  9. Whole abdominal wall segmentation using augmented active shape models (AASM) with multi-atlas label fusion and level set

    NASA Astrophysics Data System (ADS)

    Xu, Zhoubing; Baucom, Rebeccah B.; Abramson, Richard G.; Poulose, Benjamin K.; Landman, Bennett A.

    2016-03-01

    The abdominal wall is an important structure differentiating subcutaneous and visceral compartments and intimately involved with maintaining abdominal structure. Segmentation of the whole abdominal wall on routinely acquired computed tomography (CT) scans remains challenging due to variations and complexities of the wall and surrounding tissues. In this study, we propose a slice-wise augmented active shape model (AASM) approach to robustly segment both the outer and inner surfaces of the abdominal wall. Multi-atlas label fusion (MALF) and level set (LS) techniques are integrated into the traditional ASM framework. The AASM approach globally optimizes the landmark updates in the presence of complicated underlying local anatomical contexts. The proposed approach was validated on 184 axial slices of 20 CT scans. The Hausdorff distance against the manual segmentation was significantly reduced using proposed approach compared to that using ASM, MALF, and LS individually. Our segmentation of the whole abdominal wall enables the subcutaneous and visceral fat measurement, with high correlation to the measurement derived from manual segmentation. This study presents the first generic algorithm that combines ASM, MALF, and LS, and demonstrates practical application for automatically capturing visceral and subcutaneous fat volumes.

  10. Whole Abdominal Wall Segmentation using Augmented Active Shape Models (AASM) with Multi-Atlas Label Fusion and Level Set

    PubMed Central

    Xu, Zhoubing; Baucom, Rebeccah B.; Abramson, Richard G.; Poulose, Benjamin K.; Landman, Bennett A.

    2016-01-01

    The abdominal wall is an important structure differentiating subcutaneous and visceral compartments and intimately involved with maintaining abdominal structure. Segmentation of the whole abdominal wall on routinely acquired computed tomography (CT) scans remains challenging due to variations and complexities of the wall and surrounding tissues. In this study, we propose a slice-wise augmented active shape model (AASM) approach to robustly segment both the outer and inner surfaces of the abdominal wall. Multi-atlas label fusion (MALF) and level set (LS) techniques are integrated into the traditional ASM framework. The AASM approach globally optimizes the landmark updates in the presence of complicated underlying local anatomical contexts. The proposed approach was validated on 184 axial slices of 20 CT scans. The Hausdorff distance against the manual segmentation was significantly reduced using proposed approach compared to that using ASM, MALF, and LS individually. Our segmentation of the whole abdominal wall enables the subcutaneous and visceral fat measurement, with high correlation to the measurement derived from manual segmentation. This study presents the first generic algorithm that combines ASM, MALF, and LS, and demonstrates practical application for automatically capturing visceral and subcutaneous fat volumes. PMID:27127333

  11. Standards-Setting Procedures in Accountability Research: Impacts of Conceptual Frameworks and Mapping Procedures on Passing Rates.

    ERIC Educational Resources Information Center

    Wang, LihShing; Pan, Wei; Austin, James T.

    Standard-setting research has yielded a rich array of more than 50 standard-setting procedures, but practitioners are likely to be confused about which to use. By synthesizing the accumulated research on standard setting and progress monitoring, this study developed a three-dimensional taxonomy for conceptualizing and operationalizing the various…

  12. Critical Factors to Consider in Evaluating Standard-Setting Studies to Map Language Test Scores to Frameworks of Language Proficiency

    ERIC Educational Resources Information Center

    Tannenbaum, Richard J.; Cho, Yeonsuk

    2014-01-01

    In this article, we consolidate and present in one place what is known about quality indicators for setting standards so that stakeholders may be able to recognize the signs of standard-setting quality. We use the context of setting standards to associate English language test scores with language proficiency descriptions such as those presented…

  13. Oligocene sea-level falls recorded in mid-Pacific atoll and archipelagic apron settings

    NASA Astrophysics Data System (ADS)

    Schlanger, S. O.; Premoli Silva, I.

    1986-05-01

    Drilling results from mid-Pacific atoll and archipelagic apron sites in the Line Islands and Marshall Islands provinces lead to the conclusion that Oligocene sea-level falls detected in Atlantic passive margin sequences are also recorded in a mid-plate-tectonic setting in the Pacific Basin. The mid-Pacific sea-level falls are recorded by (a) the presence of distinct, coarse-grained, graded beds of turbidite origin, rich in reef-derived skeletal debris of Oligocene, Eocene, and Cretaceous age, that were redeposited in deep-water archipelagic apron carbonate sequences of middle and late Oligocene age now flanking the atolls and (b) a marked stratigraphic hiatus and solution unconformity in the subsurface of Enewetak atoll which dates an Oligocene period of atoll emergence correlative with both the deposition of the turbidites and the coastal offlap events discerned in Atlantic passive margins. Correlation of the subsidence path of Enewetak atoll with the development of the Oligocene solution unconformity shows that ca. 30 Ma sea level was as much as 100 m lower than at present.

  14. Language Arts Curriculum Framework: Sample Grade Level Benchmarks, Grades 5-8.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock.

    Based on the 1998 Arkansas English Language Arts Frameworks, this framework lists benchmarks for grades five through eight in writing; reading; and listening, speaking, and viewing. The writing section's stated standards are to help students employ a wide range of strategies as they write; use different writing process elements appropriately to…

  15. "EU-on-Demand": Developing National Qualifications Frameworks in a Multi-Level Context

    ERIC Educational Resources Information Center

    Elken, Mari

    2016-01-01

    The development of comprehensive national qualifications frameworks (NQFs) across Europe has been sparked by the introduction of the European Qualifications Framework (EQF) in 2008. Taking an institutional perspective, this article examines the development of NQFs in three countries, in light of developments that have taken place at the European…

  16. Information Seen as Part of the Development of Living Intelligence: the Five-Leveled Cybersemiotic Framework for FIS

    NASA Astrophysics Data System (ADS)

    Brier, Soren

    2003-06-01

    It is argued that a true transdisciplinary information science going from physical information to phenomenological understanding needs a metaphysical framework. Three different kinds of causality are implied: efficient, formal and final. And at least five different levels of existence are needed: 1. The quantum vacuum fields with entangled causation. 2. The physical level with is energy and force-based efficient causation. 3. The informational-chemical level with its formal causation based on pattern fitting. 4. The biological-semiotic level with its non-conscious final causation and 5. The social-linguistic level of self-consciousness with its conscious goal-oriented final causation. To integrate these consistently in an evolutionary theory as emergent levels, neither mechanical determinism nor complexity theory are sufficient because they cannot be a foundation for a theory of lived meaning. C. S. Peirce's triadic semiotic philosophy combined with a cybernetic and systemic view, like N. Luhmann's, could create the framework I call Cybersemiotics.

  17. High-Order Discontinuous Galerkin Level Set Method for Interface Tracking and Re-Distancing on Unstructured Meshes

    NASA Astrophysics Data System (ADS)

    Greene, Patrick; Nourgaliev, Robert; Schofield, Sam

    2015-11-01

    A new sharp high-order interface tracking method for multi-material flow problems on unstructured meshes is presented. The method combines the marker-tracking algorithm with a discontinuous Galerkin (DG) level set method to implicitly track interfaces. DG projection is used to provide a mapping from the Lagrangian marker field to the Eulerian level set field. For the level set re-distancing, we developed a novel marching method that takes advantage of the unique features of the DG representation of the level set. The method efficiently marches outward from the zero level set with values in the new cells being computed solely from cell neighbors. Results are presented for a number of different interface geometries including ones with sharp corners and multiple hierarchical level sets. The method can robustly handle the level set discontinuities without explicit utilization of solution limiters. Results show that the expected high order (3rd and higher) of convergence for the DG representation of the level set is obtained for smooth solutions on unstructured meshes. High-order re-distancing on irregular meshes is a must for applications were the interfacial curvature is important for underlying physics, such as surface tension, wetting and detonation shock dynamics. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. Information management release number LLNL-ABS-675636.

  18. Defining obesity: second-level agenda setting attributes in black newspapers and general audience newspapers.

    PubMed

    Lee, Hyunmin; Len-Ríos, María E

    2014-01-01

    This content analysis study examines how obesity is depicted in general-audience and Black newspaper stories (N=391) through the lens of second-level agenda setting theory. The results reveal that both Black newspapers and general-audience newspapers generally ascribe individual causes for obesity. While both types of newspapers largely neglected to mention solutions for the problem, Black newspapers were more likely than general-audience newspapers to suggest both individual and societal solutions for treating obesity. For Black newspapers, these solutions more often included community interventions. In addition, Black newspapers more often used a negative tone in stories and more frequently mentioned ethnic and racial minorities as at-risk groups.

  19. QUANTITATIVE CELL MOTILITY FOR IN VITRO WOUND HEALING USING LEVEL SET-BASED ACTIVE CONTOUR TRACKING.

    PubMed

    Bunyak, Filiz; Palaniappan, Kannappan; Nath, Sumit K; Baskin, Tobias I; Dong, Gang

    2006-04-06

    Quantifying the behavior of cells individually, and in clusters as part of a population, under a range of experimental conditions, is a challenging computational task with many biological applications. We propose a versatile algorithm for segmentation and tracking of multiple motile epithelial cells during wound healing using time-lapse video. The segmentation part of the proposed method relies on a level set-based active contour algorithm that robustly handles a large number of cells. The tracking part relies on a detection-based multiple-object tracking method with delayed decision enabled by multi-hypothesis testing. The combined method is robust to complex cell behavior including division and apoptosis, and to imaging artifacts such as illumination changes.

  20. Wave breaking over sloping beaches using a coupled boundary integral-level set method

    SciTech Connect

    Garzon, M.; Adalsteinsson, D.; Gray, L.; Sethian, J.A.

    2003-12-08

    We present a numerical method for tracking breaking waves over sloping beaches. We use a fully non-linear potential model for in-compressible, irrotational and inviscid flow, and consider the effects of beach topography on breaking waves. The algorithm uses a Boundary Element Method (BEM) to compute the velocity at the interface, coupled to a Narrow Band Level Set Method to track the evolving air/water interface, and an associated extension equation to update the velocity potential both on and off the interface. The formulation of the algorithm is applicable to two and three dimensional breaking waves; in this paper, we concentrate on two-dimensional results showing wave breaking and rollup, and perform numerical convergence studies and comparison with previous techniques.

  1. Numerical simulation of overflow at vertical weirs using a hybrid level set/VOF method

    NASA Astrophysics Data System (ADS)

    Lv, Xin; Zou, Qingping; Reeve, Dominic

    2011-10-01

    This paper presents the applications of a newly developed free surface flow model to the practical, while challenging overflow problems for weirs. Since the model takes advantage of the strengths of both the level set and volume of fluid methods and solves the Navier-Stokes equations on an unstructured mesh, it is capable of resolving the time evolution of very complex vortical motions, air entrainment and pressure variations due to violent deformations following overflow of the weir crest. In the present study, two different types of vertical weir, namely broad-crested and sharp-crested, are considered for validation purposes. The calculated overflow parameters such as pressure head distributions, velocity distributions, and water surface profiles are compared against experimental data as well as numerical results available in literature. A very good quantitative agreement has been obtained. The numerical model, thus, offers a good alternative to traditional experimental methods in the study of weir problems.

  2. Large deformation solid-fluid interaction via a level set approach.

    SciTech Connect

    Schunk, Peter Randall; Noble, David R.; Baer, Thomas A.; Rao, Rekha Ranjana; Notz, Patrick K.; Wilkes, Edward Dean

    2003-12-01

    Solidification and blood flow seemingly have little in common, but each involves a fluid in contact with a deformable solid. In these systems, the solid-fluid interface moves as the solid advects and deforms, often traversing the entire domain of interest. Currently, these problems cannot be simulated without innumerable expensive remeshing steps, mesh manipulations or decoupling the solid and fluid motion. Despite the wealth of progress recently made in mechanics modeling, this glaring inadequacy persists. We propose a new technique that tracks the interface implicitly and circumvents the need for remeshing and remapping the solution onto the new mesh. The solid-fluid boundary is tracked with a level set algorithm that changes the equation type dynamically depending on the phases present. This novel approach to coupled mechanics problems promises to give accurate stresses, displacements and velocities in both phases, simultaneously.

  3. Initialisation of 3D level set for hippocampus segmentation from volumetric brain MR images

    NASA Astrophysics Data System (ADS)

    Hajiesmaeili, Maryam; Dehmeshki, Jamshid; Bagheri Nakhjavanlo, Bashir; Ellis, Tim

    2014-04-01

    Shrinkage of the hippocampus is a primary biomarker for Alzheimer's disease and can be measured through accurate segmentation of brain MR images. The paper will describe the problem of initialisation of a 3D level set algorithm for hippocampus segmentation that must cope with the some challenging characteristics, such as small size, wide range of intensities, narrow width, and shape variation. In addition, MR images require bias correction, to account for additional inhomogeneity associated with the scanner technology. Due to these inhomogeneities, using a single initialisation seed region inside the hippocampus is prone to failure. Alternative initialisation strategies are explored, such as using multiple initialisations in different sections (such as the head, body and tail) of the hippocampus. The Dice metric is used to validate our segmentation results with respect to ground truth for a dataset of 25 MR images. Experimental results indicate significant improvement in segmentation performance using the multiple initialisations techniques, yielding more accurate segmentation results for the hippocampus.

  4. Breast cancer diagnosis using level-set statistics and support vector machines.

    PubMed

    Liu, Jianguo; Yuan, Xiaohui; Buckles, Bill P

    2008-01-01

    Breast cancer diagnosis based on microscopic biopsy images and machine learning has demonstrated great promise in the past two decades. Various feature selection (or extraction) and classification algorithms have been attempted with success. However, some feature selection processes are complex and the number of features used can be quite large. We propose a new feature selection method based on level-set statistics. This procedure is simple and, when used with support vector machines (SVM), only a small number of features is needed to achieve satisfactory accuracy that is comparable to those using more sophisticated features. Therefore, the classification can be completed in much shorter time. We use multi-class support vector machines as the classification tool. Numerical results are reported to support the viability of this new procedure.

  5. A GPU-accelerated adaptive discontinuous Galerkin method for level set equation

    NASA Astrophysics Data System (ADS)

    Karakus, A.; Warburton, T.; Aksel, M. H.; Sert, C.

    2016-01-01

    This paper presents a GPU-accelerated nodal discontinuous Galerkin method for the solution of two- and three-dimensional level set (LS) equation on unstructured adaptive meshes. Using adaptive mesh refinement, computations are localised mostly near the interface location to reduce the computational cost. Small global time step size resulting from the local adaptivity is avoided by local time-stepping based on a multi-rate Adams-Bashforth scheme. Platform independence of the solver is achieved with an extensible multi-threading programming API that allows runtime selection of different computing devices (GPU and CPU) and different threading interfaces (CUDA, OpenCL and OpenMP). Overall, a highly scalable, accurate and mass conservative numerical scheme that preserves the simplicity of LS formulation is obtained. Efficiency, performance and local high-order accuracy of the method are demonstrated through distinct numerical test cases.

  6. Patient doses in paediatric CT: feasibility of setting diagnostic reference levels.

    PubMed

    Järvinen, H; Merimaa, K; Seuri, R; Tyrväinen, E; Perhomaa, M; Savikurki-Heikkilä, P; Svedström, E; Ziliukas, J; Lintrop, M

    2011-09-01

    Despite the fact that doses to paediatric patients from computed tomography (CT) examinations are of special concern, only few data or studies for setting of paediatric diagnostic reference levels (DRLs) have been published. In this study, doses to children were estimated from chest and head CT, in order to study the feasibility of DRLs for these examinations. It is shown that for the DRLs, patient dose data from different CT scanners should be collected in age or weight groups, possibly for different indications. For practical reasons, the DRLs for paediatric chest CT should be given as a continuous DRL curve as a function of patient weight. For paediatric head CT, DRLs for a few age groups could be given. The users of the DRLs should be aware of the calibration phantom applied in the console calibration for different paediatric scanning protocols. The feasibility of DRLs should be re-evaluated every 2-3 y.

  7. Numerical Schemes for the Hamilton-Jacobi and Level Set Equations on Triangulated Domains

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.; Sethian, James A.

    1997-01-01

    Borrowing from techniques developed for conservation law equations, numerical schemes which discretize the Hamilton-Jacobi (H-J), level set, and Eikonal equations on triangulated domains are presented. The first scheme is a provably monotone discretization for certain forms of the H-J equations. Unfortunately, the basic scheme lacks proper Lipschitz continuity of the numerical Hamiltonian. By employing a virtual edge flipping technique, Lipschitz continuity of the numerical flux is restored on acute triangulations. Next, schemes are introduced and developed based on the weaker concept of positive coefficient approximations for homogeneous Hamiltonians. These schemes possess a discrete maximum principle on arbitrary triangulations and naturally exhibit proper Lipschitz continuity of the numerical Hamiltonian. Finally, a class of Petrov-Galerkin approximations are considered. These schemes are stabilized via a least-squares bilinear form. The Petrov-Galerkin schemes do not possess a discrete maximum principle but generalize to high order accuracy.

  8. Setting ozone critical levels for protecting horticultural Mediterranean crops: case study of tomato.

    PubMed

    González-Fernández, I; Calvo, E; Gerosa, G; Bermejo, V; Marzuoli, R; Calatayud, V; Alonso, R

    2014-02-01

    Seven experiments carried out in Italy and Spain have been used to parameterising a stomatal conductance model and establishing exposure- and dose-response relationships for yield and quality of tomato with the main goal of setting O3 critical levels (CLe). CLe with confidence intervals, between brackets, were set at an accumulated hourly O3 exposure over 40 nl l(-1), AOT40 = 8.4 (1.2, 15.6) ppm h and a phytotoxic ozone dose above a threshold of 6 nmol m(-2) s(-1), POD6 = 2.7 (0.8, 4.6) mmol m(-2) for yield and AOT40 = 18.7 (8.5, 28.8) ppm h and POD6 = 4.1 (2.0, 6.2) mmol m(-2) for quality, both indices performing equally well. CLe confidence intervals provide information on the quality of the dataset and should be included in future calculations of O3 CLe for improving current methodologies. These CLe, derived for sensitive tomato cultivars, should not be applied for quantifying O3-induced losses at the risk of making important overestimations of the economical losses associated with O3 pollution.

  9. Teachers' Lives in Context: A Framework for Understanding Barriers to High Quality Teaching within Resource Deprived Settings

    ERIC Educational Resources Information Center

    Schwartz, Kate; Cappella, Elise; Aber, J. Lawrence

    2016-01-01

    Within low-income communities in low- and high-resource countries, there is a profound need for more effective schools that are better able to foster child and youth development and support student learning. This paper presents a theoretical framework for understanding the role of teacher ecology in influencing teacher effectiveness and, through…

  10. Modeling the advection of discontinuous quantities in Geophysical flows using Particle Level Sets

    NASA Astrophysics Data System (ADS)

    Aleksandrov, V.; Samuel, H.; Evonuk, M.

    2010-12-01

    Advection is one of the major processes that commonly acts on various scales in nature (core formation, mantle convective stirring, multi-phase flows in magma chambers, salt diapirism ...). While this process can be modeled numerically by solving conservation equations, various geodynamic scenarios involve advection of quantities with sharp discontinuities. Unfortunately, in these cases modeling numerically pure advection becomes very challenging, in particular because sharp discontinuities lead to numerical instabilities, which prevent the local use of high order numerical schemes. Several approaches have been used in computational geodynamics in order to overcome this difficulty, with variable amounts of success. Despite the use of correcting filters or non-oscillatory, shock-preserving schemes, Eulerian (fixed grid) techniques generally suffer from artificial numerical diffusion. Lagrangian approaches (dynamic grids or particles) tend to be more popular in computational geodynamics because they are not prone to excessive numerical diffusion. However, these approaches are generally computationally expensive, especially in 3D, and can suffer from spurious statistical noise. As an alternative to these aforementioned approaches, we have applied a relatively recent Particle Level set method [Enright et al., 2002] for modeling advection of quantities with the presence of sharp discontinuities. We have tested this improved method, which combines the best of Eulerian and Lagrangian approaches, against well known benchmarks and classical Geodynamic flows. In each case the Particle Level Set method accuracy equals or is better than other Eulerian and Lagrangian methods, and leads to significantly smaller computational cost, in particular in three-dimensional flows, where the reduction of computational time for modeling advection processes is most needed.

  11. An abdominal aortic aneurysm segmentation method: level set with region and statistical information.

    PubMed

    Zhuge, Feng; Rubin, Geoffrey D; Sun, Shaohua; Napel, Sandy

    2006-05-01

    We present a system for segmenting the human aortic aneurysm in CT angiograms (CTA), which, in turn, allows measurements of volume and morphological aspects useful for treatment planning. The system estimates a rough "initial surface," and then refines it using a level set segmentation scheme augmented with two external analyzers: The global region analyzer, which incorporates a priori knowledge of the intensity, volume, and shape of the aorta and other structures, and the local feature analyzer, which uses voxel location, intensity, and texture features to train and drive a support vector machine classifier. Each analyzer outputs a value that corresponds to the likelihood that a given voxel is part of the aneurysm, which is used during level set iteration to control the evolution of the surface. We tested our system using a database of 20 CTA scans of patients with aortic aneurysms. The mean and worst case values of volume overlap, volume error, mean distance error, and maximum distance error relative to human tracing were 95.3% +/- 1.4% (s.d.); worst case = 92.9%, 3.5% +/- 2.5% (s.d.); worst case = 7.0%, 0.6 +/- 0.2 mm (s.d.); worst case = 1.0 mm, and 5.2 +/- 2.3 mm (s.d.); worst case = 9.6 mm, respectively. When implemented on a 2.8 GHz Pentium IV personal computer, the mean time required for segmentation was 7.4 +/- 3.6 min (s.d.). We also performed experiments that suggest that our method is insensitive to parameter changes within 10% of their experimentally determined values. This preliminary study proves feasibility for an accurate, precise, and robust system for segmentation of the abdominal aneurysm from CTA data, and may be of benefit to patients with aortic aneurysms.

  12. Texture analysis improves level set segmentation of the anterior abdominal wall

    PubMed Central

    Xu, Zhoubing; Allen, Wade M.; Baucom, Rebeccah B.; Poulose, Benjamin K.; Landman, Bennett A.

    2013-01-01

    Purpose: The treatment of ventral hernias (VH) has been a challenging problem for medical care. Repair of these hernias is fraught with failure; recurrence rates ranging from 24% to 43% have been reported, even with the use of biocompatible mesh. Currently, computed tomography (CT) is used to guide intervention through expert, but qualitative, clinical judgments, notably, quantitative metrics based on image-processing are not used. The authors propose that image segmentation methods to capture the three-dimensional structure of the abdominal wall and its abnormalities will provide a foundation on which to measure geometric properties of hernias and surrounding tissues and, therefore, to optimize intervention. Methods: In this study with 20 clinically acquired CT scans on postoperative patients, the authors demonstrated a novel approach to geometric classification of the abdominal. The authors’ approach uses a texture analysis based on Gabor filters to extract feature vectors and follows a fuzzy c-means clustering method to estimate voxelwise probability memberships for eight clusters. The memberships estimated from the texture analysis are helpful to identify anatomical structures with inhomogeneous intensities. The membership was used to guide the level set evolution, as well as to derive an initial start close to the abdominal wall. Results: Segmentation results on abdominal walls were both quantitatively and qualitatively validated with surface errors based on manually labeled ground truth. Using texture, mean surface errors for the outer surface of the abdominal wall were less than 2 mm, with 91% of the outer surface less than 5 mm away from the manual tracings; errors were significantly greater (2–5 mm) for methods that did not use the texture. Conclusions: The authors’ approach establishes a baseline for characterizing the abdominal wall for improving VH care. Inherent texture patterns in CT scans are helpful to the tissue classification, and texture

  13. Texture analysis improves level set segmentation of the anterior abdominal wall

    SciTech Connect

    Xu, Zhoubing; Allen, Wade M.; Baucom, Rebeccah B.; Poulose, Benjamin K.; Landman, Bennett A.

    2013-12-15

    Purpose: The treatment of ventral hernias (VH) has been a challenging problem for medical care. Repair of these hernias is fraught with failure; recurrence rates ranging from 24% to 43% have been reported, even with the use of biocompatible mesh. Currently, computed tomography (CT) is used to guide intervention through expert, but qualitative, clinical judgments, notably, quantitative metrics based on image-processing are not used. The authors propose that image segmentation methods to capture the three-dimensional structure of the abdominal wall and its abnormalities will provide a foundation on which to measure geometric properties of hernias and surrounding tissues and, therefore, to optimize intervention.Methods: In this study with 20 clinically acquired CT scans on postoperative patients, the authors demonstrated a novel approach to geometric classification of the abdominal. The authors’ approach uses a texture analysis based on Gabor filters to extract feature vectors and follows a fuzzy c-means clustering method to estimate voxelwise probability memberships for eight clusters. The memberships estimated from the texture analysis are helpful to identify anatomical structures with inhomogeneous intensities. The membership was used to guide the level set evolution, as well as to derive an initial start close to the abdominal wall.Results: Segmentation results on abdominal walls were both quantitatively and qualitatively validated with surface errors based on manually labeled ground truth. Using texture, mean surface errors for the outer surface of the abdominal wall were less than 2 mm, with 91% of the outer surface less than 5 mm away from the manual tracings; errors were significantly greater (2–5 mm) for methods that did not use the texture.Conclusions: The authors’ approach establishes a baseline for characterizing the abdominal wall for improving VH care. Inherent texture patterns in CT scans are helpful to the tissue classification, and texture

  14. An abdominal aortic aneurysm segmentation method: Level set with region and statistical information

    SciTech Connect

    Zhuge Feng; Rubin, Geoffrey D.; Sun Shaohua; Napel, Sandy

    2006-05-15

    We present a system for segmenting the human aortic aneurysm in CT angiograms (CTA), which, in turn, allows measurements of volume and morphological aspects useful for treatment planning. The system estimates a rough 'initial surface', and then refines it using a level set segmentation scheme augmented with two external analyzers: The global region analyzer, which incorporates a priori knowledge of the intensity, volume, and shape of the aorta and other structures, and the local feature analyzer, which uses voxel location, intensity, and texture features to train and drive a support vector machine classifier. Each analyzer outputs a value that corresponds to the likelihood that a given voxel is part of the aneurysm, which is used during level set iteration to control the evolution of the surface. We tested our system using a database of 20 CTA scans of patients with aortic aneurysms. The mean and worst case values of volume overlap, volume error, mean distance error, and maximum distance error relative to human tracing were 95.3%{+-}1.4% (s.d.); worst case=92.9%, 3.5%{+-}2.5% (s.d.); worst case=7.0%, 0.6{+-}0.2 mm (s.d.); worst case=1.0 mm, and 5.2{+-}2.3mm (s.d.); worstcase=9.6 mm, respectively. When implemented on a 2.8 GHz Pentium IV personal computer, the mean time required for segmentation was 7.4{+-}3.6min (s.d.). We also performed experiments that suggest that our method is insensitive to parameter changes within 10% of their experimentally determined values. This preliminary study proves feasibility for an accurate, precise, and robust system for segmentation of the abdominal aneurysm from CTA data, and may be of benefit to patients with aortic aneurysms.

  15. Characterization of mammographic masses based on level set segmentation with new image features and patient information

    SciTech Connect

    Shi Jiazheng; Sahiner, Berkman; Chan Heangping; Ge Jun; Hadjiiski, Lubomir; Helvie, Mark A.; Nees, Alexis; Wu Yita; Wei Jun; Zhou Chuan; Zhang Yiheng; Cui Jing

    2008-01-15

    Computer-aided diagnosis (CAD) for characterization of mammographic masses as malignant or benign has the potential to assist radiologists in reducing the biopsy rate without increasing false negatives. The purpose of this study was to develop an automated method for mammographic mass segmentation and explore new image based features in combination with patient information in order to improve the performance of mass characterization. The authors' previous CAD system, which used the active contour segmentation, and morphological, textural, and spiculation features, has achieved promising results in mass characterization. The new CAD system is based on the level set method and includes two new types of image features related to the presence of microcalcifications with the mass and abruptness of the mass margin, and patient age. A linear discriminant analysis (LDA) classifier with stepwise feature selection was used to merge the extracted features into a classification score. The classification accuracy was evaluated using the area under the receiver operating characteristic curve. The authors' primary data set consisted of 427 biopsy-proven masses (200 malignant and 227 benign) in 909 regions of interest (ROIs) (451 malignant and 458 benign) from multiple mammographic views. Leave-one-case-out resampling was used for training and testing. The new CAD system based on the level set segmentation and the new mammographic feature space achieved a view-based A{sub z} value of 0.83{+-}0.01. The improvement compared to the previous CAD system was statistically significant (p=0.02). When patient age was included in the new CAD system, view-based and case-based A{sub z} values were 0.85{+-}0.01 and 0.87{+-}0.02, respectively. The study also demonstrated the consistency of the newly developed CAD system by evaluating the statistics of the weights of the LDA classifiers in leave-one-case-out classification. Finally, an independent test on the publicly available digital database

  16. A review of the use of human factors classification frameworks that identify causal factors for adverse events in the hospital setting.

    PubMed

    Mitchell, R J; Williamson, A M; Molesworth, B; Chung, A Z Q

    2014-01-01

    Various human factors classification frameworks have been used to identified causal factors for clinical adverse events. A systematic review was conducted to identify human factors classification frameworks that identified the causal factors (including human error) of adverse events in a hospital setting. Six electronic databases were searched, identifying 1997 articles and 38 of these met inclusion criteria. Most studies included causal contributing factors as well as error and error type, but the nature of coding varied considerably between studies. The ability of human factors classification frameworks to provide information on specific causal factors for an adverse event enables the focus of preventive attention on areas where improvements are most needed. This review highlighted some areas needing considerable improvement in order to meet this need, including better definition of terms, more emphasis on assessing reliability of coding and greater sophistication in analysis of results of the classification. Practitioner Summary: Human factors classification frameworks can be used to identify causal factors of clinical adverse events. However, this review suggests that existing frameworks are diverse, limited in their identification of the context of human error and have poor reliability when used by different individuals.

  17. Standard Setting in Relation to the Common European Framework of Reference for Languages: The Case of the State Examination of Dutch as a Second Language

    ERIC Educational Resources Information Center

    Bechger, Timo M.; Kuijper, Henk; Maris, Gunter

    2009-01-01

    This article reports on two related studies carried out to link the State examination of Dutch as a second language to the Common European Framework of Reference for languages (CEFR). In the first study, key persons from institutions for higher education were asked to determine the minimally required language level of beginning students. In the…

  18. A Framework for State-Level Renewable Energy Market Potential Studies

    EPA Pesticide Factsheets

    This document provides a framework/next steps for state officials who require estimates of renewable energy market potential, shows how to conduct a market potential study, and distinguishes between goal-oriented studies and other types of studies.

  19. Generalized cost-effectiveness analysis for national-level priority-setting in the health sector

    PubMed Central

    Hutubessy, Raymond; Chisholm, Dan; Edejer, Tessa Tan-Torres

    2003-01-01

    Cost-effectiveness analysis (CEA) is potentially an important aid to public health decision-making but, with some notable exceptions, its use and impact at the level of individual countries is limited. A number of potential reasons may account for this, among them technical shortcomings associated with the generation of current economic evidence, political expediency, social preferences and systemic barriers to implementation. As a form of sectoral CEA, Generalized CEA sets out to overcome a number of these barriers to the appropriate use of cost-effectiveness information at the regional and country level. Its application via WHO-CHOICE provides a new economic evidence base, as well as underlying methodological developments, concerning the cost-effectiveness of a range of health interventions for leading causes of, and risk factors for, disease. The estimated sub-regional costs and effects of different interventions provided by WHO-CHOICE can readily be tailored to the specific context of individual countries, for example by adjustment to the quantity and unit prices of intervention inputs (costs) or the coverage, efficacy and adherence rates of interventions (effectiveness). The potential usefulness of this information for health policy and planning is in assessing if current intervention strategies represent an efficient use of scarce resources, and which of the potential additional interventions that are not yet implemented, or not implemented fully, should be given priority on the grounds of cost-effectiveness. Health policy-makers and programme managers can use results from WHO-CHOICE as a valuable input into the planning and prioritization of services at national level, as well as a starting point for additional analyses of the trade-off between the efficiency of interventions in producing health and their impact on other key outcomes such as reducing inequalities and improving the health of the poor. PMID:14687420

  20. An ecofeminist conceptual framework to explore gendered environmental health inequities in urban settings and to inform healthy public policy.

    PubMed

    Chircop, Andrea

    2008-06-01

    This theoretical exploration is an attempt to conceptualize the link between gender and urban environmental health. The proposed ecofeminist framework enables an understanding of the link between the urban physical and social environments and health inequities mediated by gender and socioeconomic status. This framework is proposed as a theoretical magnifying glass to reveal the underlying logic that connects environmental exploitation on the one hand, and gendered health inequities on the other. Ecofeminism has the potential to reveal an inherent, normative conceptual analysis and argumentative justification of western society that permits the oppression of women and the exploitation of the environment. This insight will contribute to a better understanding of the mechanisms underlying gendered environmental health inequities and inform healthy public policy that is supportive of urban environmental health, particularly for low-income mothers.

  1. Sparsity and level set regularization for diffuse optical tomography using a transport model in 2D

    NASA Astrophysics Data System (ADS)

    Prieto, Kernel; Dorn, Oliver

    2017-01-01

    In this paper we address an inverse problem for the time-dependent linear transport equation (or radiative transfer equation) in 2D having in mind applications in diffuse optical tomography (DOT). We propose two new reconstruction algorithms which so far have not been applied to such a situation and compare their performances in certain practically relevant situations. The first of these reconstruction algorithms uses a sparsity promoting regularization scheme, whereas the second one uses a simultaneous level set reconstruction scheme for two parameters of the linear transport equation. We will also compare the results of both schemes with a third scheme which is a more traditional L 2-based Landweber-Kaczmarz scheme. We focus our attention on the DOT application of imaging the human head of a neonate where the simpler diffusion approximation is not well-suited for the inversion due to the presence of a clear layer beneath the skull which is filled with ‘low-scattering’ cerebrospinal fluid. This layer, even if its location and characteristics are known a priori, poses significant difficulties for most reconstruction schemes due to its ‘wave-guiding’ property which reduces sensitivity of the data to the interior regions. A further complication arises due to the necessity to reconstruct simultaneously two different parameters of the linear transport equation, the scattering and the absorption cross-section, from the same data set. A significant ‘cross-talk’ between these two parameters is usually expected. Our numerical experiments indicate that each of the three considered reconstruction schemes do have their merits and perform differently but reasonably well when the clear layer is a priori known. We also demonstrate the behavior of the three algorithms in the particular situation where the clear layer is unknown during the reconstruction.

  2. Expected frontiers: Incorporating weather uncertainty into a policy analysis using an integrated bi-level multi-objective optimization framework

    EPA Science Inventory

    Weather is the main driver in both plant use of nutrients and fate and transport of nutrients in the environment. In previous work, we evaluated a green tax for control of agricultural nutrients in a bi-level optimization framework that linked deterministic models. In this study,...

  3. A General Framework for Power Analysis to Detect the Moderator Effects in Two- and Three-Level Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Dong, Nianbo; Spybrook, Jessaca; Kelcey, Ben

    2016-01-01

    The purpose of this study is to propose a general framework for power analyses to detect the moderator effects in two- and three-level cluster randomized trials (CRTs). The study specifically aims to: (1) develop the statistical formulations for calculating statistical power, minimum detectable effect size (MDES) and its confidence interval to…

  4. A Generic System-Level Framework for Self-Serve Health Monitoring System through Internet of Things (IoT).

    PubMed

    Ahmed, Mobyen Uddin; Björkman, Mats; Lindén, Maria

    2015-01-01

    Sensor data are traveling from sensors to a remote server, data is analyzed remotely in a distributed manner, and health status of a user is presented in real-time. This paper presents a generic system-level framework for a self-served health monitoring system through the Internet of Things (IoT) to facilities an efficient sensor data management.

  5. How Multi-Levels of Individual and Team Learning Interact in a Public Healthcare Organisation: A Conceptual Framework

    ERIC Educational Resources Information Center

    Doyle, Louise; Kelliher, Felicity; Harrington, Denis

    2016-01-01

    The aim of this paper is to review the relevant literature on organisational learning and offer a preliminary conceptual framework as a basis to explore how the multi-levels of individual learning and team learning interact in a public healthcare organisation. The organisational learning literature highlights a need for further understanding of…

  6. The Existence of Alternative Framework in Students' Scientific Imagination on the Concept of Matter at Submicroscopic Level: Macro Imagination

    ERIC Educational Resources Information Center

    Abdullah, Nurdiana; Surif, Johari

    2015-01-01

    This study is conducted with the purpose of identifying the alternative framework contained in students' imagination on the concept of matter at submicroscopic level. Through the design of purposive sampling techniques, a total of 15 students are interviewed to obtain the data. Data from analysis document is utilized to strengthen the interview.…

  7. Low-level 14C methane oxidation rate measurements modified for remote field settings

    NASA Astrophysics Data System (ADS)

    Pack, M. A.; Pohlman, J.; Ruppel, C. D.; Xu, X.

    2012-12-01

    Aerobic methane oxidation limits atmospheric methane emissions from degraded subsea permafrost and dissociated methane hydrates in high latitude oceans. Methane oxidation rate measurements are a crucial tool for investigating the efficacy of this process, but are logistically challenging when working on small research vessels in remote settings. We modified a low-level 14C-CH4 oxidation rate measurement for use in the Beaufort Sea above hydrate bearing sediments during August 2012. Application of the more common 3H-CH4 rate measurement that uses 106 times more radioactivity was not practical because the R/V Ukpik cannot accommodate a radiation van. The low-level 14C measurement does not require a radiation van, but careful isolation of the 14C-label is essential to avoid contaminating natural abundance 14C measurements. We used 14C-CH4 with a total activity of 1.1 μCi, which is far below the 100 μCi permitting level. In addition, we modified field procedures to simplify and shorten sample processing. The original low-level 14C-CH4 method requires 6 steps in the field: (1) collect water samples in glass serum bottles, (2) inject 14C-CH4 into bottles, (3) incubate for 24 hours, (4) filter to separate the methanotrophic bacterial cells from the aqueous sample, (5) kill the filtrate with sodium hydroxide (NaOH), and (6) purge with nitrogen to remove unused 14C-CH4. Onshore, the 14C-CH4 respired to carbon dioxide or incorporated into cell material by methanotrophic bacteria during incubation is quantified by accelerator mass spectrometry (AMS). We conducted an experiment to test the possibility of storing samples for purging and filtering back onshore (steps 4 and 6). We subjected a series of water samples to steps 1-3 & 5, and preserved with mercuric chloride (HgCl2) instead of NaOH because HgCl2 is less likely to break down cell material during storage. The 14C-content of the carbon dioxide in samples preserved with HgCl2 and stored for up to 2 weeks was stable

  8. Detection of colonic polyp candidates with level set-based thickness mapping over the colon wall

    NASA Astrophysics Data System (ADS)

    Han, Hao; Li, Lihong; Duan, Chaijie; Zhao, Yang; Wang, Huafeng; Liang, Zhengrong

    2015-03-01

    Further improvement of computer-aided detection (CADe) of colonic polyps is vital to advance computed tomographic colonography (CTC) toward a screening modality, where the detection of flat polyps is especially challenging because limited image features can be extracted from flat polyps, and the traditional geometric features-based CADe methods usually fail to detect such polyps. In this paper, we present a novel pipeline to automatically detect initial polyp candidates (IPCs), especially flat polyps, from CTC images. First, the colon wall mucosa was extracted via a partial volume segmentation approach as a volumetric layer, where the inner border of colon wall can be obtained by shrinking the volumetric layer using level set based adaptive convolution. Then the outer border of colon wall (or the colon wall serosa) was segmented via a combined implementation of geodesic active contour and Mumford-Shah functional in a coarse-to-fine manner. Finally, the wall thickness was estimated along a unique path between the segmented inner and outer borders with consideration of the volumetric layers and was mapped onto a patient-specific three-dimensional (3D) colon wall model. The IPC detection results can usually be better visualized in a 2D image flattened from the 3D model, where abnormalities were detected by Z-score transformation of the thickness values. The proposed IPC detection approach was validated on 11 patients with 22 CTC scans, and each scan has at least one flat poly annotation. The above presented novel pipeline was effective to detect some flat polyps that were missed by our CADe system while keeping false detections in a relative low level. This preliminary study indicates that the presented pipeline can be incorporated into an existing CADe system to enhance the polyp detection power, especially for flat polyps.

  9. Two-phase electro-hydrodynamic flow modeling by a conservative level set model.

    PubMed

    Lin, Yuan

    2013-03-01

    The principles of electro-hydrodynamic (EHD) flow have been known for more than a century and have been adopted for various industrial applications, for example, fluid mixing and demixing. Analytical solutions of such EHD flow only exist in a limited number of scenarios, for example, predicting a small deformation of a single droplet in a uniform electric field. Numerical modeling of such phenomena can provide significant insights about EHDs multiphase flows. During the last decade, many numerical results have been reported to provide novel and useful tools of studying the multiphase EHD flow. Based on a conservative level set method, the proposed model is able to simulate large deformations of a droplet by a steady electric field, which is beyond the region of theoretic prediction. The model is validated for both leaky dielectrics and perfect dielectrics, and is found to be in excellent agreement with existing analytical solutions and numerical studies in the literature. Furthermore, simulations of the deformation of a water droplet in decyl alcohol in a steady electric field match better with published experimental data than the theoretical prediction for large deformations. Therefore the proposed model can serve as a practical and accurate tool for simulating two-phase EHD flow.

  10. Automated Robust Image Segmentation: Level Set Method Using Nonnegative Matrix Factorization with Application to Brain MRI.

    PubMed

    Dera, Dimah; Bouaynaya, Nidhal; Fathallah-Shaykh, Hassan M

    2016-07-01

    We address the problem of fully automated region discovery and robust image segmentation by devising a new deformable model based on the level set method (LSM) and the probabilistic nonnegative matrix factorization (NMF). We describe the use of NMF to calculate the number of distinct regions in the image and to derive the local distribution of the regions, which is incorporated into the energy functional of the LSM. The results demonstrate that our NMF-LSM method is superior to other approaches when applied to synthetic binary and gray-scale images and to clinical magnetic resonance images (MRI) of the human brain with and without a malignant brain tumor, glioblastoma multiforme. In particular, the NMF-LSM method is fully automated, highly accurate, less sensitive to the initial selection of the contour(s) or initial conditions, more robust to noise and model parameters, and able to detect as small distinct regions as desired. These advantages stem from the fact that the proposed method relies on histogram information instead of intensity values and does not introduce nuisance model parameters. These properties provide a general approach for automated robust region discovery and segmentation in heterogeneous images. Compared with the retrospective radiological diagnoses of two patients with non-enhancing grade 2 and 3 oligodendroglioma, the NMF-LSM detects earlier progression times and appears suitable for monitoring tumor response. The NMF-LSM method fills an important need of automated segmentation of clinical MRI.

  11. DSA Image Blood Vessel Skeleton Extraction Based on Anti-concentration Diffusion and Level Set Method

    NASA Astrophysics Data System (ADS)

    Xu, Jing; Wu, Jian; Feng, Daming; Cui, Zhiming

    Serious types of vascular diseases such as carotid stenosis, aneurysm and vascular malformation may lead to brain stroke, which are the third leading cause of death and the number one cause of disability. In the clinical practice of diagnosis and treatment of cerebral vascular diseases, how to do effective detection and description of the vascular structure of two-dimensional angiography sequence image that is blood vessel skeleton extraction has been a difficult study for a long time. This paper mainly discussed two-dimensional image of blood vessel skeleton extraction based on the level set method, first do the preprocessing to the DSA image, namely uses anti-concentration diffusion model for the effective enhancement and uses improved Otsu local threshold segmentation technology based on regional division for the image binarization, then vascular skeleton extraction based on GMM (Group marching method) with fast sweeping theory was actualized. Experiments show that our approach not only improved the time complexity, but also make a good extraction results.

  12. Cerebral Arteries Extraction using Level Set Segmentation and Adaptive Tracing for CT Angiography

    SciTech Connect

    Zhang Yong; Zhou Xiaobo; Srinivasan, Ranga; Wong, Stephen T. C.; Young, Geoff

    2007-11-02

    We propose an approach for extracting cerebral arteries from partial Computed Tomography Angiography (CTA). The challenges of extracting cerebral arteries from CTA come from the fact that arteries are usually surrounded by bones and veins in the lower portion of a CTA volume. There exists strong intensity-value overlap between vessels and surrounding objects. Besides, it is inappropriate to assume the 2D cross sections of arteries are circle or ellipse, especially for abnormal vessels. The navigation of the arteries could change suddenly in the 3D space. In this paper, a method based on level set segmentation is proposed to target this challenging problem. For the lower portion of a CTA volume, we use geodesic active contour method to detect cross section of arteries in the 2D space. The medial axis of the artery is obtained by adaptively tracking along its navigation path. This is done by finding the minimal cross section from cutting the arteries under different angles in the 3D spherical space. This method is highly automated, with minimum user input of providing only the starting point and initial navigation direction of the arteries of interests.

  13. Automatic Four-Chamber Segmentation Using Level-Set Method and Split Energy Function

    PubMed Central

    Kang, Ho Chul; Shin, Juneseuk

    2016-01-01

    Objectives In this paper, we present an automatic method to segment four chambers by extracting a whole heart, separating the left and right sides of the heart, and spliting the atrium and ventricle regions from each heart in cardiac computed tomography angiography (CTA) efficiently. Methods We smooth the images by applying filters to remove noise. Next, the volume of interest is detected by using k-means clustering. In this step, the whole heart is coarsely extracted, and it is used for seed volumes in the next step. Then, we detect seed volumes using a geometric analysis based on anatomical information and separate the left and right heart regions with the power watershed algorithm. Finally, we refine the left and right sides of the heart using the level-set method, and extract the atrium and ventricle from the left and right heart regions using the split energy function. Results We tested the proposed heart segmentation method using 20 clinical scan datasets which were acquired from various patients. To validate the proposed heart segmentation method, we evaluated its accuracy in segmenting four chambers based on four error evaluation metrics. The average values of differences between the manual and automatic segmentations were less than 3.3%, approximately. Conclusions The proposed method extracts the four chambers of the heart accurately, demonstrating that this approach can assist the cardiologist. PMID:27895960

  14. Numerical Simulation of Two-phase flow with Phase Change Using the Level-set Method

    NASA Astrophysics Data System (ADS)

    Li, Hongying; Lou, Jing; Pan, Lunsheng; Yap, Yitfatt

    2016-11-01

    Multiphase flow with phase change is widely encountered in many engineering applications. A distinct feature involves in these applications is the phase transition from one phase to another due to the non-uniform temperature distribution. Such kind of process generally releases or absorbs large amount of energy with mass transfer happened simultaneously. It demands great cautions occasionally such as the high pressure due to evaporation. This article presents a numerical model for simulation of two-fluid flow with phase change problem. In these two fluids, one of them changes its state due to phase change. Such a problem then involves two substances with three phases as well as two different interfaces, i.e. the interface between two substances and the interface of one substance between its two phases. Two level-set functions are used to capture the two interfaces in the current problem. The current model is validated against one-dimensional and two-dimensional liquid evaporation. With the code validated, it is applied to different phase change problems including (1) a falling evaporating droplet and the rising of one bubble and (2) two-fluid stratified flow with solidification of one fluid. Comparisons on the bubble and droplet topologies, flow and temperature fields are made for the first case between the falling evaporating droplet and the falling droplet without evaporation. For the second demonstration case, the effect of the superheated temperature on the solidification process is investigated.

  15. Intracerebral Transplants and Memory Dysfunction: Circuitry Repair or Functional Level Setting?

    PubMed Central

    Will, Bruno; Kelche, Christian; Cassel, Jean-Christophe

    2000-01-01

    Intracerebral grafting techniques of fetal neural cells have been used essentially with two main types of lesion paradigms, namely damage to long projection systems, in which the source and the target are clearly separate, and damage to neurons that are involved in local circuits within a small (sub)region of the brain. With the’first lesion paradigm, grafts placed homotopically (in the source) are not appropriate because their fibers grow poorly through the host parenchyma and fail to reach their normal target. To be successful, the grafts must be placed ectopically in the target region of the damaged projection systems, where generally they work as level-setting systems. Conversely, with the second paradigm, the grafts are supposed to compensate for a local loss of neurons and must be placed homotopically to induce functional effects that are based on the reconstruction of a point-to-point circuitry. By inserting a biological or artificial bridging-substrate between the source and the target of long projection systems, it might be possible to combine the positive effects of both homotopic and ectopic grafting by achieving both target reinnervation and normal control of the grafted neurons within the source area. These issues are illustrated and discussed in this review. PMID:10709217

  16. An adaptive multiresolution gradient-augmented level set method for advection problems

    NASA Astrophysics Data System (ADS)

    Schneider, Kai; Kolomenskiy, Dmitry; Nave, Jean-Chtristophe

    2014-11-01

    Advection problems are encountered in many applications, such as transport of passive scalars modeling pollution or mixing in chemical engineering. In some problems, the solution develops small-scale features localized in a part of the computational domain. If the location of these features changes in time, the efficiency of the numerical method can be significantly improved by adapting the partition dynamically to the solution. We present a space-time adaptive scheme for solving advection equations in two space dimensions. The third order accurate gradient-augmented level set method using a semi-Lagrangian formulation with backward time integration is coupled with a point value multiresolution analysis using Hermite interpolation. Thus locally refined dyadic spatial grids are introduced which are efficiently implemented with dynamic quad-tree data structures. For adaptive time integration, an embedded Runge-Kutta method is employed. The precision of the new fully adaptive method is analysed and speed up of CPU time and memory compression with respect to the uniform grid discretization are reported.

  17. Coupled Segmentation of Nuclear and Membrane-bound Macromolecules through Voting and Multiphase Level Set.

    PubMed

    Chang, Hang; Wen, Quan; Parvin, Bahram

    2015-03-01

    Membrane-bound macromolecules play an important role in tissue architecture and cell-cell communication, and is regulated by almost one-third of the genome. At the optical scale, one group of membrane proteins expresses themselves as linear structures along the cell surface boundaries, while others are sequestered; and this paper targets the former group. Segmentation of these membrane proteins on a cell-by-cell basis enables the quantitative assessment of localization for comparative analysis. However, such membrane proteins typically lack continuity, and their intensity distributions are often very heterogeneous; moreover, nuclei can form large clump, which further impedes the quantification of membrane signals on a cell-by-cell basis. To tackle these problems, we introduce a three-step process to (i) regularize the membrane signal through iterative tangential voting, (ii) constrain the location of surface proteins by nuclear features, where clumps of nuclei are segmented through a delaunay triangulation approach, and (iii) assign membrane-bound macromolecules to individual cells through an application of multi-phase geodesic level-set. We have validated our method using both synthetic data and a dataset of 200 images, and are able to demonstrate the efficacy of our approach with superior performance.

  18. Coupled Segmentation of Nuclear and Membrane-bound Macromolecules through Voting and Multiphase Level Set

    PubMed Central

    Wen, Quan

    2014-01-01

    Membrane-bound macromolecules play an important role in tissue architecture and cell-cell communication, and is regulated by almost one-third of the genome. At the optical scale, one group of membrane proteins expresses themselves as linear structures along the cell surface boundaries, while others are sequestered; and this paper targets the former group. Segmentation of these membrane proteins on a cell-by-cell basis enables the quantitative assessment of localization for comparative analysis. However, such membrane proteins typically lack continuity, and their intensity distributions are often very heterogeneous; moreover, nuclei can form large clump, which further impedes the quantification of membrane signals on a cell-by-cell basis. To tackle these problems, we introduce a three-step process to (i) regularize the membrane signal through iterative tangential voting, (ii) constrain the location of surface proteins by nuclear features, where clumps of nuclei are segmented through a delaunay triangulation approach, and (iii) assign membrane-bound macromolecules to individual cells through an application of multi-phase geodesic level-set. We have validated our method using both synthetic data and a dataset of 200 images, and are able to demonstrate the efficacy of our approach with superior performance. PMID:25530633

  19. Semi-Automated Detection of Surface Degradation on Bridges Based on a Level Set Method

    NASA Astrophysics Data System (ADS)

    Masiero, A.; Guarnieri, A.; Pirotti, F.; Vettore, A.

    2015-08-01

    Due to the effect of climate factors, natural phenomena and human usage, buildings and infrastructures are subject of progressive degradation. The deterioration of these structures has to be monitored in order to avoid hazards for human beings and for the natural environment in their neighborhood. Hence, on the one hand, monitoring such infrastructures is of primarily importance. On the other hand, unfortunately, nowadays this monitoring effort is mostly done by expert and skilled personnel, which follow the overall data acquisition, analysis and result reporting process, making the whole monitoring procedure quite expensive for the public (and private, as well) agencies. This paper proposes the use of a partially user-assisted procedure in order to reduce the monitoring cost and to make the obtained result less subjective as well. The developed method relies on the use of images acquired with standard cameras by even inexperienced personnel. The deterioration on the infrastructure surface is detected by image segmentation based on a level sets method. The results of the semi-automated analysis procedure are remapped on a 3D model of the infrastructure obtained by means of a terrestrial laser scanning acquisition. The proposed method has been successfully tested on a portion of a road bridge in Perarolo di Cadore (BL), Italy.

  20. Analysis of adequacy levels for human resources improvement within primary health care framework in Africa

    PubMed Central

    Parent, Florence; Fromageot, Audrey; Coppieters, Yves; Lejeune, Colette; Lemenu, Dominique; Garant, Michèle; Piette, Danielle; Levêque, Alain; De Ketele, Jean-Marie

    2005-01-01

    Human resources in health care system in sub-Saharan Africa are generally picturing a lack of adequacy between expected skills from the professionals and health care needs expressed by the populations. It is, however, possible to analyse these various lacks of adequacy related to human resource management and their determinants to enhance the effectiveness of the health care system. From two projects focused on nurse professionals within the health care system in Central Africa, we present an analytic grid for adequacy levels looking into the following aspects: - adequacy between skills-based profiles for health system professionals, quality of care and service delivery (health care system /medical standards), needs and expectations from the populations, - adequacy between allocation of health system professionals, quality of care and services delivered (health care system /medical standards), needs and expectations from the populations, - adequacy between human resource management within health care system and medical standards, - adequacy between human resource management within education/teaching/training and needs from health care system and education sectors, - adequacy between basic and on-going education and realities of tasks expected and implemented by different categories of professionals within the health care system body, - adequacy between intentions for initial and on-going trainings and teaching programs in health sciences for trainers (teachers/supervisors/health care system professionals/ directors (teaching managers) of schools...). This tool is necessary for decision-makers as well as for health care system professionals who share common objectives for changes at each level of intervention within the health system. Setting this adequacy implies interdisciplinary and participative approaches for concerned actors in order to provide an overall vision of a more broaden system than health district, small island with self-rationality, and in which

  1. Using Economic Evidence to Set Healthcare Priorities in Low‐Income and Lower‐Middle‐Income Countries: A Systematic Review of Methodological Frameworks

    PubMed Central

    Mitton, Craig; Doyle‐Waters, Mary M.; Drake, Tom; Conteh, Lesong; Newall, Anthony T.; Onwujekwe, Obinna; Jan, Stephen

    2016-01-01

    Abstract Policy makers in low‐income and lower‐middle‐income countries (LMICs) are increasingly looking to develop ‘evidence‐based’ frameworks for identifying priority health interventions. This paper synthesises and appraises the literature on methodological frameworks – which incorporate economic evaluation evidence – for the purpose of setting healthcare priorities in LMICs. A systematic search of Embase, MEDLINE, Econlit and PubMed identified 3968 articles with a further 21 articles identified through manual searching. A total of 36 papers were eligible for inclusion. These covered a wide range of health interventions with only two studies including health systems strengthening interventions related to financing, governance and human resources. A little under half of the studies (39%) included multiple criteria for priority setting, most commonly equity, feasibility and disease severity. Most studies (91%) specified a measure of ‘efficiency’ defined as cost per disability‐adjusted life year averted. Ranking of health interventions using multi‐criteria decision analysis and generalised cost‐effectiveness were the most common frameworks for identifying priority health interventions. Approximately a third of studies discussed the affordability of priority interventions. Only one study identified priority areas for the release or redeployment of resources. The paper concludes by highlighting the need for local capacity to conduct evaluations (including economic analysis) and empowerment of local decision‐makers to act on this evidence. PMID:26804361

  2. Using Economic Evidence to Set Healthcare Priorities in Low-Income and Lower-Middle-Income Countries: A Systematic Review of Methodological Frameworks.

    PubMed

    Wiseman, Virginia; Mitton, Craig; Doyle-Waters, Mary M; Drake, Tom; Conteh, Lesong; Newall, Anthony T; Onwujekwe, Obinna; Jan, Stephen

    2016-02-01

    Policy makers in low-income and lower-middle-income countries (LMICs) are increasingly looking to develop 'evidence-based' frameworks for identifying priority health interventions. This paper synthesises and appraises the literature on methodological frameworks--which incorporate economic evaluation evidence--for the purpose of setting healthcare priorities in LMICs. A systematic search of Embase, MEDLINE, Econlit and PubMed identified 3968 articles with a further 21 articles identified through manual searching. A total of 36 papers were eligible for inclusion. These covered a wide range of health interventions with only two studies including health systems strengthening interventions related to financing, governance and human resources. A little under half of the studies (39%) included multiple criteria for priority setting, most commonly equity, feasibility and disease severity. Most studies (91%) specified a measure of 'efficiency' defined as cost per disability-adjusted life year averted. Ranking of health interventions using multi-criteria decision analysis and generalised cost-effectiveness were the most common frameworks for identifying priority health interventions. Approximately a third of studies discussed the affordability of priority interventions. Only one study identified priority areas for the release or redeployment of resources. The paper concludes by highlighting the need for local capacity to conduct evaluations (including economic analysis) and empowerment of local decision-makers to act on this evidence.

  3. Extrapolation of G0W0 energy levels from small basis sets for elements from H to Cl

    NASA Astrophysics Data System (ADS)

    Zhu, Tong; Blum, Volker

    G0W0 calculations based on orbitals from a density-functional theory reference are widely used to predict carrier levels in molecular and inorganic materials. Their computational feasibility, however, is limited by the need to evaluate slow-converging sums over unoccupied states, requiring large basis sets paired with unfavorable scaling exponents to evaluate the self-energy. In the quantum chemistry literature, complete basis set (CBS) extrapolation strategies have been used successfully to overcome this problem for total energies. We here apply the principle of basis set extrapolation to G0W0 energy levels. For a set of 49 small molecules and clusters containing the elements H, Li through F, and Na through Cl, we test established extrapolation strategies based on Dunning's correlation-consistent (cc) basis sets (aug)-cc-pVNZ (N=2-5), as well as numeric atom-centered NAO-VCC-nZ (n=2-5) basis sets in the FHI-aims all-electron code. For the occupied and lowest unoccupied levels, different extrapolation strategies agree within +/-50 meV based on large 4Z and 5Z basis sets. We show that extrapolation based on much smaller 2Z and 3Z basis sets with largest errors +/- 100 meV based on a refinement of the NAO-VCC-nZ basis sets.

  4. Modeling Primary Breakup: A Three-Dimensional Eulerian Level Set/Vortex Sheet Method for Two-Phase Interface Dynamics

    NASA Technical Reports Server (NTRS)

    Herrmann, M.

    2003-01-01

    This paper is divided into four parts. First, the level set/vortex sheet method for three-dimensional two-phase interface dynamics is presented. Second, the LSS model for the primary breakup of turbulent liquid jets and sheets is outlined and all terms requiring subgrid modeling are identified. Then, preliminary three-dimensional results of the level set/vortex sheet method are presented and discussed. Finally, conclusions are drawn and an outlook to future work is given.

  5. Fostering Multirepresentational Levels of Chemical Concepts: A Framework to Develop Educational Software

    ERIC Educational Resources Information Center

    Marson, Guilherme A.; Torres, Bayardo B.

    2011-01-01

    This work presents a convenient framework for developing interactive chemical education software to facilitate the integration of macroscopic, microscopic, and symbolic dimensions of chemical concepts--specifically, via the development of software for gel permeation chromatography. The instructional role of the software was evaluated in a study…

  6. Toward accurate tooth segmentation from computed tomography images using a hybrid level set model

    SciTech Connect

    Gan, Yangzhou; Zhao, Qunfei; Xia, Zeyang E-mail: jing.xiong@siat.ac.cn; Hu, Ying; Xiong, Jing E-mail: jing.xiong@siat.ac.cn; Zhang, Jianwei

    2015-01-15

    Purpose: A three-dimensional (3D) model of the teeth provides important information for orthodontic diagnosis and treatment planning. Tooth segmentation is an essential step in generating the 3D digital model from computed tomography (CT) images. The aim of this study is to develop an accurate and efficient tooth segmentation method from CT images. Methods: The 3D dental CT volumetric images are segmented slice by slice in a two-dimensional (2D) transverse plane. The 2D segmentation is composed of a manual initialization step and an automatic slice by slice segmentation step. In the manual initialization step, the user manually picks a starting slice and selects a seed point for each tooth in this slice. In the automatic slice segmentation step, a developed hybrid level set model is applied to segment tooth contours from each slice. Tooth contour propagation strategy is employed to initialize the level set function automatically. Cone beam CT (CBCT) images of two subjects were used to tune the parameters. Images of 16 additional subjects were used to validate the performance of the method. Volume overlap metrics and surface distance metrics were adopted to assess the segmentation accuracy quantitatively. The volume overlap metrics were volume difference (VD, mm{sup 3}) and Dice similarity coefficient (DSC, %). The surface distance metrics were average symmetric surface distance (ASSD, mm), RMS (root mean square) symmetric surface distance (RMSSSD, mm), and maximum symmetric surface distance (MSSD, mm). Computation time was recorded to assess the efficiency. The performance of the proposed method has been compared with two state-of-the-art methods. Results: For the tested CBCT images, the VD, DSC, ASSD, RMSSSD, and MSSD for the incisor were 38.16 ± 12.94 mm{sup 3}, 88.82 ± 2.14%, 0.29 ± 0.03 mm, 0.32 ± 0.08 mm, and 1.25 ± 0.58 mm, respectively; the VD, DSC, ASSD, RMSSSD, and MSSD for the canine were 49.12 ± 9.33 mm{sup 3}, 91.57 ± 0.82%, 0.27 ± 0.02 mm, 0

  7. Birth choices in Timor-Leste: a framework for understanding the use of maternal health services in low resource settings.

    PubMed

    Wild, Kayli; Barclay, Lesley; Kelly, Paul; Martins, Nelson

    2010-12-01

    The high rate of maternal mortality in Timor-Leste is a persistent problem which has been exacerbated by the long history of military occupation and ongoing political crises since independence in 1999. It is similar to other developing countries where there have been slow declines in maternal mortality despite 20 years of Safe Motherhood interventions. The national Ministry of Health, United Nations (UN) agencies and non-government organisations (NGOs) have attempted to reduce maternal mortality by enacting policies and interventions to increase the number of births in health centres and hospitals. Despite considerable effort in promoting facility-based delivery, most Timorese women birth at home and the lack of midwives means few women have access to a skilled birth attendant. This paper investigates factors influencing access to and use of maternal health services in rural areas of Timor-Leste. It draws on 21 interviews and 11 group discussions with Timorese women and their families collected over two periods of fieldwork, one month in September 2006 and five months from July to December 2007. Theoretical concepts from anthropology and health social science are used to explore individual, social, political and health system issues which affect the way in which maternal health services are utilised. In drawing together a range of theories this paper aims to extend explanations around access to maternal health services in developing countries. An empirically informed framework is proposed which illustrates the complex factors that influence women's birth choices. This framework can be used by policy-makers, practitioners, donors and researchers to think critically about policy decisions and where investments can have the most impact for improving maternal health in Timor-Leste and elsewhere.

  8. CT liver volumetry using geodesic active contour segmentation with a level-set algorithm

    NASA Astrophysics Data System (ADS)

    Suzuki, Kenji; Epstein, Mark L.; Kohlbrenner, Ryan; Obajuluwa, Ademola; Xu, Jianwu; Hori, Masatoshi; Baron, Richard

    2010-03-01

    Automatic liver segmentation on CT images is challenging because the liver often abuts other organs of a similar density. Our purpose was to develop an accurate automated liver segmentation scheme for measuring liver volumes. We developed an automated volumetry scheme for the liver in CT based on a 5 step schema. First, an anisotropic smoothing filter was applied to portal-venous phase CT images to remove noise while preserving the liver structure, followed by an edge enhancer to enhance the liver boundary. By using the boundary-enhanced image as a speed function, a fastmarching algorithm generated an initial surface that roughly estimated the liver shape. A geodesic-active-contour segmentation algorithm coupled with level-set contour-evolution refined the initial surface so as to more precisely fit the liver boundary. The liver volume was calculated based on the refined liver surface. Hepatic CT scans of eighteen prospective liver donors were obtained under a liver transplant protocol with a multi-detector CT system. Automated liver volumes obtained were compared with those manually traced by a radiologist, used as "gold standard." The mean liver volume obtained with our scheme was 1,520 cc, whereas the mean manual volume was 1,486 cc, with the mean absolute difference of 104 cc (7.0%). CT liver volumetrics based on an automated scheme agreed excellently with "goldstandard" manual volumetrics (intra-class correlation coefficient was 0.95) with no statistically significant difference (p(F<=f)=0.32), and required substantially less completion time. Our automated scheme provides an efficient and accurate way of measuring liver volumes.

  9. A unified EM approach to bladder wall segmentation with coupled level-set constraints

    PubMed Central

    Han, Hao; Li, Lihong; Duan, Chaijie; Zhang, Hao; Zhao, Yang; Liang, Zhengrong

    2013-01-01

    Magnetic resonance (MR) imaging-based virtual cystoscopy (VCys), as a non-invasive, safe and cost-effective technique, has shown its promising virtue for early diagnosis and recurrence management of bladder carcinoma. One primary goal of VCys is to identify bladder lesions with abnormal bladder wall thickness, and consequently a precise segmentation of the inner and outer borders of the wall is required. In this paper, we propose a unified expectation-maximization (EM) approach to the maximum-a-posteriori (MAP) solution of bladder wall segmentation, by integrating a novel adaptive Markov random field (AMRF) model and the coupled level-set (CLS) information into the prior term. The proposed approach is applied to the segmentation of T1-weighted MR images, where the wall is enhanced while the urine and surrounding soft tissues are suppressed. By introducing scale-adaptive neighborhoods as well as adaptive weights into the conventional MRF model, the AMRF model takes into account the local information more accurately. In order to mitigate the influence of image artifacts adjacent to the bladder wall and to preserve the continuity of the wall surface, we apply geometrical constraints on the wall using our previously developed CLS method. This paper not only evaluates the robustness of the presented approach against the known ground truth of simulated digital phantoms, but further compares its performance with our previous CLS approach via both volunteer and patient studies. Statistical analysis on experts’ scores of the segmented borders from both approaches demonstrates that our new scheme is more effective in extracting the bladder wall. Based on the wall thickness calibrated from the segmented single-layer borders, a three-dimensional virtual bladder model can be constructed and the wall thickness can be mapped on to the model, where the bladder lesions will be eventually detected via experts’ visualization and/or computer-aided detection. PMID:24001932

  10. Level-set surface segmentation and registration for computing intrasurgical deformations

    NASA Astrophysics Data System (ADS)

    Audette, Michel A.; Peters, Terence M.

    1999-05-01

    We propose a method for estimating intrasurgical brain shift for image-guided surgery. This method consists of five stages: the identification of relevant anatomical surfaces within the MRI/CT volume, range-sensing of the skin and cortex in the OR, rigid registration of the skin range image with its MRI/CT homologue, non-rigid motion tracking over time of cortical range images, and lastly, interpolation of this surface displacement information over the whole brain volume via a realistically valued finite element model of the head. This paper focuses on the anatomical surface identification and cortical range surface tracking problems. The surface identification scheme implements a recent algorithm which imbeds 3D surface segmentation as the level- set of a 4D moving front. A by-product of this stage is a Euclidean distance and closest point map which is later exploited to speed up the rigid and non-rigid surface registration. The range-sensor uses both laser-based triangulation and defocusing techniques to produce a 2D range profile, and is linearly swept across the skin or cortical surface to produce a 3D range image. The surface registration technique is of the iterative closest point type, where each iteration benefits from looking up, rather than searching for, explicit closest point pairs. These explicit point pairs in turn are used in conjunction with a closed-form SVD-based rigid transformation computation and with fast recursive splines to make each rigid and non-rigid registration iteration essentially instantaneous. Our method is validated with a novel deformable brain-shaped phantom, made of Polyvinyl Alcohol Cryogel.

  11. Analysis and improvement of data-set level file distribution in Disk Pool Manager

    NASA Astrophysics Data System (ADS)

    Cadellin Skipsey, Samuel; Purdie, Stuart; Britton, David; Mitchell, Mark; Bhimji, Wahid; Smith, David

    2014-06-01

    Of the three most widely used implementations of the WLCG Storage Element specification, Disk Pool Manager[1, 2] (DPM) has the simplest implementation of file placement balancing (StoRM doesn't attempt this, leaving it up to the underlying filesystem, which can be very sophisticated in itself). DPM uses a round-robin algorithm (with optional filesystem weighting), for placing files across filesystems and servers. This does a reasonable job of evenly distributing files across the storage array provided to it. However, it does not offer any guarantees of the evenness of distribution of that subset of files associated with a given "dataset" (which often maps onto a "directory" in the DPM namespace (DPNS)). It is useful to consider a concept of "balance", where an optimally balanced set of files indicates that the files are distributed evenly across all of the pool nodes. The best case performance of the round robin algorithm is to maintain balance, it has no mechanism to improve balance. In the past year or more, larger DPM sites have noticed load spikes on individual disk servers, and suspected that these were exacerbated by excesses of files from popular datasets on those servers. We present here a software tool which analyses file distribution for all datasets in a DPM SE, providing a measure of the poorness of file location in this context. Further, the tool provides a list of file movement actions which will improve dataset-level file distribution, and can action those file movements itself. We present results of such an analysis on the UKI-SCOTGRID-GLASGOW Production DPM.

  12. Multiatlas segmentation of thoracic and abdominal anatomy with level set-based local search.

    PubMed

    Schreibmann, Eduard; Marcus, David M; Fox, Tim

    2014-07-08

    Segmentation of organs at risk (OARs) remains one of the most time-consuming tasks in radiotherapy treatment planning. Atlas-based segmentation methods using single templates have emerged as a practical approach to automate the process for brain or head and neck anatomy, but pose significant challenges in regions where large interpatient variations are present. We show that significant changes are needed to autosegment thoracic and abdominal datasets by combining multi-atlas deformable registration with a level set-based local search. Segmentation is hierarchical, with a first stage detecting bulk organ location, and a second step adapting the segmentation to fine details present in the patient scan. The first stage is based on warping multiple presegmented templates to the new patient anatomy using a multimodality deformable registration algorithm able to cope with changes in scanning conditions and artifacts. These segmentations are compacted in a probabilistic map of organ shape using the STAPLE algorithm. Final segmentation is obtained by adjusting the probability map for each organ type, using customized combinations of delineation filters exploiting prior knowledge of organ characteristics. Validation is performed by comparing automated and manual segmentation using the Dice coefficient, measured at an average of 0.971 for the aorta, 0.869 for the trachea, 0.958 for the lungs, 0.788 for the heart, 0.912 for the liver, 0.884 for the kidneys, 0.888 for the vertebrae, 0.863 for the spleen, and 0.740 for the spinal cord. Accurate atlas segmentation for abdominal and thoracic regions can be achieved with the usage of a multi-atlas and perstructure refinement strategy. To improve clinical workflow and efficiency, the algorithm was embedded in a software service, applying the algorithm automatically on acquired scans without any user interaction.

  13. Breast mass segmentation on dynamic contrast-enhanced magnetic resonance scans using the level set method

    NASA Astrophysics Data System (ADS)

    Shi, Jiazheng; Sahiner, Berkman; Chan, Heang-Ping; Paramagul, Chintana; Hadjiiski, Lubomir M.; Helvie, Mark; Wu, Yi-Ta; Ge, Jun; Zhang, Yiheng; Zhou, Chuan; Wei, Jun

    2008-03-01

    The goal of this study was to develop an automated method to segment breast masses on dynamic contrast-enhanced (DCE) magnetic resonance (MR) scans that were performed to monitor breast cancer response to neoadjuvant chemotherapy. A radiologist experienced in interpreting breast MR scans defined the mass using a cuboid volume of interest (VOI). Our method then used the K-means clustering algorithm followed by morphological operations for initial mass segmentation on the VOI. The initial segmentation was then refined by a three-dimensional level set (LS) method. The velocity field of the LS method was formulated in terms of the mean curvature which guaranteed the smoothness of the surface and the Sobel edge information which attracted the zero LS to the desired mass margin. We also designed a method to reduce segmentation leak by adapting a region growing technique. Our method was evaluated on twenty DCE-MR scans of ten patients who underwent neoadjuvant chemotherapy. Each patient had pre- and post-chemotherapy DCE-MR scans on a 1.5 Tesla magnet. Computer segmentation was applied to coronal T1-weighted images. The in-plane pixel size ranged from 0.546 to 0.703 mm and the slice thickness ranged from 2.5 to 4.0 mm. The flip angle was 15 degrees, repetition time ranged from 5.98 to 6.7 ms, and echo time ranged from 1.2 to 1.3 ms. The computer segmentation results were compared to the radiologist's manual segmentation in terms of the overlap measure defined as the ratio of the intersection of the computer and the radiologist's segmentations to the radiologist's segmentation. Pre- and post-chemotherapy masses had overlap measures of 0.81+/-0.11 (mean+/-s.d.) and 0.70+/-0.21, respectively.

  14. 3D mapping of airway wall thickening in asthma with MSCT: a level set approach

    NASA Astrophysics Data System (ADS)

    Fetita, Catalin; Brillet, Pierre-Yves; Hartley, Ruth; Grenier, Philippe A.; Brightling, Christopher

    2014-03-01

    Assessing the airway wall thickness in multi slice computed tomography (MSCT) as image marker for airway disease phenotyping such asthma and COPD is a current trend and challenge for the scientific community working in lung imaging. This paper addresses the same problem from a different point of view: considering the expected wall thickness-to-lumen-radius ratio for a normal subject as known and constant throughout the whole airway tree, the aim is to build up a 3D map of airway wall regions of larger thickness and to define an overall score able to highlight a pathological status. In this respect, the local dimension (caliber) of the previously segmented airway lumen is obtained on each point by exploiting the granulometry morphological operator. A level set function is defined based on this caliber information and on the expected wall thickness ratio, which allows obtaining a good estimate of the airway wall throughout all segmented lumen generations. Next, the vascular (or mediastinal dense tissue) contact regions are automatically detected and excluded from analysis. For the remaining airway wall border points, the real wall thickness is estimated based on the tissue density analysis in the airway radial direction; thick wall points are highlighted on a 3D representation of the airways and several quantification scores are defined. The proposed approach is fully automatic and was evaluated (proof of concept) on a patient selection coming from different databases including mild, severe asthmatics and normal cases. This preliminary evaluation confirms the discriminative power of the proposed approach regarding different phenotypes and is currently extending to larger cohorts.

  15. Utilization Frameworks for Evaluation Reporting.

    ERIC Educational Resources Information Center

    Haenn, Joseph F.; Owens, Thomas R.

    Two utilization and implementation frameworks (known as Knowledge Production Utilization (KPU) frameworks) are related to the planning and reporting aspects of an evaluation: the Dissemination Analysis Group (DAG) model and; the Halland Loucks' Level of Use Scale (LoU). This comparison is set against a background wherein literature is reviewed to…

  16. A public health framework to translate risk factors related to political violence and war into multi-level preventive interventions.

    PubMed

    De Jong, Joop T V M

    2010-01-01

    Political violence, armed conflicts and human rights violations are produced by a variety of political, economic and socio-cultural factors. Conflicts can be analyzed with an interdisciplinary approach to obtain a global understanding of the relative contribution of risk and protective factors. A public health framework was designed to address these risk factors and protective factors. The framework resulted in a matrix that combined primary, secondary and tertiary interventions with their implementation on the levels of the society-at-large, the community, and the family and individual. Subsequently, the risk and protective factors were translated into multi-sectoral, multi-modal and multi-level preventive interventions involving the economy, governance, diplomacy, the military, human rights, agriculture, health, and education. Then the interventions were slotted in their appropriate place in the matrix. The interventions can be applied in an integrative form by international agencies, governments and non-governmental organizations, and molded to meet the requirements of the historic, political-economic and socio-cultural context. The framework maps the complementary fit among the different actors while engaging themselves in preventive, rehabilitative and reconstructive interventions. The framework shows how the economic, diplomatic, political, criminal justice, human rights, military, health and rural development sectors can collaborate to promote peace or prevent the aggravation or continuation of violence. A deeper understanding of the association between risk and protective factors and the developmental pathways of generic, country-specific and culture-specific factors leading to political violence is needed.

  17. Stress distribution in fixed-partial prosthesis and peri-implant bone tissue with different framework materials and vertical misfit levels: a three-dimensional finite element analysis.

    PubMed

    Bacchi, Ataís; Consani, Rafael L X; Mesquita, Marcelo F; dos Santos, Mateus B F

    2013-09-01

    The purpose of this study was to evaluate the influence of superstructure material and vertical misfits on the stresses created in an implant-supported partial prosthesis. A three-dimensional (3-D) finite element model was prepared based on common clinical data. The posterior part of a severely resorbed jaw with two osseointegrated implants at the second premolar and second molar regions was modeled using specific modeling software (SolidWorks 2010). Finite element models were created by importing the solid model into mechanical simulation software (ANSYS Workbench 11). The models were divided into groups according to the prosthesis framework material (type IV gold alloy, silver-palladium alloy, commercially pure titanium, cobalt-chromium alloy, or zirconia) and vertical misfit level (10 µm, 50 µm, and 100 µm) created at one implant-prosthesis interface. The gap of the vertical misfit was set to be closed and the stress values were measured in the framework, porcelain veneer, retention screw, and bone tissue. Stiffer materials led to higher stress concentration in the framework and increased stress values in the retention screw, while in the same circumstances, the porcelain veneer showed lower stress values, and there was no significant difference in stress in the peri-implant bone tissue. A considerable increase in stress concentration was observed in all the structures evaluated within the misfit amplification. The framework material influenced the stress concentration in the prosthetic structures and retention screw, but not that in bone tissue. All the structures were significantly influenced by the increase in the misfit levels.

  18. Interprofessional team building in the palliative home care setting: Use of a conceptual framework to inform a pilot evaluation.

    PubMed

    Shaw, James; Kearney, Colleen; Glenns, Brenda; McKay, Sandra

    2016-01-01

    Home-based palliative care is increasingly dependent on interprofessional teams to deliver collaborative care that more adequately meets the needs of clients and families. The purpose of this pilot evaluation was to qualitatively explore the views of an interprofessional group of home care providers (occupational therapists, nurses, personal support work supervisors, community care coordinators, and a team coordinator) regarding a pilot project encouraging teamwork in interprofessional palliative home care services. We used qualitative methods, informed by an interprofessional conceptual framework, to analyse participants' accounts and provide recommendations regarding strategies for interprofessional team building in palliative home health care. Findings suggest that encouraging practitioners to share past experiences and foster common goals for palliative care are important elements of team building in interprofessional palliative care. Also, establishing a team leader who emphasises sharing power among team members and addressing the need for mutual emotional support may help to maximise interprofessional teamwork in palliative home care. These findings may be used to develop and test more comprehensive efforts to promote stronger interprofessional teamwork in palliative home health care delivery.

  19. A machine learning framework for auto classification of imaging system exams in hospital setting for utilization optimization.

    PubMed

    Patil, Meru A; Patil, Ravindra B; Krishnamoorthy, P; John, Jacob

    2016-08-01

    In clinical environment, Interventional X-Ray (IXR) system is used on various anatomies and for various types of the procedures. It is important to classify correctly each exam of IXR system into respective procedures and/or assign to correct anatomy. This classification enhances productivity of the system in terms of better scheduling of the Cath lab, also provides means to perform device usage/revenue forecast of the system by hospital management and focus on targeted treatment planning for a disease/anatomy. Although it may appear classification of each exam into respective procedure/anatomy a simple task. However, in real-life hospital settings, it is well-known that same system settings are used to perform different types of procedures. Though, such usage leads to under-utilization of the system. In this work, a method is developed to classify exams into respective anatomical type by applying machine-learning techniques (SVM, KNN and decision trees) on log information of the systems. The classification result is promising with accuracy of greater than 90%.

  20. SET overexpression in HEK293 cells regulates mitochondrial uncoupling proteins levels within a mitochondrial fission/reduced autophagic flux scenario

    SciTech Connect

    Almeida, Luciana O.; Goto, Renata N.; Neto, Marinaldo P.C.; Sousa, Lucas O.; Curti, Carlos; Leopoldino, Andréia M.

    2015-03-06

    We hypothesized that SET, a protein accumulated in some cancer types and Alzheimer disease, is involved in cell death through mitochondrial mechanisms. We addressed the mRNA and protein levels of the mitochondrial uncoupling proteins UCP1, UCP2 and UCP3 (S and L isoforms) by quantitative real-time PCR and immunofluorescence as well as other mitochondrial involvements, in HEK293 cells overexpressing the SET protein (HEK293/SET), either in the presence or absence of oxidative stress induced by the pro-oxidant t-butyl hydroperoxide (t-BHP). SET overexpression in HEK293 cells decreased UCP1 and increased UCP2 and UCP3 (S/L) mRNA and protein levels, whilst also preventing lipid peroxidation and decreasing the content of cellular ATP. SET overexpression also (i) decreased the area of mitochondria and increased the number of organelles and lysosomes, (ii) increased mitochondrial fission, as demonstrated by increased FIS1 mRNA and FIS-1 protein levels, an apparent accumulation of DRP-1 protein, and an increase in the VDAC protein level, and (iii) reduced autophagic flux, as demonstrated by a decrease in LC3B lipidation (LC3B-II) in the presence of chloroquine. Therefore, SET overexpression in HEK293 cells promotes mitochondrial fission and reduces autophagic flux in apparent association with up-regulation of UCP2 and UCP3; this implies a potential involvement in cellular processes that are deregulated such as in Alzheimer's disease and cancer. - Highlights: • SET, UCPs and autophagy prevention are correlated. • SET action has mitochondrial involvement. • UCP2/3 may reduce ROS and prevent autophagy. • SET protects cell from ROS via UCP2/3.

  1. SET overexpression in HEK293 cells regulates mitochondrial uncoupling proteins levels within a mitochondrial fission/reduced autophagic flux scenario.

    PubMed

    Almeida, Luciana O; Goto, Renata N; Neto, Marinaldo P C; Sousa, Lucas O; Curti, Carlos; Leopoldino, Andréia M

    2015-03-06

    We hypothesized that SET, a protein accumulated in some cancer types and Alzheimer disease, is involved in cell death through mitochondrial mechanisms. We addressed the mRNA and protein levels of the mitochondrial uncoupling proteins UCP1, UCP2 and UCP3 (S and L isoforms) by quantitative real-time PCR and immunofluorescence as well as other mitochondrial involvements, in HEK293 cells overexpressing the SET protein (HEK293/SET), either in the presence or absence of oxidative stress induced by the pro-oxidant t-butyl hydroperoxide (t-BHP). SET overexpression in HEK293 cells decreased UCP1 and increased UCP2 and UCP3 (S/L) mRNA and protein levels, whilst also preventing lipid peroxidation and decreasing the content of cellular ATP. SET overexpression also (i) decreased the area of mitochondria and increased the number of organelles and lysosomes, (ii) increased mitochondrial fission, as demonstrated by increased FIS1 mRNA and FIS-1 protein levels, an apparent accumulation of DRP-1 protein, and an increase in the VDAC protein level, and (iii) reduced autophagic flux, as demonstrated by a decrease in LC3B lipidation (LC3B-II) in the presence of chloroquine. Therefore, SET overexpression in HEK293 cells promotes mitochondrial fission and reduces autophagic flux in apparent association with up-regulation of UCP2 and UCP3; this implies a potential involvement in cellular processes that are deregulated such as in Alzheimer's disease and cancer.

  2. Nurse staffing levels and outcomes - mining the UK national data sets for insight.

    PubMed

    Leary, Alison; Tomai, Barbara; Swift, Adrian; Woodward, Andrew; Hurst, Keith

    2017-04-18

    Purpose Despite the generation of mass data by the nursing workforce, determining the impact of the contribution to patient safety remains challenging. Several cross-sectional studies have indicated a relationship between staffing and safety. The purpose of this paper is to uncover possible associations and explore if a deeper understanding of relationships between staffing and other factors such as safety could be revealed within routinely collected national data sets. Design/methodology/approach Two longitudinal routinely collected data sets consisting of 30 years of UK nurse staffing data and seven years of National Health Service (NHS) benchmark data such as survey results, safety and other indicators were used. A correlation matrix was built and a linear correlation operation was applied (Pearson product-moment correlation coefficient). Findings A number of associations were revealed within both the UK staffing data set and the NHS benchmarking data set. However, the challenges of using these data sets soon became apparent. Practical implications Staff time and effort are required to collect these data. The limitations of these data sets include inconsistent data collection and quality. The mode of data collection and the itemset collected should be reviewed to generate a data set with robust clinical application. Originality/value This paper revealed that relationships are likely to be complex and non-linear; however, the main contribution of the paper is the identification of the limitations of routinely collected data. Much time and effort is expended in collecting this data; however, its validity, usefulness and method of routine national data collection appear to require re-examination.

  3. Optimal Design in Three-Level Block Randomized Designs with Two Levels of Nesting: An ANOVA Framework with Random Effects

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2013-01-01

    Large-scale experiments that involve nested structures may assign treatment conditions either to subgroups such as classrooms or to individuals such as students within subgroups. Key aspects of the design of such experiments include knowledge of the variance structure in higher levels and the sample sizes necessary to reach sufficient power to…

  4. Extension of the Mass-Conserving Level-Set method to unstructured polyhedral control volumes for two-phase flows

    NASA Astrophysics Data System (ADS)

    Raees, Fahim; van der Heul, Duncan R.; Vuik, Kees

    2013-11-01

    In this research, we present the Mass-Conserving Level-Set method (MCLS) for the simulation of two-dimensional, incompressible, immiscible two-phase flows, using a discretisation scheme that can accurately and efficiently handle domains of arbitrary geometrical complexity. The level set and the volume of fluid fraction are evolved at each time step on unstructured triangular grids. The Higher-Order Discontinuous Galerkin finite element method is used for spatial discretisation of the level set advection equation. The volume of fluid fraction advection is done in geometrical manner using Lagrangian-Eulerian method. This method is accurately mass conserving and easy to implement on unstructured grids. Also, it avoids overlapping regions during the volume of fluid fraction advection. The advected level set is corrected locally to make it mass conserving by the means of an explicit, invertible relation between the local level set and the volume of fluid fraction. This relation is termed as a Volume-of-Fluid function. The results show that proposed method is accurately mass conserving. Also, higher-order convergence is highlighted with this method on unstructured grids for the different test cases.

  5. Farm Level--Setting Up and Using the Tripod Level, Staking Out Foundations, Differential Leveling, and Staking Out Fence Lines. Student Materials. V.A. III. V-E-1, V-E-2.

    ERIC Educational Resources Information Center

    Texas A and M Univ., College Station. Vocational Instructional Services.

    Designed for use by individuals enrolled in vocational agricultural classes, these student materials deal with setting up and using a tripod level, staking out foundations, differential leveling, and staking out fence lines. Topics covered in the unit are different kinds of tripod levels, the parts of a tripod level, transporting a tripod level,…

  6. Basis set limit geometries for ammonia at the SCF and MP2 levels of theory

    NASA Technical Reports Server (NTRS)

    Defrees, D. J.; Mclean, A. D.

    1984-01-01

    The controversy over the Hartree-Fock bond angle of NH3 is resolved and the convergence of the geometry for the molecule as the basis set is systematically improved with both SCF and correlated MP2 wave functions. The results of the geometrical optimizations, carried out in four stages with a series of uncontracted bases sets, are shown. The obtained structure for NH3 supports the results of Radom and Rodwell (1980) that the Hartree-Fock limit angle is significantly greater than was previously believed.

  7. Using the Gene Ontology to Scan Multi-Level Gene Sets for Associations in Genome Wide Association Studies

    PubMed Central

    Schaid, Daniel J.; Sinnwell, Jason P.; Jenkins, Gregory D.; McDonnell, Shannon K.; Ingle, James N.; Kubo, Michiaki; Goss, Paul E.; Costantino, Joseph P.; Wickerham, D. Lawrence; Weinshilboum, Richard M.

    2011-01-01

    Gene-set analyses have been widely used in gene expression studies, and some of the developed methods have been extended to genome wide association studies (GWAS). Yet, complications due to linkage disequilibrium (LD) among single nucleotide polymorphisms (SNPs), and variable numbers of SNPs per gene and genes per gene-set, have plagued current approaches, often leading to ad hoc “fixes”. To overcome some of the current limitations, we developed a general approach to scan GWAS SNP data for both gene-level and gene-set analyses, building on score statistics for generalized linear models, and taking advantage of the directed acyclic graph structure of the gene ontology when creating gene-sets. However, other types of gene-set structures can be used, such as the popular Kyoto Encyclopedia of Genes and Genomes (KEGG). Our approach combines SNPs into genes, and genes into gene-sets, but assures that positive and negative effects of genes on a trait do not cancel. To control for multiple testing of many gene-sets, we use an efficient computational strategy that accounts for LD and provides accurate step-down adjusted p-values for each gene-set. Application of our methods to two different GWAS provide guidance on the potential strengths and weaknesses of our proposed gene-set analyses. PMID:22161999

  8. Comparing Panelists' Understanding of Standard Setting across Multiple Levels of an Alternate Science Assessment

    ERIC Educational Resources Information Center

    Hansen, Mary A.; Lyon, Steven R.; Heh, Peter; Zigmond, Naomi

    2013-01-01

    Large-scale assessment programs, including alternate assessments based on alternate achievement standards (AA-AAS), must provide evidence of technical quality and validity. This study provides information about the technical quality of one AA-AAS by evaluating the standard setting for the science component. The assessment was designed to have…

  9. A novel region-based level set method initialized with mean shift clustering for automated medical image segmentation.

    PubMed

    Bai, Pei Rui; Liu, Qing Yi; Li, Lei; Teng, Sheng Hua; Li, Jing; Cao, Mao Yong

    2013-11-01

    Appropriate initialization and stable evolution are desirable criteria to satisfy in level set methods. In this study, a novel region-based level set method utilizing both global and local image information complementarily is proposed. The global image information is extracted from mean shift clustering without any prior knowledge. Appropriate initial contours are obtained by regulating the clustering results. The local image information, as extracted by a data fitting energy, is employed to maintain a stable evolution of the zero level set curves. The advantages of the proposed method are as follows. First, the controlling parameters of the evolution can be easily estimated by the clustering results. Second, the automaticity of the model increases because of a reduction in computational cost and manual intervention. Experimental results confirm the efficiency and accuracy of the proposed method for medical image segmentation.

  10. Goodness-of-fit measures for individual-level models of infectious disease in a Bayesian framework.

    PubMed

    Gardner, A; Deardon, R; Darlington, G

    2011-12-01

    In simple models there are a variety of tried and tested ways to assess goodness-of-fit. However, in complex non-linear models, such as spatio-temporal individual-level models, less research has been done on how best to ascertain goodness-of-fit. Often such models are fitted within a Bayesian statistical framework, since such a framework is ideally placed to account for the many areas of data uncertainty. Within a Bayesian context, a major tool for assessing goodness-of-fit is the posterior predictive distribution. That is, a distribution for a test statistic is found through simulation from the posterior distribution and then compared with the observed test statistic for the data. Here, we examine different test statistics and ascertain how well they can detect model misspecification via a simulation study.

  11. Establishing optimal project-level strategies for pavement maintenance and rehabilitation - A framework and case study

    NASA Astrophysics Data System (ADS)

    Irfan, Muhammad; Bilal Khurshid, Muhammad; Bai, Qiang; Labi, Samuel; Morin, Thomas L.

    2012-05-01

    This article presents a framework and an illustrative example for identifying the optimal pavement maintenance and rehabilitation (M&R) strategy using a mixed-integer nonlinear programming model. The objective function is to maximize the cost-effectiveness expressed as the ratio of the effectiveness to the cost. The constraints for the optimization problem are related to performance, budget, and choice. Two different formulations of effectiveness are derived using treatment-specific performance models for each constituent treatment of the strategy; and cost is expressed in terms of the agency and user costs over the life cycle. The proposed methodology is demonstrated using a case study. Probability distributions are established for the optimization input variables and Monte Carlo simulations are carried out to yield optimal solutions. Using the results of these simulations, M&R strategy contours are developed as a novel tool that can help pavement managers quickly identify the optimal M&R strategy for a given pavement section.

  12. Robust Framework to Combine Diverse Classifiers Assigning Distributed Confidence to Individual Classifiers at Class Level

    PubMed Central

    Arshad, Sannia; Rho, Seungmin

    2014-01-01

    We have presented a classification framework that combines multiple heterogeneous classifiers in the presence of class label noise. An extension of m-Mediods based modeling is presented that generates model of various classes whilst identifying and filtering noisy training data. This noise free data is further used to learn model for other classifiers such as GMM and SVM. A weight learning method is then introduced to learn weights on each class for different classifiers to construct an ensemble. For this purpose, we applied genetic algorithm to search for an optimal weight vector on which classifier ensemble is expected to give the best accuracy. The proposed approach is evaluated on variety of real life datasets. It is also compared with existing standard ensemble techniques such as Adaboost, Bagging, and Random Subspace Methods. Experimental results show the superiority of proposed ensemble method as compared to its competitors, especially in the presence of class label noise and imbalance classes. PMID:25295302

  13. Multisite Evaluation of a Data Quality Tool for Patient-Level Clinical Data Sets

    PubMed Central

    Huser, Vojtech; DeFalco, Frank J.; Schuemie, Martijn; Ryan, Patrick B.; Shang, Ning; Velez, Mark; Park, Rae Woong; Boyce, Richard D.; Duke, Jon; Khare, Ritu; Utidjian, Levon; Bailey, Charles

    2016-01-01

    Introduction: Data quality and fitness for analysis are crucial if outputs of analyses of electronic health record data or administrative claims data should be trusted by the public and the research community. Methods: We describe a data quality analysis tool (called Achilles Heel) developed by the Observational Health Data Sciences and Informatics Collaborative (OHDSI) and compare outputs from this tool as it was applied to 24 large healthcare datasets across seven different organizations. Results: We highlight 12 data quality rules that identified issues in at least 10 of the 24 datasets and provide a full set of 71 rules identified in at least one dataset. Achilles Heel is a freely available software that provides a useful starter set of data quality rules with the ability to add additional rules. We also present results of a structured email-based interview of all participating sites that collected qualitative comments about the value of Achilles Heel for data quality evaluation. Discussion: Our analysis represents the first comparison of outputs from a data quality tool that implements a fixed (but extensible) set of data quality rules. Thanks to a common data model, we were able to compare quickly multiple datasets originating from several countries in America, Europe and Asia. PMID:28154833

  14. Development of depression in survivors of childhood and adolescent cancer: a multi-level life course conceptual framework.

    PubMed

    Kaye, Erica C; Brinkman, Tara M; Baker, Justin N

    2017-03-09

    As therapeutic and supportive care interventions become increasingly effective, growing numbers of childhood and adolescent cancer survivors face a myriad of physical and psychological sequelae secondary to their disease and treatment. Mental health issues, in particular, present a significant problem in this unique patient population, with depression affecting a sizable number of childhood and adolescent cancer survivors. Multiple key determinants impact a survivor's risk of developing depression, with variables traversing across biologic, individual, family, community, and global levels, as well as spanning throughout the life course of human development from the preconception and prenatal periods to adulthood. A multi-level life course conceptual model offers a valuable framework to identify and organize the diverse variables that modulate the risk of developing depression in survivors of childhood and adolescent cancer. This review describes the first multi-level life course perspective applied to development of depression in childhood and adolescent cancer survivors. This conceptual framework may be used to guide the investigation of mental health interventions for SCACs to ensure that key determinants of depression occurrence are adequately addressed across various levels and throughout the life trajectory.

  15. Embracing a Common Focus: A Framework for Middle Level Teacher Preparation

    ERIC Educational Resources Information Center

    Faulkner, Shawn A.; Howell, Penny B.; Cook, Chris M.

    2013-01-01

    As more and more states make a commitment to specialized middle level teacher preparation, teacher education programs across the country must make the necessary adjustments to ensure middle level teachers are prepared to be successful. Unfortunately, individual state and institutional requirements often make this challenging and can result in…

  16. Homelessness Outcome Reporting Normative Framework: Systems-Level Evaluation of Progress in Ending Homelessness

    ERIC Educational Resources Information Center

    Austen, Tyrone; Pauly, Bernie

    2012-01-01

    Homelessness is a serious and growing issue. Evaluations of systemic-level changes are needed to determine progress in reducing or ending homelessness. The report card methodology is one means of systems-level assessment. Rather than solely establishing an enumeration, homelessness report cards can capture pertinent information about structural…

  17. Adaptive local basis set for Kohn-Sham density functional theory in a discontinuous Galerkin framework II: Force, vibration, and molecular dynamics calculations

    NASA Astrophysics Data System (ADS)

    Zhang, Gaigong; Lin, Lin; Hu, Wei; Yang, Chao; Pask, John E.

    2017-04-01

    Recently, we have proposed the adaptive local basis set for electronic structure calculations based on Kohn-Sham density functional theory in a pseudopotential framework. The adaptive local basis set is efficient and systematically improvable for total energy calculations. In this paper, we present the calculation of atomic forces, which can be used for a range of applications such as geometry optimization and molecular dynamics simulation. We demonstrate that, under mild assumptions, the computation of atomic forces can scale nearly linearly with the number of atoms in the system using the adaptive local basis set. We quantify the accuracy of the Hellmann-Feynman forces for a range of physical systems, benchmarked against converged planewave calculations, and find that the adaptive local basis set is efficient for both force and energy calculations, requiring at most a few tens of basis functions per atom to attain accuracies required in practice. Since the adaptive local basis set has implicit dependence on atomic positions, Pulay forces are in general nonzero. However, we find that the Pulay force is numerically small and systematically decreasing with increasing basis completeness, so that the Hellmann-Feynman force is sufficient for basis sizes of a few tens of basis functions per atom. We verify the accuracy of the computed forces in static calculations of quasi-1D and 3D disordered Si systems, vibration calculation of a quasi-1D Si system, and molecular dynamics calculations of H2 and liquid Al-Si alloy systems, where we show systematic convergence to benchmark planewave results and results from the literature.

  18. The Integrated Behavioural Model for Water, Sanitation, and Hygiene: a systematic review of behavioural models and a framework for designing and evaluating behaviour change interventions in infrastructure-restricted settings

    PubMed Central

    2013-01-01

    Background Promotion and provision of low-cost technologies that enable improved water, sanitation, and hygiene (WASH) practices are seen as viable solutions for reducing high rates of morbidity and mortality due to enteric illnesses in low-income countries. A number of theoretical models, explanatory frameworks, and decision-making models have emerged which attempt to guide behaviour change interventions related to WASH. The design and evaluation of such interventions would benefit from a synthesis of this body of theory informing WASH behaviour change and maintenance. Methods We completed a systematic review of existing models and frameworks through a search of related articles available in PubMed and in the grey literature. Information on the organization of behavioural determinants was extracted from the references that fulfilled the selection criteria and synthesized. Results from this synthesis were combined with other relevant literature, and from feedback through concurrent formative and pilot research conducted in the context of two cluster-randomized trials on the efficacy of WASH behaviour change interventions to inform the development of a framework to guide the development and evaluation of WASH interventions: the Integrated Behavioural Model for Water, Sanitation, and Hygiene (IBM-WASH). Results We identified 15 WASH-specific theoretical models, behaviour change frameworks, or programmatic models, of which 9 addressed our review questions. Existing models under-represented the potential role of technology in influencing behavioural outcomes, focused on individual-level behavioural determinants, and had largely ignored the role of the physical and natural environment. IBM-WASH attempts to correct this by acknowledging three dimensions (Contextual Factors, Psychosocial Factors, and Technology Factors) that operate on five-levels (structural, community, household, individual, and habitual). Conclusions A number of WASH-specific models and frameworks

  19. Leveling

    USGS Publications Warehouse

    1966-01-01

    Geodetic leveling by the U.S. Geological Survey provides a framework of accurate elevations for topographic mapping. Elevations are referred to the Sea Level Datum of 1929. Lines of leveling may be run either with automatic or with precise spirit levels, by either the center-wire or the three-wire method. For future use, the surveys are monumented with bench marks, using standard metal tablets or other marking devices. The elevations are adjusted by least squares or other suitable method and are published in lists of control.

  20. The critical size is set at a single-cell level by growth rate to attain homeostasis and adaptation.

    PubMed

    Ferrezuelo, Francisco; Colomina, Neus; Palmisano, Alida; Garí, Eloi; Gallego, Carme; Csikász-Nagy, Attila; Aldea, Martí

    2012-01-01

    Budding yeast cells are assumed to trigger Start and enter the cell cycle only after they attain a critical size set by external conditions. However, arguing against deterministic models of cell size control, cell volume at Start displays great individual variability even under constant conditions. Here we show that cell size at Start is robustly set at a single-cell level by the volume growth rate in G1, which explains the observed variability. We find that this growth-rate-dependent sizer is intimately hardwired into the Start network and the Ydj1 chaperone is key for setting cell size as a function of the individual growth rate. Mathematical modelling and experimental data indicate that a growth-rate-dependent sizer is sufficient to ensure size homeostasis and, as a remarkable advantage over a rigid sizer mechanism, it reduces noise in G1 length and provides an immediate solution for size adaptation to external conditions at a population level.

  1. Wave energy level and geographic setting correlate with Florida beach water quality.

    PubMed

    Feng, Zhixuan; Reniers, Ad; Haus, Brian K; Solo-Gabriele, Helena M; Kelly, Elizabeth A

    2016-03-15

    Many recreational beaches suffer from elevated levels of microorganisms, resulting in beach advisories and closures due to lack of compliance with Environmental Protection Agency guidelines. We conducted the first statewide beach water quality assessment by analyzing decadal records of fecal indicator bacteria (enterococci and fecal coliform) levels at 262 Florida beaches. The objectives were to depict synoptic patterns of beach water quality exceedance along the entire Florida shoreline and to evaluate their relationships with wave condition and geographic location. Percent exceedances based on enterococci and fecal coliform were negatively correlated with both long-term mean wave energy and beach slope. Also, Gulf of Mexico beaches exceeded the thresholds significantly more than Atlantic Ocean ones, perhaps partially due to the lower wave energy. A possible linkage between wave energy level and water quality is beach sand, a pervasive nonpoint source that tends to harbor more bacteria in the low-wave-energy environment.

  2. Area-level risk factors for adverse birth outcomes: trends in urban and rural settings

    PubMed Central

    2013-01-01

    Background Significant and persistent racial and income disparities in birth outcomes exist in the US. The analyses in this manuscript examine whether adverse birth outcome time trends and associations between area-level variables and adverse birth outcomes differ by urban–rural status. Methods Alabama births records were merged with ZIP code-level census measures of race, poverty, and rurality. B-splines were used to determine long-term preterm birth (PTB) and low birth weight (LBW) trends by rurality. Logistic regression models were used to examine differences in the relationships between ZIP code-level percent poverty or percent African-American with either PTB or LBW. Interactions with rurality were examined. Results Population dense areas had higher adverse birth outcome rates compared to other regions. For LBW, the disparity between population dense and other regions increased during the 1991–2005 time period, and the magnitude of the disparity was maintained through 2010. Overall PTB and LBW rates have decreased since 2006, except within isolated rural regions. The addition of individual-level socioeconomic or race risk factors greatly attenuated these geographical disparities, but isolated rural regions maintained increased odds of adverse birth outcomes. ZIP code-level percent poverty and percent African American both had significant relationships with adverse birth outcomes. Poverty associations remained significant in the most population-dense regions when models were adjusted for individual-level risk factors. Conclusions Population dense urban areas have heightened rates of adverse birth outcomes. High-poverty African American areas have higher odds of adverse birth outcomes in urban versus rural regions. These results suggest there are urban-specific social or environmental factors increasing risk for adverse birth outcomes in underserved communities. On the other hand, trends in PTBs and LBWs suggest interventions that have decreased adverse

  3. An automatic method for fast and accurate liver segmentation in CT images using a shape detection level set method

    NASA Astrophysics Data System (ADS)

    Lee, Jeongjin; Kim, Namkug; Lee, Ho; Seo, Joon Beom; Won, Hyung Jin; Shin, Yong Moon; Shin, Yeong Gil

    2007-03-01

    Automatic liver segmentation is still a challenging task due to the ambiguity of liver boundary and the complex context of nearby organs. In this paper, we propose a faster and more accurate way of liver segmentation in CT images with an enhanced level set method. The speed image for level-set propagation is smoothly generated by increasing number of iterations in anisotropic diffusion filtering. This prevents the level-set propagation from stopping in front of local minima, which prevails in liver CT images due to irregular intensity distributions of the interior liver region. The curvature term of shape modeling level-set method captures well the shape variations of the liver along the slice. Finally, rolling ball algorithm is applied for including enhanced vessels near the liver boundary. Our approach are tested and compared to manual segmentation results of eight CT scans with 5mm slice distance using the average distance and volume error. The average distance error between corresponding liver boundaries is 1.58 mm and the average volume error is 2.2%. The average processing time for the segmentation of each slice is 5.2 seconds, which is much faster than the conventional ones. Accurate and fast result of our method will expedite the next stage of liver volume quantification for liver transplantations.

  4. The Daily Events and Emotions of Master's-Level Family Therapy Trainees in Off-Campus Practicum Settings

    ERIC Educational Resources Information Center

    Edwards, Todd M.; Patterson, Jo Ellen

    2012-01-01

    The Day Reconstruction Method (DRM) was used to assess the daily events and emotions of one program's master's-level family therapy trainees in off-campus practicum settings. This study examines the DRM reports of 35 family therapy trainees in the second year of their master's program in marriage and family therapy. Four themes emerged from the…

  5. Simulation of Heterogeneous Atom Probe Tip Shapes Evolution during Field Evaporation Using a Level Set Method and Different Evaporation Models

    SciTech Connect

    Xu, Zhijie; Li, Dongsheng; Xu, Wei; Devaraj, Arun; Colby, Robert J.; Thevuthasan, Suntharampillai; Geiser, B. P.; Larson, David J.

    2015-04-01

    In atom probe tomography (APT), accurate reconstruction of the spatial positions of field evaporated ions from measured detector patterns depends upon a correct understanding of the dynamic tip shape evolution and evaporation laws of component atoms. Artifacts in APT reconstructions of heterogeneous materials can be attributed to the assumption of homogeneous evaporation of all the elements in the material in addition to the assumption of a steady state hemispherical dynamic tip shape evolution. A level set method based specimen shape evolution model is developed in this study to simulate the evaporation of synthetic layered-structured APT tips. The simulation results of the shape evolution by the level set model qualitatively agree with the finite element method and the literature data using the finite difference method. The asymmetric evolving shape predicted by the level set model demonstrates the complex evaporation behavior of heterogeneous tip and the interface curvature can potentially lead to the artifacts in the APT reconstruction of such materials. Compared with other APT simulation methods, the new method provides smoother interface representation with the aid of the intrinsic sub-grid accuracy. Two evaporation models (linear and exponential evaporation laws) are implemented in the level set simulations and the effect of evaporation laws on the tip shape evolution is also presented.

  6. An on-line learning tracking of non-rigid target combining multiple-instance boosting and level set

    NASA Astrophysics Data System (ADS)

    Chen, Mingming; Cai, Jingju

    2013-10-01

    Visual tracking algorithms based on online boosting generally use a rectangular bounding box to represent the position of the target, while actually the shape of the target is always irregular. This will cause the classifier to learn the features of the non-target parts in the rectangle region, thereby the performance of the classifier is reduced, and drift would happen. To avoid the limitations of the bounding-box, we propose a novel tracking-by-detection algorithm involving the level set segmentation, which ensures the classifier only learn the features of the real target area in the tracking box. Because the shape of the target only changes a little between two adjacent frames and the current level set algorithm can avoid the re-initialization of the signed distance function, it only takes a few iterations to converge to the position of the target contour in the next frame. We also make some improvement on the level set energy function so that the zero level set would have less possible to converge to the false contour. In addition, we use gradient boost to improve the original multi-instance learning (MIL) algorithm like the WMILtracker, which greatly speed up the tracker. Our algorithm outperforms the original MILtracker both on speed and precision. Compared with the WMILtracker, our algorithm runs at a almost same speed, but we can avoid the drift caused by background learning, so the precision is better.

  7. Evaluating the Use of Synoptic Assessment to Engage and Develop Lower Level Higher Education Students within a Further Education Setting

    ERIC Educational Resources Information Center

    Southall, Jane; Wason, Hilary

    2016-01-01

    Engaging less academically qualified Higher Education students being taught within a Further Education setting, who have weaker study skills and little experience of independent learning, is challenging. Confidence and motivation levels are often low and they feel overwhelmed. Effective assessment design is crucial and needs to capitalise on…

  8. Inversion and classification studies of live-site production-level MetalMapper data sets

    NASA Astrophysics Data System (ADS)

    Shubitidze, F.; Fernández, J. P.; Miller, J.; Keranen, J.; Barrowes, B. E.; Bijamov, A.

    2012-06-01

    This paper illustrates the discrimination performance of a set of advanced models at an actual UXO live site. The suite of methods, which combines the orthonormalized volume magnetic source (ONVMS) model, a data-preprocessing technique based on joint diagonalization (JD), and differential evolution (DE) minimization, among others, was tested at the former Camp Beale in California. The data for the study were collected independently by two UXO production teams from Parsons and CH2M HILL using the MetalMapper (MM) sensor in cued mode; each set of data was also processed independently. Initially all data were inverted using a multi-target version of the combined ONVMS-DE algorithm, which provided intrinsic parameters (the total ONVMS amplitudes) that were then used to perform classification after having been inspected by an expert. Classification of the Parsons data was conducted by a Sky Research production team using a fingerprinting approach; analysis of the CH2M HILL data was performed by a Sky/Dartmouth R&D team using unsupervised clustering. During the classification stage the analysts requested the ground truth for selected anomalies typical of the different clusters; this was then used to classify them using a probability function. This paper reviews the data inversion, processing, and discrimination schemes involving the advanced EMI methods and presents the classification results obtained for both the CH2M HILL and the Parsons data. Independent scoring by the Institute for Defense Analyses reveals superb all-around classification performance.

  9. Care pathways across the primary-hospital care continuum: using the multi-level framework in explaining care coordination

    PubMed Central

    2013-01-01

    Background Care pathways are widely used in hospitals for a structured and detailed planning of the care process. There is a growing interest in extending care pathways into primary care to improve quality of care by increasing care coordination. Evidence is sparse about the relationship between care pathways and care coordination. The multi-level framework explores care coordination across organizations and states that (inter)organizational mechanisms have an effect on the relationships between healthcare professionals, resulting in quality and efficiency of care. The aim of this study was to assess the extent to which care pathways support or create elements of the multi-level framework necessary to improve care coordination across the primary - hospital care continuum. Methods This study is an in-depth analysis of five existing local community projects located in four different regions in Flanders (Belgium) to determine whether the available empirical evidence supported or refuted the theoretical expectations from the multi-level framework. Data were gathered using mixed methods, including structured face-to-face interviews, participant observations, documentation and a focus group. Multiple cases were analyzed performing a cross case synthesis to strengthen the results. Results The development of a care pathway across the primary-hospital care continuum, supported by a step-by-step scenario, led to the use of existing and newly constructed structures, data monitoring and the development of information tools. The construction and use of these inter-organizational mechanisms had a positive effect on exchanging information, formulating and sharing goals, defining and knowing each other’s roles, expectations and competences and building qualitative relationships. Conclusion Care pathways across the primary-hospital care continuum enhance the components of care coordination. PMID:23919518

  10. A framework for the recognition of high-level surgical tasks from video images for cataract surgeries.

    PubMed

    Lalys, F; Riffaud, L; Bouget, D; Jannin, P

    2012-04-01

    The need for a better integration of the new generation of computer-assisted-surgical systems has been recently emphasized. One necessity to achieve this objective is to retrieve data from the operating room (OR) with different sensors, then to derive models from these data. Recently, the use of videos from cameras in the OR has demonstrated its efficiency. In this paper, we propose a framework to assist in the development of systems for the automatic recognition of high-level surgical tasks using microscope videos analysis. We validated its use on cataract procedures. The idea is to combine state-of-the-art computer vision techniques with time series analysis. The first step of the framework consisted in the definition of several visual cues for extracting semantic information, therefore, characterizing each frame of the video. Five different pieces of image-based classifiers were, therefore, implemented. A step of pupil segmentation was also applied for dedicated visual cue detection. Time series classification algorithms were then applied to model time-varying data. Dynamic time warping and hidden Markov models were tested. This association combined the advantages of all methods for better understanding of the problem. The framework was finally validated through various studies. Six binary visual cues were chosen along with 12 phases to detect, obtaining accuracies of 94%.

  11. Building a conceptual framework to culturally adapt health promotion and prevention programs at the deep structural level.

    PubMed

    Wang-Schweig, Meme; Kviz, Frederick J; Altfeld, Susan J; Miller, Arlene M; Miller, Brenda A

    2014-07-01

    The debate on the effectiveness and merit for the amount of time, effort, and resources to culturally adapt health promotion and prevention programs continues. This may be due, in large part, to the lack of theory in commonly used methods to match programmatic content and delivery to the culture of a population, particularly at the deep structural level. This paper asserts that prior to the cultural adaptation of prevention programs, it is necessary to first develop a conceptual framework. We propose a multiphase approach to address key challenges in the science of cultural adaptation by first identifying and exploring relevant cultural factors that may affect the targeted health-related behavior prior to proceeding through steps of a stage model. The first phase involves developing an underlying conceptual framework that integrates cultural factors to ground this process. The second phase employs the different steps of a stage model. For Phase I of our approach, we offer four key steps and use our research study as an example of how these steps were applied to build a framework for the cultural adaptation of a family-based intervention to prevent adolescent alcohol use, Guiding Good Choices (GGC), to Chinese American families. We then provide a summary of the preliminary evidence from a few key relationships that were tested among our sample with the greater purpose of discussing how these findings might be used to culturally adapt GGC.

  12. High performance in healthcare priority setting and resource allocation: A literature- and case study-based framework in the Canadian context.

    PubMed

    Smith, Neale; Mitton, Craig; Hall, William; Bryan, Stirling; Donaldson, Cam; Peacock, Stuart; Gibson, Jennifer L; Urquhart, Bonnie

    2016-08-01

    Priority setting and resource allocation, or PSRA, are key functions of executive teams in healthcare organizations. Yet decision-makers often base their choices on historical patterns of resource distribution or political pressures. Our aim was to provide leaders with guidance on how to improve PSRA practice, by creating organizational contexts which enable high performance. We carried out in-depth case studies of six Canadian healthcare organizations to obtain from healthcare leaders their understanding of the concept of high performance in PSRA and the factors which contribute to its achievement. Individual and group interviews were carried out (n = 62) with senior managers, middle managers and Board members. Site observations and document review were used to assist researchers in interpreting the interview data. Qualitative data were analyzed iteratively with the literature on empirical examples of PSRA practice, in order to develop a framework of high performance in PSRA. The framework consists of four domains - structures, processes, attitudes and behaviours, and outcomes - within which are 19 specific elements. The emergent themes derive from case studies in different kinds of health organizations (urban/rural, small/large) across Canada. The elements can serve as a checklist for 'high performance' in PSRA. This framework provides a means by which decision-makers in healthcare might assess their practice and identify key areas for improvement. The findings are likely generalizable, certainly within Canada but also across countries. This work constitutes, to our knowledge, the first attempt to present a full package of elements comprising high performance in health care PSRA.

  13. Ready Ready Exercises. "Ready-Set-ABE" To Ease Students' Transition into ABE Level Studies.

    ERIC Educational Resources Information Center

    Molek, Carol

    This booklet is intended to assist tutors in helping transitional and low-level adult basic education (ABE) students acquire the reading skills required to make a successful adjustment to regular ABE classes. The exercises provided are intended primarily for use in student-tutor learning teams, with students gradually completing greater portions…

  14. Connected Functional Working Spaces: A Framework for the Teaching and Learning of Functions at Upper Secondary Level

    ERIC Educational Resources Information Center

    Minh, Tran Kiem; Lagrange, Jean-Baptiste

    2016-01-01

    This paper aims at contributing to remedy the narrow treatment of functions at upper secondary level. Assuming that students make sense of functions by working on functional situations in distinctive settings, we propose to consider functional working spaces inspired by geometrical working spaces. We analyse a classroom situation based on a…

  15. Measuring Oxygen Cost During Level Walking in Individuals with Acquired Brain Injury in the Clinical Setting

    PubMed Central

    Dawes, Helen; Collett, Johnathen; Ramsbottom, Roger; Howells, Ken; Sackley, Cath; Wade, Derick

    2004-01-01

    This study examined the test-retest reliability of oxygen cost (ml·kg-1·min-1) during level walking in individuals with acquired brain injury (ABI). Ten individuals with ABI (5 men, 5 women) (Traumatic brain injury, 1, central pontine myelinolysis, 1, stroke 8) and 21 healthy controls (11 men, 10 women). Measurements of gross and net (walking minus resting) oxygen consumption (ml·kg-1·min-1), and oxygen cost (ml·kg-1·min-1) during level walking at self-selected speeds. Measurements were taken on two occasions within one week. Oxygen cost was significantly lower (p < 0.05) in individuals with ABI on the second test versus the first test. Percentage variability in oxygen cost from test to re-test ranged from 14.7 to 17.3% in the control group and from 17.4 to 20.8% in the brain injury group. Clinical populations may demonstrate a significant decrease in oxygen cost between testing occasions. Individuals require at least one period of familiarisation if oxygen cost is used as an outcome measure during level walking in clinical groups. The amount of familiarisation has yet to be investigated in individuals with ABI. Key Points Individuals with brain injury during level walking May demonstrate a significant decrease in oxygen cost between testing occasions. May require at least one period of familiarisation if oxygen cost is used as an outcome measure The degree of familiarisation required in this clinical group needs further investigation PMID:24482582

  16. Optimal Sampling of Units in Three-Level Cluster Randomized Designs: An Ancova Framework

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2011-01-01

    Field experiments with nested structures assign entire groups such as schools to treatment and control conditions. Key aspects of such cluster randomized experiments include knowledge of the intraclass correlation structure and the sample sizes necessary to achieve adequate power to detect the treatment effect. The units at each level of the…

  17. Developing a Theoretical Framework for Classifying Levels of Context Use for Mathematical Problems

    ERIC Educational Resources Information Center

    Almuna Salgado, Felipe

    2016-01-01

    This paper aims to revisit and clarify the term problem context and to develop a theoretical classification of the construct of levels of context use (LCU) to analyse how the context of a problem is used to formulate a problem in mathematical terms and to interpret the answer in relation to the context of a given problem. Two criteria and six…

  18. Alcohol and Sexual Risk: An Event-Level Analysis in Commercial Sex Setting

    PubMed Central

    Chen, Yiyun; Li, Xiaoming; Shen, Zhiyong; Zhou, Yuejiao; Tang, Zhenzhu

    2013-01-01

    Objective To assess the episodic relationship between alcohol and sexual risk in multiple-client occasions among female sex workers (FSWs). Methods Data on alcohol use and sexual episodes with clients from the past two days were collected among FSWs in Guangxi, China (n=336 for yesterday, and n=299 for the day before yesterday). Logistic regression using generalized estimating equations with robust variance estimation was used to assess the alcohol-sexual risk relationship, controlling for contextual variables salient in the setting of commercial sex. Results: alcohol use among FSWs was associated with a higher likelihood of unprotected sex with clients during both days. This relationship was modified by the number of clients received within a day. Additionally, having a larger number of drinks was associated with higher odds of unprotected sex, but the association was not consistent across the two days. Conclusion Findings from the study support an association between alcohol use and sexual risk among FSWs. The design of alcohol and sexual risk reduction intervention among FSWs in China may take advantage of the interaction between contextual factors and alcohol use on sexual risk. PMID:24045031

  19. A high-speed DAQ framework for future high-level trigger and event building clusters

    NASA Astrophysics Data System (ADS)

    Caselle, M.; Ardila Perez, L. E.; Balzer, M.; Dritschler, T.; Kopmann, A.; Mohr, H.; Rota, L.; Vogelgesang, M.; Weber, M.

    2017-03-01

    Modern data acquisition and trigger systems require a throughput of several GB/s and latencies of the order of microseconds. To satisfy such requirements, a heterogeneous readout system based on FPGA readout cards and GPU-based computing nodes coupled by InfiniBand has been developed. The incoming data from the back-end electronics is delivered directly into the internal memory of GPUs through a dedicated peer-to-peer PCIe communication. High performance DMA engines have been developed for direct communication between FPGAs and GPUs using "DirectGMA (AMD)" and "GPUDirect (NVIDIA)" technologies. The proposed infrastructure is a candidate for future generations of event building clusters, high-level trigger filter farms and low-level trigger system. In this paper the heterogeneous FPGA-GPU architecture will be presented and its performance be discussed.

  20. Screening and Intervention for Intimate Partner Violence in Healthcare Settings: Creating Sustainable System-Level Programs

    PubMed Central

    Rhodes, Karin; Brown, Jeremy

    2015-01-01

    Abstract Among the barriers to routine screening for intimate partner violence (IPV) are time constraints, a lack of protocols and policies, and departmental philosophies of care that may conflict with IPV screening recommendations. To address these barriers, systems-level interventions are needed; in this article, we describe one model that may overcome these obstacles. We discuss how this systemic approach may best be implemented in both out-patient clinics and emergency departments (EDs) and note that evidence for its success will be required. PMID:25412012

  1. Effects of Relative Mean Sea Level Variations on Tidal Networks Generated on Experimental Setting

    NASA Astrophysics Data System (ADS)

    Stefanon, L.; Carniello, L.; D'Alpaos, A.; Rinaldo, A.

    2012-12-01

    We present the results of laboratory experiments carried out in a large experimental apparatus aimed at reproducing a typical lagoonal environment subject to tidal forcings. The experimental apparatus consists of two adjoining basins reproducing the sea and the lagoon. The tide is generated at the sea by a vertical steel sharp-edge weir, oscillating vertically. The weir is driven by an ad hoc developed software which continuously corrects the weir motion on the basis of water levels measured at the sea, allowing us to generate a sinusoidal tide of fixed amplitude and period, oscillating around mean water level. The bottom of the lagoon is covered by a layer of cohesionless plastic grains, with a density of 1041 kg/m3. The cohesionless plastic grains are characterized by a nearly uniform grain size distribution, with a median grain size of 0.8 mm. The lack of external sediment supply, the absence of vegetation, and the prevalence of bedload transport prevent any deposition processes and lateral surface accretion, attributing a purely erosive character to the experimental lagoon. As a consequence, in this experimental lagoon the main morphodynamic process responsible for tidal network initiation and development is the differential erosion between the channels and the adjacent surface. The experiments were designed in order to analyze the effects of mean sea level variations on channel network dynamics, focusing on the changes of the relevant geomorphic characteristics of the experimental networks, such as e.g. drainage density, based on the probability distribution of unchanneled lengths, and flowing tidal prism. Our results suggest that a decrease in the tidal prism leads to network retreat and contraction of channel cross sections. Conversely, an increase in the tidal prism promotes network re-incision and re-expansion of channel cross sections. In general, contractions and expansions tend to occur within the same planar blueprint and the network re-expands cutting

  2. A Software Process Framework for the SEI Capability Maturity Model: Repeatable Level

    DTIC Science & Technology

    1993-06-01

    Vilfredo Pareto , that most effects I come from relatively few causes, i.e., 80% of the effects come from 20% of the possible causes. peer review - A review...to the next maturity level. Using the Pareto principle [Juran88b], the CMM prescribes the "vital few" key process areas to focus on depending on an...or interfacing with the individuals responsible for performing in the topic area. (See train for contrast.) Pareto analysis - The analysis of defects

  3. Framework for DOE mixed low-level waste disposal: Site fact sheets

    SciTech Connect

    Gruebel, M.M.; Waters, R.D.; Hospelhorn, M.B.; Chu, M.S.Y.

    1994-11-01

    The Department of Energy (DOE) is required to prepare and submit Site Treatment Plans (STPS) pursuant to the Federal Facility Compliance Act (FFCAct). Although the FFCAct does not require that disposal be addressed in the STPS, the DOE and the States recognize that treatment of mixed low-level waste will result in residues that will require disposal in either low-level waste or mixed low-level waste disposal facilities. As a result, the DOE is working with the States to define and develop a process for evaluating disposal-site suitability in concert with the FFCAct and development of the STPS. Forty-nine potential disposal sites were screened; preliminary screening criteria reduced the number of sites for consideration to twenty-six. The DOE then prepared fact sheets for the remaining sites. These fact sheets provided additional site-specific information for understanding the strengths and weaknesses of the twenty-six sites as potential disposal sites. The information also provided the basis for discussion among affected States and the DOE in recommending sites for more detailed evaluation.

  4. Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework.

    PubMed

    Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana

    2014-06-01

    Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd.

  5. Impacts of static pressure set level on HVAC energy consumption and indoor conditions

    SciTech Connect

    Liu, M.; Zhu, Y.; Claridge, D.E.; White, E.

    1997-12-31

    Air static pressure must be maintained at a certain level leaving the air-handling unit (AHU) to force a suitable amount of air through the terminal boxes. However, an excessive static pressure level is often used due to (1) lack of a control device in a constant-volume (CV) system, (2) a malfunctioning control device in a variable-air-volume (VAV) system, and (3) fear of failure to maintain room temperature. High static pressure often develops excessive damper leakage in older mixing boxes. This results in an appropriate mixing of hot and cold air for dual-duct systems, excessive reheat in single-duct systems, and an excessive amount of air entering the space. Consequently, the actual fan power and heating and cooling energy consumption all become significantly higher than the design values. Even worse, the system may not be able to maintain room conditions due to unwanted simultaneous heating and cooling and may be noisy due to the excessive static pressure. This paper proposed to control the hot duct pressure and the variable-frequency drives (VFDs) to control the fan static, i.e., the cold duct pressure for dual-duct air-handling units. Both a theoretical analysis and results from a case study are presented in this paper.

  6. RD50 value as the criterion for setting maximum admissible levels of occupational exposure to irritants in Poland.

    PubMed

    Kupczewska-Dobecka, Małgorzata; Soćko, Renata; Czerczak, Sławomir

    2006-01-01

    The aim of this work is to analyse Maximum Admissible Concentration (MAC) values proposed for irritants by the Group of Experts for Chemical Agents in Poland, based on the RD50 value. In 1994-2004, MAC values for irritants based on the RD50 value were set for 17 chemicals. For the purpose of the analysis, 1/10 RD50, 1/100 RD50 and the MAC/RD50 ratio were calculated. The determined MAC values are within the 0.01-0.09 RD50 range. The RD50 value is a good rough criterion to set MAC values for irritants and it makes it possible to estimate quickly admissible exposure levels. It has become clear that, in some cases, simple setting the MAC value for an irritant at the level of 0.03 RD50 may be insufficient to determine precisely the possible hazard to workers' health. Other available toxicological data, such as NOAEL (No-Observed-Adverse-Effect Level) and LOAEL (Lowest-Observed-Adverse-Effect Level), should always be considered as well.

  7. Implementing and Measuring the Level of Laboratory Service Integration in a Program Setting in Nigeria

    PubMed Central

    Mbah, Henry; Negedu-Momoh, Olubunmi Ruth; Adedokun, Oluwasanmi; Ikani, Patrick Anibbe; Balogun, Oluseyi; Sanwo, Olusola; Ochei, Kingsley; Ekanem, Maurice; Torpey, Kwasi

    2014-01-01

    Background The surge of donor funds to fight HIV&AIDS epidemic inadvertently resulted in the setup of laboratories as parallel structures to rapidly respond to the identified need. However these parallel structures are a threat to the existing fragile laboratory systems. Laboratory service integration is critical to remedy this situation. This paper describes an approach to quantitatively measure and track integration of HIV-related laboratory services into the mainstream laboratory services and highlight some key intervention steps taken, to enhance service integration. Method A quantitative before-and-after study conducted in 122 Family Health International (FHI360) supported health facilities across Nigeria. A minimum service package was identified including management structure; trainings; equipment utilization and maintenance; information, commodity and quality management for laboratory integration. A check list was used to assess facilities at baseline and 3 months follow-up. Level of integration was assessed on an ordinal scale (0 = no integration, 1 = partial integration, 2 = full integration) for each service package. A composite score grading expressed as a percentage of total obtainable score of 14 was defined and used to classify facilities (≤80% FULL, 25% to 79% PARTIAL and <25% NO integration). Weaknesses were noted and addressed. Results We analyzed 9 (7.4%) primary, 104 (85.2%) secondary and 9 (7.4%) tertiary level facilities. There were statistically significant differences in integration levels between baseline and 3 months follow-up period (p<0.01). Baseline median total integration score was 4 (IQR 3 to 5) compared to 7 (IQR 4 to 9) at 3 months follow-up (p = 0.000). Partial and fully integrated laboratory systems were 64 (52.5%) and 0 (0.0%) at baseline, compared to 100 (82.0%) and 3 (2.4%) respectively at 3 months follow-up (p = 0.000). Discussion This project showcases our novel approach to measure the status of each

  8. Exploring a morphodynamic modeling framework for reef island evolution under sea-level rise

    NASA Astrophysics Data System (ADS)

    Lorenzo Trueba, J.; Ashton, A. D.; Donnelly, J. P.

    2013-12-01

    Global sea-level rise rates have increased over the last century, with dramatic rate increases expected over the coming century and beyond. Not only are rates projected to approach those of the previous deglaciation, the actual increase in elevation by the end of the century (potentially 1m or more) will be significant in terms of the elevations of low-lying coastal landforms. Coral reef islands, often called 'cays' or 'motus', which generally comprise the subaerial portion of atolls, are particularly sensitive to sea-level rise. These landforms are typically low-lying (on the order of meters high), and are formed of wave-transported detrital sediment perched atop coralline rock. As opposed to barrier islands that can be supplied by offshore sediment from the shoreface, breakdown of corals and the shallow offshore lithology can serve as a source of sediment to reef islands, which can help build these islands as sea level rises. Here, we present a morphodynamic model to explore the combined effects of sea-level rise, sediment supply, and overwash processes on the evolution of reef islands. Model results demonstrate how reef islands are particularly sensitive to the offshore generation of sediment. When this onshore sediment supply is low, islands migrate lagoonward via storm overwash, Islands migrate over the proximal lagoonward regions, which tend to include a shallow (~2m) platform, until they reach the edge of a typically very deep lagoon (up to 60m or more). At the lagoon edge, reef islands stop their migration and eventually drown overwash sediment flux is lost to the lagoon. In contrast, a high sediment supply of offshore sediment can bulwark reef islands before reaching the lagoon edge. One possibility is that the island attains a ';static equilibrium' in which the overwash flux fills the top-barrier accommodation created by sea-level rise, and the island surface area is maintained. When the sediment supply is very high, however, the island can undergo rapid

  9. Transmembrane proteoglycans control stretch-activated channels to set cytosolic calcium levels

    PubMed Central

    Gopal, Sandeep; Søgaard, Pernille; Multhaupt, Hinke A.B.; Pataki, Csilla; Okina, Elena; Xian, Xiaojie; Pedersen, Mikael E.; Stevens, Troy; Griesbeck, Oliver; Park, Pyong Woo; Pocock, Roger

    2015-01-01

    Transmembrane heparan sulfate proteoglycans regulate multiple aspects of cell behavior, but the molecular basis of their signaling is unresolved. The major family of transmembrane proteoglycans is the syndecans, present in virtually all nucleated cells, but with mostly unknown functions. Here, we show that syndecans regulate transient receptor potential canonical (TRPCs) channels to control cytosolic calcium equilibria and consequent cell behavior. In fibroblasts, ligand interactions with heparan sulfate of syndecan-4 recruit cytoplasmic protein kinase C to target serine714 of TRPC7 with subsequent control of the cytoskeleton and the myofibroblast phenotype. In epidermal keratinocytes a syndecan–TRPC4 complex controls adhesion, adherens junction composition, and early differentiation in vivo and in vitro. In Caenorhabditis elegans, the TRPC orthologues TRP-1 and -2 genetically complement the loss of syndecan by suppressing neuronal guidance and locomotory defects related to increases in neuronal calcium levels. The widespread and conserved syndecan–TRPC axis therefore fine tunes cytoskeletal organization and cell behavior. PMID:26391658

  10. Creating an occupational therapy Level II fieldwork experience in a county jail setting.

    PubMed

    Provident, Ingrid M; Joyce-Gaguzis, Kelly

    2005-01-01

    Although occupational therapy services have been rendered in prisons historically, only one occupational therapy program currently exists in a county jail: the Allegheny County Jail Project (ACJ Project). The offenders who populate county jails experience occupational deprivation. The participants of the ACJ Project have benefited from occupational therapy intervention that was initiated during incarceration and continued following their release from jail in order to help them resume productive life roles and to reduce the reoccurrence of engagement in criminal behaviors (recidivism rate). As of June 2003, the ACJ Project has successfully affected the lifestyle patterns of its participants and overall public safety by helping 63% of participants secure gainful employment and by helping 91.8% of participants maintain their freedom after prison. The purpose of this report is to describe the process and benefits of implementing fieldwork opportunities for Level II occupational therapy students in a best practice occupational therapy program in a nontraditional environment: a county jail.

  11. A Unified Framework for Multi-level Processing of Complex Data

    DTIC Science & Technology

    2014-12-04

    each point representing an image thumb-nail, highlight of a medical record , spectral cmve for eve1y pixel of an HSI cube, etc. A weighted graph...thumb-nail, highlight of a medical record , spectral curve for every pixel of an HSI cube, etc. A weighted graph, with data similarities as weights, is...other than abstracts): 6.00 1 Invited Speaker , two lectures: "Multi-level methods for image inpainting" and "Manifold approach to high-dimensional data

  12. Using the World Health Organization's 4S-Framework to Strengthen National Strategies, Policies and Services to Address Mental Health Problems in Adolescents in Resource-Constrained Settings

    PubMed Central

    2011-01-01

    Background Most adolescents live in resource-constrained countries and their mental health has been less well recognised than other aspects of their health. The World Health Organization's 4-S Framework provides a structure for national initiatives to improve adolescent health through: gathering and using strategic information; developing evidence-informed policies; scaling up provision and use of health services; and strengthening linkages with other government sectors. The aim of this paper is to discuss how the findings of a recent systematic review of mental health problems in adolescents in resource-constrained settings might be applied using the 4-S Framework. Method Analysis of the implications of the findings of a systematic search of the English-language literature for national strategies, policies, services and cross-sectoral linkages to improve the mental health of adolescents in resource-constrained settings. Results Data are available for only 33/112 [29%] resource-constrained countries, but in all where data are available, non-psychotic mental health problems in adolescents are identifiable, prevalent and associated with reduced quality of life, impaired participation and compromised development. In the absence of evidence about effective interventions in these settings expert opinion is that a broad public policy response which addresses direct strategies for prevention, early intervention and treatment; health service and health workforce requirements; social inclusion of marginalised groups of adolescents; and specific education is required. Specific endorsed strategies include public education, parent education, training for teachers and primary healthcare workers, psycho-educational curricula, identification through periodic screening of the most vulnerable and referral for care, and the availability of counsellors or other identified trained staff members in schools from whom adolescents can seek assistance for personal, peer and family

  13. Examining Screening-Level Multimedia Models Through a Comparison Framework for Landfill Management.

    PubMed

    Asif, Zunaira; Chen, Zhi

    2016-01-01

    Two models for evaluating transport and fate of benzene were studied and compared in this paper. A fugacity model and an analytical environmental multimedia model (AEMM) were used to reconcile fate and mass transfer of benzene observed in a landfill site. The comparison of two models were based on average concentrations and partition behavior of benzene among three different phases i.e., air, soil, and groundwater. In the study of fugacity method about 99.6 % of the total benzene flux was distributed into air from landfill source. According to AEMM the diffusion gas flux was also predominant mechanism for benzene released from landfill and advection of gas and liquid was second dominant transport mechanism at steady-state conditions. Overall study of fugacity modeling (Level I and II) confirms the fate and transport mechanism of benzene released from landfill by comparing it with AEMM. However, the values of predicted concentrations, advection, and diffusion flux of benzene using fugacity model were different from AEMM results due to variation in input parameters. In comparison with experimental observations, fugacity model showed more error difference as compared to AEMM as fugacity model is treated as a single unit box model. This study confirms that fugacity model is a screening level tool to be used in conjunction with detailed remediation followed by AEMM that can be evolved as strategic decision-making stage.

  14. Examining Screening-Level Multimedia Models Through a Comparison Framework for Landfill Management

    NASA Astrophysics Data System (ADS)

    Asif, Zunaira; Chen, Zhi

    2016-01-01

    Two models for evaluating transport and fate of benzene were studied and compared in this paper. A fugacity model and an analytical environmental multimedia model (AEMM) were used to reconcile fate and mass transfer of benzene observed in a landfill site. The comparison of two models were based on average concentrations and partition behavior of benzene among three different phases i.e., air, soil, and groundwater. In the study of fugacity method about 99.6 % of the total benzene flux was distributed into air from landfill source. According to AEMM the diffusion gas flux was also predominant mechanism for benzene released from landfill and advection of gas and liquid was second dominant transport mechanism at steady-state conditions. Overall study of fugacity modeling (Level I and II) confirms the fate and transport mechanism of benzene released from landfill by comparing it with AEMM. However, the values of predicted concentrations, advection, and diffusion flux of benzene using fugacity model were different from AEMM results due to variation in input parameters. In comparison with experimental observations, fugacity model showed more error difference as compared to AEMM as fugacity model is treated as a single unit box model. This study confirms that fugacity model is a screening level tool to be used in conjunction with detailed remediation followed by AEMM that can be evolved as strategic decision-making stage.

  15. High Levels of Post-Abortion Complication in a Setting Where Abortion Service Is Not Legalized

    PubMed Central

    Melese, Tadele; Habte, Dereje; Tsima, Billy M.; Mogobe, Keitshokile Dintle; Chabaesele, Kesegofetse; Rankgoane, Goabaone; Keakabetse, Tshiamo R.; Masweu, Mabole; Mokotedi, Mosidi; Motana, Mpho; Moreri-Ntshabele, Badani

    2017-01-01

    Background Maternal mortality due to abortion complications stands among the three leading causes of maternal death in Botswana where there is a restrictive abortion law. This study aimed at assessing the patterns and determinants of post-abortion complications. Methods A retrospective institution based cross-sectional study was conducted at four hospitals from January to August 2014. Data were extracted from patients’ records with regards to their socio-demographic variables, abortion complications and length of hospital stay. Descriptive statistics and bivariate analysis were employed. Result A total of 619 patients’ records were reviewed with a mean (SD) age of 27.12 (5.97) years. The majority of abortions (95.5%) were reported to be spontaneous and 3.9% of the abortions were induced by the patient. Two thirds of the patients were admitted as their first visit to the hospitals and one third were referrals from other health facilities. Two thirds of the patients were admitted as a result of incomplete abortion followed by inevitable abortion (16.8%). Offensive vaginal discharge (17.9%), tender uterus (11.3%), septic shock (3.9%) and pelvic peritonitis (2.4%) were among the physical findings recorded on admission. Clinically detectable anaemia evidenced by pallor was found to be the leading major complication in 193 (31.2%) of the cases followed by hypovolemic and septic shock 65 (10.5%). There were a total of 9 abortion related deaths with a case fatality rate of 1.5%. Self-induced abortion and delayed uterine evacuation of more than six hours were found to have significant association with post-abortion complications (p-values of 0.018 and 0.035 respectively). Conclusion Abortion related complications and deaths are high in our setting where abortion is illegal. Mechanisms need to be devised in the health facilities to evacuate the uterus in good time whenever it is indicated and to be equipped to handle the fatal complications. There is an indication for

  16. Hydrogeologic setting east of a low-level radioactive-waste disposal site near Sheffield, Illinois

    USGS Publications Warehouse

    Foster, J.B.; Garklavs, George; Mackey, G.W.

    1984-01-01

    Core samples from 45 test wells and 4 borings were used to describe the glacial geology of the area east of the low-level radioactive-waste disposal site near Sheffield, Bureau County, Illinois. Previous work has shown that shallow ground water beneath the disposal site flows east through a pebbly-sand unit of the Toulon Member of the Glasford Formation. The pebbly sand was found in core samples from wells in an area extending northeast from the waste-disposal site to a strip-mine lake and east along the south side of the lake. Other stratigraphic units identified in the study area are correlated with units found on the disposal site. The pebbly-sand unit of the Toulon Member grades from a pebbly sand on site into a coarse gravel with sand and pebbles towards the lake. The Hulick Till Member, a key bed, underlies the Toulon Member throughout most of the study area. A narrow channel-like depression in the Hulick Till is filled with coarse gravelly sand of the Toulon Member. The filled depression extends eastward from near the northeast corner of the waste-disposal site to the strip-mine lake. (USGS)

  17. Gradient Augmented Level Set Method for Two Phase Flow Simulations with Phase Change

    NASA Astrophysics Data System (ADS)

    Anumolu, C. R. Lakshman; Trujillo, Mario F.

    2016-11-01

    A sharp interface capturing approach is presented for two-phase flow simulations with phase change. The Gradient Augmented Levelset method is coupled with the two-phase momentum and energy equations to advect the liquid-gas interface and predict heat transfer with phase change. The Ghost Fluid Method (GFM) is adopted for velocity to discretize the advection and diffusion terms in the interfacial region. Furthermore, the GFM is employed to treat the discontinuity in the stress tensor, velocity, and temperature gradient yielding an accurate treatment in handling jump conditions. Thermal convection and diffusion terms are approximated by explicitly identifying the interface location, resulting in a sharp treatment for the energy solution. This sharp treatment is extended to estimate the interfacial mass transfer rate. At the computational cell, a d-cubic Hermite interpolating polynomial is employed to describe the interface location, which is locally fourth-order accurate. This extent of subgrid level description provides an accurate methodology for treating various interfacial processes with a high degree of sharpness. The ability to predict the interface and temperature evolutions accurately is illustrated by comparing numerical results with existing 1D to 3D analytical solutions.

  18. Development of a Software Framework for System-Level Carbon Sequestration Risk Assessment

    SciTech Connect

    Miller, R.

    2013-02-28

    The overall purpose of this project was to identify, evaluate, select, develop, and test a suite of enhancements to the GoldSim software program, in order to make it a better tool for use in support of Carbon Capture and Sequestration (CCS) projects. The GoldSim software is a foundational tool used by scientists at NETL and at other laboratories and research institutions to evaluate system-level risks of proposed CCS projects. The primary product of the project was a series of successively improved versions of the GoldSim software, supported by an extensive User’s Guide. All of the enhancements were tested by scientists at Los Alamos National Laboratory, and several of the enhancements have already been incorporated into the CO{sub 2}-PENS sequestration model.

  19. Selective removal of cesium and strontium using porous frameworks from high level nuclear waste.

    PubMed

    Aguila, Briana; Banerjee, Debasis; Nie, Zimin; Shin, Yongsoon; Ma, Shengqian; Thallapally, Praveen K

    2016-05-01

    Efficient and cost-effective removal of radioactive (137)Cs and (90)Sr found in spent fuel is an important step for safe, long-term storage of nuclear waste. Solid-state materials such as resins and titanosilicate zeolites have been assessed for the removal of Cs and Sr from aqueous solutions, but there is room for improvement in terms of capacity and selectivity. Herein, we report the Cs(+) and Sr(2+) exchange potential of an ultra stable MOF, namely, MIL-101-SO3H, as a function of different contact times, concentrations, pH levels, and in the presence of competing ions. Our preliminary results suggest that MOFs with suitable ion exchange groups can be promising alternate materials for cesium and strontium removal.

  20. Setting action levels for drinking water: are we protecting our health or our economy (or our backs!)?

    PubMed

    Reimann, Clemens; Banks, David

    2004-10-01

    Clean and healthy drinking water is important for life. Drinking water can be drawn from streams, lakes and rivers, directly collected (and stored) from rain, acquired by desalination of ocean water and melting of ice or it can be extracted from groundwater resources. Groundwater may reach the earth's surface in the form of springs or can be extracted via dug or drilled wells; it also contributes significantly to river baseflow. Different water quality issues have to be faced when utilising these different water resources. Some of these are at present largely neglected in water quality regulations. This paper focuses on the inorganic chemical quality of natural groundwater. Possible health effects, the problems of setting meaningful action levels or maximum admissible concentrations (MAC-values) for drinking water, and potential shortcomings in current legislation are discussed. An approach to setting action levels based on transparency, toxicological risk assessment, completeness, and identifiable responsibility is suggested.

  1. Level Set Methods for Optimization Problems Involving Geometry and Constraints. I. Frequencies of a Two-Density Inhomogeneous Drum

    NASA Astrophysics Data System (ADS)

    Osher, Stanley J.; Santosa, Fadil

    2001-07-01

    Many problems in engineering design involve optimizing the geometry to maximize a certain design objective. Geometrical constraints are often imposed. In this paper, we use the level set method devised in (Osher and Sethian, J. Comput. Phys.79, 12 (1988)), the variational level set calculus presented in (Zhao et al., J. Comput. Phys.127, 179 (1996)), and the projected gradient method, as in (Rudin et al., Physica D.60, 259 (1992)), to construct a simple numerical approach for problems of this type. We apply this technique to a model problem involving a vibrating system whose resonant frequency or whose spectral gap is to be optimized subject to constraints on geometry. Our numerical results are quite promising. We expect to use this approach to deal with a wide class of optimal design problems in the future.

  2. A level-set approach for eddy current imaging of defects in a conductive half-space

    NASA Astrophysics Data System (ADS)

    Litman, A.; Lesselier, D.; Santosa, F.

    The retrieval of the shape of a cylindrical defect of low conductivity buried in a conductive half-space is investigated from aspect-limited, frequency-diverse data. The sources of the interrogative fields and the receivers of the scattered (anomalous) fields are both placed on the same side of a particular interface. The defect is embedded on the other side. We derive an iterative process based on level-set methods. This level-set approach has been shown to be effective in treating problems with propagating fronts and is based on the ideas developed by Osher and Sethian. An iterative process is implemented: at each iteration, the boundary of the defect is moving with a speed term which minimizes the residual in the data fit. The resulting equation of motion is solved by employing entropy-satisfying upwind finite-differences schemes.

  3. WriteSmoothing: Improving Lifetime of Non-volatile Caches Using Intra-set Wear-leveling

    SciTech Connect

    Mittal, Sparsh; Vetter, Jeffrey S; Li, Dong

    2014-01-01

    Driven by the trends of increasing core-count and bandwidth-wall problem, the size of last level caches (LLCs) has greatly increased. Since SRAM consumes high leakage power, researchers have explored use of non-volatile memories (NVMs) for designing caches as they provide high density and consume low leakage power. However, since NVMs have low write-endurance and the existing cache management policies are write variation-unaware, effective wear-leveling techniques are required for achieving reasonable cache lifetimes using NVMs. We present WriteSmoothing, a technique for mitigating intra-set write variation in NVM caches. WriteSmoothing logically divides the cache-sets into multiple modules. For each module, WriteSmoothing collectively records number of writes in each way for any of the sets. It then periodically makes most frequently written ways in a module unavailable to shift the write-pressure to other ways in the sets of the module. Extensive simulation results have shown that on average, for single and dual-core system configurations, WriteSmoothing improves cache lifetime by 2.17X and 2.75X, respectively. Also, its implementation overhead is small and it works well for a wide range of algorithm and system parameters.

  4. Dynamic compensation mechanism gives rise to period and duty-cycle level sets in oscillatory neuronal models.

    PubMed

    Rotstein, Horacio G; Olarinre, Motolani; Golowasch, Jorge

    2016-11-01

    Rhythmic oscillation in neurons can be characterized by various attributes, such as the oscillation period and duty cycle. The values of these features depend on the amplitudes of the participating ionic currents, which can be characterized by their maximum conductance values. Recent experimental and theoretical work has shown that the values of these attributes can be maintained constant for different combinations of two or more ionic currents of varying conductances, defining what is known as level sets in conductance space. In two-dimensional conductance spaces, a level set is a curve, often a line, along which a particular oscillation attribute value is conserved. In this work, we use modeling, dynamical systems tools (phase-space analysis), and numerical simulations to investigate the possible dynamic mechanisms responsible for the generation of period and duty-cycle levels sets in simplified (linearized and FitzHugh-Nagumo) and conductance-based (Morris-Lecar) models of neuronal oscillations. A simplistic hypothesis would be that the tonic balance between ionic currents with the same or opposite effective signs is sufficient to create level sets. According to this hypothesis, the dynamics of each ionic current during a given cycle are well captured by some constant quantity (e.g., maximal conductances), and the phase-plane diagrams are identical or are almost identical (e.g., cubic-like nullclines with the same maxima and minima) for different combinations of these maximal conductances. In contrast, we show that these mechanisms are dynamic and involve the complex interaction between the nonlinear voltage dependencies and the effective time scales at which the ionic current's dynamical variables operate.

  5. Robust Anisotropic Diffusion Based Edge Enhancement for Level Set Segmentation and Asymmetry Analysis of Breast Thermograms using Zernike Moments.

    PubMed

    Prabha, S; Sujatha, C M; Ramakrishnan, S

    2015-01-01

    Breast thermography plays a major role in early detection of breast cancer in which the thermal variations are associated with precancerous state of breast. The distribution of asymmetrical thermal patterns indicates the pathological condition in breast thermal images. In this work, asymmetry analysis of breast thermal images is carried out using level set segmentation and Zernike moments. The breast tissues are subjected to Tukey’s biweight robust anisotropic diffusion filtering (TBRAD) for the generation of edge map. Reaction diffusion level set method is employed for segmentation in which TBRAD edge map is used as stopping criterion during the level set evolution. Zernike moments are extracted from the segmented breast tissues to perform asymmetry analysis. Results show that the TBRAD filter is able to enhance the edges near infra mammary folds and lower breast boundaries effectively. It is observed that segmented breast tissues are found to be continuous and has sharper boundary. This method yields high degree of correlation (98%) between the segmented output and the ground truth images. Among the extracted Zernike features, higher order moments are found to be significant in demarcating normal and carcinoma breast tissues by 9%. It appears that, the methodology adopted here is useful in accurate segmentation and differentiation of normal and carcinoma breast tissues for automated diagnosis of breast abnormalities.

  6. Pulmonary Nodule Detection Model Based on SVM and CT Image Feature-Level Fusion with Rough Sets

    PubMed Central

    Lu, Huiling; Zhang, Junjie; Shi, Hongbin

    2016-01-01

    In order to improve the detection accuracy of pulmonary nodules in CT image, considering two problems of pulmonary nodules detection model, including unreasonable feature structure and nontightness of feature representation, a pulmonary nodules detection algorithm is proposed based on SVM and CT image feature-level fusion with rough sets. Firstly, CT images of pulmonary nodule are analyzed, and 42-dimensional feature components are extracted, including six new 3-dimensional features proposed by this paper and others 2-dimensional and 3-dimensional features. Secondly, these features are reduced for five times with rough set based on feature-level fusion. Thirdly, a grid optimization model is used to optimize the kernel function of support vector machine (SVM), which is used as a classifier to identify pulmonary nodules. Finally, lung CT images of 70 patients with pulmonary nodules are collected as the original samples, which are used to verify the effectiveness and stability of the proposed model by four groups' comparative experiments. The experimental results show that the effectiveness and stability of the proposed model based on rough set feature-level fusion are improved in some degrees. PMID:27722173

  7. Comparison of different statistical methods for estimation of extreme sea levels with wave set-up contribution

    NASA Astrophysics Data System (ADS)

    Kergadallan, Xavier; Bernardara, Pietro; Benoit, Michel; Andreewsky, Marc; Weiss, Jérôme

    2013-04-01

    Estimating the probability of occurrence of extreme sea levels is a central issue for the protection of the coast. Return periods of sea level with wave set-up contribution are estimated here in one site : Cherbourg in France in the English Channel. The methodology follows two steps : the first one is computation of joint probability of simultaneous wave height and still sea level, the second one is interpretation of that joint probabilities to assess a sea level for a given return period. Two different approaches were evaluated to compute joint probability of simultaneous wave height and still sea level : the first one is multivariate extreme values distributions of logistic type in which all components of the variables become large simultaneously, the second one is conditional approach for multivariate extreme values in which only one component of the variables have to be large. Two different methods were applied to estimate sea level with wave set-up contribution for a given return period : Monte-Carlo simulation in which estimation is more accurate but needs higher calculation time and classical ocean engineering design contours of type inverse-FORM in which the method is simpler and allows more complex estimation of wave setup part (wave propagation to the coast for example). We compare results from the two different approaches with the two different methods. To be able to use both Monte-Carlo simulation and design contours methods, wave setup is estimated with an simple empirical formula. We show advantages of the conditional approach compared to the multivariate extreme values approach when extreme sea-level occurs when either surge or wave height is large. We discuss the validity of the ocean engineering design contours method which is an alternative when computation of sea levels is too complex to use Monte-Carlo simulation method.

  8. Levels of 8-OxodG Predict Hepatobiliary Pathology in Opisthorchis viverrini Endemic Settings in Thailand

    PubMed Central

    Jariwala, Amar R.; Sithithaworn, Jiraporn; Sripa, Banchob; Brindley, Paul J.; Laha, Thewarach; Mairiang, Eimorn; Pairojkul, Chawalit; Khuntikeo, Narong; Mulvenna, Jason; Sithithaworn, Paiboon; Bethony, Jeffrey M.

    2015-01-01

    Opisthorchis viverrini is distinct among helminth infections as it drives a chronic inflammatory response in the intrahepatic bile duct that progresses from advanced periductal fibrosis (APF) to cholangiocarcinoma (CCA). Extensive research shows that oxidative stress (OS) plays a critical role in the transition from chronic O. viverrini infection to CCA. OS also results in the excision of a modified DNA lesion (8-oxodG) into urine, the levels of which can be detected by immunoassay. Herein, we measured concentrations of urine 8-oxodG by immunoassay from the following four groups in the Khon Kaen Cancer Cohort study: (1) O. viverrini negative individuals, (2) O. viverrini positive individuals with no APF as determined by abdominal ultrasound, (3) O. viverrini positive individuals with APF as determined by abdominal ultrasound, and (4) O. viverrini induced cases of CCA. A logistic regression model was used to evaluate the utility of creatinine-adjusted urinary 8-oxodG among these groups, along with demographic, behavioral, and immunological risk factors. Receiver operating characteristic (ROC) curve analysis was used to evaluate the predictive accuracy of urinary 8-oxodG for APF and CCA. Elevated concentrations of 8-oxodG in urine positively associated with APF and CCA in a strongly dose-dependent manner. Urinary 8-oxodG concentrations also accurately predicted whether an individual presented with APF or CCA compared to O. viverrini infected individuals without these pathologies. In conclusion, urinary 8-oxodG is a robust ‘candidate’ biomarker of the progression of APF and CCA from chronic opisthorchiasis, which is indicative of the critical role that OS plays in both of these advanced hepatobiliary pathologies. The findings also confirm our previous observations that severe liver pathology occurs early and asymptomatically in residents of O. viverrini endemic regions, where individuals are infected for years (often decades) with this food-borne pathogen. These

  9. Levels of 8-OxodG Predict Hepatobiliary Pathology in Opisthorchis viverrini Endemic Settings in Thailand.

    PubMed

    Saichua, Prasert; Yakovleva, Anna; Kamamia, Christine; Jariwala, Amar R; Sithithaworn, Jiraporn; Sripa, Banchob; Brindley, Paul J; Laha, Thewarach; Mairiang, Eimorn; Pairojkul, Chawalit; Khuntikeo, Narong; Mulvenna, Jason; Sithithaworn, Paiboon; Bethony, Jeffrey M

    2015-01-01

    Opisthorchis viverrini is distinct among helminth infections as it drives a chronic inflammatory response in the intrahepatic bile duct that progresses from advanced periductal fibrosis (APF) to cholangiocarcinoma (CCA). Extensive research shows that oxidative stress (OS) plays a critical role in the transition from chronic O. viverrini infection to CCA. OS also results in the excision of a modified DNA lesion (8-oxodG) into urine, the levels of which can be detected by immunoassay. Herein, we measured concentrations of urine 8-oxodG by immunoassay from the following four groups in the Khon Kaen Cancer Cohort study: (1) O. viverrini negative individuals, (2) O. viverrini positive individuals with no APF as determined by abdominal ultrasound, (3) O. viverrini positive individuals with APF as determined by abdominal ultrasound, and (4) O. viverrini induced cases of CCA. A logistic regression model was used to evaluate the utility of creatinine-adjusted urinary 8-oxodG among these groups, along with demographic, behavioral, and immunological risk factors. Receiver operating characteristic (ROC) curve analysis was used to evaluate the predictive accuracy of urinary 8-oxodG for APF and CCA. Elevated concentrations of 8-oxodG in urine positively associated with APF and CCA in a strongly dose-dependent manner. Urinary 8-oxodG concentrations also accurately predicted whether an individual presented with APF or CCA compared to O. viverrini infected individuals without these pathologies. In conclusion, urinary 8-oxodG is a robust 'candidate' biomarker of the progression of APF and CCA from chronic opisthorchiasis, which is indicative of the critical role that OS plays in both of these advanced hepatobiliary pathologies. The findings also confirm our previous observations that severe liver pathology occurs early and asymptomatically in residents of O. viverrini endemic regions, where individuals are infected for years (often decades) with this food-borne pathogen. These

  10. A farm-level precision land management framework based on integer programming

    PubMed Central

    Li, Qi; Hu, Guiping; Jubery, Talukder Zaki; Ganapathysubramanian, Baskar

    2017-01-01

    Farmland management involves several planning and decision making tasks including seed selection and irrigation management. A farm-level precision farmland management model based on mixed integer linear programming is proposed in this study. Optimal decisions are designed for pre-season planning of crops and irrigation water allocation. The model captures the effect of size and shape of decision scale as well as special irrigation patterns. The authors illustrate the model with a case study on a farm in the state of California in the U.S. and show the model can capture the impact of precision farm management on profitability. The results show that threefold increase of annual net profit for farmers could be achieved by carefully choosing irrigation and seed selection. Although farmers could increase profits by applying precision management to seed or irrigation alone, profit increase is more significant if farmers apply precision management on seed and irrigation simultaneously. The proposed model can also serve as a risk analysis tool for farmers facing seasonal irrigation water limits as well as a quantitative tool to explore the impact of precision agriculture. PMID:28346499

  11. A farm-level precision land management framework based on integer programming.

    PubMed

    Li, Qi; Hu, Guiping; Jubery, Talukder Zaki; Ganapathysubramanian, Baskar

    2017-01-01

    Farmland management involves several planning and decision making tasks including seed selection and irrigation management. A farm-level precision farmland management model based on mixed integer linear programming is proposed in this study. Optimal decisions are designed for pre-season planning of crops and irrigation water allocation. The model captures the effect of size and shape of decision scale as well as special irrigation patterns. The authors illustrate the model with a case study on a farm in the state of California in the U.S. and show the model can capture the impact of precision farm management on profitability. The results show that threefold increase of annual net profit for farmers could be achieved by carefully choosing irrigation and seed selection. Although farmers could increase profits by applying precision management to seed or irrigation alone, profit increase is more significant if farmers apply precision management on seed and irrigation simultaneously. The proposed model can also serve as a risk analysis tool for farmers facing seasonal irrigation water limits as well as a quantitative tool to explore the impact of precision agriculture.

  12. Setting the most robust effluent level under severe uncertainty: application of information-gap decision theory to chemical management.

    PubMed

    Yokomizo, Hiroyuki; Naito, Wataru; Tanaka, Yoshinari; Kamo, Masashi

    2013-11-01

    Decisions in ecological risk management for chemical substances must be made based on incomplete information due to uncertainties. To protect the ecosystems from the adverse effect of chemicals, a precautionary approach is often taken. The precautionary approach, which is based on conservative assumptions about the risks of chemical substances, can be applied selecting management models and data. This approach can lead to an adequate margin of safety for ecosystems by reducing exposure to harmful substances, either by reducing the use of target chemicals or putting in place strict water quality criteria. However, the reduction of chemical use or effluent concentrations typically entails a financial burden. The cost effectiveness of the precautionary approach may be small. Hence, we need to develop a formulaic methodology in chemical risk management that can sufficiently protect ecosystems in a cost-effective way, even when we do not have sufficient information for chemical management. Information-gap decision theory can provide the formulaic methodology. Information-gap decision theory determines which action is the most robust to uncertainty by guaranteeing an acceptable outcome under the largest degree of uncertainty without requiring information about the extent of parameter uncertainty at the outset. In this paper, we illustrate the application of information-gap decision theory to derive a framework for setting effluent limits of pollutants for point sources under uncertainty. Our application incorporates a cost for reduction in pollutant emission and a cost to wildlife species affected by the pollutant. Our framework enables us to settle upon actions to deal with severe uncertainty in ecological risk management of chemicals.

  13. Providing a navigable route for acute medicine nurses to advance their practice: a framework of ascending levels of practice.

    PubMed

    Lees-Deutsch, Liz; Christian, Jan; Setchfield, Ian

    2016-01-01

    This article conveys concerns raised by delegates at the International SAM Conference (Manchester, 2015) regarding how to advance nursing practice in acute medicine. It endeavors to capture the essence of 'how to advance practice' and 'how to integrate advanced practice' within the workforce structures of an acute medicine unit (AMU). It addresses the production of tacit knowledge and the recognition and integration of this to developing the nursing workforce. The current context of NHS efficiencies and recruitment issues emphasize the value of retaining tacit knowledge. Uniquely, this article offers an early conceptual framework through which levels of advancement and potential transition points to advance nursing practice in acute medicine are articulated. Determining how to advance requires identification of prior accomplishments such as, tacit knowledge, experiential learning, CPD, specialist courses and management experience. This requires nurses to make judicious decisions to advance their practice and the distinction between 'amassing experience' and 'career progression'. It aims to stimulate thinking around the practicalities of advancement, the value of tacit knowledge and potential realization through the framework trajectory.

  14. Inverting Glacial Isostatic Adjustment with Paleo Sea Level Records using Bayesian Framework and Burgers Rheology

    NASA Astrophysics Data System (ADS)

    Caron, L.; Metivier, L.; Greff-Lefftz, M.; Fleitout, L.; Rouby, H.

    2015-12-01

    Glacial Isostatic Adjustment models most often assume a mantle with a viscoelastic Maxwell rheology and a given ice history model. Here we use a Bayesian Monte Carlo with Markov Chains formalism to invert the global GIA signal simultaneously for the mechanical properties of the mantle and for the volume of the various ice-sheets using as starting ice models two distinct previously published ice histories. Burgers as well as Maxwell rheologies are considered.The fitted data consist of 5720 paleo sea level records from the last 35kyrs, with a world-wide distribution. Our ambition is to present not only the best fitting model, but also the range of possible solutions (within the explored space of parameters) with their respective probability of explaining the data, and thus reveal the trade-off effects and range of uncertainty affecting the parameters. Our a posteriori probality maps exhibit in all cases two distinct peaks: both are characterized by an upper mantle viscosity around 5.1020Pa.s but one of the peaks features a lower mantle viscosity around 3.1021Pa.s while the other indicates lower mantle viscosity of more than 1.1022Pa.s. The global maximum depends upon the starting ice history and the chosen rheology: the first peak (P1) has the highest probability only in the case with a Maxwell rheology and ice history based on ICE-5G, while the second peak (P2) is favored when using ANU-based ice history or Burgers rheology, and is our preferred solution as it is also consistent with long-term geodynamics and gravity gradients anomalies over Laurentide. P2 is associated with larger volumes for the Laurentian and Fennoscandian ice-sheets and as a consequence of total ice volume balance, smaller volumes for the Antactic ice-sheet. This last point interfers with the estimate of present-day ice-melting in Antarctica from GRACE data. Finally, we find that P2 with Burgers rheology favors the existence of a tectosphere, i.e. a viscous sublithospheric layer.

  15. TDP-43 aggregation mirrors TDP-43 knockdown, affecting the expression levels of a common set of proteins

    PubMed Central

    Prpar Mihevc, S.; Baralle, Marco; Buratti, Emanuele; Rogelj, Boris

    2016-01-01

    TDP-43 protein plays an important role in regulating transcriptional repression, RNA metabolism, and splicing. Typically it shuttles between the nucleus and the cytoplasm to perform its functions, while abnormal cytoplasmic aggregation of TDP-43 has been associated with neurodegenerative diseases amyotrophic lateral sclerosis (ALS) and frontotemporal lobar degeneration (FTLD). For the purpose of this study we selected a set of proteins that were misregulated following silencing of TDP-43 and analysed their expression in a TDP-43-aggregation model cell line HEK293 Flp-in Flag-TDP-43-12x-Q/N F4L. Following TDP-43 sequestration in insoluble aggregates, we observed higher nuclear levels of EIF4A3, and POLDIP3β, whereas nuclear levels of DNMT3A, HNRNPA3, PABPC1 and POLDIP3α dropped, and cytoplasmic levels of RANBP1 dropped. In addition, immunofluorescence signal intensity quantifications showed increased nuclear expression of HNRNPL and YARS, and downregulation of cytoplasmic DPCD. Furthermore, cytoplasmic levels of predominantly nuclear protein ALYREF increased. In conclusion, by identifying a common set of proteins that are differentially expressed in a similar manner in these two different conditions, we show that TDP-43 aggregation has a comparable effect to TDP-43 knockdown. PMID:27665936

  16. Investigation of indoor air volatile organic compounds concentration levels in dental settings and some related methodological issues.

    PubMed

    Santarsiero, Anna; Fuselli, Sergio; Piermattei, Alessandro; Morlino, Roberta; De Blasio, Giorgia; De Felice, Marco; Ortolani, Emanuela

    2009-01-01

    The assessment of indoor air volatile organic compounds (VOCs) concentration levels in dental settings has a big health relevance for the potentially massive occupational exposure to a lot of diverse contaminants. The comparison of the VOCs profile relative to indoor conditions and to the corresponding outdoor concentrations, as well as the discovery of possible correlations between specific dental activities and VOCs concentration variations are of utmost importance for offering a reliable characterization of risk for dentists and dental staff health. In this study we review the most relevant environmental studies addressing the VOCs contamination level in dental settings. We analyze the methodological problems this kind of study must face and we report preliminary results of an indoor air investigation, carried out at dental hospital in Italy, the "Ospedale odontoiatrico George Eastman" of Rome, in which general lines for the analysis of dental settings in environmental terms are sketched. The aim of this work is to identify the kind of problems a typical enclosed (non-industrial) environment indoor air investigation has to cope with by means of the analysis of a case study.

  17. Joint multiregion segmentation and parametric estimation of image motion by basis function representation and level set evolution.

    PubMed

    Vázquez, Carlos; Mitiche, Amar; Laganière, Robert

    2006-05-01

    The purpose of this study is to investigate a variational method for joint segmentation and parametric estimation of image motion by basis function representation of motion and level set evolution. The functional contains three terms. One term is of classic regularization to bias the solution toward a segmentation with smooth boundaries. A second term biases the solution toward a segmentation with boundaries which coincide with motion discontinuities, following a description of motion discontinuities by a function of the image spatio-temporal variations. The third term refers to region information and measures conformity of the parametric representation of the motion of each region of segmentation to the image spatio-temporal variations. The components of motion in each region of segmentation are represented as functions in a space generated by a set of basis functions. The coefficients of the motion components considered combinations of the basis functions are the parameters of representation. The necessary conditions for a minimum of the functional, which are derived taking into consideration the dependence of the motion parameters on segmentation, lead to an algorithm which condenses to concurrent curve evolution, implemented via level sets, and estimation of the parameters by least squares within each region of segmentation. The algorithm and its implementation are verified on synthetic and real images using a basis of cosine transforms.

  18. Efficient model chemistries for peptides. I. General framework and a study of the heterolevel approximation in RHF and MP2 with Pople split-valence basis sets.

    PubMed

    Echenique, Pablo; Alonso, José Luis

    2008-07-15

    We present an exhaustive study of more than 250 ab initio potential energy surfaces (PESs) of the model dipeptide HCO-L-Ala-NH(2). The model chemistries (MCs) investigated are constructed as homo- and heterolevels involving possibly different RHF and MP2 calculations for the geometry and the energy. The basis sets used belong to a sample of 39 representants from Pople's split-valence families, ranging from the small 3-21G to the large 6-311++G(2df,2pd). The reference PES to which the rest are compared is the MP2/6-311++G(2df,2pd) homolevel, which, as far as we are aware, is the most accurate PES in the literature. All data sets have been analyzed according to a general framework, which can be extended to other complex problems and which captures the nearness concept in the space of MCs. The great number of MCs evaluated has allowed us to significantly explore this space and show that the correlation between accuracy and computational cost of the methods is imperfect, thus justifying a systematic search for the combination of features in a MC that is optimal to deal with peptides. Regarding the particular MCs studied, the most important conclusion is that the potentially very cost-saving heterolevel approximation is a very efficient one to describe the whole PES of HCO-L-Ala-NH(2). Finally, we show that, although RHF may be used to calculate the geometry if a MP2 single-point energy calculation follows, pure RHF//RHF homolevels are not recommendable for this problem.

  19. Probabilistic Evaluation of Ecological and Economic Objectives of River Basin Management Reveals a Potential Flaw in the Goal Setting of the EU Water Framework Directive

    NASA Astrophysics Data System (ADS)

    Hjerppe, Turo; Taskinen, Antti; Kotamäki, Niina; Malve, Olli; Kettunen, Juhani

    2017-04-01

    The biological status of European lakes has not improved as expected despite up-to-date legislation and ecological standards. As a result, the realism of objectives and the attainment of related ecological standards are under doubt. This paper gets to the bottom of a river basin management plan of a eutrophic lake in Finland and presents the ecological and economic impacts of environmental and societal drivers and planned management measures. For these purposes, we performed a Monte Carlo simulation of a diffuse nutrient load, lake water quality and cost-benefit models. Simulations were integrated into a Bayesian influence diagram that revealed the basic uncertainties. It turned out that the attainment of good ecological status as qualified in the Water Framework Directive of the European Union is unlikely within given socio-economic constraints. Therefore, management objectives and ecological and economic standards need to be reassessed and reset to provide a realistic goal setting for management. More effort should be put into the evaluation of the total monetary benefits and on the monitoring of lake phosphorus balances to reduce the uncertainties, and the resulting margin of safety and costs and risks of planned management measures.

  20. Probabilistic Evaluation of Ecological and Economic Objectives of River Basin Management Reveals a Potential Flaw in the Goal Setting of the EU Water Framework Directive.

    PubMed

    Hjerppe, Turo; Taskinen, Antti; Kotamäki, Niina; Malve, Olli; Kettunen, Juhani

    2017-04-01

    The biological status of European lakes has not improved as expected despite up-to-date legislation and ecological standards. As a result, the realism of objectives and the attainment of related ecological standards are under doubt. This paper gets to the bottom of a river basin management plan of a eutrophic lake in Finland and presents the ecological and economic impacts of environmental and societal drivers and planned management measures. For these purposes, we performed a Monte Carlo simulation of a diffuse nutrient load, lake water quality and cost-benefit models. Simulations were integrated into a Bayesian influence diagram that revealed the basic uncertainties. It turned out that the attainment of good ecological status as qualified in the Water Framework Directive of the European Union is unlikely within given socio-economic constraints. Therefore, management objectives and ecological and economic standards need to be reassessed and reset to provide a realistic goal setting for management. More effort should be put into the evaluation of the total monetary benefits and on the monitoring of lake phosphorus balances to reduce the uncertainties, and the resulting margin of safety and costs and risks of planned management measures.

  1. A new method for the level set equation using a hierarchical-gradient truncation and remapping technique

    NASA Astrophysics Data System (ADS)

    Kohno, Haruhiko; Nave, Jean-Christophe

    2013-06-01

    We present a novel numerical method for solving the advection equation for a level set function. The new method uses hierarchical-gradient truncation and remapping (H-GTaR) of the original partial differential equation (PDE). Our strategy reduces the original PDE to a set of decoupled linear ordinary differential equations with constant coefficients. Additionally, we introduce a remapping strategy to periodically guarantee solution accuracy for a deformation problem. The proposed scheme yields nearly an exact solution for a rigid body motion with a smooth function that possesses vanishingly small higher derivatives and calculates the gradient of the advected function in a straightforward way. We will evaluate our method in one- and two-dimensional domains and present results to several classical benchmark problems.

  2. Assessement of serum amyloid A levels in the rehabilitation setting in the Florida manatee (Trichechus manatus latirostris).

    PubMed

    Cray, Carolyn; Dickey, Meranda; Brewer, Leah Brinson; Arheart, Kristopher L

    2013-12-01

    The acute phase protein serum amyloid A (SAA) has been previously shown to have value as a biomarker of inflammation and infection in many species, including manatees (Trichechus manatus latirostris). In the current study, results from an automated assay for SAA were used in a rehabilitation setting. Reference intervals were established from clinically normal manatees using the robust method: 0-46 mg/L. More than 30-fold higher mean SAA levels were observed in manatees suffering from cold stress and boat-related trauma. Poor correlations were observed between SAA and total white blood count, percentage of neutrophils, albumin, and albumin/globulin ratio. A moderate correlation was observed between SAA and the presence of nucleated red blood cells. The sensitivity of SAA testing was 93% and the specificity was 98%, representing the highest combined values of all the analytes. The results indicate that the automated method for SAA quantitation can provide important clinical data for manatees in a rehabilitation setting.

  3. An advanced approach for the generation of complex cellular material representative volume elements using distance fields and level sets

    NASA Astrophysics Data System (ADS)

    Sonon, B.; François, B.; Massart, T. J.

    2015-08-01

    A general and widely tunable method for the generation of representative volume elements for cellular materials based on distance and level set functions is presented. The approach is based on random tessellations constructed from random inclusion packings. A general methodology to obtain arbitrary-shaped tessellations to produce disordered foams is presented and illustrated. These tessellations can degenerate either in classical Voronoï tessellations potentially additively weighted depending on properties of the initial inclusion packing used, or in Laguerre tessellations through a simple modification of the formulation. A versatile approach to control the particular morphology of the obtained foam is introduced. Specific local features such as concave triangular Plateau borders and non-constant thickness heterogeneous coatings can be built from the tessellation in a straightforward way and are tuned by a small set of parameters with a clear morphological interpretation.

  4. A level-set-based topology optimisation for acoustic-elastic coupled problems with a fast BEM-FEM solver

    NASA Astrophysics Data System (ADS)

    Isakari, Hiroshi; Kondo, Toyohiro; Takahashi, Toru; Matsumoto, Toshiro

    2017-03-01

    This paper presents a structural optimisation method in three-dimensional acoustic-elastic coupled problems. The proposed optimisation method finds an optimal allocation of elastic materials which reduces the sound level on some fixed observation points. In the process of the optimisation, configuration of the elastic materials is expressed with a level set function, and the distribution of the level set function is iteratively updated with the help of the topological derivative. The topological derivative is associated with state and adjoint variables which are the solutions of the acoustic-elastic coupled problems. In this paper, the acoustic-elastic coupled problems are solved by a BEM-FEM coupled solver, in which the fast multipole method (FMM) and a multi-frontal solver for sparse matrices are efficiently combined. Along with the detailed formulations for the topological derivative and the BEM-FEM coupled solver, we present some numerical examples of optimal designs of elastic sound scatterer to manipulate sound waves, from which we confirm the effectiveness of the present method.

  5. Simplex Free Adaptive Tree Fast Sweeping and Evolution Methods for Solving Level Set Equations in Arbitrary Dimension

    DTIC Science & Technology

    2005-05-06

    and node counts for a codimension-n problem of solving the eikonal equation |∇φ| = 1, (8) with a boundary point at xb = (0.5, 0.5, . . . , 0.5) with...unorganized points using variational level set method. Comp. Vis. and Image Under., 80:295–319, 2000. [40] Hongkai Zhao. A fast sweeping method for eikonal equations. Math. Comp., 74(250):603–627 (electronic), 2005. 21 ...nature. Secondly, they re- quire a backtracking along characteristics and an interpolation at an arbitrary point within the domain. This

  6. 3-dimensional throat region segmentation from MRI data based on Fourier interpolation and 3-dimensional level set methods.

    PubMed

    Campbell, Sean; Doshi, Trushali; Soraghan, John; Petropoulakis, Lykourgos; Di Caterina, Gaetano; Grose, Derek; MacKenzie, Kenneth

    2015-01-01

    A new algorithm for 3D throat region segmentation from magnetic resonance imaging (MRI) is presented. The proposed algorithm initially pre-processes the MRI data to increase the contrast between the throat region and its surrounding tissues and to reduce artifacts. Isotropic 3D volume is reconstructed using the Fourier interpolation. Furthermore, a cube encompassing the throat region is evolved using level set method to form a smooth 3D boundary of the throat region. The results of the proposed algorithm on real and synthetic MRI data are used to validate the robustness and accuracy of the algorithm.

  7. Establishing a Strong Foundation: District and School-Level Supports for Classroom Implementation of the LDC and MDC Frameworks. Executive Summary

    ERIC Educational Resources Information Center

    Reumann-Moore, Rebecca; Lawrence, Nancy; Sanders, Felicia; Christman, Jolley Bruce; Duffy, Mark

    2011-01-01

    The Bill and Melinda Gates Foundation has invested in the development and dissemination of high-quality instructional and formative assessment tools to support teachers' incorporation of the Core Common State Standards (CCSS) into their classroom instruction. Literacy experts have developed a framework and a set of templates that teachers can use…

  8. Towards people-centred health systems: a multi-level framework for analysing primary health care governance in low- and middle-income countries.

    PubMed

    Abimbola, Seye; Negin, Joel; Jan, Stephen; Martiniuk, Alexandra

    2014-09-01

    Although there is evidence that non-government health system actors can individually or collectively develop practical strategies to address primary health care (PHC) challenges in the community, existing frameworks for analysing health system governance largely focus on the role of governments, and do not sufficiently account for the broad range of contribution to PHC governance. This is important because of the tendency for weak governments in low- and middle-income countries (LMICs). We present a multi-level governance framework for use as a thinking guide in analysing PHC governance in LMICs. This framework has previously been used to analyse the governance of common-pool resources such as community fisheries and irrigation systems. We apply the framework to PHC because, like common-pool resources, PHC facilities in LMICs tend to be commonly owned by the community such that individual and collective action is often required to avoid the 'tragedy of the commons'-destruction and degradation of the resource resulting from lack of concern for its continuous supply. In the multi-level framework, PHC governance is conceptualized at three levels, depending on who influences the supply and demand of PHC services in a community and how: operational governance (individuals and providers within the local health market), collective governance (community coalitions) and constitutional governance (governments at different levels and other distant but influential actors). Using the example of PHC governance in Nigeria, we illustrate how the multi-level governance framework offers a people-centred lens on the governance of PHC in LMICs, with a focus on relations among health system actors within and between levels of governance. We demonstrate the potential impact of health system actors functioning at different levels of governance on PHC delivery, and how governance failure at one level can be assuaged by governance at another level.

  9. Towards people-centred health systems: a multi-level framework for analysing primary health care governance in low- and middle-income countries

    PubMed Central

    Abimbola, Seye; Negin, Joel; Jan, Stephen; Martiniuk, Alexandra

    2014-01-01

    Although there is evidence that non-government health system actors can individually or collectively develop practical strategies to address primary health care (PHC) challenges in the community, existing frameworks for analysing health system governance largely focus on the role of governments, and do not sufficiently account for the broad range of contribution to PHC governance. This is important because of the tendency for weak governments in low- and middle-income countries (LMICs). We present a multi-level governance framework for use as a thinking guide in analysing PHC governance in LMICs. This framework has previously been used to analyse the governance of common-pool resources such as community fisheries and irrigation systems. We apply the framework to PHC because, like common-pool resources, PHC facilities in LMICs tend to be commonly owned by the community such that individual and collective action is often required to avoid the ‘tragedy of the commons’—destruction and degradation of the resource resulting from lack of concern for its continuous supply. In the multi-level framework, PHC governance is conceptualized at three levels, depending on who influences the supply and demand of PHC services in a community and how: operational governance (individuals and providers within the local health market), collective governance (community coalitions) and constitutional governance (governments at different levels and other distant but influential actors). Using the example of PHC governance in Nigeria, we illustrate how the multi-level governance framework offers a people-centred lens on the governance of PHC in LMICs, with a focus on relations among health system actors within and between levels of governance. We demonstrate the potential impact of health system actors functioning at different levels of governance on PHC delivery, and how governance failure at one level can be assuaged by governance at another level. PMID:25274638

  10. Autocatalytic sets and boundaries.

    PubMed

    Hordijk, Wim; Steel, Mike

    Autopoietic systems, chemotons, and autogens are models that aim to explain (the emergence of) life as a functionally closed and self-sustaining system. An essential element in these models is the notion of a boundary containing, maintaining, and being generated by an internal reaction network. The more general concept of collectively autocatalytic sets, formalized as RAF theory, does not explicitly include this notion of a boundary. Here, we argue that (1) the notion of a boundary can also be incorporated in the formal RAF framework, (2) this provides a mechanism for the emergence of higher-level autocatalytic sets, (3) this satisfies a necessary condition for the evolvability of autocatalytic sets, and (4) this enables the RAF framework to formally represent and analyze (at least in part) the other models. We suggest that RAF theory might thus provide a basis for a unifying formal framework for the further development and study of such models. Graphical abstractThe emergence of an autocatalytic (super)set of autocatalytic (sub)sets.

  11. 3D Segmentation with an application of level set-method using MRI volumes for image guided surgery.

    PubMed

    Bosnjak, A; Montilla, G; Villegas, R; Jara, I

    2007-01-01

    This paper proposes an innovation in the application for image guided surgery using a comparative study of three different method of segmentation. This segmentation method is faster than the manual segmentation of images, with the advantage that it allows to use the same patient as anatomical reference, which has more precision than a generic atlas. This new methodology for 3D information extraction is based on a processing chain structured of the following modules: 1) 3D Filtering: the purpose is to preserve the contours of the structures and to smooth the homogeneous areas; several filters were tested and finally an anisotropic diffusion filter was used. 2) 3D Segmentation. This module compares three different methods: Region growing Algorithm, Cubic spline hand assisted, and Level Set Method. It then proposes a Level Set-based on the front propagation method that allows the making of the reconstruction of the internal walls of the anatomical structures of the brain. 3) 3D visualization. The new contribution of this work consists on the visualization of the segmented model and its use in the pre-surgery planning.

  12. Comparison of image segmentation of lungs using methods: connected threshold, neighborhood connected, and threshold level set segmentation

    NASA Astrophysics Data System (ADS)

    Amanda, A. R.; Widita, R.

    2016-03-01

    The aim of this research is to compare some image segmentation methods for lungs based on performance evaluation parameter (Mean Square Error (MSE) and Peak Signal Noise to Ratio (PSNR)). In this study, the methods compared were connected threshold, neighborhood connected, and the threshold level set segmentation on the image of the lungs. These three methods require one important parameter, i.e the threshold. The threshold interval was obtained from the histogram of the original image. The software used to segment the image here was InsightToolkit-4.7.0 (ITK). This research used 5 lung images to be analyzed. Then, the results were compared using the performance evaluation parameter determined by using MATLAB. The segmentation method is said to have a good quality if it has the smallest MSE value and the highest PSNR. The results show that four sample images match the criteria of connected threshold, while one sample refers to the threshold level set segmentation. Therefore, it can be concluded that connected threshold method is better than the other two methods for these cases.

  13. Interfaces and hydrophobic interactions in receptor-ligand systems: A level-set variational implicit solvent approach.

    PubMed

    Cheng, Li-Tien; Wang, Zhongming; Setny, Piotr; Dzubiella, Joachim; Li, Bo; McCammon, J Andrew

    2009-10-14

    A model nanometer-sized hydrophobic receptor-ligand system in aqueous solution is studied by the recently developed level-set variational implicit solvent model (VISM). This approach is compared to all-atom computer simulations. The simulations reveal complex hydration effects within the (concave) receptor pocket, sensitive to the distance of the (convex) approaching ligand. The ligand induces and controls an intermittent switching between dry and wet states of the hosting pocket, which determines the range and magnitude of the pocket-ligand attraction. In the level-set VISM, a geometric free-energy functional of all possible solute-solvent interfaces coupled to the local dispersion potential is minimized numerically. This approach captures the distinct metastable states that correspond to topologically different solute-solvent interfaces, and thereby reproduces the bimodal hydration behavior observed in the all-atom simulation. Geometrical singularities formed during the interface relaxation are found to contribute significantly to the energy barrier between different metastable states. While the hydration phenomena can thus be explained by capillary effects, the explicit inclusion of dispersion and curvature corrections seems to be essential for a quantitative description of hydrophobically confined systems on nanoscales. This study may shed more light onto the tight connection between geometric and energetic aspects of biomolecular hydration and may represent a valuable step toward the proper interpretation of experimental receptor-ligand binding rates.

  14. LV wall segmentation using the variational level set method (LSM) with additional shape constraint for oedema quantification

    NASA Astrophysics Data System (ADS)

    Kadir, K.; Gao, H.; Payne, A.; Soraghan, J.; Berry, C.

    2012-10-01

    In this paper an automatic algorithm for the left ventricle (LV) wall segmentation and oedema quantification from T2-weighted cardiac magnetic resonance (CMR) images is presented. The extent of myocardial oedema delineates the ischaemic area-at-risk (AAR) after myocardial infarction (MI). Since AAR can be used to estimate the amount of salvageable myocardial post-MI, oedema imaging has potential clinical utility in the management of acute MI patients. This paper presents a new scheme based on the variational level set method (LSM) with additional shape constraint for the segmentation of T2-weighted CMR image. In our approach, shape information of the myocardial wall is utilized to introduce a shape feature of the myocardial wall into the variational level set formulation. The performance of the method is tested using real CMR images (12 patients) and the results of the automatic system are compared to manual segmentation. The mean perpendicular distances between the automatic and manual LV wall boundaries are in the range of 1-2 mm. Bland-Altman analysis on LV wall area indicates there is no consistent bias as a function of LV wall area, with a mean bias of -121 mm2 between individual investigator one (IV1) and LSM, and -122 mm2 between individual investigator two (IV2) and LSM when compared to two investigators. Furthermore, the oedema quantification demonstrates good correlation when compared to an expert with an average error of 9.3% for 69 slices of short axis CMR image from 12 patients.

  15. Hospitals as complex adaptive systems: A case study of factors influencing priority setting practices at the hospital level in Kenya.

    PubMed

    Barasa, Edwine W; Molyneux, Sassy; English, Mike; Cleary, Susan

    2017-02-01

    There is a dearth of literature on priority setting and resource allocation (PSRA) practices in hospitals, particularly in low and middle income countries (LMICs). Using a case study approach, we examined PSRA practices in 2 public hospitals in coastal Kenya. We collected data through a combination of in-depth interviews of national level policy makers, hospital managers, and frontline practitioners in the case study hospitals (n = 72), review of documents such as hospital plans and budgets, minutes of meetings and accounting records, and non-participant observations of PSRA practices in case study hospitals over a period of 7 months. In this paper, we apply complex adaptive system (CAS) theory to examine the factors that influence PSRA practices. We found that PSRA practices in the case hospitals were influenced by, 1) inadequate financing level and poorly designed financing arrangements, 2) limited hospital autonomy and decision space, and 3) inadequate management and leadership capacity in the hospital. The case study hospitals exhibited properties of complex adaptive systems (CASs) that exist in a dynamic state with multiple interacting agents. Weaknesses in system 'hardware' (resource scarcity) and 'software' (including PSRA guidelines that reduced hospitals decision space, and poor leadership skills) led to the emergence of undesired properties. The capacity of hospitals to set priorities should be improved across these interacting aspects of the hospital organizational system. Interventions should however recognize that hospitals are CAS. Rather than rectifying isolated aspects of the system, they should endeavor to create conditions for productive emergence.

  16. A new non-overlapping concept to improve the Hybrid Particle Level Set method in multi-phase fluid flows

    NASA Astrophysics Data System (ADS)

    Archer, Philip J.; Bai, Wei

    2015-02-01

    A novel non-overlapping concept is augmented to the Hybrid Particle Level Set (HPLS) method to improve its accuracy and suitability for the modelling of multi-phase fluid flows. The concept addresses shortcomings in the reseeding algorithm, which maintains resolution of the surface at runtime. These shortcomings result in the misplacement of newly seeded particles in the opposite signed domain and necessitate a restriction on the distance that a particle can escape without deletion, which reduces the effectiveness of the method. The non-overlapping concept judges the suitability of potential new particles based on information already contained within the particle representation of the surface. By preventing the misplacement of particles it is possible to significantly relax the distance restriction thereby increasing the accuracy of the HPLS method in multi-phase flows. To demonstrate its robustness and efficiency, the concept is examined with a number of challenging test cases, including both level-set-only simulations and two-phase fluid flows.

  17. Mask pattern recovery by level set method based inverse inspection technology (IIT) and its application on defect auto disposition

    NASA Astrophysics Data System (ADS)

    Park, Jin-Hyung; Chung, Paul D. H.; Jeon, Chan-Uk; Cho, Han Ku; Pang, Linyong; Peng, Danping; Tolani, Vikram; Cecil, Tom; Kim, David; Baik, KiHo

    2009-10-01

    At the most advanced technology nodes, such as 32nm and 22nm, aggressive OPC and Sub-Resolution Assist Features (SRAFs) are required. However, their use results in significantly increased mask complexity, making mask defect disposition more challenging than ever. This paper describes how mask patterns can first be recovered from the inspection images by applying patented algorithms using Level Set Methods. The mask pattern recovery step is then followed by aerial/wafer image simulation, the results of which can be plugged into an automated mask defect disposition system based on aerial/wafer image. The disposition criteria are primarily based on wafer-plane CD variance. The system also connects to a post-OPC lithography verification tool that can provide gauges and CD specs, thereby enabling them to be used in mask defect disposition as well. Results on both programmed defects and production defects collected at Samsung mask shop are presented to show the accuracy and consistency of using the Level Set Methods and aerial/wafer image based automated mask disposition.

  18. A fast and robust level set method for image segmentation using fuzzy clustering and lattice Boltzmann method.

    PubMed

    Balla-Arabé, Souleymane; Gao, Xinbo; Wang, Bin

    2013-06-01

    In the last decades, due to the development of the parallel programming, the lattice Boltzmann method (LBM) has attracted much attention as a fast alternative approach for solving partial differential equations. In this paper, we first designed an energy functional based on the fuzzy c-means objective function which incorporates the bias field that accounts for the intensity inhomogeneity of the real-world image. Using the gradient descent method, we obtained the corresponding level set equation from which we deduce a fuzzy external force for the LBM solver based on the model by Zhao. The method is fast, robust against noise, independent to the position of the initial contour, effective in the presence of intensity inhomogeneity, highly parallelizable and can detect objects with or without edges. Experiments on medical and real-world images demonstrate the performance of the proposed method in terms of speed and efficiency.

  19. PCA and level set based non-rigid image registration for MRI and Paxinos-Watson atlas of rat brain

    NASA Astrophysics Data System (ADS)

    Cai, Chao; Liu, Ailing; Ding, Mingyue; Zhou, Chengping

    2007-12-01

    Image registration provides the ability to geometrically align one dataset with another. It is a basic task in a great variety of biomedical imaging applications. This paper introduced a novel three-dimensional registration method for Magnetic Resonance Image (MRI) and Paxinos-Watson Atlas of rat brain. For the purpose of adapting to a large range and non-linear deformation between MRI and atlas in higher registration accuracy, based on the segmentation of rat brain, we chose the principle components analysis (PCA) automatically performing the linear registration, and then, a level set based nonlinear registration correcting some small distortions. We implemented this registration method in a rat brain 3D reconstruction and analysis system. Experiments have demonstrated that this method can be successfully applied to registering the low resolution and noise affection MRI with Paxinos-Watson Atlas of rat brain.

  20. A New User Dependent Iris Recognition System Based on an Area Preserving Pointwise Level Set Segmentation Approach

    NASA Astrophysics Data System (ADS)

    Barzegar, Nakissa; Moin, M. Shahram

    2009-12-01

    This paper presents a new user dependent approach in iris recognition systems. In the proposed method, consistent bits of iris code are calculated, based on the user specifications, using the user's mask. Another contribution of our work is in the iris segmentation phase, where a new pointwise level set approach with area preserving has been used for determining inner and outer iris boundaries, both exclusively performed in one step. Thanks to the special properties of this segmentation technique, there is no constraint about angles of head tilt. Furthermore, we showed that this algorithm is robust in noisy situations and can locate irises which are partly occluded by eyelid and eyelashes. Experimental results, on three renowned iris databases (CASIAIrisV3, Bath, and Ubiris), show that our method outperforms some of the existing methods, both in terms of accuracy and response time.

  1. Best Practices for Ethical Sharing of Individual-Level Health Research Data From Low- and Middle-Income Settings.

    PubMed

    Bull, Susan; Cheah, Phaik Yeong; Denny, Spencer; Jao, Irene; Marsh, Vicki; Merson, Laura; Shah More, Neena; Nhan, Le Nguyen Thanh; Osrin, David; Tangseefa, Decha; Wassenaar, Douglas; Parker, Michael

    2015-07-01

    Sharing individual-level data from clinical and public health research is increasingly being seen as a core requirement for effective and efficient biomedical research. This article discusses the results of a systematic review and multisite qualitative study of key stakeholders' perspectives on best practices in ethical data sharing in low- and middle-income settings. Our research suggests that for data sharing to be effective and sustainable, multiple social and ethical requirements need to be met. An effective model of data sharing will be one in which considered judgments will need to be made about how best to achieve scientific progress, minimize risks of harm, promote fairness and reciprocity, and build and sustain trust.

  2. Calculation of contact angles at triple phase boundary in solid oxide fuel cell anode using the level set method

    SciTech Connect

    Sun, Xiaojun; Hasegawa, Yosuke; Kohno, Haruhiko; Jiao, Zhenjun; Hayakawa, Koji; Okita, Kohei; Shikazono, Naoki

    2014-10-15

    A level set method is applied to characterize the three dimensional structures of nickel, yttria stabilized zirconia and pore phases in solid oxide fuel cell anode reconstructed by focused ion beam-scanning electron microscope. A numerical algorithm is developed to evaluate the contact angles at the triple phase boundary based on interfacial normal vectors which can be calculated from the signed distance functions defined for each of the three phases. Furthermore, surface tension force is estimated from the contact angles by assuming the interfacial force balance at the triple phase boundary. The average contact angle values of nickel, yttria stabilized zirconia and pore are found to be 143°–156°, 83°–138° and 82°–123°, respectively. The mean contact angles remained nearly unchanged after 100 hour operation. However, the contact angles just after reduction are different for the cells with different sintering temperatures. In addition, standard deviations of the contact angles are very large especially for yttria stabilized zirconia and pore phases. The calculated surface tension forces from mean contact angles were close to the experimental values found in the literature. Slight increase of surface tensions of nickel/pore and nickel/yttria stabilized zirconia were observed after operation. Present data are expected to be used not only for the understanding of the degradation mechanism, but also for the quantitative prediction of the microstructural temporal evolution of solid oxide fuel cell anode. - Highlights: • A level set method is applied to characterize the 3D structures of SOFC anode. • A numerical algorithm is developed to evaluate the contact angles at the TPB. • Surface tension force is estimated from the contact angles. • The average contact angle values are found to be 143o-156o, 83o-138o and 82o-123o. • Present data are expected to understand degradation and predict evolution of SOFC.

  3. Segmentation and Analysis of Corpus Callosum in Alzheimer MR Images using Total Variation Based Diffusion Filter and Level Set Method.

    PubMed

    Anandh, K R; Sujatha, C M; Ramakrishnan, S

    2015-01-01

    Alzheimer’s Disease (AD) is a common form of dementia that affects gray and white matter structures of brain. Manifestation of AD leads to cognitive deficits such as memory impairment problems, ability to think and difficulties in performing day to day activities. Although the etiology of this disease is unclear, imaging biomarkers are highly useful in the early diagnosis of AD. Magnetic resonance imaging is an indispensible non-invasive imaging modality that reflects both the geometry and pathology of the brain. Corpus Callosum (CC) is the largest white matter structure as well as the main inter-hemispheric fiber connection that undergoes regional alterations due to AD. Therefore, segmentation and feature extraction are predominantly essential to characterize the CC atrophy. In this work, an attempt has been made to segment CC using edge based level set method. Prior to segmentation, the images are pre-processed using Total Variation (TV) based diffusion filtering to enhance the edge information. Shape based geometric features are extracted from the segmented CC images to analyze the CC atrophy. Results show that the edge based level set method is able to segment CC in both the normal and AD images. TV based diffusion filtering has performed uniform region specific smoothing thereby preserving the texture and small scale details of the image. Consequently, the edge map of CC in both the normal and AD are apparently sharp and distinct with continuous boundaries. This facilitates the final contour to correctly segment CC from the nearby structures. The extracted geometric features such as area, perimeter and minor axis are found to have the percentage difference of 5.97%, 22.22% and 9.52% respectively in the demarcation of AD subjects. As callosal atrophy is significant in the diagnosis of AD, this study seems to be clinically useful.

  4. A hybrid semi-automatic method for liver segmentation based on level-set methods using multiple seed points.

    PubMed

    Yang, Xiaopeng; Yu, Hee Chul; Choi, Younggeun; Lee, Wonsup; Wang, Baojian; Yang, Jaedo; Hwang, Hongpil; Kim, Ji Hyun; Song, Jisoo; Cho, Baik Hwan; You, Heecheon

    2014-01-01

    The present study developed a hybrid semi-automatic method to extract the liver from abdominal computerized tomography (CT) images. The proposed hybrid method consists of a customized fast-marching level-set method for detection of an optimal initial liver region from multiple seed points selected by the user and a threshold-based level-set method for extraction of the actual liver region based on the initial liver region. The performance of the hybrid method was compared with those of the 2D region growing method implemented in OsiriX using abdominal CT datasets of 15 patients. The hybrid method showed a significantly higher accuracy in liver extraction (similarity index, SI=97.6 ± 0.5%; false positive error, FPE = 2.2 ± 0.7%; false negative error, FNE=2.5 ± 0.8%; average symmetric surface distance, ASD=1.4 ± 0.5mm) than the 2D (SI=94.0 ± 1.9%; FPE = 5.3 ± 1.1%; FNE=6.5 ± 3.7%; ASD=6.7 ± 3.8mm) region growing method. The total liver extraction time per CT dataset of the hybrid method (77 ± 10 s) is significantly less than the 2D region growing method (575 ± 136 s). The interaction time per CT dataset between the user and a computer of the hybrid method (28 ± 4 s) is significantly shorter than the 2D region growing method (484 ± 126 s). The proposed hybrid method was found preferred for liver segmentation in preoperative virtual liver surgery planning.

  5. Implementation and evaluation of the Level Set method: Towards efficient and accurate simulation of wet etching for microengineering applications

    NASA Astrophysics Data System (ADS)

    Montoliu, C.; Ferrando, N.; Gosálvez, M. A.; Cerdá, J.; Colom, R. J.

    2013-10-01

    The use of atomistic methods, such as the Continuous Cellular Automaton (CCA), is currently regarded as a computationally efficient and experimentally accurate approach for the simulation of anisotropic etching of various substrates in the manufacture of Micro-electro-mechanical Systems (MEMS). However, when the features of the chemical process are modified, a time-consuming calibration process needs to be used to transform the new macroscopic etch rates into a corresponding set of atomistic rates. Furthermore, changing the substrate requires a labor-intensive effort to reclassify most atomistic neighborhoods. In this context, the Level Set (LS) method provides an alternative approach where the macroscopic forces affecting the front evolution are directly applied at the discrete level, thus avoiding the need for reclassification and/or calibration. Correspondingly, we present a fully-operational Sparse Field Method (SFM) implementation of the LS approach, discussing in detail the algorithm and providing a thorough characterization of the computational cost and simulation accuracy, including a comparison to the performance by the most recent CCA model. We conclude that the SFM implementation achieves similar accuracy as the CCA method with less fluctuations in the etch front and requiring roughly 4 times less memory. Although SFM can be up to 2 times slower than CCA for the simulation of anisotropic etchants, it can also be up to 10 times faster than CCA for isotropic etchants. In addition, we present a parallel, GPU-based implementation (gSFM) and compare it to an optimized, multicore CPU version (cSFM), demonstrating that the SFM algorithm can be successfully parallelized and the simulation times consequently reduced, while keeping the accuracy of the simulations. Although modern multicore CPUs provide an acceptable option, the massively parallel architecture of modern GPUs is more suitable, as reflected by computational times for gSFM up to 7.4 times faster than

  6. The FSHB -211G>T variant attenuates serum FSH levels in the supraphysiological gonadotropin setting of Klinefelter syndrome.

    PubMed

    Busch, Alexander S; Tüttelmann, Frank; Zitzmann, Michael; Kliesch, Sabine; Gromoll, Jörg

    2015-05-01

    Klinefelter syndrome (47, XXY) is the most frequent genetic cause of male infertility and individuals share the endocrine hallmark of hypergonadotropic hypogonadism. Single-nucleotide polymorphisms located within the FSHB/FSHR gene were recently shown to impact serum follicle-stimulating hormone (FSH) levels and other reproductive parameters in men. The objective of this study was to analyse the effect of FSHB-211G>T (c.-280G>T, rs10835638) as well as FSHR c.2039G>A (rs6166) and FSHR c.-29G>A (rs1394205) on endocrine and reproductive parameters in untreated and testosterone-treated Klinefelter patients. Patients were retrospectively selected from the clientele attending a university-based andrology centre. A total of 309 non-mosaic Klinefelter individuals between 18 and 65 years were included and genotyped for the variants by TaqMan assays. The untreated group comprised 248 men, in which the FSHB -211G>T allele was significantly associated with the reduced serum follicle-stimulating hormone levels (-6.5 U/l per T allele, P=1.3 × 10(-3)). Testosterone treatment (n=150) abolished the observed association. When analysing patients before and under testosterone treatment (n=89), gonadotropin levels were similarly suppressed independently of the FSHB genotype. The FSHR polymorphisms did not exhibit any significant influence in any group, neither on the endocrine nor reproductive parameters. In conclusion, a hypergonadotropic setting such as Klinefelter syndrome does not mask the FSHB -211G>T genotype effects on the follicle-stimulating hormone serum levels. The impact was indeed more pronounced compared with normal or infertile men, whereas gonadotropin suppression under testosterone treatment seems to be independent of the genotype. Thus, the FSHB -211G>T genotype is a key determinant in the regulation of gonadotropins in different reproductive-endocrine pathopyhsiologies.

  7. Linking English-Language Test Scores onto the Common European Framework of Reference: An Application of Standard-Setting Methodology. TOEFL iBT Research Report TOEFL iBt-06. ETS RR-08-34

    ERIC Educational Resources Information Center

    Tannenbaum, Richard J.; Wylie, E. Caroline

    2008-01-01

    The Common European Framework of Reference (CEFR) describes language proficiency in reading, writing, speaking, and listening on a 6-level scale. In this study, English-language experts from across Europe linked CEFR levels to scores on three tests: the TOEFL® iBT test, the TOEIC® assessment, and the TOEIC "Bridge"™ test.…

  8. The application of language-game theory to the analysis of science learning: Developing an interpretive classroom-level learning framework

    NASA Astrophysics Data System (ADS)

    Ahmadibasir, Mohammad

    In this study an interpretive learning framework that aims to measure learning on the classroom level is introduced. In order to develop and evaluate the value of the framework, a theoretical/empirical study is designed. The researcher attempted to illustrate how the proposed framework provides insights on the problem of classroom-level learning. The framework is developed by construction of connections between the current literature on science learning and Wittgenstein's language-game theory. In this framework learning is defined as change of classroom language-game or discourse. In the proposed framework, learning is measured by analysis of classroom discourse. The empirical explanation power of the framework is evaluated by applying the framework in the analysis of learning in a fifth-grade science classroom. The researcher attempted to analyze how students' colloquial discourse changed to a discourse that bears more resemblance to science discourse. The results of the empirical part of the investigation are presented in three parts: first, the gap between what students did and what they were supposed to do was reported. The gap showed that students during the classroom inquiry wanted to do simple comparisons by direct observation, while they were supposed to do tool-assisted observation and procedural manipulation for a complete comparison. Second, it was illustrated that the first attempt to connect the colloquial to science discourse was done by what was immediately intelligible for students and then the teacher negotiated with students in order to help them to connect the old to the new language-game more purposefully. The researcher suggested that these two events in the science classroom are critical in discourse change. Third, it was illustrated that through the academic year, the way that students did the act of comparison was improved and by the end of the year more accurate causal inferences were observable in classroom communication. At the end of the

  9. Dynamics of shigellosis epidemics: estimating individual-level transmission and reporting rates from national epidemiologic data sets.

    PubMed

    Joh, Richard I; Hoekstra, Robert M; Barzilay, Ezra J; Bowen, Anna; Mintz, Eric D; Weiss, Howard; Weitz, Joshua S

    2013-10-15

    Shigellosis, a diarrheal disease, is endemic worldwide and is responsible for approximately 15,000 laboratory-confirmed cases in the United States every year. However, patients with shigellosis often do not seek medical care. To estimate the burden of shigellosis, we extended time-series susceptible-infected-recovered models to infer epidemiologic parameters from underreported case data. We applied the time-series susceptible-infected-recovered-based inference schemes to analyze the largest surveillance data set of Shigella sonnei in the United States from 1967 to 2007 with county-level resolution. The dynamics of shigellosis transmission show strong annual and multiyear cycles, as well as seasonality. By using the schemes, we inferred individual-level parameters of shigellosis infection, including seasonal transmissibilities and basic reproductive number (R0). In addition, this study provides quantitative estimates of the reporting rate, suggesting that the shigellosis burden in the United States may be more than 10 times the number of laboratory-confirmed cases. Although the estimated reporting rate is generally under 20%, and R0 is generally under 1.5, there is a strong negative correlation between estimates of the reporting rate and R0. Such negative correlations are likely to pose identifiability problems in underreported diseases. We discuss complementary approaches that might further disentangle the true reporting rate and R0.

  10. Determinants of symptom profile and severity of conduct disorder in a tertiary level pediatric care set up: A pilot study

    PubMed Central

    Jayaprakash, R.; Rajamohanan, K.; Anil, P.

    2014-01-01

    Background: Conduct disorders (CDs) are one of the most common causes for referral to child and adolescent mental health centers. CD varies in its environmental factors, symptom profile, severity, co-morbidity, and functional impairment. Aims: The aim was to analyze the determinants of symptom profile and severity among childhood and adolescent onset CD. Settings and Design: Clinic based study with 60 consecutive children between 6 and 18 years of age satisfying International Classification of Disease-10 Development Control Rules guidelines for CD, attending behavioral pediatrics unit outpatient. Materials and Methods: The family psychopathology, symptom severity, and functional level were assessed using parent interview schedule, revised behavioral problem checklist and Children's Global Assessment Scale. Statistical Analysis: The correlation and predictive power of the variables were analyzed using SPSS 16.0 version. Results: There was significant male dominance (88.3%) with boy girl ratio 7.5:1. Most common comorbidity noticed was hyperkinetic disorders (45%). Childhood onset group was more predominant (70%). Prevalence of comorbidity was more among early onset group (66.7%) than the late-onset group (33.3%). The family psychopathology, symptom severity, and the functional impairment were significantly higher in the childhood onset group. Conclusion: The determinants of symptom profile and severity are early onset (childhood onset CD), nature, and quantity of family psychopathology, prevalence, and type of comorbidity and nature of symptom profile itself. The family psychopathology is positively correlated with the symptom severity and negatively correlated with the functional level of the children with CD. The symptom severity was negatively correlated with the functional level of the child with CD. PMID:25568472

  11. Three dimensional level set based semiautomatic segmentation of atherosclerotic carotid artery wall volume using 3D ultrasound imaging

    NASA Astrophysics Data System (ADS)

    Hossain, Md. Murad; AlMuhanna, Khalid; Zhao, Limin; Lal, Brajesh K.; Sikdar, Siddhartha

    2014-03-01

    3D segmentation of carotid plaque from ultrasound (US) images is challenging due to image artifacts and poor boundary definition. Semiautomatic segmentation algorithms for calculating vessel wall volume (VWV) have been proposed for the common carotid artery (CCA) but they have not been applied on plaques in the internal carotid artery (ICA). In this work, we describe a 3D segmentation algorithm that is robust to shadowing and missing boundaries. Our algorithm uses distance regularized level set method with edge and region based energy to segment the adventitial wall boundary (AWB) and lumen-intima boundary (LIB) of plaques in the CCA, ICA and external carotid artery (ECA). The algorithm is initialized by manually placing points on the boundary of a subset of transverse slices with an interslice distance of 4mm. We propose a novel user defined stopping surface based energy to prevent leaking of evolving surface across poorly defined boundaries. Validation was performed against manual segmentation using 3D US volumes acquired from five asymptomatic patients with carotid stenosis using a linear 4D probe. A pseudo gold-standard boundary was formed from manual segmentation by three observers. The Dice similarity coefficient (DSC), Hausdor distance (HD) and modified HD (MHD) were used to compare the algorithm results against the pseudo gold-standard on 1205 cross sectional slices of 5 3D US image sets. The algorithm showed good agreement with the pseudo gold standard boundary with mean DSC of 93.3% (AWB) and 89.82% (LIB); mean MHD of 0.34 mm (AWB) and 0.24 mm (LIB); mean HD of 1.27 mm (AWB) and 0.72 mm (LIB). The proposed 3D semiautomatic segmentation is the first step towards full characterization of 3D plaque progression and longitudinal monitoring.

  12. Public Higher Education Performance Accountability Framework Report: Goal--College Readiness Measure: Levels in English and Mathematics. Commission Report 07-24

    ERIC Educational Resources Information Center

    California Postsecondary Education Commission, 2007

    2007-01-01

    As part of its work in developing a performance accountability framework for higher education, the Commission conducted an analysis of student performance on standardized tests at the high school and middle school levels. National test results show that California is behind most other states in giving its students a high school education of the…

  13. A Research Study Using the Delphi Method to Define Essential Competencies for a High School Game Art and Design Course Framework at the National Level

    ERIC Educational Resources Information Center

    Mack, Nayo Corenus-Geneva

    2011-01-01

    This research study reports the findings of a Delphi study conducted to determine the essential competencies and objectives for a high school Game Art and Design course framework at the national level. The Delphi panel consisted of gaming, industry and educational experts from all over the world who were members of the International Game…

  14. Are Providers More Likely to Contribute to Healthcare Disparities Under High Levels of Cognitive Load? How Features of the Healthcare Setting May Lead to Biases in Medical Decision Making

    PubMed Central

    Burgess, Diana J.

    2014-01-01

    Systematic reviews of healthcare disparities suggest that clinicians’ diagnostic and therapeutic decision making varies by clinically irrelevant characteristics, such as patient race, and that this variation may contribute to healthcare disparities. However, there is little understanding of the particular features of the healthcare setting under which clinicians are most likely to be inappropriately influenced by these characteristics. This study delineates several hypotheses to stimulate future research in this area. It is posited that healthcare settings in which providers experience high levels of cognitive load will increase the likelihood of racial disparities via 2 pathways. First, providers who experience higher levels of cognitive load are hypothesized to make poorer medical decisions and provide poorer care for all patients, due to lower levels of controlled processing (H1). Second, under greater levels of cognitive load, it is hypothesized that healthcare providers’ medical decisions and interpersonal behaviors will be more likely to be influenced by racial stereotypes, leading to poorer processes and outcomes of care for racial minority patients (H2). It is further hypothesized that certain characteristics of healthcare settings will result in higher levels of cognitive load experienced by providers (H3). Finally, it is hypothesized that minority patients will be disproportionately likely to be treated in healthcare settings in which providers experience greater levels of cognitive load (H4a), which will result in racial disparities due to lower levels of controlled processing by providers (H4b) and the influence of racial stereotypes (H4c).The study concludes with implications for research and practice that flow from this framework. PMID:19726783

  15. Exploring facilitators and barriers to individual and organizational level capacity building: outcomes of participation in a community priority setting workshop.

    PubMed

    Flaman, Laura M; Nykiforuk, Candace I J; Plotnikoff, Ronald C; Raine, Kim

    2010-06-01

    This article explores facilitators and barriers to individual and organizational capacity to address priority strategies for community-level chronic disease prevention. Interviews were conducted with a group of participants who previously participated in a community priority-setting workshop held in two Alberta communities. The goal of the workshop was to bring together key community stakeholders to collaboratively identify action strategies for preventing chronic diseases in their communities. While capacity building was not the specific aim of the workshop, it could be considered an unintended byproduct of bringing together community representatives around a specific issue. One purpose of this study was to examine the participants' capacity to take action on the priority strategies identified at the workshop. Eleven one-on-one semi-structured interviews were conducted with workshop participants to examine facilitators and barriers to individual and organizational level capacity building. Findings suggest that there were several barriers identified by participants that limited their capacity to take action on the workshop strategies, specifically: (i) organizations' lack of priorities or competing priorities; (ii) priorities secondary to the organizational mandate; (iii) disconnect between organizational and community priorities; (iv) disconnect between community organization priorities; (v) disconnect between organizations and government/funder priorities; (vi) limited resources (i.e. time, money and personnel); and, (vii) bigger community issues. The primary facilitator of individual capacity to take action or priority strategies was supportive organizations. Recognition of these elements will allow practitioners, organizations, governments/funders, and communities to focus on seeking ways to improve capacity for chronic disease prevention.

  16. Automatic optimal filament segmentation with sub-pixel accuracy using generalized linear models and B-spline level-sets

    PubMed Central

    Xiao, Xun; Geyer, Veikko F.; Bowne-Anderson, Hugo; Howard, Jonathon; Sbalzarini, Ivo F.

    2016-01-01

    Biological filaments, such as actin filaments, microtubules, and cilia, are often imaged using different light-microscopy techniques. Reconstructing the filament curve from the acquired images constitutes the filament segmentation problem. Since filaments have lower dimensionality than the image itself, there is an inherent trade-off between tracing the filament with sub-pixel accuracy and avoiding noise artifacts. Here, we present a globally optimal filament segmentation method based on B-spline vector level-sets and a generalized linear model for the pixel intensity statistics. We show that the resulting optimization problem is convex and can hence be solved with global optimality. We introduce a simple and efficient algorithm to compute such optimal filament segmentations, and provide an open-source implementation as an ImageJ/Fiji plugin. We further derive an information-theoretic lower bound on the filament segmentation error, quantifying how well an algorithm could possibly do given the information in the image. We show that our algorithm asymptotically reaches this bound in the spline coefficients. We validate our method in comprehensive benchmarks, compare with other methods, and show applications from fluorescence, phase-contrast, and dark-field microscopy. PMID:27104582

  17. The Moving Boundary Node Method: A level set-based, finite volume algorithm with applications to cell motility

    PubMed Central

    Wolgemuth, Charles W.; Zajac, Mark

    2010-01-01

    Eukaryotic cell crawling is a highly complex biophysical and biochemical process, where deformation and motion of a cell are driven by internal, biochemical regulation of a poroelastic cytoskeleton. One challenge to building quantitative models that describe crawling cells is solving the reaction-diffusion-advection dynamics for the biochemical and cytoskeletal components of the cell inside its moving and deforming geometry. Here we develop an algorithm that uses the level set method to move the cell boundary and uses information stored in the distance map to construct a finite volume representation of the cell. Our method preserves Cartesian connectivity of nodes in the finite volume representation while resolving the distorted cell geometry. Derivatives approximated using a Taylor series expansion at finite volume interfaces lead to second order accuracy even on highly distorted quadrilateral elements. A modified, Laplacian-based interpolation scheme is developed that conserves mass while interpolating values onto nodes that join the cell interior as the boundary moves. An implicit time-stepping algorithm is used to maintain stability. We use the algoirthm to simulate two simple models for cellular crawling. The first model uses depolymerization of the cytoskeleton to drive cell motility and suggests that the shape of a steady crawling cell is strongly dependent on the adhesion between the cell and the substrate. In the second model, we use a model for chemical signalling during chemotaxis to determine the shape of a crawling cell in a constant gradient and to show cellular response upon gradient reversal. PMID:20689723

  18. Sharp Interface Level Set Method based Study for Evaporation of a Sessile Droplet on Hydrophilic and Hydrophobic Substrates

    NASA Astrophysics Data System (ADS)

    Shaikh, Javed; Sharma, Atul; Bhardwaj, Rajneesh

    2016-11-01

    The evaporation of a sessile droplet is important in many applications like hot-spot cooling, surface patterning etc. An understanding of the droplet dynamics in terms of evaporation rate, evaporative cooling and substrate wettability is important for designing the droplet based devices. Extensive theoretical and experimental research has been conducted on evaporating droplets in recent years; however, the effect of surrounding vapors on the evaporation dynamics of a sessile droplet is not found in the literature. In this work, an in-house sharp-interface level set code based on the Ghost Fluid Method (GFM) is used. Energy, species, and momentum equations are coupled for studying the sessile droplet evaporation phenomenon on hydrophilic and hydrophobic substrates. Different modes of droplet evaporation i.e. constant contact radius (CCR), constant contact angle (CCA) are observed for the two types of substrates. The coupling of energy and species equations is used for capturing the evaporating cooling induced dip in the droplet surface temperature. The effect of surrounding vapors like fluorocarbon vapors, on the evaporative cooling, is presented for water droplet on the hydrophilic and hydrophobic substrates. Research Scholar.

  19. Automatic optimal filament segmentation with sub-pixel accuracy using generalized linear models and B-spline level-sets.

    PubMed

    Xiao, Xun; Geyer, Veikko F; Bowne-Anderson, Hugo; Howard, Jonathon; Sbalzarini, Ivo F

    2016-08-01

    Biological filaments, such as actin filaments, microtubules, and cilia, are often imaged using different light-microscopy techniques. Reconstructing the filament curve from the acquired images constitutes the filament segmentation problem. Since filaments have lower dimensionality than the image itself, there is an inherent trade-off between tracing the filament with sub-pixel accuracy and avoiding noise artifacts. Here, we present a globally optimal filament segmentation method based on B-spline vector level-sets and a generalized linear model for the pixel intensity statistics. We show that the resulting optimization problem is convex and can hence be solved with global optimality. We introduce a simple and efficient algorithm to compute such optimal filament segmentations, and provide an open-source implementation as an ImageJ/Fiji plugin. We further derive an information-theoretic lower bound on the filament segmentation error, quantifying how well an algorithm could possibly do given the information in the image. We show that our algorithm asymptotically reaches this bound in the spline coefficients. We validate our method in comprehensive benchmarks, compare with other methods, and show applications from fluorescence, phase-contrast, and dark-field microscopy.

  20. Barrett's Mucosa Segmentation in Endoscopic Images Using a Hybrid Method: Spatial Fuzzy c-mean and Level Set

    PubMed Central

    Yousefi-Banaem, Hossein; Rabbani, Hossein; Adibi, Peyman

    2016-01-01

    Barrett's mucosa is one of the most important diseases in upper gastrointestinal system that caused by gastro-esophagus reflux. If left untreated, the disease will cause distal esophagus and gastric cardia adenocarcinoma. The malignancy risk is very high in short segment Barrett's mucosa. Therefore, lesion area segmentation can improve specialist decision for treatment. In this paper, we proposed a combined fuzzy method with active models for Barrett's mucosa segmentation. In this study, we applied three methods for special area segmentation and determination. For whole disease area segmentation, we applied the hybrid fuzzy based level set method (LSM). Morphological algorithms were used for gastroesophageal junction determination, and we discriminated Barrett's mucosa from break by applying Chan-Vase method. Fuzzy c-mean and LSMs fail to segment this type of medical image due to weak boundaries. In contrast, the full automatic hybrid method with correlation approach that has used in this paper segmented the metaplasia area in the endoscopy image with desirable accuracy. The presented approach omits the manually desired cluster selection step that needed the operator manipulation. Obtained results convinced us that this approach is suitable for esophagus metaplasia segmentation. PMID:28028499

  1. Novel level-set based segmentation method of the lung at HRCT images of diffuse interstitial lung disease (DILD)

    NASA Astrophysics Data System (ADS)

    Lee, Jeongjin; Seo, Joon Beom; Kim, Namkug; Park, Sang Ok; Lee, Ho; Shin, Yeong Gil; Kim, Soo-Hong

    2009-02-01

    In this paper, we propose an algorithm for reliable segmentation of the lung at HRCT of DILD. Our method consists of four main steps. First, the airway and colon are segmented and excluded by thresholding(-974 HU) and connected component analysis. Second, initial lung is identified by thresholding(-474 HU). Third, shape propagation outward the lung is performed on the initial lung. Actual lung boundaries exist inside the propagated boundaries. Finally, subsequent shape modeling level-set inward the lung from the propagated boundary can identify the lung boundary when the curvature term was highly weighted. To assess the accuracy of the proposed algorithm, the segmentation results of 54 patients are compared with those of manual segmentation done by an expert radiologist. The value of 1 minus volumetric overlap is less than 5% error. Accurate result of our method would be useful in determining the lung parenchyma at HRCT, which is the essential step for the automatic classification and quantification of diffuse interstitial lung disease.

  2. Mechanical behavior of pathological and normal red blood cells in microvascular flow based on modified level-set method

    NASA Astrophysics Data System (ADS)

    Zhang, XiWen; Ma, FangChao; Hao, PengFei; Yao, ZhaoHui

    2016-01-01

    The research of the motion and deformation of the RBCs is important to reveal the mechanism of blood diseases. A numerical method has been developed with level set formulation for elastic membrane immersed in incompressible fluid. The numerical model satisfies mass and energy conservation without the leaking problems in classical Immersed Boundary Method (IBM), at the same time, computing grid we used can be much smaller than the general literatures. The motion and deformation of a red blood cell (including pathological & normal status) in microvascular flow are simulated. It is found that the Reynolds number and membrane's stiffness play an important role in the transmutation and oscillation of the elastic membrane. The normal biconcave shape of the RBC is propitious to create high deformation than other pathological shapes. With reduced viscosity of the interior fluid both the velocity of the blood and the deformability of the cell reduced. With increased viscosity of the plasma both the velocity of the blood and the deformability of the cell reduced. The tank treading of the RBC membrane is observed at low enough viscosity contrast in shear flow. The tank tread fixed inclination angle of the cell depends on the shear ratio and viscosity contrast, which can be compared with the experimental observation well.

  3. Barrett's Mucosa Segmentation in Endoscopic Images Using a Hybrid Method: Spatial Fuzzy c-mean and Level Set.

    PubMed

    Yousefi-Banaem, Hossein; Rabbani, Hossein; Adibi, Peyman

    2016-01-01

    Barrett's mucosa is one of the most important diseases in upper gastrointestinal system that caused by gastro-esophagus reflux. If left untreated, the disease will cause distal esophagus and gastric cardia adenocarcinoma. The malignancy risk is very high in short segment Barrett's mucosa. Therefore, lesion area segmentation can improve specialist decision for treatment. In this paper, we proposed a combined fuzzy method with active models for Barrett's mucosa segmentation. In this study, we applied three methods for special area segmentation and determination. For whole disease area segmentation, we applied the hybrid fuzzy based level set method (LSM). Morphological algorithms were used for gastroesophageal junction determination, and we discriminated Barrett's mucosa from break by applying Chan-Vase method. Fuzzy c-mean and LSMs fail to segment this type of medical image due to weak boundaries. In contrast, the full automatic hybrid method with correlation approach that has used in this paper segmented the metaplasia area in the endoscopy image with desirable accuracy. The presented approach omits the manually desired cluster selection step that needed the operator manipulation. Obtained results convinced us that this approach is suitable for esophagus metaplasia segmentation.

  4. Joint optimization of segmentation and shape prior from level-set-based statistical shape model, and its application to the automated segmentation of abdominal organs.

    PubMed

    Saito, Atsushi; Nawano, Shigeru; Shimizu, Akinobu

    2016-02-01

    The goal of this study is to provide a theoretical framework for accurately optimizing the segmentation energy considering all of the possible shapes generated from the level-set-based statistical shape model (SSM). The proposed algorithm solves the well-known open problem, in which a shape prior may not be optimal in terms of an objective functional that needs to be minimized during segmentation. The algorithm allows the selection of an optimal shape prior from among all possible shapes generated from an SSM by conducting a branch-and-bound search over an eigenshape space. The proposed algorithm does not require predefined shape templates or the construction of a hierarchical clustering tree before graph-cut segmentation. It jointly optimizes an objective functional in terms of both the shape prior and segmentation labeling, and finds an optimal solution by considering all possible shapes generated from an SSM. We apply the proposed algorithm to both pancreas and spleen segmentation using multiphase computed tomography volumes, and we compare the results obtained with those produced by a conventional algorithm employing a branch-and-bound search over a search tree of predefined shapes, which were sampled discretely from an SSM. The proposed algorithm significantly improves the segmentation performance in terms of the Jaccard index and Dice similarity index. In addition, we compare the results with the state-of-the-art multiple abdominal organs segmentation algorithm, and confirmed that the performances of both algorithms are comparable to each other. We discuss the high computational efficiency of the proposed algorithm, which was determined experimentally using a normalized number of traversed nodes in a search tree, and the extensibility of the proposed algorithm to other SSMs or energy functionals.

  5. Assessment of the Coastal Landscape Response to Sea-Level Rise Using a Decision-Support Framework

    NASA Astrophysics Data System (ADS)

    Lentz, E. E.; Thieler, E. R.; Plant, N. G.; Stippa, S.; Horton, R. M.; Gesch, D. B.

    2014-12-01

    Identifying the form and nature of coastal landscape changes that may occur in response to future sea-level rise (SLR) is essential to support decision making for resource allocation that improves climate change resilience. Both natural ecosystems and the built environment are subject to these changes and require associated resilience assessments. Existing assessments of coastal change driven by SLR typically focus on two categories of coastal response: 1) inundation by flooding as the water level rises; and 2) dynamic change resulting from movement of landforms and/or ecosystems. Results from these assessments are not always straightforward to apply in a decision support context, as it can be unclear what the dominant response type may be in a given coastal setting (e.g., barrier island, headland, wetland, forest). Furthermore, an important decision support element is to capture and clearly convey the associated uncertainty of both the underlying datasets (e.g., elevation) and climate drivers (e.g., relative SLR). We developed a Bayesian network model of SLR assessment that uses publicly available geospatial datasets—land cover, elevation, and vertical land movement—and their associated uncertainties to generate probabilistic predictions of those areas likely to inundate versus dynamically respond to various SLR scenarios. SLR projections were generated using multiple sources of information, including Coupled Model Intercomparison Project Phase 5 (CMIP5) models. Model outputs include predictions of potential future land-surface elevation and coastal response type at landscape (>100 km) to local (5-10 km) scales for the Northeastern U.S., commensurate with decision-making needs. The probabilistic approach allows us to objectively and transparently describe prediction certainty to decision makers. From this approach, we are also able to highlight areas in which more data or knowledge may be needed to provide a stronger basis for decision making.

  6. Framework for combining REACH and national regulations to obtain equal protection levels of human health and the environment in different countries - comparative study of Denmark and Korea.

    PubMed

    Lee, Jihyun; Pedersen, Anders Branth; Pedersen, Anders Brandt; Thomsen, Marianne

    2013-08-15

    The aim of this paper is to present a conceptual framework for a systems approach to protect the environment and human health by taking into account differences in the cumulative risks of total human exposure in a territorial context. To this end the measures that are available and that can be included in REACH exposure scenarios in order to obtain territorially relevant chemical safety assessments (CSAs) were explored. The advantage of linking the REACH exposure scenarios with background environmental quality data reported under other national regulations is discussed. The main question is how REACH may be improved to protect the environment and human health inside and outside the EU. This question is exemplified in a comparative case study of two countries, Denmark and Korea, each with its own set of different environmental qualities and national regulations. As a member of the EU Denmark is obliged to adopt REACH, while Korea implemented REACH to improve the competitiveness of Korean industry within the EU market. It is presented how differences in national regulations and environmental qualities in these two countries affect background human exposure concentrations. Choosing lead as a model compound, the territorial differences in background exposure to endocrine and neurological interfering stressors were modelled. It is concluded that the different territorial soil and air lead pollution levels contribute differently to the total childhood lead exposure in the two countries. As such, the probability of the total exposure from air and soil exceeding 10% of the provisional Total Daily Intake (PTDI) is estimated to be 55.3% in Denmark and 8.2% in Korea. The relative contribution from air inhalation and soil ingestion to childhood lead exposure is estimated to be 1-99% in Denmark while it is 83-17% in Korea.

  7. Gender Mainstreaming in Education at the Level of Field Operations: The Case of CARE USA's Indicator Framework

    ERIC Educational Resources Information Center

    Miske, Shirley; Meagher, Margaret; DeJaeghere, Joan

    2010-01-01

    Following the adoption of gender mainstreaming at the Beijing Conference for Women in 1995 as a major strategy to promote gender equality and the recognition of gender analysis as central to this process, Gender and Development (GAD) frameworks have provided tools for gender analysis in various sectors. Gender mainstreaming in basic education has…

  8. A Simple Model Framework to Explore the Deeply Uncertain, Local Sea Level Response to Climate Change. A Case Study on New Orleans, Louisiana

    NASA Astrophysics Data System (ADS)

    Bakker, Alexander; Louchard, Domitille; Keller, Klaus

    2016-04-01

    Sea-level rise threatens many coastal areas around the world. The integrated assessment of potential adaptation and mitigation strategies requires a sound understanding of the upper tails and the major drivers of the uncertainties. Global warming causes sea-level to rise, primarily due to thermal expansion of the oceans and mass loss of the major ice sheets, smaller ice caps and glaciers. These components show distinctly different responses to temperature changes with respect to response time, threshold behavior, and local fingerprints. Projections of these different components are deeply uncertain. Projected uncertainty ranges strongly depend on (necessary) pragmatic choices and assumptions; e.g. on the applied climate scenarios, which processes to include and how to parameterize them, and on error structure of the observations. Competing assumptions are very hard to objectively weigh. Hence, uncertainties of sea-level response are hard to grasp in a single distribution function. The deep uncertainty can be better understood by making clear the key assumptions. Here we demonstrate this approach using a relatively simple model framework. We present a mechanistically motivated, but simple model framework that is intended to efficiently explore the deeply uncertain sea-level response to anthropogenic climate change. The model consists of 'building blocks' that represent the major components of sea-level response and its uncertainties, including threshold behavior. The framework's simplicity enables the simulation of large ensembles allowing for an efficient exploration of parameter uncertainty and for the simulation of multiple combined adaptation and mitigation strategies. The model framework can skilfully reproduce earlier major sea level assessments, but due to the modular setup it can also be easily utilized to explore high-end scenarios and the effect of competing assumptions and parameterizations.

  9. Sedimentary framework of the southern Maine inner continental shelf: Influence of glaciation and sea-level change

    USGS Publications Warehouse

    Kelley, J.T.; Belknap, D.F.; Shipp, R.C.

    1989-01-01

    Although the tidally influenced shoreline of Maine is longer than that of virtually any other state, almost no research on its geology has been published. In order to go some way towards remedying this, 1500 km of high-resolution seismic reflection data and 800 km of sidescan sonar imagery have been collected. On the basis of these data and observations made during ten submersible dives, more than 800 bottom samples were collected and evaluated for texture and composition. The understanding of the sedimentary framework of the southern Maine shelf and the processes that maintain it are summarized, and future research directions to evaluate the strategic mineral potential are indicated. In the past 14,000 years, the Maine shelf has experienced a deglaciation and two marine transgressions separated by a regression. The deglaciation was accompanied by the first transgression and deposited till interbedded with up to 40 m of glaciomarine sediment (the Presumpscot Formation) across the shelf. The first transgression culminated about 12,500 yrs B.P., and its landward limit is marked by large glaciomarine deltas 50-100 km landward of the present-day coast. Sea level fell until about 9500 yrs B.P., when shorelines were cut at about the 65 m depth and some large "lowstand deltas" were deposited. Sea level has risen since then and in the general absence of modern river sediment input marine processes have reworked the older sediment. Five shelf environments have been defined in terms of their surficial sediment and stratigraphy. Nearshore ramps are sandy regions extending to about 30 m deep offshore of sandy beaches. These may be reworked lowstand deltas, and possess the thickest bodies of sand in the region. Nearshore basins are mud-filled troughs seaward of coastal areas lacking significant river input. Slumping glaciomarine deposits provide most of the Holocene mud that floors these basins. Rocky zones are extensive areas of exposed rock most common in the 30-50 m depth

  10. Self-Compassion: A Mentorship Framework for Counselor Educator Mothers

    ERIC Educational Resources Information Center

    Solomon, Coralis; Barden, Sejal Mehta

    2016-01-01

    Counselor educators experience high levels of stress. Mothers in academia face an additional set of emotional stressors. The authors offer a self-compassion framework for mentors to increase emotional resilience of mothers in counselor education.

  11. The Resilience Activation Framework: A conceptual model of how access to social resources promotes adaptation and rapid recovery in post-disaster settings

    PubMed Central

    Abramson, David M.; Grattan, Lynn M.; Mayer, Brian; Colten, Craig E.; Arosemena, Farah A.; Rung, Ariane; Lichtveld, Maureen

    2014-01-01

    A number of governmental agencies have called for enhancing citizen’s resilience as a means of preparing populations in advance of disasters, and as a counter-balance to social and individual vulnerabilities. This increasing scholarly, policy and programmatic interest in promoting individual and communal resilience presents a challenge to the research and practice communities: to develop a translational framework that can accommodate multi-disciplinary scientific perspectives into a single, applied model. The Resilience Activation Framework provides a basis for testing how access to social resources, such as formal and informal social support and help, promotes positive adaptation or reduced psychopathology among individuals and communities exposed to the acute collective stressors associated with disasters, whether manmade, natural, or technological in origin. Articulating the mechanisms by which access to social resources activate and sustain resilience capacities for optimal mental health outcomes post-disaster can lead to the development of effective preventive and early intervention programs. PMID:24870399

  12. The resilience activation framework: a conceptual model of how access to social resources promotes adaptation and rapid recovery in post-disaster settings.

    PubMed

    Abramson, David M; Grattan, Lynn M; Mayer, Brian; Colten, Craig E; Arosemena, Farah A; Bedimo-Rung, Ariane; Lichtveld, Maureen

    2015-01-01

    A number of governmental agencies have called for enhancing citizens' resilience as a means of preparing populations in advance of disasters, and as a counterbalance to social and individual vulnerabilities. This increasing scholarly, policy, and programmatic interest in promoting individual and communal resilience presents a challenge to the research and practice communities: to develop a translational framework that can accommodate multidisciplinary scientific perspectives into a single, applied model. The Resilience Activation Framework provides a basis for testing how access to social resources, such as formal and informal social support and help, promotes positive adaptation or reduced psychopathology among individuals and communities exposed to the acute collective stressors associated with disasters, whether human-made, natural, or technological in origin. Articulating the mechanisms by which access to social resources activate and sustain resilience capacities for optimal mental health outcomes post-disaster can lead to the development of effective preventive and early intervention programs.

  13. A novel connectionist framework for computation of an approximate convex-hull of a set of planar points, circles and ellipses.

    PubMed

    Pal, Srimanta; Bhattacharya, Sabyasachi; Pal, Nikhil R

    2006-02-01

    We propose a two layer neural network for computation of an approximate convex-hull of a set of points or a set of circles/ellipses of different sizes. The algorithm is based on a very elegant concept - shrinking of a rubber band surrounding the set of planar objects. Logically, a set of neurons is placed on a circle (rubber band) surrounding the objects. Each neuron has a parameter vector associated with it. This may be viewed as the current position of the neuron. The given set of points/objects exerts a force of attraction on every neuron, which determines how its current position will be updated (as if, the force determines the direction of movement of the neuron lying on the rubber band). As the network evolves, the neurons (parameter vectors) approximate the convex-hull more and more accurately. The scheme can be applied to find the convex-hull of a planar set of circles or ellipses or a mixture of the two. Some properties related to the evolution of the algorithm are also presented.

  14. Design of a pseudo-log image transform hardware accelerator in a high-level synthesis-based memory management framework

    NASA Astrophysics Data System (ADS)

    Butt, Shahzad Ahmad; Mancini, Stéphane; Rousseau, Frédéric; Lavagno, Luciano

    2014-09-01

    The pseudo-log image transform belongs to a class of image processing kernels that generate memory references which are nonlinear functions of loop indices. Due to the nonlinearity of the memory references, the usual design methodologies do not allow efficient hardware implementation for nonlinear kernels. For optimized hardware implementation, these kernels require the creation of a customized memory hierarchy and efficient data/memory management strategy. We present the design and real-time hardware implementation of a pseudo-log image transform IP (hardware image processing engine) using a memory management framework. The framework generates a controller which efficiently manages input data movement in the form of tiles between off-chip main memory, on-chip memory, and the core processing unit. The framework can jointly optimize the memory hierarchy and the tile computation schedule to reduce on-chip memory requirements, to maximize throughput, and to increase data reuse for reducing off-chip memory bandwidth requirements. The algorithmic C++ description of the pseudo-log kernel is profiled in the framework to generate an enhanced description with a customized memory hierarchy. The enhanced description of the kernel is then used for high-level synthesis (HLS) to perform architectural design space exploration in order to find an optimal implementation under given performance constraints. The optimized register transfer level implementation of the IP generated after HLS is used for performance estimation. The performance estimation is done in a simulation framework to characterize the IP with different external off-chip memory latencies and a variety of data transfer policies. Experimental results show that the designed IP can be used for real-time implementation and that the generated memory hierarchy is capable of feeding the IP with a sufficiently high bandwidth even in the presence of long external memory latencies.

  15. The role of the basis set and the level of quantum mechanical theory in the prediction of the structure and reactivity of cisplatin.

    PubMed

    Paschoal, Diego; Marcial, Bruna L; Lopes, Juliana Fedoce; De Almeida, Wagner B; Dos Santos, Hélio F

    2012-11-05

    In this article, we conducted an extensive ab initio study on the importance of the level of theory and the basis set for theoretical predictions of the structure and reactivity of cisplatin [cis-diamminedichloroplatinum(II) (cDDP)]. Initially, the role of the basis set for the Pt atom was assessed using 24 different basis sets, including three all-electron basis sets (ABS). In addition, a modified all-electron double zeta polarized basis set (mDZP) was proposed by adding a set of diffuse d functions onto the existing DZP basis set. The energy barrier and the rate constant for the first chloride/water exchange ligand process, namely, the aquation reaction, were taken as benchmarks for which reliable experimental data are available. At the B3LYP/mDZP/6-31+G(d) level (the first basis set is for Pt and the last set is for all of the light atoms), the energy barrier was 22.8 kcal mol(-1), which is in agreement with the average experimental value, 22.9 ± 0.4 kcal mol(-1). For the other accessible ABS (DZP and ADZP), the corresponding values were 15.4 and 24.5 kcal mol(-1), respectively. The ADZP and mDZP are notably similar, raising the importance of diffuse d functions for the prediction of the kinetic properties of cDDP. In this article, we also analyze the ligand basis set and the level of theory effects by considering 36 basis sets at distinct levels of theory, namely, Hartree-Fock, MP2, and several DFT functionals. From a survey of the data, we recommend the mPW1PW91/mDZP/6-31+G(d) or B3PW91/mDZP/6-31+G(d) levels to describe the structure and reactivity of cDDP and its small derivatives. Conversely, for large molecules containing a cisplatin motif (for example, the cDDP-DNA complex), the lower levels B3LYP/LANL2DZ/6-31+G(d) and B3LYP/SBKJC-VDZ/6-31+G(d) are suggested. At these levels of theory, the predicted energy barrier was 26.0 and 25.9 kcal mol(-1), respectively, which is only 13% higher than the actual value.

  16. End of FY10 report - used fuel disposition technical bases and lessons learned : legal and regulatory framework for high-level waste disposition in the United States.

    SciTech Connect

    Weiner, Ruth F.; Blink, James A.; Rechard, Robert Paul; Perry, Frank; Jenkins-Smith, Hank C.; Carter, Joe; Nutt, Mark; Cotton, Tom

    2010-09-01

    This report examines the current policy, legal, and regulatory framework pertaining to used nuclear fuel and high level waste management in the United States. The goal is to identify potential changes that if made could add flexibility and possibly improve the chances of successfully implementing technical aspects of a nuclear waste policy. Experience suggests that the regulatory framework should be established prior to initiating future repository development. Concerning specifics of the regulatory framework, reasonable expectation as the standard of proof was successfully implemented and could be retained in the future; yet, the current classification system for radioactive waste, including hazardous constituents, warrants reexamination. Whether or not consideration of multiple sites are considered simultaneously in the future, inclusion of mechanisms such as deliberate use of performance assessment to manage site characterization would be wise. Because of experience gained here and abroad, diversity of geologic media is not particularly necessary as a criterion in site selection guidelines for multiple sites. Stepwise development of the repository program that includes flexibility also warrants serious consideration. Furthermore, integration of the waste management system from storage, transportation, and disposition, should be examined and would be facilitated by integration of the legal and regulatory framework. Finally, in order to enhance acceptability of future repository development, the national policy should be cognizant of those policy and technical attributes that enhance initial acceptance, and those policy and technical attributes that maintain and broaden credibility.

  17. Multiscale level-set method for accurate modeling of immiscible two-phase flow with deposited thin films on solid surfaces

    NASA Astrophysics Data System (ADS)

    Abu-Al-Saud, Moataz O.; Riaz, Amir; Tchelepi, Hamdi A.

    2017-03-01

    We developed a multiscale sharp-interface level-set method for immiscible two-phase flow with a pre-existing thin film on solid surfaces. The lubrication approximation theory is used to model the thin-film equation efficiently. The incompressible Navier-Stokes, level-set, and thin-film evolution equations are coupled sequentially to capture the dynamics occurring at multiple length scales. The Hamilton-Jacobi level-set reinitialization is employed to construct the signed-distance function, which takes into account the deposited thin-film on the solid surface. The proposed multiscale method is validated and shown to match the augmented Young-Laplace equation for a static meniscus in a capillary tube. Viscous bending of the advancing interface over the precursor film is captured by the proposed level-set method and agrees with the Cox-Voinov theory. The advancing bubble surrounded by a wetting film inside a capillary tube is considered, and the predicted film thickness compares well with both theory and experiments. We also demonstrate that the multiscale level-set approach can model immiscible two-phase flow with a capillary number as low as 10-6.

  18. The relative noise levels of parallel axis gear sets with various contact ratios and gear tooth forms

    NASA Astrophysics Data System (ADS)

    Drago, Raymond J.; Lenski, Joseph W., Jr.; Spencer, Robert H.; Valco, Mark; Oswald, Fred B.

    1993-12-01

    The real noise reduction benefits which may be obtained through the use of one gear tooth form as compared to another is an important design parameter for any geared system, especially for helicopters in which both weight and reliability are very important factors. This paper describes the design and testing of nine sets of gears which are as identical as possible except for their basic tooth geometry. Noise measurements were made at various combinations of load and speed for each gear set so that direct comparisons could be made. The resultant data was analyzed so that valid conclusions could be drawn and interpreted for design use.

  19. The relative noise levels of parallel axis gear sets with various contact ratios and gear tooth forms

    NASA Technical Reports Server (NTRS)

    Drago, Raymond J.; Lenski, Joseph W., Jr.; Spencer, Robert H.; Valco, Mark; Oswald, Fred B.

    1993-01-01

    The real noise reduction benefits which may be obtained through the use of one gear tooth form as compared to another is an important design parameter for any geared system, especially for helicopters in which both weight and reliability are very important factors. This paper describes the design and testing of nine sets of gears which are as identical as possible except for their basic tooth geometry. Noise measurements were made at various combinations of load and speed for each gear set so that direct comparisons could be made. The resultant data was analyzed so that valid conclusions could be drawn and interpreted for design use.

  20. An application-dependent framework for the recognition of high-level surgical tasks in the OR.

    PubMed

    Lalys, Florent; Riffaud, Laurent; Bouget, David; Jannin, Pierre

    2011-01-01

    Surgical process analysis and modeling is a recent and important topic aiming at introducing a new generation of computer-assisted surgical systems. Among all of the techniques already in use for extracting data from the Operating Room, the use of image videos allows automating the surgeons' assistance without altering the surgical routine. We proposed in this paper an application-dependent framework able to automatically extract the phases of the surgery only by using microscope videos as input data and that can be adaptable to different surgical specialties. First, four distinct types of classifiers based on image processing were implemented to extract visual cues from video frames. Each of these classifiers was related to one kind of visual cue: visual cues recognizable through color were detected with a color histogram approach, for shape-oriented visual cues we trained a Haar classifier, for texture-oriented visual cues we used a bag-of-word approach with SIFT descriptors, and for all other visual cues we used a classical image classification approach including a feature extraction, selection, and a supervised classification. The extraction of this semantic vector for each video frame then permitted to classify time series using either Hidden Markov Model or Dynamic Time Warping algorithms. The framework was validated on cataract surgeries, obtaining accuracies of 95%.