Science.gov

Sample records for level set framework

  1. An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs

    PubMed Central

    Mosaliganti, Kishore R.; Gelas, Arnaud; Megason, Sean G.

    2013-01-01

    In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish

  2. Setting Healthcare Priorities at the Macro and Meso Levels: A Framework for Evaluation

    PubMed Central

    Barasa, Edwine W.; Molyneux, Sassy; English, Mike; Cleary, Susan

    2015-01-01

    Background: Priority setting in healthcare is a key determinant of health system performance. However, there is no widely accepted priority setting evaluation framework. We reviewed literature with the aim of developing and proposing a framework for the evaluation of macro and meso level healthcare priority setting practices. Methods: We systematically searched Econlit, PubMed, CINAHL, and EBSCOhost databases and supplemented this with searches in Google Scholar, relevant websites and reference lists of relevant papers. A total of 31 papers on evaluation of priority setting were identified. These were supplemented by broader theoretical literature related to evaluation of priority setting. A conceptual review of selected papers was undertaken. Results: Based on a synthesis of the selected literature, we propose an evaluative framework that requires that priority setting practices at the macro and meso levels of the health system meet the following conditions: (1) Priority setting decisions should incorporate both efficiency and equity considerations as well as the following outcomes; (a) Stakeholder satisfaction, (b) Stakeholder understanding, (c) Shifted priorities (reallocation of resources), and (d) Implementation of decisions. (2) Priority setting processes should also meet the procedural conditions of (a) Stakeholder engagement, (b) Stakeholder empowerment, (c) Transparency, (d) Use of evidence, (e) Revisions, (f) Enforcement, and (g) Being grounded on community values. Conclusion: Available frameworks for the evaluation of priority setting are mostly grounded on procedural requirements, while few have included outcome requirements. There is, however, increasing recognition of the need to incorporate both consequential and procedural considerations in priority setting practices. In this review, we adapt an integrative approach to develop and propose a framework for the evaluation of priority setting practices at the macro and meso levels that draws from these

  3. A unified variational segmentation framework with a level-set based sparse composite shape prior

    NASA Astrophysics Data System (ADS)

    Liu, Wenyang; Ruan, Dan

    2015-03-01

    Image segmentation plays an essential role in many medical applications. Low SNR conditions and various artifacts makes its automation challenging. To achieve robust and accurate segmentation results, a good approach is to introduce proper shape priors. In this study, we present a unified variational segmentation framework that regularizes the target shape with a level-set based sparse composite prior. When the variational problem is solved with a block minimization/decent scheme, the regularizing impact of the sparse composite prior can be observed to adjust to the most recent shape estimate, and may be interpreted as a ‘dynamic’ shape prior, yet without compromising convergence thanks to the unified energy framework. The proposed method was applied to segment corpus callosum from 2D MR images and liver from 3D CT volumes. Its performance was evaluated using Dice Similarity Coefficient and Hausdorff distance, and compared with two benchmark level-set based segmentation methods. The proposed method has achieved statistically significant higher accuracy in both experiments and avoided faulty inclusion/exclusion of surrounding structures with similar intensities, as opposed to the benchmark methods.

  4. A Unified Variational Segmentation Framework with a Level-set based Sparse Composite Shape Prior

    PubMed Central

    Liu, Wenyang; Ruan, Dan

    2015-01-01

    Image segmentation plays an essential role in many medical applications. Low SNR conditions and various artifacts makes its automation challenging. To achieve robust and accurate segmentation results, a good approach is to introduce proper shape priors. In this study, we present a unified variational segmentation framework that regularizes the target shape with a level-set based sparse composite prior. When the variational problem is solved with a block minimization/decent scheme, the regularizing impact of the sparse composite prior can be observed to adjust to the most recent shape estimate, and may be interpreted as a “dynamic” shape prior, yet without compromising convergence thanks to the unified energy framework. The proposed method was applied to segment corpus callosum from 2D MR images and liver from 3D CT volumes. Its performance was evaluated using Dice Similarity Coefficient and Hausdorff distance, and compared with two benchmark level-set based segmentation methods. The proposed method has achieved statistically significant higher accuracy in both experiments and avoided faulty inclusion/exclusion of surrounding structures with similar intensities, as opposed to the benchmark methods. PMID:25668234

  5. A multi-phase level set framework for source reconstruction in bioluminescence tomography

    SciTech Connect

    Huang Heyu; Qu Xiaochao; Liang Jimin; He Xiaowei; Chen Xueli; Yang Da'an; Tian Jie

    2010-07-01

    We propose a novel multi-phase level set algorithm for solving the inverse problem of bioluminescence tomography. The distribution of unknown interior source is considered as piecewise constant and represented by using multiple level set functions. The localization of interior bioluminescence source is implemented by tracing the evolution of level set function. An alternate search scheme is incorporated to ensure the global optimal of reconstruction. Both numerical and physical experiments are performed to evaluate the developed level set reconstruction method. Reconstruction results show that the proposed method can stably resolve the interior source of bioluminescence tomography.

  6. A coupled level-set framework for bladder wall segmentation with application to MRI-based virtual cystoscopy

    NASA Astrophysics Data System (ADS)

    Duan, Chaijie; Bao, Shanglian; Liang, Zhengrong

    2009-02-01

    In this paper, we propose a coupled level-set framework for segmentation of bladder wall using T1-weighted magnetic resonance (MR) images. The segmentation results will be used for non-invasive MR-based virtual cystoscopy (VCys). The framework uses two level-set functions to segment inner and outer borders of the bladder wall respectively. Based on Chan-Vese (C-V) model, a local adaptive fitting (LAF) image energy is introduced to capture local intensity contrast. Comparing with previous work, our method has the following advantages. First of all, unlike most other work which only segments the boundary of the bladder but not inner border and outer border respectively, our method extracts the inner border as well as the outer border of bladder wall automatically. Secondly, we focus on T1-weighted MR images which decrease the image intensity of the urine and therefore minimize the partial volume effect (PVE) on the bladder wall for detection of abnormalities on the mucosa layer in contrast to others' work on CT images and T2-weighted MR images which enhance the intensity of the urine and encounter the PVE. In addition, T1-weighted MR images provide the best tissue contrast for detection of the outer border of the bladder wall. Since MR images tend to be inhomogeneous and have ghost artifacts due to motion and other causes as compared to computer tomography (CT)-based VCys, our framework is easy to control the geometric property of level-set functions to mitigate the influences of inhomogeneity and ghosts. Finally, a variety of geometric parameters, such as the thickness of bladder wall, etc, can be measured easily under the level-set framework. These parameters are clinically important for VCys. The segmentation results were evaluated by experienced radiologists, whose feedback strongly demonstrated the usefulness of such coupled level-set framework for VCys.

  7. Improving district level health planning and priority setting in Tanzania through implementing accountability for reasonableness framework: Perceptions of stakeholders

    PubMed Central

    2010-01-01

    Background In 2006, researchers and decision-makers launched a five-year project - Response to Accountable Priority Setting for Trust in Health Systems (REACT) - to improve planning and priority-setting through implementing the Accountability for Reasonableness framework in Mbarali District, Tanzania. The objective of this paper is to explore the acceptability of Accountability for Reasonableness from the perspectives of the Council Health Management Team, local government officials, health workforce and members of user boards and committees. Methods Individual interviews were carried out with different categories of actors and stakeholders in the district. The interview guide consisted of a series of questions, asking respondents to describe their perceptions regarding each condition of the Accountability for Reasonableness framework in terms of priority setting. Interviews were analysed using thematic framework analysis. Documentary data were used to support, verify and highlight the key issues that emerged. Results Almost all stakeholders viewed Accountability for Reasonableness as an important and feasible approach for improving priority-setting and health service delivery in their context. However, a few aspects of Accountability for Reasonableness were seen as too difficult to implement given the socio-political conditions and traditions in Tanzania. Respondents mentioned: budget ceilings and guidelines, low level of public awareness, unreliable and untimely funding, as well as the limited capacity of the district to generate local resources as the major contextual factors that hampered the full implementation of the framework in their context. Conclusion This study was one of the first assessments of the applicability of Accountability for Reasonableness in health care priority-setting in Tanzania. The analysis, overall, suggests that the Accountability for Reasonableness framework could be an important tool for improving priority-setting processes in the

  8. A fourth-order accurate curvature computation in a level set framework for two-phase flows subjected to surface tension forces

    NASA Astrophysics Data System (ADS)

    Coquerelle, Mathieu; Glockner, Stéphane

    2016-01-01

    We propose an accurate and robust fourth-order curvature extension algorithm in a level set framework for the transport of the interface. The method is based on the Continuum Surface Force approach, and is shown to efficiently calculate surface tension forces for two-phase flows. In this framework, the accuracy of the algorithms mostly relies on the precise computation of the surface curvature which we propose to accomplish using a two-step algorithm: first by computing a reliable fourth-order curvature estimation from the level set function, and second by extending this curvature rigorously in the vicinity of the surface, following the Closest Point principle. The algorithm is easy to implement and to integrate into existing solvers, and can easily be extended to 3D. We propose a detailed analysis of the geometrical and numerical criteria responsible for the appearance of spurious currents, a well known phenomenon observed in various numerical frameworks. We study the effectiveness of this novel numerical method on state-of-the-art test cases showing that the resulting curvature estimate significantly reduces parasitic currents. In addition, the proposed approach converges to fourth-order regarding spatial discretization, which is two orders of magnitude better than algorithms currently available. We also show the necessity for high-order transport methods for the surface by studying the case of the 2D advection of a column at equilibrium thereby proving the robustness of the proposed approach. The algorithm is further validated on more complex test cases such as a rising bubble.

  9. Monitoring street-level spatial-temporal variations of carbon monoxide in urban settings using a wireless sensor network (WSN) framework.

    PubMed

    Wen, Tzai-Hung; Jiang, Joe-Air; Sun, Chih-Hong; Juang, Jehn-Yih; Lin, Tzu-Shiang

    2013-12-01

    Air pollution has become a severe environmental problem due to urbanization and heavy traffic. Monitoring street-level air quality is an important issue, but most official monitoring stations are installed to monitor large-scale air quality conditions, and their limited spatial resolution cannot reflect the detailed variations in air quality that may be induced by traffic jams. By deploying wireless sensors on crossroads and main roads, this study established a pilot framework for a wireless sensor network (WSN)-based real-time monitoring system to understand street-level spatial-temporal changes of carbon monoxide (CO) in urban settings. The system consists of two major components. The first component is the deployment of wireless sensors. We deployed 44 sensor nodes, 40 transmitter nodes and four gateway nodes in this study. Each sensor node includes a signal processing module, a CO sensor and a wireless communication module. In order to capture realistic human exposure to traffic pollutants, all sensors were deployed at a height of 1.5 m on lampposts and traffic signs. The study area covers a total length of 1.5 km of Keelung Road in Taipei City. The other component is a map-based monitoring platform for sensor data visualization and manipulation in time and space. Using intensive real-time street-level monitoring framework, we compared the spatial-temporal patterns of air pollution in different time periods. Our results capture four CO concentration peaks throughout the day at the location, which was located along an arterial and nearby traffic sign. The hourly average could reach 5.3 ppm from 5:00 pm to 7:00 pm due to the traffic congestion. The proposed WSN-based framework captures detailed ground information and potential risk of human exposure to traffic-related air pollution. It also provides street-level insights into real-time monitoring for further early warning of air pollution and urban environmental management. PMID:24287859

  10. Monitoring Street-Level Spatial-Temporal Variations of Carbon Monoxide in Urban Settings Using a Wireless Sensor Network (WSN) Framework

    PubMed Central

    Wen, Tzai-Hung; Jiang, Joe-Air; Sun, Chih-Hong; Juang, Jehn-Yih; Lin, Tzu-Shiang

    2013-01-01

    Air pollution has become a severe environmental problem due to urbanization and heavy traffic. Monitoring street-level air quality is an important issue, but most official monitoring stations are installed to monitor large-scale air quality conditions, and their limited spatial resolution cannot reflect the detailed variations in air quality that may be induced by traffic jams. By deploying wireless sensors on crossroads and main roads, this study established a pilot framework for a wireless sensor network (WSN)-based real-time monitoring system to understand street-level spatial-temporal changes of carbon monoxide (CO) in urban settings. The system consists of two major components. The first component is the deployment of wireless sensors. We deployed 44 sensor nodes, 40 transmitter nodes and four gateway nodes in this study. Each sensor node includes a signal processing module, a CO sensor and a wireless communication module. In order to capture realistic human exposure to traffic pollutants, all sensors were deployed at a height of 1.5 m on lampposts and traffic signs. The study area covers a total length of 1.5 km of Keelung Road in Taipei City. The other component is a map-based monitoring platform for sensor data visualization and manipulation in time and space. Using intensive real-time street-level monitoring framework, we compared the spatial-temporal patterns of air pollution in different time periods. Our results capture four CO concentration peaks throughout the day at the location, which was located along an arterial and nearby traffic sign. The hourly average could reach 5.3 ppm from 5:00 pm to 7:00 pm due to the traffic congestion. The proposed WSN-based framework captures detailed ground information and potential risk of human exposure to traffic-related air pollution. It also provides street-level insights into real-time monitoring for further early warning of air pollution and urban environmental management. PMID:24287859

  11. Optimization from design rules, source and mask, to full chip with a single computational lithography framework: level-set-methods-based inverse lithography technology (ILT)

    NASA Astrophysics Data System (ADS)

    Pang, Linyong; Peng, Danping; Hu, Peter; Chen, Dongxue; Cecil, Tom; He, Lin; Xiao, Guangming; Tolani, Vikram; Dam, Thuc; Baik, Ki-Ho; Gleason, Bob

    2010-04-01

    For semiconductor manufacturers moving toward advanced technology nodes -32nm, 22nm and below - lithography presents a great challenge, because it is fundamentally constrained by basic principles of optical physics. Because no major lithography hardware improvements are expected over the next couple years, Computational Lithography has been recognized by the industry as the key technology needed to drive lithographic performance. This implies not only simultaneous co-optimization of all the lithographic enhancement tricks that have been learned over the years, but that they also be pushed to the limit by powerful computational techniques and systems. In this paper a single computational lithography framework for design, mask, and source co-optimization will be explained in non-mathematical language. A number of memory and logic device results at the 32nm node and below are presented to demonstrate the benefits of Level-Set-Method-based ILT in applications covering design rule optimization, SMO, and full-chip correction.

  12. A framework and a set of tools called Nutting models to estimate retention capacities and loads of nitrogen and phosphorus in rivers at catchment and national level (France)

    NASA Astrophysics Data System (ADS)

    Legeay, Pierre-Louis; Moatar, Florentina; Dupas, Rémi; Gascuel-Odoux, Chantal

    2016-04-01

    The Nutting-N and Nutting-P models (Dupas et al., 2013, 2015) have been developed to estimate Nitrogen and Phosphorus nonpoint-source emissions to surface water, using readily available data. These models were inspired from US model SPARROW (Smith al., 1997) and European model GREEN (Grizzetti et al., 2008), i.e. statistical approaches consisting of linking nitrogen and phosphorus surplus to catchment's land and rivers characteristics to find the catchment relative retention capacities. The nutrient load (L) at the outlet of each catchment is expressed as: L=R*(B*DS+PS) [1] where DS is diffuse sources (i.e. surplus in kg.ha-1/yr-1 for N, P storage in soil for P), PS is point sources from domestic and industrial origin (kg.ha-1.yr-1), R and B are the river system and basin reduction factor, respectively and they combine observed variables and calibrated parameters. The model was calibrated on independent catchments for the 2005-2009 and 2008-2012 periods. Variables were selected according to Bayesian Information Criterion (BIC) in order to optimize the predictive performance of the models. From these basic models, different improvements have been realized to build a framework and a set of tools: 1) a routing module has been added in order to improve estimations on 4 or 5 stream order, i.e. upscaling the basic Nutting approach; 2) a territorial module, in order to test the models at local scale (from 500 to 5000 km²); 3) a seasonal estimation has been investigated. The basic approach as well territorial application will be illustrated. These tools allow water manager to identify areas at risk where high nutrients loads are estimated, as well areas where retention is potentially high and can buffer high nutrient sources. References Dupas R., Curie F., Gascuel-Odoux C., Moatar F., Delmas M., Parnaudeau, V., Durand P., 2013. Assessing N emissions in surface water at the national level: Comparison of country-wide vs. regionalized models. Science of the Total Environment

  13. An adaptive level set method

    SciTech Connect

    Milne, R.B.

    1995-12-01

    This thesis describes a new method for the numerical solution of partial differential equations of the parabolic type on an adaptively refined mesh in two or more spatial dimensions. The method is motivated and developed in the context of the level set formulation for the curvature dependent propagation of surfaces in three dimensions. In that setting, it realizes the multiple advantages of decreased computational effort, localized accuracy enhancement, and compatibility with problems containing a range of length scales.

  14. High-Level Application Framework for LCLS

    SciTech Connect

    Chu, P; Chevtsov, S.; Fairley, D.; Larrieu, C.; Rock, J.; Rogind, D.; White, G.; Zalazny, M.; /SLAC

    2008-04-22

    A framework for high level accelerator application software is being developed for the Linac Coherent Light Source (LCLS). The framework is based on plug-in technology developed by an open source project, Eclipse. Many existing functionalities provided by Eclipse are available to high-level applications written within this framework. The framework also contains static data storage configuration and dynamic data connectivity. Because the framework is Eclipse-based, it is highly compatible with any other Eclipse plug-ins. The entire infrastructure of the software framework will be presented. Planned applications and plug-ins based on the framework are also presented.

  15. A Framework for Describing Interlanguages in Multilingual Settings.

    ERIC Educational Resources Information Center

    Tenjoh-Okwen, Thomas

    1989-01-01

    Outlines a contrastive analysis model and a non-contrastive analysis model for studying interlanguage in strictly bilingual settings, and suggests a bidimensional framework, including both linguistic and curricular components, for studying interlanguage in multilingual settings. (21 references) (CB)

  16. Towards a Framework for Change Detection in Data Sets

    NASA Astrophysics Data System (ADS)

    Böttcher, Mirko; Nauck, Detlef; Ruta, Dymitr; Spott, Martin

    Since the world with its markets, innovations and customers is changing faster than ever before, the key to survival for businesses is the ability to detect, assess and respond to changing conditions rapidly and intelligently. Discovering changes and reacting to or acting upon them before others do has therefore become a strategical issue for many companies. However, existing data analysis techniques are insufflent for this task since they typically assume that the domain under consideration is stable over time. This paper presents a framework that detects changes within a data set at virtually any level of granularity. The underlying idea is to derive a rule-based description of the data set at different points in time and to subsequently analyse how these rules change. Nevertheless, further techniques are required to assist the data analyst in interpreting and assessing their changes. Therefore the framework also contains methods to discard rules that are non-drivers for change and to assess the interestingness of detected changes.

  17. Level set method for microfabrication simulations

    NASA Astrophysics Data System (ADS)

    Baranski, Maciej; Kasztelanic, Rafal; Albero, Jorge; Nieradko, Lukasz; Gorecki, Christophe

    2010-05-01

    The article describes application of Level Set method for two different microfabrication processes. First is shape evolution of during reflow of the glass structure. Investigated problem were approximated by viscous flow of material thus kinetics of the process were known from physical model. Second problem is isotropic wet etching of silicon. Which is much more complicated because dynamics of the shape evolution is strongly coupled with time and geometry shapes history. In etching simulations Level Set method is coupled with Finite Element Method (FEM) that is used for calculation of etching acid concentration that determine geometry evolution of the structure. The problem arising from working with FEM with time varying boundaries was solved with the use of the dynamic mesh technique employing the Level Set formalism of higher dimensional function for geometry description. Isotropic etching was investigated in context of mico-lenses fabrication. Model was compared with experimental data obtained in etching of the silicon moulds used for micro-lenses fabrication.

  18. Setting dietary intake levels: problems and pitfalls.

    PubMed

    Russell, Robert M

    2007-01-01

    Recommended dietary intake levels are the nutrient standards used in designing food assistance programmes, institutional feeding programmes, counselling and teaching. In the USA, the recommended dietary allowances (RDAs) are the basis for setting the poverty threshold and food stamp allotments. In the 1990s, a new paradigm was put forth for estimating nutrient requirements and recommended intake levels. This considered the level of nutrient needed for normal body functioning (versus the amount needed to prevent a deficiency state from occurring). An estimated average requirement (EAR), an RDA and a tolerable upper intake level (UL) were determined for most nutrients. In setting forth these nutrient intake levels (dietary reference intakes, DRIs), a number of data challenges were encountered. For example, it was recognized that for most nutrients there was an absence of dose-response data, and few chronic human or animal studies had been undertaken. In considering how to revise nutrient intake recommendations for populations in the future, the following pitfalls must be overcome: (1) invalid assumption that a threshold level for a requirement will hold for all nutrients; (2) lack of uniform criteria for the selection of the endpoints used (need for evidence-based review, consideration of comparative risk); (3) invalid extrapolations to children for many nutrients; (4) lack of information on variability of responses, and interactions with other nutrients; and (5) lack of understanding in the community of how to use the various DRI numbers. PMID:17913222

  19. Level Set Segmentation of Lumbar Vertebrae Using Appearance Models

    NASA Astrophysics Data System (ADS)

    Fritscher, Karl; Leber, Stefan; Schmölz, Werner; Schubert, Rainer

    For the planning of surgical interventions of the spine exact knowledge about 3D shape and the local bone quality of vertebrae are of great importance in order to estimate the anchorage strength of screws or implants. As a prerequisite for quantitative analysis a method for objective and therefore automated segmentation of vertebrae is needed. In this paper a framework for the automatic segmentation of vertebrae using 3D appearance models in a level set framework is presented. In this framework model information as well as gradient information and probabilities of pixel intensities at object edges in the unseen image are used. The method is tested on 29 lumbar vertebrae leading to accurate results, which can be useful for surgical planning and further analysis of the local bone quality.

  20. Pulmonary lobe segmentation with level sets

    NASA Astrophysics Data System (ADS)

    Schmidt-Richberg, Alexander; Ehrhardt, Jan; Wilms, Matthias; Werner, René; Handels, Heinz

    2012-02-01

    Automatic segmentation of the separate human lung lobes is a crucial task in computer aided diagnostics and intervention planning, and required for example for determination of disease spreading or pulmonary parenchyma quantification. In this work, a novel approach for lobe segmentation based on multi-region level sets is presented. In a first step, interlobular fissures are detected using a supervised enhancement filter. The fissures are then used to compute a cost image, which is incorporated in the level set approach. By this, the segmentation is drawn to the fissures at places where structure information is present in the image. In areas with incomplete fissures (e.g. due to insufficient image quality or anatomical conditions) the smoothing term of the level sets applies and a closed continuation of the fissures is provided. The approach is tested on nine pulmonary CT scans. It is shown that incorporating the additional force term improves the segmentation significantly. On average, 83% of the left fissure is traced correctly; the right oblique and horizontal fissures are properly segmented to 76% and 48%, respectively.

  1. Setting the stage for master's level success

    NASA Astrophysics Data System (ADS)

    Roberts, Donna

    Comprehensive reading, writing, research, and study skills play a critical role in a graduate student's success and ability to contribute to a field of study effectively. The literature indicated a need to support graduate student success in the areas of mentoring, navigation, as well as research and writing. The purpose of this two-phased mixed methods explanatory study was to examine factors that characterize student success at the Master's level in the fields of education, sociology and social work. The study was grounded in a transformational learning framework which focused on three levels of learning: technical knowledge, practical or communicative knowledge, and emancipatory knowledge. The study included two data collection points. Phase one consisted of a Master's Level Success questionnaire that was sent via Qualtrics to graduate level students at three colleges and universities in the Central Valley of California: a California State University campus, a University of California campus, and a private college campus. The results of the chi-square indicated that seven questionnaire items were significant with p values less than .05. Phase two in the data collection included semi-structured interview questions that resulted in three themes emerged using Dedoose software: (1) the need for more language and writing support at the Master's level, (2) the need for mentoring, especially for second-language learners, and (3) utilizing the strong influence of faculty in student success. It is recommended that institutions continually assess and strengthen their programs to meet the full range of learners and to support students to degree completion.

  2. Etch Profile Simulation Using Level Set Methods

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    Etching and deposition of materials are critical steps in semiconductor processing for device manufacturing. Both etching and deposition may have isotropic and anisotropic components, due to directional sputtering and redeposition of materials, for example. Previous attempts at modeling profile evolution have used so-called "string theory" to simulate the moving solid-gas interface between the semiconductor and the plasma. One complication of this method is that extensive de-looping schemes are required at the profile corners. We will present a 2D profile evolution simulation using level set theory to model the surface. (1) By embedding the location of the interface in a field variable, the need for de-looping schemes is eliminated and profile corners are more accurately modeled. This level set profile evolution model will calculate both isotropic and anisotropic etch and deposition rates of a substrate in low pressure (10s mTorr) plasmas, considering the incident ion energy angular distribution functions and neutral fluxes. We will present etching profiles of Si substrates in Ar/Cl2 discharges for various incident ion energies and trench geometries.

  3. Level set based structural topology optimization for minimizing frequency response

    NASA Astrophysics Data System (ADS)

    Shu, Lei; Wang, Michael Yu; Fang, Zongde; Ma, Zhengdong; Wei, Peng

    2011-11-01

    For the purpose of structure vibration reduction, a structural topology optimization for minimizing frequency response is proposed based on the level set method. The objective of the present study is to minimize the frequency response at the specified points or surfaces on the structure with an excitation frequency or a frequency range, subject to the given amount of the material over the admissible design domain. The sensitivity analysis with respect to the structural boundaries is carried out, while the Extended finite element method (X-FEM) is employed for solving the state equation and the adjoint equation. The optimal structure with smooth boundaries is obtained by the level set evolution with advection velocity, derived from the sensitivity analysis and the optimization algorithm. A number of numerical examples, in the frameworks of two-dimension (2D) and three-dimension (3D), are presented to demonstrate the feasibility and effectiveness of the proposed approach.

  4. A probabilistic level set formulation for interactive organ segmentation

    NASA Astrophysics Data System (ADS)

    Cremers, Daniel; Fluck, Oliver; Rousson, Mikael; Aharon, Shmuel

    2007-03-01

    Level set methods have become increasingly popular as a framework for image segmentation. Yet when used as a generic segmentation tool, they suffer from an important drawback: Current formulations do not allow much user interaction. Upon initialization, boundaries propagate to the final segmentation without the user being able to guide or correct the segmentation. In the present work, we address this limitation by proposing a probabilistic framework for image segmentation which integrates input intensity information and user interaction on equal footings. The resulting algorithm determines the most likely segmentation given the input image and the user input. In order to allow a user interaction in real-time during the segmentation, the algorithm is implemented on a graphics card and in a narrow band formulation.

  5. Framework for State-Level Renewable Energy Market Potential Studies

    SciTech Connect

    Kreycik, C.; Vimmerstedt, L.; Doris, E.

    2010-01-01

    State-level policymakers are relying on estimates of the market potential for renewable energy resources as they set goals and develop policies to accelerate the development of these resources. Therefore, accuracy of such estimates should be understood and possibly improved to appropriately support these decisions. This document provides a framework and next steps for state officials who require estimates of renewable energy market potential. The report gives insight into how to conduct a market potential study, including what supporting data are needed and what types of assumptions need to be made. The report distinguishes between goal-oriented studies and other types of studies, and explains the benefits of each.

  6. Chemically Induced Surface Evolutions with Level Sets

    Energy Science and Technology Software Center (ESTSC)

    2006-11-17

    ChISELS is used for the theoretical modeling of detailed surface chemistry and consomitant surface evolutions occurring during microsystem fabrication processes conducted at low pressures. Examples include physical vapor deposition (PVD), low pressure chemical vapor deposition (PECVD), and plasma etching. Evolving interfaces are represented using the level-set method and the evolution equations time integrated using a Semi-Lagrangian approach. A Ballistic transport model is employed to solve for the fluxes incident on each of the surface elements.more » Surface chemistry leading to etching or deposition is computed by either coupling to Surface Chemkin (a commercially available code) or by providing user defined subroutines. The computational meshes used are quad-trees (2-D) and oct-trees (3-D), constructed such that grid refinement is localized to regions near the surface interfaces. As the interface evolves, the mesh is dynamically reconstructed as needed for the grid to remain fine only around the interface. For parallel computation, a domain decomposition scheme with dynamic load balancing is used to distribute the computational work across processors.« less

  7. Chemically Induced Surface Evolutions with Level Sets

    SciTech Connect

    2006-11-17

    ChISELS is used for the theoretical modeling of detailed surface chemistry and consomitant surface evolutions occurring during microsystem fabrication processes conducted at low pressures. Examples include physical vapor deposition (PVD), low pressure chemical vapor deposition (PECVD), and plasma etching. Evolving interfaces are represented using the level-set method and the evolution equations time integrated using a Semi-Lagrangian approach. A Ballistic transport model is employed to solve for the fluxes incident on each of the surface elements. Surface chemistry leading to etching or deposition is computed by either coupling to Surface Chemkin (a commercially available code) or by providing user defined subroutines. The computational meshes used are quad-trees (2-D) and oct-trees (3-D), constructed such that grid refinement is localized to regions near the surface interfaces. As the interface evolves, the mesh is dynamically reconstructed as needed for the grid to remain fine only around the interface. For parallel computation, a domain decomposition scheme with dynamic load balancing is used to distribute the computational work across processors.

  8. Advanced level set segmentation of the right atrium in MR

    NASA Astrophysics Data System (ADS)

    Chen, Siqi; Kohlberger, Timo; Kirchberg, Klaus J.

    2011-03-01

    Atrial fibrillation is a common heart arrhythmia, and can be effectively treated with ablation. Ablation planning requires 3D models of the patient's left atrium (LA) and/or right atrium (RA), therefore an automatic segmentation procedure to retrieve these models is desirable. In this study, we investigate the use of advanced level set segmentation approaches to automatically segment RA in magnetic resonance angiographic (MRA) volume images. Low contrast to noise ratio makes the boundary between the RA and the nearby structures nearly indistinguishable. Therefore, pure data driven segmentation approaches such as watershed and ChanVese methods are bound to fail. Incorporating training shapes through PCA modeling to constrain the segmentation is one popular solution, and is also used in our segmentation framework. The shape parameters from PCA are optimized with a global histogram based energy model. However, since the shape parameters span a much smaller space, it can not capture fine details of the shape. Therefore, we employ a second refinement step after the shape based segmentation stage, which follows closely the recent work of localized appearance model based techniques. The local appearance model is established through a robust point tracking mechanism and is learned through landmarks embedded on the surface of training shapes. The key contribution of our work is the combination of a statistical shape prior and a localized appearance prior for level set segmentation of the right atrium from MRA. We test this two step segmentation framework on porcine RA to verify the algorithm.

  9. A level set segmentation for computer-aided dental x-ray analysis

    NASA Astrophysics Data System (ADS)

    Li, Shuo; Fevens, Thomas; Krzyzak, Adam; Li, Song

    2005-04-01

    A level-set-based segmentation framework for Computer Aided Dental X-rays Analysis (CADXA) is proposed. In this framework, we first employ level set methods to segment the dental X-ray image into three regions: Normal Region (NR), Potential Abnormal Region (PAR), Abnormal and Background Region (ABR). The segmentation results are then used to build uncertainty maps based on a proposed uncertainty measurement method and an analysis scheme is applied. The level set segmentation method consists of two stages: a training stage and a segmentation stage. During the training stage, manually chosen representative images are segmented using hierarchical level set region detection. The segmentation results are used to train a support vector machine (SVM) classifier. During the segmentation stage, a dental X-ray image is first classified by the trained SVM. The classifier provides an initial contour which is close to the correct boundary for the coupled level set method which is then used to further segment the image. Different dental X-ray images are used to test the framework. Experimental results show that the proposed framework achieves faster level set segmentation and provides more detailed information and indications of possible problems to the dentist. To our best knowledge, this is one of the first results on CADXA using level set methods.

  10. Priority setting in healthcare: towards guidelines for the program budgeting and marginal analysis framework.

    PubMed

    Peacock, Stuart J; Mitton, Craig; Ruta, Danny; Donaldson, Cam; Bate, Angela; Hedden, Lindsay

    2010-10-01

    Economists' approaches to priority setting focus on the principles of opportunity cost, marginal analysis and choice under scarcity. These approaches are based on the premise that it is possible to design a rational priority setting system that will produce legitimate changes in resource allocation. However, beyond issuing guidance at the national level, economic approaches to priority setting have had only a moderate impact in practice. In particular, local health service organizations - such as health authorities, health maintenance organizations, hospitals and healthcare trusts - have had difficulty implementing evidence from economic appraisals. Yet, in the context of making decisions between competing claims on scarce health service resources, economic tools and thinking have much to offer. The purpose of this article is to describe and discuss ten evidence-based guidelines for the successful design and implementation of a program budgeting and marginal analysis (PBMA) priority setting exercise. PBMA is a framework that explicitly recognizes the need to balance pragmatic and ethical considerations with economic rationality when making resource allocation decisions. While the ten guidelines are drawn from the PBMA framework, they may be generalized across a range of economic approaches to priority setting. PMID:20950070

  11. Efficient molecular surface generation using level-set methods.

    PubMed

    Can, Tolga; Chen, Chao-I; Wang, Yuan-Fang

    2006-12-01

    Molecules interact through their surface residues. Calculation of the molecular surface of a protein structure is thus an important step for a detailed functional analysis. One of the main considerations in comparing existing methods for molecular surface computations is their speed. Most of the methods that produce satisfying results for small molecules fail to do so for large complexes. In this article, we present a level-set-based approach to compute and visualize a molecular surface at a desired resolution. The emerging level-set methods have been used for computing evolving boundaries in several application areas from fluid mechanics to computer vision. Our method provides a uniform framework for computing solvent-accessible, solvent-excluded surfaces and interior cavities. The computation is carried out very efficiently even for very large molecular complexes with tens of thousands of atoms. We compared our method to some of the most widely used molecular visualization tools (Swiss-PDBViewer, PyMol, and Chimera) and our results show that we can calculate and display a molecular surface 1.5-3.14 times faster on average than all three of the compared programs. Furthermore, we demonstrate that our method is able to detect all of the interior inaccessible cavities that can accommodate one or more water molecules. PMID:16621636

  12. Decentralized health care priority-setting in Tanzania: evaluating against the accountability for reasonableness framework.

    PubMed

    Maluka, Stephen; Kamuzora, Peter; San Sebastiån, Miguel; Byskov, Jens; Olsen, Øystein E; Shayo, Elizabeth; Ndawi, Benedict; Hurtig, Anna-Karin

    2010-08-01

    Priority-setting has become one of the biggest challenges faced by health decision-makers worldwide. Fairness is a key goal of priority-setting and Accountability for Reasonableness has emerged as a guiding framework for fair priority-setting. This paper describes the processes of setting health care priorities in Mbarali district, Tanzania, and evaluates the descriptions against Accountability for Reasonableness. Key informant interviews were conducted with district health managers, local government officials and other stakeholders using a semi-structured interview guide. Relevant documents were also gathered and group priority-setting in the district was observed. The results indicate that, while Tanzania has a decentralized public health care system, the reality of the district level priority-setting process was that it was not nearly as participatory as the official guidelines suggest it should have been. Priority-setting usually occurred in the context of budget cycles and the process was driven by historical allocation. Stakeholders' involvement in the process was minimal. Decisions (but not the reasoning behind them) were publicized through circulars and notice boards, but there were no formal mechanisms in place to ensure that this information reached the public. There were neither formal mechanisms for challenging decisions nor an adequate enforcement mechanism to ensure that decisions were made in a fair and equitable manner. Therefore, priority-setting in Mbarali district did not satisfy all four conditions of Accountability for Reasonableness; namely relevance, publicity, appeals and revision, and enforcement. This paper aims to make two important contributions to this problematic situation. First, it provides empirical analysis of priority-setting at the district level in the contexts of low-income countries. Second, it provides guidance to decision-makers on how to improve fairness, legitimacy, and sustainability of the priority-setting process. PMID

  13. Beyond SMART? A New Framework for Goal Setting

    ERIC Educational Resources Information Center

    Day, Trevor; Tosey, Paul

    2011-01-01

    This article extends currently reported theory and practice in the use of learning goals or targets with students in secondary and further education. Goal-setting and action-planning constructs are employed in personal development plans (PDPs) and personal learning plans (PLPs) and are advocated as practice within the English national policy…

  14. A Framework for Credit. Framework Guidelines 1. Levels, Credit Value and the Award of Credits.

    ERIC Educational Resources Information Center

    Further Education Unit, London (England).

    This document explores the rationale and technical issues underlying the proposal for a common credit framework in Great Britain. This volume, aimed at senior institutional managers, curriculum managers, and practitioners, offers advice on levels, credit value, and award of credit within the framework proposal. A list of terminology is found at…

  15. A contribution to set a legal framework for biofertilisers.

    PubMed

    Malusá, E; Vassilev, N

    2014-08-01

    The extensive research, production and use of microorganisms to improve plant nutrition have resulted in an inconsistent definition of the term "biofertiliser" which, in some cases, is due to the different microbial mechanisms involved. The rationale for adopting the term biofertiliser is that it derives from "biological fertiliser", that, in turn, implies the use of living microorganisms. Here, we propose a definition for this kind of products which is distinguishing them from biostimulants or other inorganic and organic fertilisers. Special emphasis is given to microorganism(s) with multifunctional properties and biofertilisers containing more than one microorganism. This definition could be included in legal provisions regulating registration and marketing requirements. A set of rules is also proposed which could guarantee the quality of biofertilisers present on the market and thus foster their use by farmers. PMID:24903811

  16. Tailoring Healthy Workplace Interventions to Local Healthcare Settings: A Complexity Theory-Informed Workplace of Well-Being Framework

    PubMed Central

    Brand, Sarah L.; Fleming, Lora E.; Wyatt, Katrina M.

    2015-01-01

    Many healthy workplace interventions have been developed for healthcare settings to address the consistently low scores of healthcare professionals on assessments of mental and physical well-being. Complex healthcare settings present challenges for the scale-up and spread of successful interventions from one setting to another. Despite general agreement regarding the importance of the local setting in affecting intervention success across different settings, there is no consensus on what it is about a local setting that needs to be taken into account to design healthy workplace interventions appropriate for different local settings. Complexity theory principles were used to understand a workplace as a complex adaptive system and to create a framework of eight domains (system characteristics) that affect the emergence of system-level behaviour. This Workplace of Well-being (WoW) framework is responsive and adaptive to local settings and allows a shared understanding of the enablers and barriers to behaviour change by capturing local information for each of the eight domains. We use the results of applying the WoW framework to one workplace, a UK National Health Service ward, to describe the utility of this approach in informing design of setting-appropriate healthy workplace interventions that create workplaces conducive to healthy behaviour change. PMID:26380358

  17. Levels of racism: a theoretic framework and a gardener's tale.

    PubMed Central

    Jones, C P

    2000-01-01

    The author presents a theoretic framework for understanding racism on 3 levels: institutionalized, personally mediated, and internalized. This framework is useful for raising new hypotheses about the basis of race-associated differences in health outcomes, as well as for designing effective interventions to eliminate those differences. She then presents an allegory about a gardener with 2 flower boxes, rich and poor soil, and red and pink flowers. This allegory illustrates the relationship between the 3 levels of racism and may guide our thinking about how to intervene to mitigate the impacts of racism on health. It may also serve as a tool for starting a national conversation on racism. PMID:10936998

  18. An efficient MRF embedded level set method for image segmentation.

    PubMed

    Yang, Xi; Gao, Xinbo; Tao, Dacheng; Li, Xuelong; Li, Jie

    2015-01-01

    This paper presents a fast and robust level set method for image segmentation. To enhance the robustness against noise, we embed a Markov random field (MRF) energy function to the conventional level set energy function. This MRF energy function builds the correlation of a pixel with its neighbors and encourages them to fall into the same region. To obtain a fast implementation of the MRF embedded level set model, we explore algebraic multigrid (AMG) and sparse field method (SFM) to increase the time step and decrease the computation domain, respectively. Both AMG and SFM can be conducted in a parallel fashion, which facilitates the processing of our method for big image databases. By comparing the proposed fast and robust level set method with the standard level set method and its popular variants on noisy synthetic images, synthetic aperture radar (SAR) images, medical images, and natural images, we comprehensively demonstrate the new method is robust against various kinds of noises. In particular, the new level set method can segment an image of size 500 × 500 within 3 s on MATLAB R2010b installed in a computer with 3.30-GHz CPU and 4-GB memory. PMID:25420261

  19. A 3D Level Set Method for Microwave Breast Imaging

    PubMed Central

    Colgan, Timothy J.; Hagness, Susan C.; Van Veen, Barry D.

    2015-01-01

    Objective Conventional inverse-scattering algorithms for microwave breast imaging result in moderate resolution images with blurred boundaries between tissues. Recent 2D numerical microwave imaging studies demonstrate that the use of a level set method preserves dielectric boundaries, resulting in a more accurate, higher resolution reconstruction of the dielectric properties distribution. Previously proposed level set algorithms are computationally expensive and thus impractical in 3D. In this paper we present a computationally tractable 3D microwave imaging algorithm based on level sets. Methods We reduce the computational cost of the level set method using a Jacobian matrix, rather than an adjoint method, to calculate Frechet derivatives. We demonstrate the feasibility of 3D imaging using simulated array measurements from 3D numerical breast phantoms. We evaluate performance by comparing full 3D reconstructions to those from a conventional microwave imaging technique. We also quantitatively assess the efficacy of our algorithm in evaluating breast density. Results Our reconstructions of 3D numerical breast phantoms improve upon those of a conventional microwave imaging technique. The density estimates from our level set algorithm are more accurate than those of conventional microwave imaging, and the accuracy is greater than that reported for mammographic density estimation. Conclusion Our level set method leads to a feasible level of computational complexity for full 3D imaging, and reconstructs the heterogeneous dielectric properties distribution of the breast more accurately than conventional microwave imaging methods. Significance 3D microwave breast imaging using a level set method is a promising low-cost, non-ionizing alternative to current breast imaging techniques. PMID:26011863

  20. Hippocampus segmentation using locally weighted prior based level set

    NASA Astrophysics Data System (ADS)

    Achuthan, Anusha; Rajeswari, Mandava

    2015-12-01

    Segmentation of hippocampus in the brain is one of a major challenge in medical image segmentation due to its' imaging characteristics, with almost similar intensity between another adjacent gray matter structure, such as amygdala. The intensity similarity has causes the hippocampus to have weak or fuzzy boundaries. With this main challenge being demonstrated by hippocampus, a segmentation method that relies on image information alone may not produce accurate segmentation results. Therefore, it is needed an assimilation of prior information such as shape and spatial information into existing segmentation method to produce the expected segmentation. Previous studies has widely integrated prior information into segmentation methods. However, the prior information has been utilized through a global manner integration, and this does not reflect the real scenario during clinical delineation. Therefore, in this paper, a locally integrated prior information into a level set model is presented. This work utilizes a mean shape model to provide automatic initialization for level set evolution, and has been integrated as prior information into the level set model. The local integration of edge based information and prior information has been implemented through an edge weighting map that decides at voxel level which information need to be observed during a level set evolution. The edge weighting map shows which corresponding voxels having sufficient edge information. Experiments shows that the proposed integration of prior information locally into a conventional edge-based level set model, known as geodesic active contour has shown improvement of 9% in averaged Dice coefficient.

  1. Exploring the level sets of quantum control landscapes

    SciTech Connect

    Rothman, Adam; Ho, Tak-San; Rabitz, Herschel

    2006-05-15

    A quantum control landscape is defined by the value of a physical observable as a functional of the time-dependent control field E(t) for a given quantum-mechanical system. Level sets through this landscape are prescribed by a particular value of the target observable at the final dynamical time T, regardless of the intervening dynamics. We present a technique for exploring a landscape level set, where a scalar variable s is introduced to characterize trajectories along these level sets. The control fields E(s,t) accomplishing this exploration (i.e., that produce the same value of the target observable for a given system) are determined by solving a differential equation over s in conjunction with the time-dependent Schroedinger equation. There is full freedom to traverse a level set, and a particular trajectory is realized by making an a priori choice for a continuous function f(s,t) that appears in the differential equation for the control field. The continuous function f(s,t) can assume an arbitrary form, and thus a level set generally contains a family of controls, where each control takes the quantum system to the same final target value, but produces a distinct control mechanism. In addition, although the observable value remains invariant over the level set, other dynamical properties (e.g., the degree of robustness to control noise) are not specifically preserved and can vary greatly. Examples are presented to illustrate the continuous nature of level-set controls and their associated induced dynamical features, including continuously morphing mechanisms for population control in model quantum systems.

  2. An Expanded Theoretical Framework of Care Coordination Across Transitions in Care Settings.

    PubMed

    Radwin, Laurel E; Castonguay, Denise; Keenan, Carolyn B; Hermann, Cherice

    2016-01-01

    For many patients, high-quality, patient-centered, and cost-effective health care requires coordination among multiple clinicians and settings. Ensuring optimal care coordination requires a clear understanding of how clinician activities and continuity during transitions affect patient-centeredness and quality outcomes. This article describes an expanded theoretical framework to better understand care coordination. The framework provides clear articulation of concepts. Examples are provided of ways to measure the concepts. PMID:26595361

  3. A Conceptual Framework for a Psychometric Theory for Standard Setting with Examples of Its Use for Evaluating the Functioning of Two Standard Setting Methods

    ERIC Educational Resources Information Center

    Reckase, Mark D.

    2006-01-01

    A conceptual framework is proposed for a psychometric theory of standard setting. The framework suggests that participants in a standard setting process (panelists) develop an internal, intended standard as a result of training and the participant's background. The goal of a standard setting process is to convert panelists' intended standards to…

  4. Public Health and Health Promotion Capacity at National and Regional Level: A Review of Conceptual Frameworks

    PubMed Central

    Aluttis, Christoph; den Broucke, Stephan Van; Chiotan, Cristina; Costongs, Caroline; Michelsen, Kai; Brand, Helmut

    2014-01-01

    The concept of capacity building for public health has gained much attention during the last decade. National as well as international organizations increasingly focus their efforts on capacity building to improve performance in the health sector. During the past two decades, a variety of conceptual frameworks have been developed which describe relevant dimensions for public health capacity. Notably, these frameworks differ in design and conceptualization. This paper therefore reviews the existing conceptual frameworks and integrates them into one framework, which contains the most relevant dimensions for public health capacity at the country- or regional level. A comprehensive literature search was performed to identify frameworks addressing public health capacity building at the national or regional level. We content-analysed these frameworks to identify the core dimensions of public health capacity. The dimensions were subsequently synthesized into a set of thematic areas to construct a conceptual framework which describes the most relevant dimensions for capacities at the national- or regional level. The systematic review resulted in the identification of seven core domains for public health capacity: resources, organizational structures, workforce, partnerships, leadership and governance, knowledge development and country specific context. Accordingly, these dimensions were used to construct a framework, which describes these core domains more in detail. Our research shows that although there is no generally agreedupon model of public health capacity, a number of key domains for public health and health promotion capacity are consistently recurring in existing frameworks, regardless of their geographical location or thematic area. As only little work on the core concepts of public health capacities has yet taken place, this study adds value to the discourse by identifying these consistencies across existing frameworks and by synthesising them into a new

  5. Public health and health promotion capacity at national and regional level: a review of conceptual frameworks.

    PubMed

    Aluttis, Christoph; den Broucke, Stephan Van; Chiotan, Cristina; Costongs, Caroline; Michelsen, Kai; Brand, Helmut

    2014-03-26

    The concept of capacity building for public health has gained much attention during the last decade. National as well as international organizations increasingly focus their efforts on capacity building to improve performance in the health sector. During the past two decades, a variety of conceptual frameworks have been developed which describe relevant dimensions for public health capacity. Notably, these frameworks differ in design and conceptualization. This paper therefore reviews the existing conceptual frameworks and integrates them into one framework, which contains the most relevant dimensions for public health capacity at the country- or regional level. A comprehensive literature search was performed to identify frameworks addressing public health capacity building at the national or regional level. We content-analysed these frameworks to identify the core dimensions of public health capacity. The dimensions were subsequently synthesized into a set of thematic areas to construct a conceptual framework which describes the most relevant dimensions for capacities at the national- or regional level. The systematic review resulted in the identification of seven core domains for public health capacity: resources, organizational structures, workforce, partnerships, leadership and governance, knowledge development and country specific context. Accordingly, these dimensions were used to construct a framework, which describes these core domains more in detail. Our research shows that although there is no generally agreedupon model of public health capacity, a number of key domains for public health and health promotion capacity are consistently recurring in existing frameworks, regardless of their geographical location or thematic area. As only little work on the core concepts of public health capacities has yet taken place, this study adds value to the discourse by identifying these consistencies across existing frameworks and by synthesising them into a new

  6. The exchange boundary framework: understanding the evolution of power within collaborative decision-making settings.

    PubMed

    Watson, Erin R; Foster-Fishman, Pennie G

    2013-03-01

    Many community decision-making bodies encounter challenges in creating conditions where stakeholders from disadvantaged populations can authentically participate in ways that give them actual influence over decisions affecting their lives (Foster-Fishman et al., Lessons for the journey: Strategies and suggestions for guiding planning, governance, and sustainability in comprehensive community initiatives. W.K. Kellogg Foundation, Battle Creek, MI, 2004). These challenges are often rooted in asymmetrical power dynamics operating within the settings (Prilleltensky, J Commun Psychol 36:116-136, 2008). In response, this paper presents the Exchange Boundary Framework, a new approach for understanding and promoting authentic, empowered participation within collaborative decision-making settings. The framework expands upon theories currently used in the field of community psychology by focusing on the underlying processes through which power operates in relationships and examining the evolution of power dynamics over time. By integrating concepts from social exchange theory (Emerson, Am Soc Rev 27:31-41, 1962) and social boundaries theory (Hayward, Polity 31(1):1-22, 1998), the framework situates power within parallel processes of resources exchange and social regulation. The framework can be used to understand the conditions leading to power asymmetries within collaborative decisionmaking processes, and guide efforts to promote more equitable and authentic participation by all stakeholders within these settings. In this paper we describe the Exchange Boundary Framework, apply it to three distinct case studies, and discuss key considerations for its application within collaborative community settings. PMID:22760794

  7. A variational level set approach to multiphase motion

    SciTech Connect

    Zhao, Hong-Kai; Chan, T.; Merriman, B.; Osher, S.

    1996-08-01

    A coupled level set method for the motion of multiple junctions (of, e.g., solid, liquid, and grain boundaries), which follows the gradient flow for an energy functional consisting of surface tension (proportional to length) and bulk energies (proportional to area), is developed. The approach combines the level set method of S. Osher and J. A. Sethian with a theoretical variational formulation of the motion by F. Reitich and H. M. Sonar. The resulting method uses as many level set functions as there are regions and the energy functional is evaluated entirely in terms of level set functions. The gradient projection method leads to a coupled system of perturbed (by curvature terms) Hamilton-Jacobi equations. The coupling is enforced using a single Lagrange multiplier associated with a constraint which essentially prevents (a) regions from overlapping and (b) the development of a vacuum. The numerical implementation is relatively simple and the results agree with (and go beyond) the theory as given in. Other applications of this methodology, including the decomposition of a domain into subregions with minimal interface length, are discussed. Finally, some new techniques and results in level set methodology are presented. 18 refs., 10 figs.

  8. A PDE-Based Fast Local Level Set Method

    NASA Astrophysics Data System (ADS)

    Peng, Danping; Merriman, Barry; Osher, Stanley; Zhao, Hongkai; Kang, Myungjoo

    1999-11-01

    We develop a fast method to localize the level set method of Osher and Sethian (1988, J. Comput. Phys.79, 12) and address two important issues that are intrinsic to the level set method: (a) how to extend a quantity that is given only on the interface to a neighborhood of the interface; (b) how to reset the level set function to be a signed distance function to the interface efficiently without appreciably moving the interface. This fast local level set method reduces the computational effort by one order of magnitude, works in as much generality as the original one, and is conceptually simple and easy to implement. Our approach differs from previous related works in that we extract all the information needed from the level set function (or functions in multiphase flow) and do not need to find explicitly the location of the interface in the space domain. The complexity of our method to do tasks such as extension and distance reinitialization is O(N), where N is the number of points in space, not O(N log N) as in works by Sethian (1996, Proc. Nat. Acad. Sci. 93, 1591) and Helmsen and co-workers (1996, SPIE Microlithography IX, p. 253). This complexity estimation is also valid for quite general geometrically based front motion for our localized method.

  9. Total variation and level set methods in image science

    NASA Astrophysics Data System (ADS)

    Tsai, Yen-Hsi Richard; Osher, Stanley

    We review level set methods and the related techniques that are common in many PDE-based image models. Many of these techniques involve minimizing the total variation of the solution and admit regularizations on the curvature of its level sets. We examine the scope of these techniques in image science, in particular in image segmentation, interpolation, and decomposition, and introduce some relevant level set techniques that are useful for this class of applications. Many of the standard problems are formulated as variational models. We observe increasing synergistic progression of new tools and ideas between the inverse problem community and the `imagers'. We show that image science demands multi-disciplinary knowledge and flexible, but still robust methods. That is why the level set method and total variation methods have become thriving techniques in this field.Our goal is to survey recently developed techniques in various fields of research that are relevant to diverse objectives in image science. We begin by reviewing some typical PDE-based applications in image processing. In typical PDE methods, images are assumed to be continuous functions sampled on a grid. We will show that these methods all share a common feature, which is the emphasis on processing the level lines of the underlying image. The importance of level lines has been known for some time. See, e.g., Alvarez, Guichard, Morel and Lions (1993). This feature places our slightly general definition of the level set method for image science in context. In Section 2 we describe the building blocks of a typical level set method in the continuum setting. Each important task that we need to do is formulated as the solution to certain PDEs. Then, in Section 3, we briefly describe the finite difference methods developed to construct approximate solutions to these PDEs. Some approaches to interpolation into small subdomains of an image are reviewed in Section 4. In Section 5 we describe the Chan

  10. Level-Set Topology Optimization with Aeroelastic Constraints

    NASA Technical Reports Server (NTRS)

    Dunning, Peter D.; Stanford, Bret K.; Kim, H. Alicia

    2015-01-01

    Level-set topology optimization is used to design a wing considering skin buckling under static aeroelastic trim loading, as well as dynamic aeroelastic stability (flutter). The level-set function is defined over the entire 3D volume of a transport aircraft wing box. Therefore, the approach is not limited by any predefined structure and can explore novel configurations. The Sequential Linear Programming (SLP) level-set method is used to solve the constrained optimization problems. The proposed method is demonstrated using three problems with mass, linear buckling and flutter objective and/or constraints. A constraint aggregation method is used to handle multiple buckling constraints in the wing skins. A continuous flutter constraint formulation is used to handle difficulties arising from discontinuities in the design space caused by a switching of the critical flutter mode.

  11. The adoption of the Reference Framework for diabetes care among primary care physicians in primary care settings

    PubMed Central

    Wong, Martin C.S.; Wang, Harry H.X.; Kwan, Mandy W.M.; Chan, Wai Man; Fan, Carmen K.M.; Liang, Miaoyin; Li, Shannon TS; Fung, Franklin D.H.; Yeung, Ming Sze; Chan, David K.L.; Griffiths, Sian M.

    2016-01-01

    Abstract The prevalence of diabetes mellitus has been increasing both globally and locally. Primary care physicians (PCPs) are in a privileged position to provide first contact and continuing care for diabetic patients. A territory-wide Reference Framework for Diabetes Care for Adults has been released by the Hong Kong Primary Care Office in 2010, with the aim to further enhance evidence-based and high quality care for diabetes in the primary care setting through wide adoption of the Reference Framework. A valid questionnaire survey was conducted among PCPs to evaluate the levels of, and the factors associated with, their adoption of the Reference Framework. A total of 414 completed surveys were received with the response rate of 13.0%. The average adoption score was 3.29 (SD 0.51) out of 4. Approximately 70% of PCPs highly adopted the Reference Framework in their routine practice. Binary logistic regression analysis showed that the PCPs perceptions on the inclusion of sufficient local information (adjusted odds ratio [aOR] = 4.748, 95%CI 1.597–14.115, P = 0.005) and reduction of professional autonomy of PCPs (aOR = 1.859, 95%CI 1.013–3.411, P = 0.045) were more likely to influence their adoption level of the Reference Framework for diabetes care in daily practices. The overall level of guideline adoption was found to be relatively high among PCPs for adult diabetes in primary care settings. The adoption barriers identified in this study should be addressed in the continuous updating of the Reference Framework. Strategies need to be considered to enhance the guideline adoption and implementation capacity. PMID:27495018

  12. Bi-directional evolutionary level set method for topology optimization

    NASA Astrophysics Data System (ADS)

    Zhu, Benliang; Zhang, Xianmin; Fatikow, Sergej; Wang, Nianfeng

    2015-03-01

    A bi-directional evolutionary level set method for solving topology optimization problems is presented in this article. The proposed method has three main advantages over the standard level set method. First, new holes can be automatically generated in the design domain during the optimization process. Second, the dependency of the obtained optimized configurations upon the initial configurations is eliminated. Optimized configurations can be obtained even being started from a minimum possible initial guess. Third, the method can be easily implemented and is computationally more efficient. The validity of the proposed method is tested on the mean compliance minimization problem and the compliant mechanisms topology optimization problem.

  13. Set-membership identification and fault detection using a Bayesian framework

    NASA Astrophysics Data System (ADS)

    Fernández-Cantí, Rosa M.; Blesa, Joaquim; Puig, Vicenç; Tornil-Sin, Sebastian

    2016-05-01

    This paper deals with the problem of set-membership identification and fault detection using a Bayesian framework. The paper presents how the set-membership model estimation problem can be reformulated from the Bayesian viewpoint in order to, first, determine the feasible parameter set in the identification stage and, second, check the consistency between the measurement data and the model in the fault-detection stage. The paper shows that, assuming uniform distributed measurement noise and uniform model prior probability distributions, the Bayesian approach leads to the same feasible parameter set than the well-known set-membership technique based on approximating the feasible parameter set using sets. Additionally, it can deal with models that are nonlinear in the parameters. The single-output and multiple-output cases are addressed as well. The procedure and results are illustrated by means of the application to a quadruple-tank process.

  14. The ICF: A Framework for Setting Goals for Children with Speech Impairment

    ERIC Educational Resources Information Center

    McLeod, Sharynne; Bleile, Ken

    2004-01-01

    The International Classification of Functioning, Disability and Health (ICF) (World Health Organization, 2001) is proposed as a framework for integrative goal setting for children with speech impairment. The ICF incorporates both impairment and social factors to consider when selecting appropriate goals to bring about change in the lives of…

  15. Counselors' Job Satisfaction across Education Levels, Settings, and Specialties

    ERIC Educational Resources Information Center

    Gambrell, Crista E.

    2010-01-01

    This study examined counselor satisfaction across education levels (Masters and Doctorate), work settings (private practice and institutions), and specializations (mental health counselors, school counselors, counselor educators, and creative arts/other counselors). Counselors were surveyed counseling professionals across these variables to…

  16. Geologic setting of the low-level burial grounds

    SciTech Connect

    Lindsey, K.A.; Jaeger, G.K.; Slate, J.L.; Swett, K.J.; Mercer, R.B.

    1994-10-13

    This report describes the regional and site specific geology of the Hanford Sites low-level burial grounds in the 200 East and West Areas. The report incorporates data from boreholes across the entire 200 Areas, integrating the geology of this area into a single framework. Geologic cross-sections, isopach maps, and structure contour maps of all major geological units from the top of the Columbia River Basalt Group to the surface are included. The physical properties and characteristics of the major suprabasalt sedimentary units also are discussed.

  17. Developing a pressure ulcer risk factor minimum data set and risk assessment framework

    PubMed Central

    Coleman, Susanne; Nelson, E Andrea; Keen, Justin; Wilson, Lyn; McGinnis, Elizabeth; Dealey, Carol; Stubbs, Nikki; Muir, Delia; Farrin, Amanda; Dowding, Dawn; Schols, Jos MGA; Cuddigan, Janet; Berlowitz, Dan; Jude, Edward; Vowden, Peter; Bader, Dan L; Gefen, Amit; Oomens, Cees WJ; Schoonhoven, Lisette; Nixon, Jane

    2014-01-01

    Aim To agree a draft pressure ulcer risk factor Minimum Data Set to underpin the development of a new evidenced-based Risk Assessment Framework. Background A recent systematic review identified the need for a pressure ulcer risk factor Minimum Data Set and development and validation of an evidenced-based pressure ulcer Risk Assessment Framework. This was undertaken through the Pressure UlceR Programme Of reSEarch (RP-PG-0407-10056), funded by the National Institute for Health Research and incorporates five phases. This article reports phase two, a consensus study. Design Consensus study. Method A modified nominal group technique based on the Research and Development/University of California at Los Angeles appropriateness method. This incorporated an expert group, review of the evidence and the views of a Patient and Public Involvement service user group. Data were collected December 2010–December 2011. Findings The risk factors and assessment items of the Minimum Data Set (including immobility, pressure ulcer and skin status, perfusion, diabetes, skin moisture, sensory perception and nutrition) were agreed. In addition, a draft Risk Assessment Framework incorporating all Minimum Data Set items was developed, comprising a two stage assessment process (screening and detailed full assessment) and decision pathways. Conclusion The draft Risk Assessment Framework will undergo further design and pre-testing with clinical nurses to assess and improve its usability. It will then be evaluated in clinical practice to assess its validity and reliability. The Minimum Data Set could be used in future for large scale risk factor studies informing refinement of the Risk Assessment Framework. PMID:24845398

  18. High-fidelity interface tracking in compressible flows: Unlimited anchored adaptive level set

    NASA Astrophysics Data System (ADS)

    Nourgaliev, R. R.; Theofanous, T. G.

    2007-06-01

    The interface-capturing-fidelity issue of the level set method is addressed wholly within the Eulerian framework. Our aim is for a practical and efficient way to realize the expected benefits of grid resolution and high order schemes. Based on a combination of structured adaptive mesh refinement (SAMR), rather than quad/octrees, and on high-order spatial discretization, rather than the use of Lagrangian particles, our method is tailored to compressible flows, while it provides a potentially useful alternative to the particle level set (PLS) for incompressible flows. Interesting salient features of our method include (a) avoidance of limiting (in treating the Hamiltonian of the level set equation), (b) anchoring the level set in a manner that ensures no drift and no spurious oscillations of the zero level during PDE-reinitialization, and (c) a non-linear tagging procedure for defining the neighborhood of the interface subject to mesh refinement. Numerous computational results on a set of benchmark problems (strongly deforming, stretching and tearing interfaces) demonstrate that with this approach, implemented up to 11th order accuracy, the level set method becomes essentially free of mass conservation errors and also free of parasitic interfacial oscillations, while it is still highly efficient, and convenient for 3D parallel implementation. In addition, demonstration of performance in fully-coupled simulations is presented for multimode Rayleigh-Taylor instability (low-Mach number regime) and shock-induced, bubble-collapse (highly compressible regime).

  19. A Framework for Translating a High Level Security Policy into Low Level Security Mechanisms

    NASA Astrophysics Data System (ADS)

    Hassan, Ahmed A.; Bahgat, Waleed M.

    2010-01-01

    Security policies have different components; firewall, active directory, and IDS are some examples of these components. Enforcement of network security policies to low level security mechanisms faces some essential difficulties. Consistency, verification, and maintenance are the major ones of these difficulties. One approach to overcome these difficulties is to automate the process of translation of high level security policy into low level security mechanisms. This paper introduces a framework of an automation process that translates a high level security policy into low level security mechanisms. The framework is described in terms of three phases; in the first phase all network assets are categorized according to their roles in the network security and relations between them are identified to constitute the network security model. This proposed model is based on organization based access control (OrBAC). However, the proposed model extend the OrBAC model to include not only access control policy but also some other administrative security policies like auditing policy. Besides, the proposed model enables matching of each rule of the high level security policy with the corresponding ones of the low level security policy. Through the second phase of the proposed framework, the high level security policy is mapped into the network security model. The second phase could be considered as a translation of the high level security policy into an intermediate model level. Finally, the intermediate model level is translated automatically into low level security mechanism. The paper illustrates the applicability of proposed approach through an application example.

  20. Settings for health promotion: an analytic framework to guide intervention design and implementation.

    PubMed

    Poland, Blake; Krupa, Gene; McCall, Douglas

    2009-10-01

    Taking a settings approach to health promotion means addressing the contexts within which people live, work, and play and making these the object of inquiry and intervention as well as the needs and capacities of people to be found in different settings. This approach can increase the likelihood of success because it offers opportunities to situate practice in its context. Members of the setting can optimize interventions for specific contextual contingencies, target crucial factors in the organizational context influencing behavior, and render settings themselves more health promoting. A number of attempts have been made to systematize evidence regarding the effectiveness of interventions in different types of settings (e.g., school-based health promotion, community development). Few, if any, attempts have been made to systematically develop a template or framework for analyzing those features of settings that should influence intervention design and delivery. This article lays out the core elements of such a framework in the form of a nested series of questions to guide analysis. Furthermore, it offers advice on additional considerations that should be taken into account when operationalizing a settings approach in the field. PMID:19809004

  1. Conceptual framework for indexing visual information at multiple levels

    NASA Astrophysics Data System (ADS)

    Jaimes, Alejandro; Chang, Shih-Fu

    1999-12-01

    In this paper, we present a conceptual framework for indexing different aspects of visual information. Our framework unifies concepts from this literature in diverse fields such as cognitive psychology, library sciences, art, and the more recent content-based retrieval. We present multiple level structures for visual and non-visual and non- visual information. The ten-level visual structure presented provides a systematic way of indexing images based on syntax and semantics, and includes distinctions between general concept and visual concept. We define different types of relations at different levels of the visual structure, and also use a semantic information table to summarize important aspects related to an image. While the focus is on the development of a conceptual indexing structure, our aim is also to bring together the knowledge from various fields, unifying the issues that should be considered when building a digital image library. Our analysis stresses the limitations of state of the art content-based retrieval systems and suggests areas in which improvements are necessary.

  2. Variational and Shape Prior-based Level Set Model for Image Segmentation

    SciTech Connect

    Diop, El Hadji S.; Jerbi, Taha; Burdin, Valerie

    2010-09-30

    A new image segmentation model based on level sets approach is presented herein. We deal with radiographic medical images where boundaries are not salient, and objects of interest have the same gray level as other structures in the image. Thus, an a priori information about the shape we look for is integrated in the level set evolution for good segmentation results. The proposed model also accounts a penalization term that forces the level set to be close to a signed distance function (SDF), which then avoids the re-initialization procedure. In addition, a variant and complete Mumford-Shah model is used in our functional; the added Hausdorff measure helps to better handle zones where boundaries are occluded or not salient. Finally, a weighted area term is added to the functional to make the level set drive rapidly to object's boundaries. The segmentation model is formulated in a variational framework, which, thanks to calculus of variations, yields to partial differential equations (PDEs) to guide the level set evolution. Results obtained on both synthetic and digital radiographs reconstruction (DRR) show that the proposed model improves on existing prior and non-prior shape based image segmentation.

  3. Improvements to Level Set, Immersed Boundary methods for Interface Tracking

    NASA Astrophysics Data System (ADS)

    Vogl, Chris; Leveque, Randy

    2014-11-01

    It is not uncommon to find oneself solving a moving boundary problem under flow in the context of some application. Of particular interest is when the moving boundary exerts a curvature-dependent force on the liquid. Such a force arises when observing a boundary that is resistant to bending or has surface tension. Numerically speaking, stable numerical computation of the curvature can be difficult as it is often described in terms of high-order derivatives of either marker particle positions or of a level set function. To address this issue, the level set method is modified to track not only the position of the boundary, but the curvature as well. The definition of the signed-distance function that is used to modify the level set method is also used to develop an interpolation-free, closest-point method. These improvements are used to simulate a bending-resistant, inextensible boundary under shear flow to highlight area and volume conservation, as well as stable curvature calculation. Funded by a NSF MSPRF grant.

  4. Priority-setting in healthcare: a framework for reasonable clinical judgements.

    PubMed

    Baerøe, K

    2009-08-01

    What are the criteria for reasonable clinical judgements? The reasonableness of macro-level decision-making has been much discussed, but little attention has been paid to the reasonableness of applying guidelines generated at a macro-level to individual cases. This paper considers a framework for reasonable clinical decision-making that will capture cases where relevant guidelines cannot reasonably be followed. There are three main sections. (1) Individual claims on healthcare from the point of view of concerns about equity are analysed. (2) The demands of responsibility and equity on professional clinical performance are discussed, and how the combination of these demands emerges into seven requirements that constitute the framework is explored. Since this framework is developed to assist in reasonable clinical decision-making, practical implications of all these requirements are also suggested. (3) Challenges concerning the framework are discussed. First, a crucial presumption that the framework relies upon is considered-namely, clinicians' willingness to justify their decisions as requested. Then how public deliberation may influence clinical decision-making is discussed. Next is a consideration of how clinicians' need to have confidence in their own judgements in order to perform in a manner worthy of trust would be compatible with adherence to the framework supported by public deliberation. It is concluded that fair distribution in the interplay between macro- and micro-level considerations can be secured by legitimising procedures on each level, by ensuring well-organised and continuing public debate and by basing individual clinical judgements upon well-justified and principled normative bases. PMID:19644007

  5. A linear optimal transportation framework for quantifying and visualizing variations in sets of images

    PubMed Central

    Wang, Wei; Slepčev, Dejan; Basu, Saurav; Ozolek, John A.

    2012-01-01

    Transportation-based metrics for comparing images have long been applied to analyze images, especially where one can interpret the pixel intensities (or derived quantities) as a distribution of ‘mass’ that can be transported without strict geometric constraints. Here we describe a new transportation-based framework for analyzing sets of images. More specifically, we describe a new transportation-related distance between pairs of images, which we denote as linear optimal transportation (LOT). The LOT can be used directly on pixel intensities, and is based on a linearized version of the Kantorovich-Wasserstein metric (an optimal transportation distance, as is the earth mover’s distance). The new framework is especially well suited for computing all pairwise distances for a large database of images efficiently, and thus it can be used for pattern recognition in sets of images. In addition, the new LOT framework also allows for an isometric linear embedding, greatly facilitating the ability to visualize discriminant information in different classes of images. We demonstrate the application of the framework to several tasks such as discriminating nuclear chromatin patterns in cancer cells, decoding differences in facial expressions, galaxy morphologies, as well as sub cellular protein distributions. PMID:23729991

  6. A level set method for materials with texturally equilibrated pores

    NASA Astrophysics Data System (ADS)

    Ghanbarzadeh, Soheil; Hesse, Marc A.; Prodanović, Maša

    2015-09-01

    Textural equilibrium controls the distribution of the liquid phase in many naturally occurring porous materials such as partially molten rocks and alloys, salt-brine and ice-water systems. In these materials, pore geometry evolves to minimize the solid-liquid interfacial energy while maintaining a constant dihedral angle, θ, at solid-liquid contact lines. We present a level set method to compute an implicit representation of the liquid-solid interface in textural equilibrium with space-filling tessellations of multiple solid grains in three dimensions. Each grain is represented by a separate level set function and interfacial energy minimization is achieved by evolving the solid-liquid interface under surface diffusion to constant mean curvature surface. The liquid volume and dihedral angle constraints are added to the formulation using virtual convective and normal velocity terms. This results in an initial value problem for a system of non-linear coupled PDEs governing the evolution of the level sets for each grain, using the implicit representation of the solid grains as initial condition. A domain decomposition scheme is devised to restrict the computational domain of each grain to few grid points around the grain. The coupling between the interfaces is achieved in a higher level on the original computational domain. The spatial resolution of the discretization is improved through high-order spatial differentiation schemes and localization of computations through domain composition. Examples of three-dimensional solutions are also obtained for different grain distributions networks that illustrate the geometric flexibility of the method.

  7. Coupled level set segmentation using a point-based statistical shape model relying on correspondence probabilities

    NASA Astrophysics Data System (ADS)

    Hufnagel, Heike; Ehrhardt, Jan; Pennec, Xavier; Schmidt-Richberg, Alexander; Handels, Heinz

    2010-03-01

    In this article, we propose a unified statistical framework for image segmentation with shape prior information. The approach combines an explicitely parameterized point-based probabilistic statistical shape model (SSM) with a segmentation contour which is implicitly represented by the zero level set of a higher dimensional surface. These two aspects are unified in a Maximum a Posteriori (MAP) estimation where the level set is evolved to converge towards the boundary of the organ to be segmented based on the image information while taking into account the prior given by the SSM information. The optimization of the energy functional obtained by the MAP formulation leads to an alternate update of the level set and an update of the fitting of the SSM. We then adapt the probabilistic SSM for multi-shape modeling and extend the approach to multiple-structure segmentation by introducing a level set function for each structure. During segmentation, the evolution of the different level set functions is coupled by the multi-shape SSM. First experimental evaluations indicate that our method is well suited for the segmentation of topologically complex, non spheric and multiple-structure shapes. We demonstrate the effectiveness of the method by experiments on kidney segmentation as well as on hip joint segmentation in CT images.

  8. Level set method for image segmentation based on moment competition

    NASA Astrophysics Data System (ADS)

    Min, Hai; Wang, Xiao-Feng; Huang, De-Shuang; Jin, Jing; Wang, Hong-Zhi; Li, Hai

    2015-05-01

    We propose a level set method for image segmentation which introduces the moment competition and weakly supervised information into the energy functional construction. Different from the region-based level set methods which use force competition, the moment competition is adopted to drive the contour evolution. Here, a so-called three-point labeling scheme is proposed to manually label three independent points (weakly supervised information) on the image. Then the intensity differences between the three points and the unlabeled pixels are used to construct the force arms for each image pixel. The corresponding force is generated from the global statistical information of a region-based method and weighted by the force arm. As a result, the moment can be constructed and incorporated into the energy functional to drive the evolving contour to approach the object boundary. In our method, the force arm can take full advantage of the three-point labeling scheme to constrain the moment competition. Additionally, the global statistical information and weakly supervised information are successfully integrated, which makes the proposed method more robust than traditional methods for initial contour placement and parameter setting. Experimental results with performance analysis also show the superiority of the proposed method on segmenting different types of complicated images, such as noisy images, three-phase images, images with intensity inhomogeneity, and texture images.

  9. Toward automatic computer aided dental X-ray analysis using level set method.

    PubMed

    Li, Shuo; Fevens, Thomas; Krzyzak, Adam; Jin, Chao; Li, Song

    2005-01-01

    A Computer Aided Dental X-rays Analysis (CADXA) framework is proposed to semi-automatically detect areas of bone loss and root decay in digital dental X-rays. In this framework, first, a new proposed competitive coupled level set method is proposed to segment the image into three pathologically meaningful regions using two coupled level set functions. Tailored for the dental clinical environment, the segmentation stage uses a trained support vector machine (SVM) classifier to provide initial contours. Then, based on the segmentation results, an analysis scheme is applied. First, the scheme builds an uncertainty map from which those areas with bone loss will be automatically detected. Secondly, the scheme employs a method based on the SVM and the average intensity profile to isolate the teeth and detect root decay. Experimental results show that our proposed framework is able to automatically detect the areas of bone loss and, when given the orientation of the teeth, it is able to automatically detect the root decay with a seriousness level marked for diagnosis. PMID:16685904

  10. A geometric level set model for ultrasounds analysis

    SciTech Connect

    Sarti, A.; Malladi, R.

    1999-10-01

    We propose a partial differential equation (PDE) for filtering and segmentation of echocardiographic images based on a geometric-driven scheme. The method allows edge-preserving image smoothing and a semi-automatic segmentation of the heart chambers, that regularizes the shapes and improves edge fidelity especially in presence of distinct gaps in the edge map as is common in ultrasound imagery. A numerical scheme for solving the proposed PDE is borrowed from level set methods. Results on human in vivo acquired 2D, 2D+time,3D, 3D+time echocardiographic images are shown.

  11. Discrete Optimization with Polynomially Detectable Boundaries and Restricted Level Sets

    NASA Astrophysics Data System (ADS)

    Zinder, Yakov; Memar, Julia; Singh, Gaurav

    The paper describes an optimization procedure for a class of discrete optimization problems which is defined by certain properties of the boundary of the feasible region and level sets of the objective function. It is shown that these properties are possessed, for example, by various scheduling problems, including a number of well-known NP-hard problems which play an important role in scheduling theory. For an important particular case the presented optimization procedure is compared with a version of the branch-and-bound algorithm by means of computational experiments.

  12. The Augmented Fast Marching Method for Level Set Reinitialization

    NASA Astrophysics Data System (ADS)

    Salac, David

    2011-11-01

    The modeling of multiphase fluid flows typically requires accurate descriptions of the interface and curvature of the interface. Here a new reinitialization technique based on the fast marching method for gradient-augmented level sets is presented. The method is explained and results in both 2D and 3D are presented. Overall the method is more accurate than reinitialization methods based on similar stencils and the resulting curvature fields are much smoother. The method will also be demonstrated in a sample application investigating the dynamic behavior of vesicles in general fluid flows. Support provided by University at Buffalo - SUNY.

  13. Automatic segmentation of right ventricle on ultrasound images using sparse matrix transform and level set

    NASA Astrophysics Data System (ADS)

    Qin, Xulei; Cong, Zhibin; Halig, Luma V.; Fei, Baowei

    2013-03-01

    An automatic framework is proposed to segment right ventricle on ultrasound images. This method can automatically segment both epicardial and endocardial boundaries from a continuous echocardiography series by combining sparse matrix transform (SMT), a training model, and a localized region based level set. First, the sparse matrix transform extracts main motion regions of myocardium as eigenimages by analyzing statistical information of these images. Second, a training model of right ventricle is registered to the extracted eigenimages in order to automatically detect the main location of the right ventricle and the corresponding transform relationship between the training model and the SMT-extracted results in the series. Third, the training model is then adjusted as an adapted initialization for the segmentation of each image in the series. Finally, based on the adapted initializations, a localized region based level set algorithm is applied to segment both epicardial and endocardial boundaries of the right ventricle from the whole series. Experimental results from real subject data validated the performance of the proposed framework in segmenting right ventricle from echocardiography. The mean Dice scores for both epicardial and endocardial boundaries are 89.1%+/-2.3% and 83.6+/-7.3%, respectively. The automatic segmentation method based on sparse matrix transform and level set can provide a useful tool for quantitative cardiac imaging.

  14. XFEM schemes for level set based structural optimization

    NASA Astrophysics Data System (ADS)

    Li, Li; Wang, Michael Yu; Wei, Peng

    2012-12-01

    In this paper, some elegant extended finite element method (XFEM) schemes for level set method structural optimization are proposed. Firstly, two-dimension (2D) and three-dimension (3D) XFEM schemes with partition integral method are developed and numerical examples are employed to evaluate their accuracy, which indicate that an accurate analysis result can be obtained on the structural boundary. Furthermore, the methods for improving the computational accuracy and efficiency of XFEM are studied, which include the XFEM integral scheme without quadrature sub-cells and higher order element XFEM scheme. Numerical examples show that the XFEM scheme without quadrature sub-cells can yield similar accuracy of structural analysis while prominently reducing the time cost and that higher order XFEM elements can improve the computational accuracy of structural analysis in the boundary elements, but the time cost is increasing. Therefore, the balance of time cost between FE system scale and the order of element needs to be discussed. Finally, the reliability and advantages of the proposed XFEM schemes are illustrated with several 2D and 3D mean compliance minimization examples that are widely used in the recent literature of structural topology optimization. All numerical results demonstrate that the proposed XFEM is a promising structural analysis approach for structural optimization with the level set method.

  15. Variational level set segmentation for forest based on MCMC sampling

    NASA Astrophysics Data System (ADS)

    Yang, Tie-Jun; Huang, Lin; Jiang, Chuan-xian; Nong, Jian

    2014-11-01

    Environmental protection is one of the themes of today's world. The forest is a recycler of carbon dioxide and natural oxygen bar. Protection of forests, monitoring of forest growth is long-term task of environmental protection. It is very important to automatically statistic the forest coverage rate using optical remote sensing images and the computer, by which we can timely understand the status of the forest of an area, and can be freed from tedious manual statistics. Towards the problem of computational complexity of the global optimization using convexification, this paper proposes a level set segmentation method based on Markov chain Monte Carlo (MCMC) sampling and applies it to forest segmentation in remote sensing images. The presented method needs not to do any convexity transformation for the energy functional of the goal, and uses MCMC sampling method with global optimization capability instead. The possible local minima occurring by using gradient descent method is also avoided. There are three major contributions in the paper. Firstly, by using MCMC sampling, the convexity of the energy functional is no longer necessary and global optimization can still be achieved. Secondly, taking advantage of the data (texture) and knowledge (a priori color) to guide the construction of Markov chain, the convergence rate of Markov chains is improved significantly. Finally, the level set segmentation method by integrating a priori color and texture for forest is proposed. The experiments show that our method can efficiently and accurately segment forest in remote sensing images.

  16. A conceptual framework of computations in mid-level vision

    PubMed Central

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P.

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words—or, rather, descriptors—capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations. PMID:25566044

  17. A level-set method for interfacial flows with surfactant

    NASA Astrophysics Data System (ADS)

    Xu, Jian-Jun; Li, Zhilin; Lowengrub, John; Zhao, Hongkai

    2006-03-01

    A level-set method for the simulation of fluid interfaces with insoluble surfactant is presented in two-dimensions. The method can be straightforwardly extended to three-dimensions and to soluble surfactants. The method couples a semi-implicit discretization for solving the surfactant transport equation recently developed by Xu and Zhao [J. Xu, H. Zhao. An Eulerian formulation for solving partial differential equations along a moving interface, J. Sci. Comput. 19 (2003) 573-594] with the immersed interface method originally developed by LeVeque and Li and [R. LeVeque, Z. Li. The immersed interface method for elliptic equations with discontinuous coefficients and singular sources, SIAM J. Numer. Anal. 31 (1994) 1019-1044] for solving the fluid flow equations and the Laplace-Young boundary conditions across the interfaces. Novel techniques are developed to accurately conserve component mass and surfactant mass during the evolution. Convergence of the method is demonstrated numerically. The method is applied to study the effects of surfactant on single drops, drop-drop interactions and interactions among multiple drops in Stokes flow under a steady applied shear. Due to Marangoni forces and to non-uniform Capillary forces, the presence of surfactant results in larger drop deformations and more complex drop-drop interactions compared to the analogous cases for clean drops. The effects of surfactant are found to be most significant in flows with multiple drops. To our knowledge, this is the first time that the level-set method has been used to simulate fluid interfaces with surfactant.

  18. Statistics of dark matter halos in the excursion set peak framework

    SciTech Connect

    Lapi, A.; Danese, L. E-mail: danese@sissa.it

    2014-07-01

    We derive approximated, yet very accurate analytical expressions for the abundance and clustering properties of dark matter halos in the excursion set peak framework; the latter relies on the standard excursion set approach, but also includes the effects of a realistic filtering of the density field, a mass-dependent threshold for collapse, and the prescription from peak theory that halos tend to form around density maxima. We find that our approximations work excellently for diverse power spectra, collapse thresholds and density filters. Moreover, when adopting a cold dark matter power spectra, a tophat filtering and a mass-dependent collapse threshold (supplemented with conceivable scatter), our approximated halo mass function and halo bias represent very well the outcomes of cosmological N-body simulations.

  19. Comprehensive evaluation of long-term hydrological data sets: Constraints of the Budyko framework

    NASA Astrophysics Data System (ADS)

    Greve, Peter; Orlowsky, Boris; Seneviratne, Sonia I.

    2013-04-01

    An accurate estimate of the climatological land water balance is essential for a wide range of socio-economical issues. Despite the simplicity of the underlying water balance equation, its individual variables are of complex nature. Global estimates, either derived from observations or from models, of precipitation (P ) and especially evapotranspiration (ET) are characterized by high uncertainties. This leads to inconsistent results in determining conditions related to the land water balance and its components. In this study, we consider the Budyko framework as a constraint to evaluate long-term hydrological data sets within the period from 1984 to 2005. The Budyko framework is a well established empirically based relationsship between ET-P and Ep-P , with Ep being the potential evaporation. We use estimates of ET associated with the LandFlux-EVAL initiative (Mueller et. al., 2012), either derived from observations, CMIP5 models or land-surface models (LSMs) driven with observation-based forcing or atmospheric reanalyses. Data sets of P comprise all commonly used global observation-based estimates. Ep is determined by methods of differing complexity with recent global temperature and radiation data sets. Based on this comprehensive synthesis of data sets and methods to determine Ep, more than 2000 possible combinations for ET-P in conjunction with Ep-P are created. All combinations are validated against the Budyko curve and against physical limits within the Budyko phase space. For this purpose we develop an error measure based on the root mean square error which combines both constraints. We find that uncertainties are mainly induced by the ET data sets. In particular, reanalysis and CMIP5 data sets are characterized by low realism. The realism of LSMs is further not primarily controlled by the forcing, as different LSMs driven with the same forcing show significantly different error measures. Our comprehensive approach is thus suitable to detect uncertainties

  20. Powerful Set-Based Gene-Environment Interaction Testing Framework for Complex Diseases.

    PubMed

    Jiao, Shuo; Peters, Ulrike; Berndt, Sonja; Bézieau, Stéphane; Brenner, Hermann; Campbell, Peter T; Chan, Andrew T; Chang-Claude, Jenny; Lemire, Mathieu; Newcomb, Polly A; Potter, John D; Slattery, Martha L; Woods, Michael O; Hsu, Li

    2015-12-01

    Identification of gene-environment interaction (G × E) is important in understanding the etiology of complex diseases. Based on our previously developed Set Based gene EnviRonment InterAction test (SBERIA), in this paper we propose a powerful framework for enhanced set-based G × E testing (eSBERIA). The major challenge of signal aggregation within a set is how to tell signals from noise. eSBERIA tackles this challenge by adaptively aggregating the interaction signals within a set weighted by the strength of the marginal and correlation screening signals. eSBERIA then combines the screening-informed aggregate test with a variance component test to account for the residual signals. Additionally, we develop a case-only extension for eSBERIA (coSBERIA) and an existing set-based method, which boosts the power not only by exploiting the G-E independence assumption but also by avoiding the need to specify main effects for a large number of variants in the set. Through extensive simulation, we show that coSBERIA and eSBERIA are considerably more powerful than existing methods within the case-only and the case-control method categories across a wide range of scenarios. We conduct a genome-wide G × E search by applying our methods to Illumina HumanExome Beadchip data of 10,446 colorectal cancer cases and 10,191 controls and identify two novel interactions between nonsteroidal anti-inflammatory drugs (NSAIDs) and MINK1 and PTCHD3. PMID:26095235

  1. Implementing accountability for reasonableness framework at district level in Tanzania: a realist evaluation

    PubMed Central

    2011-01-01

    Background Despite the growing importance of the Accountability for Reasonableness (A4R) framework in priority setting worldwide, there is still an inadequate understanding of the processes and mechanisms underlying its influence on legitimacy and fairness, as conceived and reflected in service management processes and outcomes. As a result, the ability to draw scientifically sound lessons for the application of the framework to services and interventions is limited. This paper evaluates the experiences of implementing the A4R approach in Mbarali District, Tanzania, in order to find out how the innovation was shaped, enabled, and constrained by the interaction between contexts, mechanisms and outcomes. Methods This study draws on the principles of realist evaluation -- a largely qualitative approach, chiefly concerned with testing and refining programme theories by exploring the complex interactions of contexts, mechanisms, and outcomes. Mixed methods were used in data collection, including individual interviews, non-participant observation, and document reviews. A thematic framework approach was adopted for the data analysis. Results The study found that while the A4R approach to priority setting was helpful in strengthening transparency, accountability, stakeholder engagement, and fairness, the efforts at integrating it into the current district health system were challenging. Participatory structures under the decentralisation framework, central government's call for partnership in district-level planning and priority setting, perceived needs of stakeholders, as well as active engagement between researchers and decision makers all facilitated the adoption and implementation of the innovation. In contrast, however, limited local autonomy, low level of public awareness, unreliable and untimely funding, inadequate accountability mechanisms, and limited local resources were the major contextual factors that hampered the full implementation. Conclusion This study

  2. Towards a Dynamic Conceptual Framework for English-Medium Education in Multilingual University Settings

    ERIC Educational Resources Information Center

    Dafouz, Emma; Smit, Ute

    2016-01-01

    At a time of increasing internationalization in tertiary education, English-Medium Education in Multilingual University Settings (EMEMUS) has become a common practice. While there is already ample research describing this phenomenon at a local level (Smit and Dafouz 2012a), the theoretical side needs to be elaborated. This article thus aims to…

  3. An Analysis Framework Addressing the Scale and Legibility of Large Scientific Data Sets

    SciTech Connect

    Childs, H R

    2006-11-20

    Much of the previous work in the large data visualization area has solely focused on handling the scale of the data. This task is clearly a great challenge and necessary, but it is not sufficient. Applying standard visualization techniques to large scale data sets often creates complicated pictures where meaningful trends are lost. A second challenge, then, is to also provide algorithms that simplify what an analyst must understand, using either visual or quantitative means. This challenge can be summarized as improving the legibility or reducing the complexity of massive data sets. Fully meeting both of these challenges is the work of many, many PhD dissertations. In this dissertation, we describe some new techniques to address both the scale and legibility challenges, in hope of contributing to the larger solution. In addition to our assumption of simultaneously addressing both scale and legibility, we add an additional requirement that the solutions considered fit well within an interoperable framework for diverse algorithms, because a large suite of algorithms is often necessary to fully understand complex data sets. For scale, we present a general architecture for handling large data, as well as details of a contract-based system for integrating advanced optimizations into a data flow network design. We also describe techniques for volume rendering and performing comparisons at the extreme scale. For legibility, we present several techniques. Most noteworthy are equivalence class functions, a technique to drive visualizations using statistical methods, and line-scan based techniques for characterizing shape.

  4. Profile Evolution Simulation in Etching Systems Using Level Set Methods

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Govindan, T. R.; Meyyappan, M.

    1998-01-01

    Semiconductor device profiles are determined by the characteristics of both etching and deposition processes. In particular, a highly anisotropic etch is required to achieve vertical sidewalls. However, etching is comprised of both anisotropic and isotropic components, due to ion and neutral fluxes, respectively. In Ar/Cl2 plasmas, for example, neutral chlorine reacts with the Si surfaces to form silicon chlorides. These compounds are then removed by the impinging ion fluxes. Hence the directionality of the ions (and thus the ion angular distribution function, or IAD), as well as the relative fluxes of neutrals and ions determines the amount of undercutting. One method of modeling device profile evolution is to simulate the moving solid-gas interface between the semiconductor and the plasma as a string of nodes. The velocity of each node is calculated and then the nodes are advanced accordingly. Although this technique appears to be relatively straightforward, extensive looping schemes are required at the profile corners. An alternate method is to use level set theory, which involves embedding the location of the interface in a field variable. The normal speed is calculated at each mesh point, and the field variable is updated. The profile comers are more accurately modeled as the need for looping algorithms is eliminated. The model we have developed is a 2-D Level Set Profile Evolution Simulation (LSPES). The LSPES calculates etch rates of a substrate in low pressure plasmas due to the incident ion and neutral fluxes. For a Si substrate in an Ar/C12 gas mixture, for example, the predictions of the LSPES are identical to those from a string evolution model for high neutral fluxes and two different ion angular distributions.(2) In the figure shown, the relative neutral to ion flux in the bulk plasma is 100 to 1. For a moderately isotropic ion angular distribution function as shown in the cases in the left hand column, both the LSPES (top row) and rude's string

  5. Automatic segmentation of right ventricular ultrasound images using sparse matrix transform and a level set

    NASA Astrophysics Data System (ADS)

    Qin, Xulei; Cong, Zhibin; Fei, Baowei

    2013-11-01

    An automatic segmentation framework is proposed to segment the right ventricle (RV) in echocardiographic images. The method can automatically segment both epicardial and endocardial boundaries from a continuous echocardiography series by combining sparse matrix transform, a training model, and a localized region-based level set. First, the sparse matrix transform extracts main motion regions of the myocardium as eigen-images by analyzing the statistical information of the images. Second, an RV training model is registered to the eigen-images in order to locate the position of the RV. Third, the training model is adjusted and then serves as an optimized initialization for the segmentation of each image. Finally, based on the initializations, a localized, region-based level set algorithm is applied to segment both epicardial and endocardial boundaries in each echocardiograph. Three evaluation methods were used to validate the performance of the segmentation framework. The Dice coefficient measures the overall agreement between the manual and automatic segmentation. The absolute distance and the Hausdorff distance between the boundaries from manual and automatic segmentation were used to measure the accuracy of the segmentation. Ultrasound images of human subjects were used for validation. For the epicardial and endocardial boundaries, the Dice coefficients were 90.8 ± 1.7% and 87.3 ± 1.9%, the absolute distances were 2.0 ± 0.42 mm and 1.79 ± 0.45 mm, and the Hausdorff distances were 6.86 ± 1.71 mm and 7.02 ± 1.17 mm, respectively. The automatic segmentation method based on a sparse matrix transform and level set can provide a useful tool for quantitative cardiac imaging.

  6. Automatic segmentation of right ventricular ultrasound images using sparse matrix transform and a level set

    PubMed Central

    Qin, Xulei; Cong, Zhibin; Fei, Baowei

    2013-01-01

    An automatic segmentation framework is proposed to segment the right ventricle (RV) in echocardiographic images. The method can automatically segment both epicardial and endocardial boundaries from a continuous echocardiography series by combining sparse matrix transform, a training model, and a localized region-based level set. First, the sparse matrix transform extracts main motion regions of the myocardium as eigen-images by analyzing the statistical information of the images. Second, an RV training model is registered to the eigen-images in order to locate the position of the RV. Third, the training model is adjusted and then serves as an optimized initialization for the segmentation of each image. Finally, based on the initializations, a localized, region-based level set algorithm is applied to segment both epicardial and endocardial boundaries in each echocardiograph. Three evaluation methods were used to validate the performance of the segmentation framework. The Dice coefficient measures the overall agreement between the manual and automatic segmentation. The absolute distance and the Hausdorff distance between the boundaries from manual and automatic segmentation were used to measure the accuracy of the segmentation. Ultrasound images of human subjects were used for validation. For the epicardial and endocardial boundaries, the Dice coefficients were 90.8 ± 1.7% and 87.3 ± 1.9%, the absolute distances were 2.0 ± 0.42 mm and 1.79 ± 0.45 mm, and the Hausdorff distances were 6.86 ± 1.71 mm and 7.02 ± 1.17 mm, respectively. The automatic segmentation method based on a sparse matrix transform and level set can provide a useful tool for quantitative cardiac imaging. PMID:24107618

  7. Some free boundary problems in potential flow regime usinga based level set method

    SciTech Connect

    Garzon, M.; Bobillo-Ares, N.; Sethian, J.A.

    2008-12-09

    Recent advances in the field of fluid mechanics with moving fronts are linked to the use of Level Set Methods, a versatile mathematical technique to follow free boundaries which undergo topological changes. A challenging class of problems in this context are those related to the solution of a partial differential equation posed on a moving domain, in which the boundary condition for the PDE solver has to be obtained from a partial differential equation defined on the front. This is the case of potential flow models with moving boundaries. Moreover the fluid front will possibly be carrying some material substance which will diffuse in the front and be advected by the front velocity, as for example the use of surfactants to lower surface tension. We present a Level Set based methodology to embed this partial differential equations defined on the front in a complete Eulerian framework, fully avoiding the tracking of fluid particles and its known limitations. To show the advantages of this approach in the field of Fluid Mechanics we present in this work one particular application: the numerical approximation of a potential flow model to simulate the evolution and breaking of a solitary wave propagating over a slopping bottom and compare the level set based algorithm with previous front tracking models.

  8. Distance regularized two level sets for segmentation of left and right ventricles from cine-MRI.

    PubMed

    Liu, Yu; Captur, Gabriella; Moon, James C; Guo, Shuxu; Yang, Xiaoping; Zhang, Shaoxiang; Li, Chunming

    2016-06-01

    This paper presents a new level set method for segmentation of cardiac left and right ventricles. We extend the edge based distance regularized level set evolution (DRLSE) model in Li et al. (2010) to a two-level-set formulation, with the 0-level set and k-level set representing the endocardium and epicardium, respectively. The extraction of endocardium and epicardium is obtained as a result of the interactive curve evolution of the 0 and k level sets derived from the proposed variational level set formulation. The initialization of the level set function in the proposed two-level-set DRLSE model is generated from roughly located endocardium, which can be performed by applying the original DRLSE model. Experimental results have demonstrated the effectiveness of the proposed two-level-set DRLSE model. PMID:26740057

  9. INSTITUTIONALIZING SAFEGUARDS-BY-DESIGN: HIGH-LEVEL FRAMEWORK

    SciTech Connect

    Trond Bjornard PhD; Joseph Alexander; Robert Bean; Brian Castle; Scott DeMuth, Ph.D.; Phillip Durst; Michael Ehinger; Prof. Michael Golay, Ph.D.; Kevin Hase, Ph.D.; David J. Hebditch, DPhil; John Hockert, Ph.D.; Bruce Meppen; James Morgan; Jerry Phillips, Ph.D., PE

    2009-02-01

    The application of a Safeguards-by-Design (SBD) process for new nuclear facilities can reduce proliferation risks. A multi-laboratory team was sponsored in Fiscal Year (FY) 2008 to define a SBD process and determine how it could be incorporated into existing facility design and construction processes. The possibility to significantly influence major design features, such as process selection and plant layout, largely ends with the conceptual design step. Therefore SBD’s principal focus must be on the early inclusion of safeguards requirements and the early identification of beneficial design features. The result could help form the basis for a new international norm for integrating safeguards into facility design. This is an interim report describing progress and project status as of the end of FY08. In this effort, SBD is defined as a structured approach to ensure the timely, efficient, and cost-effective integration of international and national safeguards, physical security, and other nonproliferation objectives into the overall design process for a nuclear facility. A key objective is to ensure that security and nonproliferation issues are considered when weighing facility design alternatives. Central to the work completed in FY08 was a study in which a SBD process was developed in the context of the current DOE facility acquisition process. The DOE study enabled the development of a “SBD design loop” that is suitable for use in any facility design process. It is a graded, iterative process that incorporates safeguards concerns throughout the conceptual, preliminary and final design processes. Additionally, a set of proposed design principles for SBD was developed. A “Generic SBD Process” was then developed. Key features of the process include the initiation of safeguards design activities in the pre-conceptual planning phase, early incorporation of safeguards requirements into the project requirements, early appointment of an SBD team, and

  10. Parallel Computation of the Topology of Level Sets

    SciTech Connect

    Pascucci, V; Cole-McLaughlin, K

    2004-12-16

    This paper introduces two efficient algorithms that compute the Contour Tree of a 3D scalar field F and its augmented version with the Betti numbers of each isosurface. The Contour Tree is a fundamental data structure in scientific visualization that is used to preprocess the domain mesh to allow optimal computation of isosurfaces with minimal overhead storage. The Contour Tree can also be used to build user interfaces reporting the complete topological characterization of a scalar field, as shown in Figure 1. Data exploration time is reduced since the user understands the evolution of level set components with changing isovalue. The Augmented Contour Tree provides even more accurate information segmenting the range space of the scalar field in portion of invariant topology. The exploration time for a single isosurface is also improved since its genus is known in advance. Our first new algorithm augments any given Contour Tree with the Betti numbers of all possible corresponding isocontours in linear time with the size of the tree. Moreover we show how to extend the scheme introduced in [3] with the Betti number computation without increasing its complexity. Thus, we improve on the time complexity from our previous approach [10] from O(m log m) to O(n log n + m), where m is the number of cells and n is the number of vertices in the domain of F. Our second contribution is a new divide-and-conquer algorithm that computes the Augmented Contour Tree with improved efficiency. The approach computes the output Contour Tree by merging two intermediate Contour Trees and is independent of the interpolant. In this way we confine any knowledge regarding a specific interpolant to an independent function that computes the tree for a single cell. We have implemented this function for the trilinear interpolant and plan to replace it with higher order interpolants when needed. The time complexity is O(n + t log n), where t is the number of critical points of F. For the first time

  11. Setting background nutrient levels for coastal waters with oceanic influences

    NASA Astrophysics Data System (ADS)

    Smith, Alastair F.; Fryer, Rob J.; Webster, Lynda; Berx, Bee; Taylor, Alison; Walsham, Pamela; Turrell, William R.

    2014-05-01

    Nutrient enrichment of coastal water bodies as a result of human activities can lead to ecological changes. As part of a strategy to monitor such changes and detect potential eutrophication, samples were collected during research cruises conducted around the Scottish coast each January over the period 2007-2013. Data were obtained for total oxidised nitrogen (TOxN; nitrite and nitrate), phosphate and silicate, and incorporated into data-driven spatial models. Spatial averages in defined sea areas were calculated for each year in order to study inter-annual variability and systematic trends over time. Variation between some years was found to be significant (p < 0.05) but no evidence was found for any trends over the time period studied. This may have been due to the relatively short time series considered here. Modelled distributions were developed using data from groups of years (2007-2009, 2010-2011 and 2012-2013) and compared to the OSPAR Ecological Quality Objectives (EcoQOs) for dissolved inorganic nitrogen (DIN; the concentration of TOxN and ammonia), the ratio of DIN to dissolved inorganic phosphorous (N/P) and the ratio of DIN to dissolved silicate (N/S). In these three models, TOxN was below the offshore background concentration of 10 μM (12 μM at coastal locations) over more than 50% of the modelled area while N/S exceeded the upper assessment criterion of 2 over more than 50% of the modelled area. In the 2007-2009 model, N/P was below the background ratio (16) over the entire modelled area. In the 2010-2011 model the N/P ratio exceeded the background in 91% of the modelled area but remained below the upper assessment criterion (24). Scottish shelf sea waters were found to be depleted in TOxN relative to oceanic waters. This was not accounted for in the development of background values for the OSPAR EcoQOs so new estimates of these background values were derived. The implications of these results for setting reasonable background nutrient levels when

  12. A Relation Extraction Framework for Biomedical Text Using Hybrid Feature Set.

    PubMed

    Muzaffar, Abdul Wahab; Azam, Farooque; Qamar, Usman

    2015-01-01

    The information extraction from unstructured text segments is a complex task. Although manual information extraction often produces the best results, it is harder to manage biomedical data extraction manually because of the exponential increase in data size. Thus, there is a need for automatic tools and techniques for information extraction in biomedical text mining. Relation extraction is a significant area under biomedical information extraction that has gained much importance in the last two decades. A lot of work has been done on biomedical relation extraction focusing on rule-based and machine learning techniques. In the last decade, the focus has changed to hybrid approaches showing better results. This research presents a hybrid feature set for classification of relations between biomedical entities. The main contribution of this research is done in the semantic feature set where verb phrases are ranked using Unified Medical Language System (UMLS) and a ranking algorithm. Support Vector Machine and Naïve Bayes, the two effective machine learning techniques, are used to classify these relations. Our approach has been validated on the standard biomedical text corpus obtained from MEDLINE 2001. Conclusively, it can be articulated that our framework outperforms all state-of-the-art approaches used for relation extraction on the same corpus. PMID:26347797

  13. A Relation Extraction Framework for Biomedical Text Using Hybrid Feature Set

    PubMed Central

    Muzaffar, Abdul Wahab; Azam, Farooque; Qamar, Usman

    2015-01-01

    The information extraction from unstructured text segments is a complex task. Although manual information extraction often produces the best results, it is harder to manage biomedical data extraction manually because of the exponential increase in data size. Thus, there is a need for automatic tools and techniques for information extraction in biomedical text mining. Relation extraction is a significant area under biomedical information extraction that has gained much importance in the last two decades. A lot of work has been done on biomedical relation extraction focusing on rule-based and machine learning techniques. In the last decade, the focus has changed to hybrid approaches showing better results. This research presents a hybrid feature set for classification of relations between biomedical entities. The main contribution of this research is done in the semantic feature set where verb phrases are ranked using Unified Medical Language System (UMLS) and a ranking algorithm. Support Vector Machine and Naïve Bayes, the two effective machine learning techniques, are used to classify these relations. Our approach has been validated on the standard biomedical text corpus obtained from MEDLINE 2001. Conclusively, it can be articulated that our framework outperforms all state-of-the-art approaches used for relation extraction on the same corpus. PMID:26347797

  14. A universal surface complexation framework for modeling proton binding onto bacterial surfaces in geologic settings

    USGS Publications Warehouse

    Borrok, D.; Turner, B.F.; Fein, J.B.

    2005-01-01

    Adsorption onto bacterial cell walls can significantly affect the speciation and mobility of aqueous metal cations in many geologic settings. However, a unified thermodynamic framework for describing bacterial adsorption reactions does not exist. This problem originates from the numerous approaches that have been chosen for modeling bacterial surface protonation reactions. In this study, we compile all currently available potentiometric titration datasets for individual bacterial species, bacterial consortia, and bacterial cell wall components. Using a consistent, four discrete site, non-electrostatic surface complexation model, we determine total functional group site densities for all suitable datasets, and present an averaged set of 'universal' thermodynamic proton binding and site density parameters for modeling bacterial adsorption reactions in geologic systems. Modeling results demonstrate that the total concentrations of proton-active functional group sites for the 36 bacterial species and consortia tested are remarkably similar, averaging 3.2 ?? 1.0 (1??) ?? 10-4 moles/wet gram. Examination of the uncertainties involved in the development of proton-binding modeling parameters suggests that ignoring factors such as bacterial species, ionic strength, temperature, and growth conditions introduces relatively small error compared to the unavoidable uncertainty associated with the determination of cell abundances in realistic geologic systems. Hence, we propose that reasonable estimates of the extent of bacterial cell wall deprotonation can be made using averaged thermodynamic modeling parameters from all of the experiments that are considered in this study, regardless of bacterial species used, ionic strength, temperature, or growth condition of the experiment. The average site densities for the four discrete sites are 1.1 ?? 0.7 ?? 10-4, 9.1 ?? 3.8 ?? 10-5, 5.3 ?? 2.1 ?? 10-5, and 6.6 ?? 3.0 ?? 10-5 moles/wet gram bacteria for the sites with pKa values of 3

  15. Research on classified real-time flood forecasting framework based on K-means cluster and rough set.

    PubMed

    Xu, Wei; Peng, Yong

    2015-01-01

    This research presents a new classified real-time flood forecasting framework. In this framework, historical floods are classified by a K-means cluster according to the spatial and temporal distribution of precipitation, the time variance of precipitation intensity and other hydrological factors. Based on the classified results, a rough set is used to extract the identification rules for real-time flood forecasting. Then, the parameters of different categories within the conceptual hydrological model are calibrated using a genetic algorithm. In real-time forecasting, the corresponding category of parameters is selected for flood forecasting according to the obtained flood information. This research tests the new classified framework on Guanyinge Reservoir and compares the framework with the traditional flood forecasting method. It finds that the performance of the new classified framework is significantly better in terms of accuracy. Furthermore, the framework can be considered in a catchment with fewer historical floods. PMID:26442493

  16. Simultaneous segmentation and reconstruction: A level set method approach for limited view computed tomography

    SciTech Connect

    Yoon, Sungwon; Pineda, Angel R.; Fahrig, Rebecca

    2010-05-15

    Purpose: An iterative tomographic reconstruction algorithm that simultaneously segments and reconstructs the reconstruction domain is proposed and applied to tomographic reconstructions from a sparse number of projection images. Methods: The proposed algorithm uses a two-phase level set method segmentation in conjunction with an iterative tomographic reconstruction to achieve simultaneous segmentation and reconstruction. The simultaneous segmentation and reconstruction is achieved by alternating between level set function evolutions and per-region intensity value updates. To deal with the limited number of projections, a priori information about the reconstruction is enforced via penalized likelihood function. Specifically, smooth function within each region (piecewise smooth function) and bounded function intensity values for each region are assumed. Such a priori information is formulated into a quadratic objective function with linear bound constraints. The level set function evolutions are achieved by artificially time evolving the level set function in the negative gradient direction; the intensity value updates are achieved by using the gradient projection conjugate gradient algorithm. Results: The proposed simultaneous segmentation and reconstruction results were compared to ''conventional'' iterative reconstruction (with no segmentation), iterative reconstruction followed by segmentation, and filtered backprojection. Improvements of 6%-13% in the normalized root mean square error were observed when the proposed algorithm was applied to simulated projections of a numerical phantom and to real fan-beam projections of the Catphan phantom, both of which did not satisfy the a priori assumptions. Conclusions: The proposed simultaneous segmentation and reconstruction resulted in improved reconstruction image quality. The algorithm correctly segments the reconstruction space into regions, preserves sharp edges between different regions, and smoothes the noise

  17. Shared Investment Projects and Forecasting Errors: Setting Framework Conditions for Coordination and Sequencing Data Quality Activities

    PubMed Central

    Leitner, Stephan; Brauneis, Alexander; Rausch, Alexandra

    2015-01-01

    In this paper, we investigate the impact of inaccurate forecasting on the coordination of distributed investment decisions. In particular, by setting up a computational multi-agent model of a stylized firm, we investigate the case of investment opportunities that are mutually carried out by organizational departments. The forecasts of concern pertain to the initial amount of money necessary to launch and operate an investment opportunity, to the expected intertemporal distribution of cash flows, and the departments’ efficiency in operating the investment opportunity at hand. We propose a budget allocation mechanism for coordinating such distributed decisions The paper provides guidance on how to set framework conditions, in terms of the number of investment opportunities considered in one round of funding and the number of departments operating one investment opportunity, so that the coordination mechanism is highly robust to forecasting errors. Furthermore, we show that—in some setups—a certain extent of misforecasting is desirable from the firm’s point of view as it supports the achievement of the corporate objective of value maximization. We then address the question of how to improve forecasting quality in the best possible way, and provide policy advice on how to sequence activities for improving forecasting quality so that the robustness of the coordination mechanism to errors increases in the best possible way. At the same time, we show that wrong decisions regarding the sequencing can lead to a decrease in robustness. Finally, we conduct a comprehensive sensitivity analysis and prove that—in particular for relatively good forecasters—most of our results are robust to changes in setting the parameters of our multi-agent simulation model. PMID:25803736

  18. Shared investment projects and forecasting errors: setting framework conditions for coordination and sequencing data quality activities.

    PubMed

    Leitner, Stephan; Brauneis, Alexander; Rausch, Alexandra

    2015-01-01

    In this paper, we investigate the impact of inaccurate forecasting on the coordination of distributed investment decisions. In particular, by setting up a computational multi-agent model of a stylized firm, we investigate the case of investment opportunities that are mutually carried out by organizational departments. The forecasts of concern pertain to the initial amount of money necessary to launch and operate an investment opportunity, to the expected intertemporal distribution of cash flows, and the departments' efficiency in operating the investment opportunity at hand. We propose a budget allocation mechanism for coordinating such distributed decisions The paper provides guidance on how to set framework conditions, in terms of the number of investment opportunities considered in one round of funding and the number of departments operating one investment opportunity, so that the coordination mechanism is highly robust to forecasting errors. Furthermore, we show that-in some setups-a certain extent of misforecasting is desirable from the firm's point of view as it supports the achievement of the corporate objective of value maximization. We then address the question of how to improve forecasting quality in the best possible way, and provide policy advice on how to sequence activities for improving forecasting quality so that the robustness of the coordination mechanism to errors increases in the best possible way. At the same time, we show that wrong decisions regarding the sequencing can lead to a decrease in robustness. Finally, we conduct a comprehensive sensitivity analysis and prove that-in particular for relatively good forecasters-most of our results are robust to changes in setting the parameters of our multi-agent simulation model. PMID:25803736

  19. A Graphical Framework for Specification of Clinical Guidelines at Multiple Representation Levels

    PubMed Central

    Shalom, Erez; Shahar, Yuval

    2005-01-01

    Formalization of a clinical guideline for purposes of automated application and quality assessment mainly involves conversion of its free-text representation into a machine comprehensible representation, i.e., a formal language, thus enabling automated support. The main issues involved in this process are related to the collaboration between the expert physician and the knowledge engineer. We introduce GESHER - a graphical framework for specification of clinical guidelines at multiple representation levels. The GESHER architecture facilitates incremental specification through a set of views adapted to each representation level, enabling this process to proceed smoothly and in a transparent fashion, fostering extensive collaboration among the various types of users. The GESHER framework supports specification of guidelines at multiple representation levels, in more than one specification language, and uses the DeGeL digital guideline library architecture as its knowledge base. The GESHER architecture also uses a temporal abstraction knowledge base to store its declarative knowledge, and a standard medical-vocabularies server for generic specification of key terms, thus enabling reuse of the specification at multiple sites. PMID:16779126

  20. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the

  1. Measuring Afterschool Program Quality Using Setting-Level Observational Approaches

    ERIC Educational Resources Information Center

    Oh, Yoonkyung; Osgood, D. Wayne; Smith, Emilie P.

    2015-01-01

    The importance of afterschool hours for youth development is widely acknowledged, and afterschool settings have recently received increasing attention as an important venue for youth interventions, bringing a growing need for reliable and valid measures of afterschool quality. This study examined the extent to which the two observational tools,…

  2. Measuring afterschool program quality using setting-level observational approaches

    PubMed Central

    Oh, Yoonkyung; Osgood, D. Wayne; Smith, Emilie Phillips

    2016-01-01

    As the importance of afterschool hours for youth development is widely acknowledged, afterschool settings have recently received increasing attention as an important venue for youth interventions. A range of intervention programs have been in place, generally aiming at positive youth development through enhancing the quality of programs. A growing need has thus arisen for reliable and valid measures of afterschool quality. This study examined the extent to which the two observational tools, i.e., Caregiver Interaction Scales (CIS) and Promising Practices Rating Scales (PPRS), could serve as reliable and valid tools for assessing the various dimensions of afterschool setting quality. The study shows the potential promise of the instruments, on the one hand, and suggests future directions for improvement of measurement design and development of the field, on the other hand. In particular, our findings suggest the importance of addressing the effect of day-to-day fluctuations in observed afterschool quality. PMID:26819487

  3. Joint Infrared Target Recognition and Segmentation Using a Shape Manifold-Aware Level Set

    PubMed Central

    Yu, Liangjiang; Fan, Guoliang; Gong, Jiulu; Havlicek, Joseph P.

    2015-01-01

    We propose new techniques for joint recognition, segmentation and pose estimation of infrared (IR) targets. The problem is formulated in a probabilistic level set framework where a shape constrained generative model is used to provide a multi-class and multi-view shape prior and where the shape model involves a couplet of view and identity manifolds (CVIM). A level set energy function is then iteratively optimized under the shape constraints provided by the CVIM. Since both the view and identity variables are expressed explicitly in the objective function, this approach naturally accomplishes recognition, segmentation and pose estimation as joint products of the optimization process. For realistic target chips, we solve the resulting multi-modal optimization problem by adopting a particle swarm optimization (PSO) algorithm and then improve the computational efficiency by implementing a gradient-boosted PSO (GB-PSO). Evaluation was performed using the Military Sensing Information Analysis Center (SENSIAC) ATR database, and experimental results show that both of the PSO algorithms reduce the cost of shape matching during CVIM-based shape inference. Particularly, GB-PSO outperforms other recent ATR algorithms, which require intensive shape matching, either explicitly (with pre-segmentation) or implicitly (without pre-segmentation). PMID:25938202

  4. An Evaluation of the High Level Architecture (HLA) as a Framework for NASA Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reid, Michael R.; Powers, Edward I. (Technical Monitor)

    2000-01-01

    The High Level Architecture (HLA) is a current US Department of Defense and an industry (IEEE-1516) standard architecture for modeling and simulations. It provides a framework and set of functional rules and common interfaces for integrating separate and disparate simulators into a larger simulation. The goal of the HLA is to reduce software costs by facilitating the reuse of simulation components and by providing a runtime infrastructure to manage the simulations. In order to evaluate the applicability of the HLA as a technology for NASA space mission simulations, a Simulations Group at Goddard Space Flight Center (GSFC) conducted a study of the HLA and developed a simple prototype HLA-compliant space mission simulator. This paper summarizes the prototyping effort and discusses the potential usefulness of the HLA in the design and planning of future NASA space missions with a focus on risk mitigation and cost reduction.

  5. The adoption of the Reference Framework for diabetes care among primary care physicians in primary care settings: A cross-sectional study.

    PubMed

    Wong, Martin C S; Wang, Harry H X; Kwan, Mandy W M; Chan, Wai Man; Fan, Carmen K M; Liang, Miaoyin; Li, Shannon Ts; Fung, Franklin D H; Yeung, Ming Sze; Chan, David K L; Griffiths, Sian M

    2016-08-01

    The prevalence of diabetes mellitus has been increasing both globally and locally. Primary care physicians (PCPs) are in a privileged position to provide first contact and continuing care for diabetic patients. A territory-wide Reference Framework for Diabetes Care for Adults has been released by the Hong Kong Primary Care Office in 2010, with the aim to further enhance evidence-based and high quality care for diabetes in the primary care setting through wide adoption of the Reference Framework.A valid questionnaire survey was conducted among PCPs to evaluate the levels of, and the factors associated with, their adoption of the Reference Framework.A total of 414 completed surveys were received with the response rate of 13.0%. The average adoption score was 3.29 (SD 0.51) out of 4. Approximately 70% of PCPs highly adopted the Reference Framework in their routine practice. Binary logistic regression analysis showed that the PCPs perceptions on the inclusion of sufficient local information (adjusted odds ratio [aOR] = 4.748, 95%CI 1.597-14.115, P = 0.005) and reduction of professional autonomy of PCPs (aOR = 1.859, 95%CI 1.013-3.411, P = 0.045) were more likely to influence their adoption level of the Reference Framework for diabetes care in daily practices.The overall level of guideline adoption was found to be relatively high among PCPs for adult diabetes in primary care settings. The adoption barriers identified in this study should be addressed in the continuous updating of the Reference Framework. Strategies need to be considered to enhance the guideline adoption and implementation capacity. PMID:27495018

  6. Structural engineering masters level education framework of knowledge for the needs of initial professional practice

    NASA Astrophysics Data System (ADS)

    Balogh, Zsuzsa Enriko

    For at least the last decade, engineering, civil engineering, along with structural engineering as a profession within civil engineering, have and continue to face an emerging need for "Raising the Bar" of preparedness of young engineers seeking to become practicing professional engineers. The present consensus of the civil engineering profession is that the increasing need for broad and in-depth knowledge should require the young structural engineers to have at least a Masters-Level education. This study focuses on the Masters-Level preparedness in the structural engineering area within the civil engineering field. It follows much of the methodology used in the American Society of Civil Engineers (ASCE) Body of Knowledge determination for civil engineering and extends this type of study to better define the portion of the young engineers preparation beyond the undergraduate program for one specialty area of civil engineering. The objective of this research was to create a Framework of Knowledge for the young engineer which identifies and recognizes the needs of the profession, along with the profession's expectations of how those needs can be achieved in the graduate-level academic setting, in the practice environment, and through lifelong learning opportunities with an emphasis on the initial five years experience past completion of a Masters program in structural engineering. This study applied a modified Delphi method to obtain the critical information from members of the structural engineering profession. The results provide a Framework of Knowledge which will be useful to several groups seeking to better ensure the preparedness of the future young structural engineers at the Masters-Level.

  7. A novel framework for assessing metadata quality in epidemiological and public health research settings

    PubMed Central

    McMahon, Christiana; Denaxas, Spiros

    2016-01-01

    Metadata are critical in epidemiological and public health research. However, a lack of biomedical metadata quality frameworks and limited awareness of the implications of poor quality metadata renders data analyses problematic. In this study, we created and evaluated a novel framework to assess metadata quality of epidemiological and public health research datasets. We performed a literature review and surveyed stakeholders to enhance our understanding of biomedical metadata quality assessment. The review identified 11 studies and nine quality dimensions; none of which were specifically aimed at biomedical metadata. 96 individuals completed the survey; of those who submitted data, most only assessed metadata quality sometimes, and eight did not at all. Our framework has four sections: a) general information; b) tools and technologies; c) usability; and d) management and curation. We evaluated the framework using three test cases and sought expert feedback. The framework can assess biomedical metadata quality systematically and robustly. PMID:27570670

  8. High-level waste tank farm set point document

    SciTech Connect

    Anthony, J.A. III

    1995-01-15

    Setpoints for nuclear safety-related instrumentation are required for actions determined by the design authorization basis. Minimum requirements need to be established for assuring that setpoints are established and held within specified limits. This document establishes the controlling methodology for changing setpoints of all classifications. The instrumentation under consideration involve the transfer, storage, and volume reduction of radioactive liquid waste in the F- and H-Area High-Level Radioactive Waste Tank Farms. The setpoint document will encompass the PROCESS AREA listed in the Safety Analysis Report (SAR) (DPSTSA-200-10 Sup 18) which includes the diversion box HDB-8 facility. In addition to the PROCESS AREAS listed in the SAR, Building 299-H and the Effluent Transfer Facility (ETF) are also included in the scope.

  9. Telemedicine: what framework, what levels of proof, implementation rules.

    PubMed

    Zannad, Faiez; Maugendre, Philippe; Audry, Antoine; Avril, Carole; Blaise, Lucile; Blin, Olivier; Burnel, Philippe; Falise-Mirat, Béatrice; Girault, Danièle; Giri, Isabelle; Goehrs, Jean-Marie; Lassale, Catherine; Le Meur, Roland; Leurent, Pierre; Ratignier-Carbonneil, Christelle; Rossignol, Patrick; Satonnet, Evelyne; Simon, Pierre; Treluyer, Laurent

    2014-01-01

    The concept of telemedicine was formalised in France in the 2009 "Hospital, patients, health territories" (loi hôpital, patients, santé, territoire) law and the 2010 decree through which it was applied. Many experiments have been carried out and the regulatory institutions (Ministry, Regional Health Agency [Agence régionale de santé, ARS], French National Health Authority [Haute autorité de santé, HAS], etc.) have issued various guidance statements and recommendations on its organisation and on the expectations of its evaluation. With this background, the round table wanted to produce recommendations on different areas of medical telemonitoring (the role of telemonitoring, the regulatory system, the principles for assessment, methods of use and conditions for sustained and seamless deployment). Whilst many studies carried out on new medical telemonitoring approaches have led to the postulate that it offers benefit, both clinically and in terms of patient quality of life, more information is needed to demonstrate its impact on the organisation of healthcare and the associated medico-economic benefit (criteria, methods, resources). Similarly, contractual frameworks for deployment of telemonitoring do exist, although they are complicated and involve many different stakeholders (Director General fo the Care Offering [Direction générale de l'offre de soins, DGOS], ARS, HAS, Agency for Shared Health Information Systems [Agence des systèmes d'information partagés de santé, ASIP], French National Data Protection Commission [Commission nationale informatique et libertés, CNIL], French National Medical Council [Conseil national de l'Ordre des médecins, CNOM], etc.) that would benefit from a shared approach and seamless exchange between the partners involved. The current challenge is also to define the conditions required to validate a stable economic model in order to promote organisational change. One topical issue is placing the emphasis on its evaluation and

  10. Investigating the Experience of Outdoor and Adventurous Project Work in an Educational Setting Using a Self-Determination Framework

    ERIC Educational Resources Information Center

    Sproule, John; Martindale, Russell; Wang, John; Allison, Peter; Nash, Christine; Gray, Shirley

    2013-01-01

    The purpose of this study was to carry out a preliminary investigation to explore the use of outdoor and adventurous project work (PW) within an educational setting. Specifically, differences between the PW and normal academic school experiences were examined using a self-determination theory framework integrated with a goal orientation and…

  11. A Conceptual Framework for Educational Design at Modular Level to Promote Transfer of Learning

    ERIC Educational Resources Information Center

    Botma, Yvonne; Van Rensburg, G. H.; Coetzee, I. M.; Heyns, T.

    2015-01-01

    Students bridge the theory-practice gap when they apply in practice what they have learned in class. A conceptual framework was developed that can serve as foundation to design for learning transfer at modular level. The framework is based on an adopted and adapted systemic model of transfer of learning, existing learning theories, constructive…

  12. Toppled television sets and head injuries in the pediatric population: a framework for prevention.

    PubMed

    Cusimano, Michael D; Parker, Nadine

    2016-01-01

    Injuries to children caused by falling televisions have become more frequent during the last decade. These injuries can be severe and even fatal and are likely to become even more common in the future as TVs increase in size and become more affordable. To formulate guidelines for the prevention of these injuries, the authors systematically reviewed the literature on injuries related to toppling televisions. The authors searched MEDLINE, PubMed, Embase, Scopus, CINAHL (Cumulative Index to Nursing and Allied Health Literature), Cochrane Library, and Google Scholar according to the Cochrane guidelines for all studies involving children 0-18 years of age who were injured by toppled TVs. Factors contributing to injury were categorized using Haddon's Matrix, and the public health approach was used as a framework for developing strategies to prevent these injuries. The vast majority (84%) of the injuries occurred in homes and more than three-fourths were unwitnessed by adult caregivers. The TVs were most commonly large and elevated off the ground. Dressers and other furniture not designed to support TVs were commonly involved in the TV-toppling incident. The case fatality rate varies widely, but almost all deaths reported (96%) were due to brain injuries. Toddlers between the ages of 1 and 3 years most frequently suffer injuries to the head and neck, and they are most likely to suffer severe injuries. Many of these injuries require brain imaging and neurosurgical intervention. Prevention of these injuries will require changes in TV design and legislation as well as increases in public education and awareness. Television-toppling injuries can be easily prevented; however, the rates of injury do not reflect a sufficient level of awareness, nor do they reflect an acceptable effort from an injury prevention perspective. PMID:26416669

  13. Bushmeat genetics: setting up a reference framework for the DNA typing of African forest bushmeat.

    PubMed

    Gaubert, Philippe; Njiokou, Flobert; Olayemi, Ayodeji; Pagani, Paolo; Dufour, Sylvain; Danquah, Emmanuel; Nutsuakor, Mac Elikem K; Ngua, Gabriel; Missoup, Alain-Didier; Tedesco, Pablo A; Dernat, Rémy; Antunes, Agostinho

    2015-05-01

    The bushmeat trade in tropical Africa represents illegal, unsustainable off-takes of millions of tons of wild game - mostly mammals - per year. We sequenced four mitochondrial gene fragments (cyt b, COI, 12S, 16S) in >300 bushmeat items representing nine mammalian orders and 59 morphological species from five western and central African countries (Guinea, Ghana, Nigeria, Cameroon and Equatorial Guinea). Our objectives were to assess the efficiency of cross-species PCR amplification and to evaluate the usefulness of our multilocus approach for reliable bushmeat species identification. We provide a straightforward amplification protocol using a single 'universal' primer pair per gene that generally yielded >90% PCR success rates across orders and was robust to different types of meat preprocessing and DNA extraction protocols. For taxonomic identification, we set up a decision pipeline combining similarity- and tree-based approaches with an assessment of taxonomic expertise and coverage of the GENBANK database. Our multilocus approach permitted us to: (i) adjust for existing taxonomic gaps in GENBANK databases, (ii) assign to the species level 67% of the morphological species hypotheses and (iii) successfully identify samples with uncertain taxonomic attribution (preprocessed carcasses and cryptic lineages). High levels of genetic polymorphism across genes and taxa, together with the excellent resolution observed among species-level clusters (neighbour-joining trees and Klee diagrams) advocate the usefulness of our markers for bushmeat DNA typing. We formalize our DNA typing decision pipeline through an expert-curated query database - DNA BUSHMEAT - that shall permit the automated identification of African forest bushmeat items. PMID:25264212

  14. A variational approach to multi-phase motion of gas, liquid and solid based on the level set method

    NASA Astrophysics Data System (ADS)

    Yokoi, Kensuke

    2009-07-01

    We propose a simple and robust numerical algorithm to deal with multi-phase motion of gas, liquid and solid based on the level set method [S. Osher, J.A. Sethian, Front propagating with curvature-dependent speed: Algorithms based on Hamilton-Jacobi formulation, J. Comput. Phys. 79 (1988) 12; M. Sussman, P. Smereka, S. Osher, A level set approach for capturing solution to incompressible two-phase flow, J. Comput. Phys. 114 (1994) 146; J.A. Sethian, Level Set Methods and Fast Marching Methods, Cambridge University Press, 1999; S. Osher, R. Fedkiw, Level Set Methods and Dynamics Implicit Surface, Applied Mathematical Sciences, vol. 153, Springer, 2003]. In Eulerian framework, to simulate interaction between a moving solid object and an interfacial flow, we need to define at least two functions (level set functions) to distinguish three materials. In such simulations, in general two functions overlap and/or disagree due to numerical errors such as numerical diffusion. In this paper, we resolved the problem using the idea of the active contour model [M. Kass, A. Witkin, D. Terzopoulos, Snakes: active contour models, International Journal of Computer Vision 1 (1988) 321; V. Caselles, R. Kimmel, G. Sapiro, Geodesic active contours, International Journal of Computer Vision 22 (1997) 61; G. Sapiro, Geometric Partial Differential Equations and Image Analysis, Cambridge University Press, 2001; R. Kimmel, Numerical Geometry of Images: Theory, Algorithms, and Applications, Springer-Verlag, 2003] introduced in the field of image processing.

  15. Cooperative fuzzy games approach to setting target levels of ECs in quality function deployment.

    PubMed

    Yang, Zhihui; Chen, Yizeng; Yin, Yunqiang

    2014-01-01

    Quality function deployment (QFD) can provide a means of translating customer requirements (CRs) into engineering characteristics (ECs) for each stage of product development and production. The main objective of QFD-based product planning is to determine the target levels of ECs for a new product or service. QFD is a breakthrough tool which can effectively reduce the gap between CRs and a new product/service. Even though there are conflicts among some ECs, the objective of developing new product is to maximize the overall customer satisfaction. Therefore, there may be room for cooperation among ECs. A cooperative game framework combined with fuzzy set theory is developed to determine the target levels of the ECs in QFD. The key to develop the model is the formulation of the bargaining function. In the proposed methodology, the players are viewed as the membership functions of ECs to formulate the bargaining function. The solution for the proposed model is Pareto-optimal. An illustrated example is cited to demonstrate the application and performance of the proposed approach. PMID:25097884

  16. Cooperative Fuzzy Games Approach to Setting Target Levels of ECs in Quality Function Deployment

    PubMed Central

    Yang, Zhihui; Chen, Yizeng; Yin, Yunqiang

    2014-01-01

    Quality function deployment (QFD) can provide a means of translating customer requirements (CRs) into engineering characteristics (ECs) for each stage of product development and production. The main objective of QFD-based product planning is to determine the target levels of ECs for a new product or service. QFD is a breakthrough tool which can effectively reduce the gap between CRs and a new product/service. Even though there are conflicts among some ECs, the objective of developing new product is to maximize the overall customer satisfaction. Therefore, there may be room for cooperation among ECs. A cooperative game framework combined with fuzzy set theory is developed to determine the target levels of the ECs in QFD. The key to develop the model is the formulation of the bargaining function. In the proposed methodology, the players are viewed as the membership functions of ECs to formulate the bargaining function. The solution for the proposed model is Pareto-optimal. An illustrated example is cited to demonstrate the application and performance of the proposed approach. PMID:25097884

  17. Benchmarking density functional theory predictions of framework structures and properties in a chemically diverse test set of metal-organic frameworks

    SciTech Connect

    Nazarian, Dalar; Ganesh, P.; Sholl, David S.

    2015-09-30

    We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionals for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.

  18. Segmentation of neonatal brain MR images using patch-driven level sets.

    PubMed

    Wang, Li; Shi, Feng; Li, Gang; Gao, Yaozong; Lin, Weili; Gilmore, John H; Shen, Dinggang

    2014-01-01

    The segmentation of neonatal brain MR image into white matter (WM), gray matter (GM), and cerebrospinal fluid (CSF), is challenging due to the low spatial resolution, severe partial volume effect, high image noise, and dynamic myelination and maturation processes. Atlas-based methods have been widely used for guiding neonatal brain segmentation. Existing brain atlases were generally constructed by equally averaging all the aligned template images from a population. However, such population-based atlases might not be representative of a testing subject in the regions with high inter-subject variability and thus often lead to a low capability in guiding segmentation in those regions. Recently, patch-based sparse representation techniques have been proposed to effectively select the most relevant elements from a large group of candidates, which can be used to generate a subject-specific representation with rich local anatomical details for guiding the segmentation. Accordingly, in this paper, we propose a novel patch-driven level set method for the segmentation of neonatal brain MR images by taking advantage of sparse representation techniques. Specifically, we first build a subject-specific atlas from a library of aligned, manually segmented images by using sparse representation in a patch-based fashion. Then, the spatial consistency in the probability maps from the subject-specific atlas is further enforced by considering the similarities of a patch with its neighboring patches. Finally, the probability maps are integrated into a coupled level set framework for more accurate segmentation. The proposed method has been extensively evaluated on 20 training subjects using leave-one-out cross validation, and also on 132 additional testing subjects. Our method achieved a high accuracy of 0.919±0.008 for white matter and 0.901±0.005 for gray matter, respectively, measured by Dice ratio for the overlap between the automated and manual segmentations in the cortical region

  19. Novel multimodality segmentation using level sets and Jensen-Rényi divergence

    SciTech Connect

    Markel, Daniel; Zaidi, Habib; El Naqa, Issam

    2013-12-15

    Purpose: Positron emission tomography (PET) is playing an increasing role in radiotherapy treatment planning. However, despite progress, robust algorithms for PET and multimodal image segmentation are still lacking, especially if the algorithm were extended to image-guided and adaptive radiotherapy (IGART). This work presents a novel multimodality segmentation algorithm using the Jensen-Rényi divergence (JRD) to evolve the geometric level set contour. The algorithm offers improved noise tolerance which is particularly applicable to segmentation of regions found in PET and cone-beam computed tomography. Methods: A steepest gradient ascent optimization method is used in conjunction with the JRD and a level set active contour to iteratively evolve a contour to partition an image based on statistical divergence of the intensity histograms. The algorithm is evaluated using PET scans of pharyngolaryngeal squamous cell carcinoma with the corresponding histological reference. The multimodality extension of the algorithm is evaluated using 22 PET/CT scans of patients with lung carcinoma and a physical phantom scanned under varying image quality conditions. Results: The average concordance index (CI) of the JRD segmentation of the PET images was 0.56 with an average classification error of 65%. The segmentation of the lung carcinoma images had a maximum diameter relative error of 63%, 19.5%, and 14.8% when using CT, PET, and combined PET/CT images, respectively. The estimated maximal diameters of the gross tumor volume (GTV) showed a high correlation with the macroscopically determined maximal diameters, with aR{sup 2} value of 0.85 and 0.88 using the PET and PET/CT images, respectively. Results from the physical phantom show that the JRD is more robust to image noise compared to mutual information and region growing. Conclusions: The JRD has shown improved noise tolerance compared to mutual information for the purpose of PET image segmentation. Presented is a flexible

  20. Multi-scale texture-based level-set segmentation of breast B-mode images.

    PubMed

    Lang, Itai; Sklair-Levy, Miri; Spitzer, Hedva

    2016-05-01

    Automatic segmentation of ultrasonographic breast lesions is very challenging, due to the lesions' spiculated nature and the variance in shape and texture of the B-mode ultrasound images. Many studies have tried to answer this challenge by applying a variety of computational methods including: Markov random field, artificial neural networks, and active contours and level-set techniques. These studies focused on creating an automatic contour, with maximal resemblance to a manual contour, delineated by a trained radiologist. In this study, we have developed an algorithm, designed to capture the spiculated boundary of the lesion by using the properties from the corresponding ultrasonic image. This is primarily achieved through a unique multi-scale texture identifier (inspired by visual system models) integrated in a level-set framework. The algorithm׳s performance has been evaluated quantitatively via contour-based and region-based error metrics. We compared the algorithm-generated contour to a manual contour delineated by an expert radiologist. In addition, we suggest here a new method for performance evaluation where corrections made by the radiologist replace the algorithm-generated (original) result in the correction zones. The resulting corrected contour is then compared to the original version. The evaluation showed: (1) Mean absolute error of 0.5 pixels between the original and the corrected contour; (2) Overlapping area of 99.2% between the lesion regions, obtained by the algorithm and the corrected contour. These results are significantly better than those previously reported. In addition, we have examined the potential of our segmentation results to contribute to the discrimination between malignant and benign lesions. PMID:27010737

  1. An explanatory framework of teachers' perceptions of a positive mealtime environment in a preschool setting.

    PubMed

    Mita, Satoko C; Gray, Samuel A; Goodell, L Suzanne

    2015-07-01

    Attending a preschool center may help preschoolers with growth and development that encourage a healthy lifestyle, including sound eating behaviors. Providing a positive mealtime environment (PME) may be one of the keys to fostering a child's healthy eating habits in the classroom. However, a specific definition of a PME, the components of a PME, or directions on how to create one have not been established. The purpose of this study, therefore, was to explore Head Start teachers' perceptions related to a PME and create a conceptual framework representing these perceptions. To achieve this purpose, researchers conducted 65 in-depth phone interviews with Head Start teachers around the US. Applying principles of grounded theory, researchers developed a conceptual framework depicting teachers' perceptions of PME, consisting of five key components: (1) the people (i.e., teachers, kitchen staff, parent volunteers, and children), (2) positive emotional tone (e.g., relaxed and happy), (3) rules, expectations, and routines (e.g., family-style mealtime), (4) operations of a PME (i.e., eating, socialization, and learning), and (5) both short- and long-term outcomes of a PME. With this PME framework, researchers may be able to enhance the effectiveness of nutrition interventions related to a PME, focusing on the factors in the conceptual framework as well as barriers associated with achieving these factors. PMID:25728886

  2. Validation of the Visitor and Resident Framework in an E-Book Setting

    ERIC Educational Resources Information Center

    Engelsmann, Hazel C.; Greifeneder, Elke; Lauridsen, Nikoline D.; Nielsen, Anja G.

    2014-01-01

    Introduction: By applying the visitor and resident framework on e-book usage, the article explores whether the concepts of a resident and a visitor can help to explain e-book use, and can help to gain a better insight into users' motivations for e-book use. Method: A questionnaire and semi-structured interviews were conducted with users of…

  3. Developing Individualized Education Programs for Children in Inclusive Settings: A Developmentally Appropriate Framework.

    ERIC Educational Resources Information Center

    Edmiaston, Rebecca; Dolezal, Val; Doolittle, Sharon; Erickson, Carol; Merritt, Sandy

    2000-01-01

    Presents a developmentally appropriate framework reflecting the constructivist orientation of early childhood education to guide development of IEP goals and objectives for young children with disabilities. Discusses problems teachers encounter with IEPs, including defining skills too narrowly, not considering the time factor, and isolating the…

  4. A conceptual framework for organizational readiness to implement nutrition and physical activity programs in early childhood education settings.

    PubMed

    Sharma, Shreela V; Upadhyaya, Mudita; Schober, Daniel J; Byrd-Williams, Courtney

    2014-01-01

    Across multiple sectors, organizational readiness predicts the success of program implementation. However, the factors influencing readiness of early childhood education (ECE) organizations for implementation of new nutrition and physical activity programs is poorly understood. This study presents a new conceptual framework to measure organizational readiness to implement nutrition and physical activity programs in ECE centers serving children aged 0 to 5 years. The framework was validated for consensus on relevance and generalizability by conducting focus groups; the participants were managers (16 directors and 2 assistant directors) of ECE centers. The framework theorizes that it is necessary to have "collective readiness," which takes into account such factors as resources, organizational operations, work culture, and the collective attitudes, motivation, beliefs, and intentions of ECE staff. Results of the focus groups demonstrated consensus on the relevance of proposed constructs across ECE settings. Including readiness measures during program planning and evaluation could inform implementation of ECE programs targeting nutrition and physical activity behaviors. PMID:25357258

  5. Cell segmentation using coupled level sets and graph-vertex coloring.

    PubMed

    Nath, Sumit K; Palaniappan, Kannappan; Bunyak, Filiz

    2006-01-01

    Current level-set based approaches for segmenting a large number of objects are computationally expensive since they require a unique level set per object (the N-level set paradigm), or [log2N] level sets when using a multiphase interface tracking formulation. Incorporating energy-based coupling constraints to control the topological interactions between level sets further increases the computational cost to O(N2). We propose a new approach, with dramatic computational savings, that requires only four, or fewer, level sets for an arbitrary number of similar objects (like cells) using the Delaunay graph to capture spatial relationships. Even more significantly, the coupling constraints (energy-based and topological) are incorporated using just constant O(1) complexity. The explicit topological coupling constraint, based on predicting contour collisions between adjacent level sets, is developed to further prevent false merging or absorption of neighboring cells, and also reduce fragmentation during level set evolution. The proposed four-color level set algorithm is used to efficiently and accurately segment hundreds of individual epithelial cells within a moving monolayer sheet from time-lapse images of in vitro wound healing without any false merging of cells. PMID:17354879

  6. A novel level set model with automated initialization and controlling parameters for medical image segmentation.

    PubMed

    Liu, Qingyi; Jiang, Mingyan; Bai, Peirui; Yang, Guang

    2016-03-01

    In this paper, a level set model without the need of generating initial contour and setting controlling parameters manually is proposed for medical image segmentation. The contribution of this paper is mainly manifested in three points. First, we propose a novel adaptive mean shift clustering method based on global image information to guide the evolution of level set. By simple threshold processing, the results of mean shift clustering can automatically and speedily generate an initial contour of level set evolution. Second, we devise several new functions to estimate the controlling parameters of the level set evolution based on the clustering results and image characteristics. Third, the reaction diffusion method is adopted to supersede the distance regularization term of RSF-level set model, which can improve the accuracy and speed of segmentation effectively with less manual intervention. Experimental results demonstrate the performance and efficiency of the proposed model for medical image segmentation. PMID:26748038

  7. Screening Systems and Decision Making at the Preschool Level: Application of a Comprehensive Validity Framework

    ERIC Educational Resources Information Center

    Kettler, Ryan J.; Feeney-Kettler, Kelly A.

    2011-01-01

    Universal screening is designed to be an efficient method for identifying preschool students with mental health problems, but prior to use, screening systems must be evaluated to determine their appropriateness within a specific setting. In this article, an evidence-based validity framework is applied to four screening systems for identifying…

  8. Conceptual Framework and Levels of Abstraction for a Complex Large-Scale System

    SciTech Connect

    Simpson, Mary J.

    2005-03-23

    A conceptual framework and levels of abstraction are created to apply across all potential threats. Bioterrorism is used as a complex example to describe the general framework. Bioterrorism is unlimited with respect to the use of a specific agent, mode of dissemination, and potential target. Because the threat is open-ended, there is a strong need for a common, systemic understanding of attack scenarios related to bioterrorism. In recognition of this large-scale complex problem, systems are being created to define, design and use the proper level of abstraction and conceptual framework in bioterrorism. The wide variety of biological agents and delivery mechanisms provide an opportunity for dynamic scale changes by the linking or interlinking of existing threat components. Concurrent impacts must be separated and evaluated in terms of a given environment and/or ‘abstraction framework.’

  9. Multireference Level Set for the Characterization of Nuclear Morphology in Glioblastoma Multiforme

    PubMed Central

    Han, Ju; Spellman, Paul T.

    2013-01-01

    Histological tissue sections provide rich information and continue to be the gold standard for the assessment of tissue neoplasm. However, there are a significant amount of technical and biological variations that impede analysis of large histological datasets. In this paper, we have proposed a novel approach for nuclear segmentation in tumor histology sections, which addresses the problem of technical and biological variations by incorporating information from both manually annotated reference patches and the original image. Subsequently, the solution is formulated within a multireference level set framework. This approach has been validated on manually annotated samples and then applied to the TCGA glioblastoma multiforme (GBM) dataset consisting of 440 whole mount tissue sections scanned with either a 20× or 40× objective, in which, each tissue section varies in size from 40k × 40k pixels to 100k × 100k pixels. Experimental results show a superior performance of the proposed method in comparison with present state of art techniques. PMID:22987497

  10. Alternative Frameworks of the Secondary School Students on the Concept of Condensation at Submicroscopic Level

    ERIC Educational Resources Information Center

    Abdullah, Nurdiana; Surif, Johari; Ismail, Syuhaida

    2016-01-01

    The study was carried out to identify the alternative frameworks on the concept of condensation at submicroscopic level among secondary school students (N = 324). Data was collected by using the qualitative method through the Understanding Test on the Concept of Matter at Submicroscopic Level which consisted of 10 open-ended questions. The…

  11. The Agenda Setting Function of the Mass Media at Three Levels of "Information Holding"

    ERIC Educational Resources Information Center

    Benton, Marc; Frazier, P. Jean

    1976-01-01

    Extends the theoretical concept of agenda setting to include awareness of general issues, awareness of proposed solutions, and specific knowledge about the proposals. Examines whether or not agenda setting is operative at these levels and compares findings with previous agenda setting studies. (MH)

  12. A new framework for intrusion detection based on rough set theory

    NASA Astrophysics Data System (ADS)

    Li, Zhijun; Wu, Yu; Wang, Guoyin; Hai, Yongjun; He, Yunpeng

    2004-04-01

    Intrusion detection is an essential component of critical infrastructure protection mechanism. Since many current IDSs are constructed by manual encoding of expert knowledge, it is time-consuming to update their knowledge. In order to solve this problem, an effective method for misuse intrusion detection with low cost and high efficiency is presented. This paper gives an overview of our research in building a detection model for identifying known intrusions, their variations and novel attacks with unknown natures. The method is based on rough set theory and capable of extracting a set of detection rules from network packet features. After getting a decision table through preprocessing raw packet data, rough-set-based reduction and rule generation algorithms are applied, and useful rules for intrusion detection are obtained. In addition, a rough set and rule-tree-based incremental knowledge acquisition algorithm is presented in order to solve problems of updating rule set when new attacks appear. Compared with other methods, our method requires a smaller size of training data set and less effort to collect training data. Experimental results demonstrate that our system is effective and more suitable for online intrusion detection.

  13. 76 FR 9004 - Public Comment on Setting Achievement Levels in Writing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-16

    ...The National Assessment Governing Board (Governing Board) is soliciting public comments and recommendations to improve the design proposed for setting achievement levels for NAEP in writing. This notice provides opportunity for public comment and submitting recommendations for improving the design proposed for setting achievement levels for the 2011 National Assessment of Educational Progress......

  14. Joint Target Tracking, Recognition and Segmentation for Infrared Imagery Using a Shape Manifold-Based Level Set

    PubMed Central

    Gong, Jiulu; Fan, Guoliang; Yu, Liangjiang; Havlicek, Joseph P.; Chen, Derong; Fan, Ningjun

    2014-01-01

    We propose a new integrated target tracking, recognition and segmentation algorithm, called ATR-Seg, for infrared imagery. ATR-Seg is formulated in a probabilistic shape-aware level set framework that incorporates a joint view-identity manifold (JVIM) for target shape modeling. As a shape generative model, JVIM features a unified manifold structure in the latent space that is embedded with one view-independent identity manifold and infinite identity-dependent view manifolds. In the ATR-Seg algorithm, the ATR problem formulated as a sequential level-set optimization process over the latent space of JVIM, so that tracking and recognition can be jointly optimized via implicit shape matching where target segmentation is achieved as a by-product without any pre-processing or feature extraction. Experimental results on the recently released SENSIAC ATR database demonstrate the advantages and effectiveness of ATR-Seg over two recent ATR algorithms that involve explicit shape matching. PMID:24919014

  15. A level set approach for left ventricle detection in CT images using shape segmentation and optical flow

    NASA Astrophysics Data System (ADS)

    Brieva, Jorge; Moya-Albor, Ernesto; Escalante-Ramírez, Boris

    2015-01-01

    The left ventricle (LV) segmentation plays an important role in a subsequent process for the functional analysis of the LV. Typical segmentation of the endocardium wall in the ventricle excludes papillary muscles which leads to an incorrect measure of the ejected volume in the LV. In this paper we present a new variational strategy using a 2D level set framework that includes a local term for enhancing the low contrast structures and a 2D shape model. The shape model in the level set method is propagated to all image sequences corresponding to the cardiac cycles through the optical flow approach using the Hermite transform. To evaluate our strategy we use the Dice index and the Hausdorff distance to compare the segmentation results with the manual segmentation carried out by the physician.

  16. Intervention complexity--a conceptual framework to inform priority-setting in health.

    PubMed Central

    Gericke, Christian A.; Kurowski, Christoph; Ranson, M. Kent; Mills, Anne

    2005-01-01

    Health interventions vary substantially in the degree of effort required to implement them. To some extent this is apparent in their financial cost, but the nature and availability of non-financial resources is often of similar importance. In particular, human resource requirements are frequently a major constraint. We propose a conceptual framework for the analysis of interventions according to their degree of technical complexity; this complements the notion of institutional capacity in considering the feasibility of implementing an intervention. Interventions are categorized into four dimensions: characteristics of the basic intervention; characteristics of delivery; requirements on government capacity; and usage characteristics. The analysis of intervention complexity should lead to a better understanding of supply- and demand-side constraints to scaling up, indicate priorities for further research and development, and can point to potential areas for improvement of specific aspects of each intervention to close the gap between the complexity of an intervention and the capacity to implement it. The framework is illustrated using the examples of scaling up condom social marketing programmes, and the DOTS strategy for tuberculosis control in highly resource-constrained countries. The framework could be used as a tool for policy-makers, planners and programme managers when considering the expansion of existing projects or the introduction of new interventions. Intervention complexity thus complements the considerations of burden of disease, cost-effectiveness, affordability and political feasibility in health policy decision-making. Reducing the technical complexity of interventions will be crucial to meeting the health-related Millennium Development Goals. PMID:15868020

  17. An improved variational level set method for MR image segmentation and bias field correction.

    PubMed

    Zhan, Tianming; Zhang, Jun; Xiao, Liang; Chen, Yunjie; Wei, Zhihui

    2013-04-01

    In this paper, we propose an improved variational level set approach to correct the bias and to segment the magnetic resonance (MR) images with inhomogeneous intensity. First, we use a Gaussian distribution with bias field as a local region descriptor in two-phase level set formulation for segmentation and bias field correction of the images with inhomogeneous intensities. By using the information of the local variance in this descriptor, our method is able to obtain accurate segmentation results. Furthermore, we extend this method to three-phase level set formulation for brain MR image segmentation and bias field correction. By using this three-phase level set function to replace the four-phase level set function, we can reduce the number of convolution operations in each iteration and improve the efficiency. Compared with other approaches, this algorithm demonstrates a superior performance. PMID:23219273

  18. Benchmarking density functional theory predictions of framework structures and properties in a chemically diverse test set of metal-organic frameworks

    DOE PAGESBeta

    Nazarian, Dalar; Ganesh, P.; Sholl, David S.

    2015-09-30

    We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionalsmore » for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.« less

  19. Education leadership in the clinical health care setting: A framework for nursing education development.

    PubMed

    Mockett, Lynda; Horsfall, Janine; O'Callaghan, Wendy

    2006-12-01

    This paper describes how a new framework for clinical nursing education was introduced at Counties Manukau District Health Board (CMDHB), New Zealand. The project was initiated in response to the significant legislative and post registration nursing education changes within New Zealand. The journey of change has been a significant undertaking, and has required clear management, strong leadership, perseverance and understanding of the organisation's culture. The approach taken to managing the change had four stages, and reflects various change management models. The first stage, the identification process, identified the impetus for change. Creating the vision is the second stage and identified what the change would look like within the organisation. To ensure success and to guide the process of change a realistic and sustainable vision was developed. Implementing the vision was the third stage, and discusses the communication and pilot phase of implementing the nursing education framework. Stage four, embedding the vision, explores the process and experiences of changing an education culture and embedding the vision into an organisation. The paper concludes by discussing the importance of implementing robust, consistent, strategic and collaborative processes - that reflect and evaluate best educational nursing practice. PMID:19040908

  20. Education leadership in the clinical health care setting: a framework for nursing education development.

    PubMed

    Mockett, Lynda; Horsfall, Janine; O'Callaghan, Wendy

    2006-12-01

    This paper describes how a new framework for clinical nursing education was introduced at Counties Manukau District Health Board (CMDHB), New Zealand. The project was initiated in response to the significant legislative and post registration nursing education changes within New Zealand. The journey of change has been a significant undertaking, and has required clear management, strong leadership, perseverance and understanding of the organisation's culture. The approach taken to managing the change had four stages, and reflects various change management models. The first stage, the identification process, identified the impetus for change. Creating the vision is the second stage and identified what the change would look like within the organisation. To ensure success and to guide the process of change a realistic and sustainable vision was developed. Implementing the vision was the third stage, and discusses the communication and pilot phase of implementing the nursing education framework. Stage four, embedding the vision, explores the process and experiences of changing an education culture and embedding the vision into an organisation. The paper concludes by discussing the importance of implementing robust, consistent, strategic and collaborative processes--that reflect and evaluate best educational nursing practice. PMID:17028073

  1. Holocene sea level variations on the basis of integration of independent data sets

    SciTech Connect

    Sahagian, D.; Berkman, P. . Dept. of Geological Sciences and Byrd Polar Research Center)

    1992-01-01

    Variations in sea level through earth history have occurred at a wide variety of time scales. Sea level researchers have attacked the problem of measuring these sea level changes through a variety of approaches, each relevant only to the time scale in question, and usually only relevant to the specific locality from which a specific type of data are derived. There is a plethora of different data types that can and have been used (locally) for the measurement of Holocene sea level variations. The problem of merging different data sets for the purpose of constructing a global eustatic sea level curve for the Holocene has not previously been adequately addressed. The authors direct the efforts to that end. Numerous studies have been published regarding Holocene sea level changes. These have involved exposed fossil reef elevations, elevation of tidal deltas, elevation of depth of intertidal peat deposits, caves, tree rings, ice cores, moraines, eolian dune ridges, marine-cut terrace elevations, marine carbonate species, tide gauges, and lake level variations. Each of these data sets is based on particular set of assumptions, and is valid for a specific set of environments. In order to obtain the most accurate possible sea level curve for the Holocene, these data sets must be merged so that local and other influences can be filtered out of each data set. Since each data set involves very different measurements, each is scaled in order to define the sensitivity of the proxy measurement parameter to sea level, including error bounds. This effectively determines the temporal and spatial resolution of each data set. The level of independence of data sets is also quantified, in order to rule out the possibility of a common non-eustatic factor affecting more than one variety of data. The Holocene sea level curve is considered to be independent of other factors affecting the proxy data, and is taken to represent the relation between global ocean water and basin volumes.

  2. Alternative Dispute Resolution (ADR): A Different Framework for Conflict Resolution in Educational Settings.

    ERIC Educational Resources Information Center

    Turan, Selahattin; Taylor, Charles

    This paper briefly introduces alternative dispute resolution (ADR) processes and their fundamental principles. The paper provides a review of the literature on ADR and discusses its applicability in educational settings. The concept of conflict is explained, along with analysis of the limitations of traditional conflict resolution processes. The…

  3. Intellectual Curiosity in Action: A Framework to Assess First-Year Seminars in Liberal Arts Settings

    ERIC Educational Resources Information Center

    Kolb, Kenneth H.; Longest, Kyle C.; Barnett, Jenna C.

    2014-01-01

    Fostering students' intellectual curiosity is a common goal of first-year seminar programs--especially in liberal arts settings. The authors propose an alternative method to assess this ambiguous, value-laden concept. Relying on data gathered from pre- and posttest in-depth interviews of 34 students enrolled in first-year seminars, they…

  4. Evidence-Based Standard Setting: Establishing a Validity Framework for Cut Scores

    ERIC Educational Resources Information Center

    McClarty, Katie Larsen; Way, Walter D.; Porter, Andrew C.; Beimers, Jennifer N.; Miles, Julie A.

    2013-01-01

    Performance standards are a powerful way to communicate K-12 student achievement (e.g., proficiency) and are the cornerstone of standards-based reform. As education reform shifts the focus to college and career readiness, approaches for setting performance standards need to be revised. We argue that the focus on assessing student readiness can…

  5. Commentary: A Response to Reckase's Conceptual Framework and Examples for Evaluating Standard Setting Methods

    ERIC Educational Resources Information Center

    Schulz, E. Matthew

    2006-01-01

    A look at real data shows that Reckase's psychometric theory for standard setting is not applicable to bookmark and that his simulations cannot explain actual differences between methods. It is suggested that exclusively test-centered, criterion-referenced approaches are too idealized and that a psychophysics paradigm and a theory of group…

  6. Translating evidence into practice: Hong Kong Reference Framework for Preventive Care for Children in Primary Care Settings.

    PubMed

    Siu, Natalie P Y; Too, L C; Tsang, Caroline S H; Young, Betty W Y

    2015-06-01

    There is increasing evidence that supports the close relationship between childhood and adult health. Fostering healthy growth and development of children deserves attention and effort. The Reference Framework for Preventive Care for Children in Primary Care Settings has been published by the Task Force on Conceptual Model and Preventive Protocols under the direction of the Working Group on Primary Care. It aims to promote health and prevent disease in children and is based on the latest research, and contributions of the Clinical Advisory Group that comprises primary care physicians, paediatricians, allied health professionals, and patient groups. This article highlights the comprehensive, continuing, and patient-centred preventive care for children and discusses how primary care physicians can incorporate the evidence-based recommendations into clinical practice. It is anticipated that the adoption of this framework will contribute to improved health and wellbeing of children. PMID:25999033

  7. Level set based vertebra segmentation for the evaluation of Ankylosing Spondylitis

    NASA Astrophysics Data System (ADS)

    Tan, Sovira; Yao, Jianhua; Ward, Michael M.; Yao, Lawrence; Summers, Ronald M.

    2006-03-01

    Ankylosing Spondylitis is a disease of the vertebra where abnormal bone structures (syndesmophytes) grow at intervertebral disk spaces. Because this growth is so slow as to be undetectable on plain radiographs taken over years, it is necessary to resort to computerized techniques to complement qualitative human judgment with precise quantitative measures on 3-D CT images. Very fine segmentation of the vertebral body is required to capture the small structures caused by the pathology. We propose a segmentation algorithm based on a cascade of three level set stages and requiring no training or prior knowledge. First, the noise inside the vertebral body that often blocks the proper evolution of level set surfaces is attenuated by a sigmoid function whose parameters are determined automatically. The 1st level set (geodesic active contour) is designed to roughly segment the interior of the vertebra despite often highly inhomogeneous and even discontinuous boundaries. The result is used as an initial contour for the 2nd level set (Laplacian level set) that closely captures the inner boundary of the cortical bone. The last level set (reversed Laplacian level set) segments the outer boundary of the cortical bone and also corrects small flaws of the previous stage. We carried out extensive tests on 30 vertebrae (5 from each of 6 patients). Two medical experts scored the results at intervertebral disk spaces focusing on end plates and syndesmophytes. Only two minor segmentation errors at vertebral end plates were reported and two syndesmophytes were considered slightly under-segmented.

  8. A framework for outcome-level evaluation of in-service training of health care workers

    PubMed Central

    2013-01-01

    Background In-service training is a key strategic approach to addressing the severe shortage of health care workers in many countries. However, there is a lack of evidence linking these health care worker trainings to improved health outcomes. In response, the United States President’s Emergency Plan for AIDS Relief’s Human Resources for Health Technical Working Group initiated a project to develop an outcome-focused training evaluation framework. This paper presents the methods and results of that project. Methods A general inductive methodology was used for the conceptualization and development of the framework. Fifteen key informant interviews were conducted to explore contextual factors, perceived needs, barriers and facilitators affecting the evaluation of training outcomes. In addition, a thematic analysis of 70 published articles reporting health care worker training outcomes identified key themes and categories. These were integrated, synthesized and compared to several existing training evaluation models. This formed an overall typology which was used to draft a new framework. Finally, the framework was refined and validated through an iterative process of feedback, pilot testing and revision. Results The inductive process resulted in identification of themes and categories, as well as relationships among several levels and types of outcomes. The resulting framework includes nine distinct types of outcomes that can be evaluated, which are organized within three nested levels: individual, organizational and health system/population. The outcome types are: (1) individual knowledge, attitudes and skills; (2) individual performance; (3) individual patient health; (4) organizational systems; (5) organizational performance; (6) organizational-level patient health; (7) health systems; (8) population-level performance; and (9) population-level health. The framework also addresses contextual factors which may influence the outcomes of training, as well as the

  9. A framework for evaluating safety-net and other community-level factors on access for low-income populations.

    PubMed

    Davidson, Pamela L; Andersen, Ronald M; Wyn, Roberta; Brown, E Richard

    2004-01-01

    The framework presented in this article extends the Andersen behavioral model of health services utilization research to examine the effects of contextual determinants of access. A conceptual framework is suggested for selecting and constructing contextual (or community-level) variables representing the social, economic, structural, and public policy environment that influence low-income people's use of medical care. Contextual variables capture the characteristics of the population that disproportionately relies on the health care safety net, the public policy support for low-income and safety-net populations, and the structure of the health care market and safety-net services within that market. Until recently, the literature in this area has been largely qualitative and descriptive and few multivariate studies comprehensively investigated the contextual determinants of access. The comprehensive and systematic approach suggested by the framework will enable researchers to strengthen the external validity of results by accounting for the influence of a consistent set of contextual factors across locations and populations. A subsequent article in this issue of Inquiry applies the framework to examine access to ambulatory care for low-income adults, both insured and uninsured. PMID:15224958

  10. Level set discrete element method for three-dimensional computations with triaxial case study

    NASA Astrophysics Data System (ADS)

    Kawamoto, Reid; Andò, Edward; Viggiani, Gioacchino; Andrade, José E.

    2016-06-01

    In this paper, we outline the level set discrete element method (LS-DEM) which is a discrete element method variant able to simulate systems of particles with arbitrary shape using level set functions as a geometric basis. This unique formulation allows seamless interfacing with level set-based characterization methods as well as computational ease in contact calculations. We then apply LS-DEM to simulate two virtual triaxial specimens generated from XRCT images of experiments and demonstrate LS-DEM's ability to quantitatively capture and predict stress-strain and volume-strain behavior observed in the experiments.

  11. Using a Framework for Three Levels of Sense Making in a Mathematics Classroom

    ERIC Educational Resources Information Center

    Moss, Diana L.; Lamberg, Teruni

    2016-01-01

    This discussion-based lesson is designed to support Year 6 students in their initial understanding of using letters to represent numbers, expressions, and equations in algebra. The three level framework is designed for: (1) making thinking explicit, (2) exploring each other's solutions, and (3) developing new mathematical insights. In each level…

  12. The Harmonizing Outcome Measures for Eczema (HOME) roadmap: a methodological framework to develop core sets of outcome measurements in dermatology.

    PubMed

    Schmitt, Jochen; Apfelbacher, Christian; Spuls, Phyllis I; Thomas, Kim S; Simpson, Eric L; Furue, Masutaka; Chalmers, Joanne; Williams, Hywel C

    2015-01-01

    Core outcome sets (COSs) are consensus-derived minimum sets of outcomes to be assessed in a specific situation. COSs are being increasingly developed to limit outcome-reporting bias, allow comparisons across trials, and strengthen clinical decision making. Despite the increasing interest in outcomes research, methods to develop COSs have not yet been standardized. The aim of this paper is to present the Harmonizing Outcomes Measures for Eczema (HOME) roadmap for the development and implementation of COSs, which was developed on the basis of our experience in the standardization of outcome measurements for atopic eczema. Following the establishment of a panel representing all relevant stakeholders and a research team experienced in outcomes research, the scope and setting of the core set should be defined. The next steps are the definition of a core set of outcome domains such as symptoms or quality of life, followed by the identification or development and validation of appropriate outcome measurement instruments to measure these core domains. Finally, the consented COS needs to be disseminated, implemented, and reviewed. We believe that the HOME roadmap is a useful methodological framework to develop COSs in dermatology, with the ultimate goal of better decision making and promoting patient-centered health care. PMID:25186228

  13. Ice cover, landscape setting, and geological framework of Lake Vostok, East Antarctica

    USGS Publications Warehouse

    Studinger, M.; Bell, R.E.; Karner, G.D.; Tikku, A.A.; Holt, J.W.; Morse, D.L.; David, L.; Richter, T.G.; Kempf, S.D.; Peters, M.E.; Blankenship, D.D.; Sweeney, R.E.; Rystrom, V.L.

    2003-01-01

    Lake Vostok, located beneath more than 4 km of ice in the middle of East Antarctica, is a unique subglacial habitat and may contain microorganisms with distinct adaptations to such an extreme environment. Melting and freezing at the base of the ice sheet, which slowly flows across the lake, controls the flux of water, biota and sediment particles through the lake. The influx of thermal energy, however, is limited to contributions from below. Thus the geological origin of Lake Vostok is a critical boundary condition for the subglacial ecosystem. We present the first comprehensive maps of ice surface, ice thickness and subglacial topography around Lake Vostok. The ice flow across the lake and the landscape setting are closely linked to the geological origin of Lake Vostok. Our data show that Lake Vostok is located along a major geological boundary. Magnetic and gravity data are distinct east and west of the lake, as is the roughness of the subglacial topography. The physiographic setting of the lake has important consequences for the ice flow and thus the melting and freezing pattern and the lake's circulation. Lake Vostok is a tectonically controlled subglacial lake. The tectonic processes provided the space for a unique habitat and recent minor tectonic activity could have the potential to introduce small, but significant amounts of thermal energy into the lake. ?? 2002 Elsevier Science B.V. All rights reserved.

  14. A rough set based rational clustering framework for determining correlated genes.

    PubMed

    Jeyaswamidoss, Jeba Emilyn; Thangaraj, Kesavan; Ramar, Kadarkarai; Chitra, Muthusamy

    2016-06-01

    Cluster analysis plays a foremost role in identifying groups of genes that show similar behavior under a set of experimental conditions. Several clustering algorithms have been proposed for identifying gene behaviors and to understand their significance. The principal aim of this work is to develop an intelligent rough clustering technique, which will efficiently remove the irrelevant dimensions in a high-dimensional space and obtain appropriate meaningful clusters. This paper proposes a novel biclustering technique that is based on rough set theory. The proposed algorithm uses correlation coefficient as a similarity measure to simultaneously cluster both the rows and columns of a gene expression data matrix and mean squared residue to generate the initial biclusters. Furthermore, the biclusters are refined to form the lower and upper boundaries by determining the membership of the genes in the clusters using mean squared residue. The algorithm is illustrated with yeast gene expression data and the experiment proves the effectiveness of the method. The main advantage is that it overcomes the problem of selection of initial clusters and also the restriction of one object belonging to only one cluster by allowing overlapping of biclusters. PMID:27352972

  15. Epidemic Reconstruction in a Phylogenetics Framework: Transmission Trees as Partitions of the Node Set

    PubMed Central

    Hall, Matthew; Woolhouse, Mark; Rambaut, Andrew

    2015-01-01

    The use of genetic data to reconstruct the transmission tree of infectious disease epidemics and outbreaks has been the subject of an increasing number of studies, but previous approaches have usually either made assumptions that are not fully compatible with phylogenetic inference, or, where they have based inference on a phylogeny, have employed a procedure that requires this tree to be fixed. At the same time, the coalescent-based models of the pathogen population that are employed in the methods usually used for time-resolved phylogeny reconstruction are a considerable simplification of epidemic process, as they assume that pathogen lineages mix freely. Here, we contribute a new method that is simultaneously a phylogeny reconstruction method for isolates taken from an epidemic, and a procedure for transmission tree reconstruction. We observe that, if one or more samples is taken from each host in an epidemic or outbreak and these are used to build a phylogeny, a transmission tree is equivalent to a partition of the set of nodes of this phylogeny, such that each partition element is a set of nodes that is connected in the full tree and contains all the tips corresponding to samples taken from one and only one host. We then implement a Monte Carlo Markov Chain (MCMC) procedure for simultaneous sampling from the spaces of both trees, utilising a newly-designed set of phylogenetic tree proposals that also respect node partitions. We calculate the posterior probability of these partitioned trees based on a model that acknowledges the population structure of an epidemic by employing an individual-based disease transmission model and a coalescent process taking place within each host. We demonstrate our method, first using simulated data, and then with sequences taken from the H7N7 avian influenza outbreak that occurred in the Netherlands in 2003. We show that it is superior to established coalescent methods for reconstructing the topology and node heights of the

  16. [Head and Neck Tumor Segmentation Based on Augmented Gradient Level Set Method].

    PubMed

    Zhang, Qiongmin; Zhang, Jing; Wang, Mintang; He, Ling; Men, Yi; Wei, Jun; Haung, Hua

    2015-08-01

    To realize the accurate positioning and quantitative volume measurement of tumor in head and neck tumor CT images, we proposed a level set method based on augmented gradient. With the introduction of gradient information in the edge indicator function, our proposed level set model is adaptive to different intensity variation, and achieves accurate tumor segmentation. The segmentation result has been used to calculate tumor volume. In large volume tumor segmentation, the proposed level set method can reduce manual intervention and enhance the segmentation accuracy. Tumor volume calculation results are close to the gold standard. From the experiment results, the augmented gradient based level set method has achieved accurate head and neck tumor segmentation. It can provide useful information to computer aided diagnosis. PMID:26710464

  17. Learning A Superpixel-Driven Speed Function for Level Set Tracking.

    PubMed

    Zhou, Xue; Li, Xi; Hu, Weiming

    2016-07-01

    A key problem in level set tracking is to construct a discriminative speed function for effective contour evolution. In this paper, we propose a level set tracking method based on a discriminative speed function, which produces a superpixel-driven force for effective level set evolution. Based on kernel density estimation and metric learning, the speed function is capable of effectively encoding the discriminative information on object appearance within a feasible metric space. Furthermore, we introduce adaptive object shape modeling into the level set evolution process, which leads to the tracking robustness in complex scenarios. To ensure the efficiency of adaptive object shape modeling, we develop a simple but efficient weighted non-negative matrix factorization method that can online learn an object shape dictionary. Experimental results on a number of challenging video sequences demonstrate the effectiveness and robustness of the proposed tracking method. PMID:26292353

  18. The Reliability and Validity of the Comfort Level Method of Setting Hearing Aid Gain

    ERIC Educational Resources Information Center

    Walden, Brian E.; And Others

    1977-01-01

    Investigated in a series of experiments with 40 adults (20- to 70-years-old) having bilateral sensorineural hearing impairments was the test-retest reliability of the comfort level method for setting the acoustic gain of hearing aids, and the relationship between the comfort settings utilized in more realistic daily listening situations.…

  19. Basin-scale runoff prediction: An Ensemble Kalman Filter framework based on global hydrometeorological data sets

    NASA Astrophysics Data System (ADS)

    Lorenz, Christof; Tourian, Mohammad J.; Devaraju, Balaji; Sneeuw, Nico; Kunstmann, Harald

    2015-10-01

    In order to cope with the steady decline of the number of in situ gauges worldwide, there is a growing need for alternative methods to estimate runoff. We present an Ensemble Kalman Filter based approach that allows us to conclude on runoff for poorly or irregularly gauged basins. The approach focuses on the application of publicly available global hydrometeorological data sets for precipitation (GPCC, GPCP, CRU, UDEL), evapotranspiration (MODIS, FLUXNET, GLEAM, ERA interim, GLDAS), and water storage changes (GRACE, WGHM, GLDAS, MERRA LAND). Furthermore, runoff data from the GRDC and satellite altimetry derived estimates are used. We follow a least squares prediction that exploits the joint temporal and spatial auto- and cross-covariance structures of precipitation, evapotranspiration, water storage changes and runoff. We further consider time-dependent uncertainty estimates derived from all data sets. Our in-depth analysis comprises of 29 large river basins of different climate regions, with which runoff is predicted for a subset of 16 basins. Six configurations are analyzed: the Ensemble Kalman Filter (Smoother) and the hard (soft) Constrained Ensemble Kalman Filter (Smoother). Comparing the predictions to observed monthly runoff shows correlations larger than 0.5, percentage biases lower than ± 20%, and NSE-values larger than 0.5. A modified NSE-metric, stressing the difference to the mean annual cycle, shows an improvement of runoff predictions for 14 of the 16 basins. The proposed method is able to provide runoff estimates for nearly 100 poorly gauged basins covering an area of more than 11,500,000 km2 with a freshwater discharge, in volume, of more than 125,000 m3/s.

  20. Locally constrained active contour: a region-based level set for ovarian cancer metastasis segmentation

    NASA Astrophysics Data System (ADS)

    Liu, Jianfei; Yao, Jianhua; Wang, Shijun; Linguraru, Marius George; Summers, Ronald M.

    2014-03-01

    Accurate segmentation of ovarian cancer metastases is clinically useful to evaluate tumor growth and determine follow-up treatment. We present a region-based level set algorithm with localization constraints to segment ovarian cancer metastases. Our approach is established on a representative region-based level set, Chan-Vese model, in which an active contour is driven by region competition. To reduce over-segmentation, we constrain the level set propagation within a narrow image band by embedding a dynamic localization function. The metastasis intensity prior is also estimated from image regions within the level set initialization. The localization function and intensity prior force the level set to stop at the desired metastasis boundaries. Our approach was validated on 19 ovarian cancer metastases with radiologist-labeled ground-truth on contrast-enhanced CT scans from 15 patients. The comparison between our algorithm and geodesic active contour indicated that the volume overlap was 75+/-10% vs. 56+/-6%, the Dice coefficient was 83+/-8% vs. 63+/-8%, and the average surface distance was 2.2+/-0.6mm vs. 4.4+/-0.9mm. Experimental results demonstrated that our algorithm outperformed traditional level set algorithms.

  1. A discontinuous Galerkin conservative level set scheme for interface capturing in multiphase flows

    SciTech Connect

    Owkes, Mark Desjardins, Olivier

    2013-09-15

    The accurate conservative level set (ACLS) method of Desjardins et al. [O. Desjardins, V. Moureau, H. Pitsch, An accurate conservative level set/ghost fluid method for simulating turbulent atomization, J. Comput. Phys. 227 (18) (2008) 8395–8416] is extended by using a discontinuous Galerkin (DG) discretization. DG allows for the scheme to have an arbitrarily high order of accuracy with the smallest possible computational stencil resulting in an accurate method with good parallel scaling. This work includes a DG implementation of the level set transport equation, which moves the level set with the flow field velocity, and a DG implementation of the reinitialization equation, which is used to maintain the shape of the level set profile to promote good mass conservation. A near second order converging interface curvature is obtained by following a height function methodology (common amongst volume of fluid schemes) in the context of the conservative level set. Various numerical experiments are conducted to test the properties of the method and show excellent results, even on coarse meshes. The tests include Zalesak’s disk, two-dimensional deformation of a circle, time evolution of a standing wave, and a study of the Kelvin–Helmholtz instability. Finally, this novel methodology is employed to simulate the break-up of a turbulent liquid jet.

  2. An adaptive level set approach for incompressible two-phase flows

    SciTech Connect

    Sussman, M.; Almgren, A.S.; Bell, J.B.

    1997-04-01

    In Sussman, Smereka and Osher, a numerical method using the level set approach was formulated for solving incompressible two-phase flow with surface tension. In the level set approach, the interface is represented as the zero level set of a smooth function; this has the effect of replacing the advection of density, which has steep gradients at the interface, with the advection of the level set function, which is smooth. In addition, the interface can merge or break up with no special treatment. The authors maintain the level set function as the signed distance from the interface in order to robustly compute flows with high density ratios and stiff surface tension effects. In this work, they couple the level set scheme to an adaptive projection method for the incompressible Navier-Stokes equations, in order to achieve higher resolution of the interface with a minimum of additional expense. They present two-dimensional axisymmetric and fully three-dimensional results of air bubble and water drop computations.

  3. [Intellectual development disorders in Latin America: a framework for setting policy priorities for research and care].

    PubMed

    Lazcano-Ponce, Eduardo; Katz, Gregorio; Allen-Leigh, Betania; Magaña Valladares, Laura; Rangel-Eudave, Guillermina; Minoletti, Alberto; Wahlberg, Ernesto; Vásquez, Armando; Salvador-Carulla, Luis

    2013-09-01

    Intellectual development disorders (IDDs) are a set of development disorders characterized by significantly limited cognitive functioning, learning disorders, and disorders related to adaptive skills and behavior. Previously grouped under the term "intellectual disability," this problem has not been widely studied or quantified in Latin America. Those affected are absent from public policy and do not benefit from government social development and poverty reduction strategies. This article offers a critical look at IDDs and describes a new taxonomy; it also proposes recognizing IDDs as a public health issue and promoting the professionalization of care, and suggests an agenda for research and regional action. In Latin America there is no consensus on the diagnostic criteria for IDDs. A small number of rehabilitation programs cover a significant proportion of the people who suffer from IDDs, evidence-based services are not offered, and health care guidelines have not been evaluated. Manuals on psychiatric diagnosis focus heavily on identifying serious IDDs and contribute to underreporting and erroneous classification. The study of these disorders has not been a legal, social science, or public health priority, resulting in a dearth of scientific evidence on them. Specific competencies and professionalization of care for these persons are needed, and interventions must be carried out with a view to prevention, rehabilitation, community integration, and inclusion in the work force. PMID:24233114

  4. Multiphase permittivity imaging using absolute value electrical capacitance tomography data and a level set algorithm.

    PubMed

    Al Hosani, E; Soleimani, M

    2016-06-28

    Multiphase flow imaging is a very challenging and critical topic in industrial process tomography. In this article, simulation and experimental results of reconstructing the permittivity profile of multiphase material from data collected in electrical capacitance tomography (ECT) are presented. A multiphase narrowband level set algorithm is developed to reconstruct the interfaces between three- or four-phase permittivity values. The level set algorithm is capable of imaging multiphase permittivity by using one set of ECT measurement data, so-called absolute value ECT reconstruction, and this is tested with high-contrast and low-contrast multiphase data. Simulation and experimental results showed the superiority of this algorithm over classical pixel-based image reconstruction methods. The multiphase level set algorithm and absolute ECT reconstruction are presented for the first time, to the best of our knowledge, in this paper and critically evaluated. This article is part of the themed issue 'Supersensing through industrial process tomography'. PMID:27185966

  5. Joint inversion of geophysical data using petrophysical clustering and facies deformation wth the level set technique

    NASA Astrophysics Data System (ADS)

    Revil, A.

    2015-12-01

    Geological expertise and petrophysical relationships can be brought together to provide prior information while inverting multiple geophysical datasets. The merging of such information can result in more realistic solution in the distribution of the model parameters, reducing ipse facto the non-uniqueness of the inverse problem. We consider two level of heterogeneities: facies, described by facies boundaries and heteroegenities inside each facies determined by a correlogram. In this presentation, we pose the geophysical inverse problem in terms of Gaussian random fields with mean functions controlled by petrophysical relationships and covariance functions controlled by a prior geological cross-section, including the definition of spatial boundaries for the geological facies. The petrophysical relationship problem is formulated as a regression problem upon each facies. The inversion of the geophysical data is performed in a Bayesian framework. We demonstrate the usefulness of this strategy using a first synthetic case for which we perform a joint inversion of gravity and galvanometric resistivity data with the stations located at the ground surface. The joint inversion is used to recover the density and resistivity distributions of the subsurface. In a second step, we consider the possibility that the facies boundaries are deformable and their shapes are inverted as well. We use the level set approach to perform such deformation preserving prior topological properties of the facies throughout the inversion. With the help of prior facies petrophysical relationships and topological characteristic of each facies, we make posterior inference about multiple geophysical tomograms based on their corresponding geophysical data misfits. The method is applied to a second synthetic case showing that we can recover the heterogeneities inside the facies, the mean values for the petrophysical properties, and, to some extent, the facies boundaries using the 2D joint inversion of

  6. Evolving entities: towards a unified framework for understanding diversity at the species and higher levels.

    PubMed

    Barraclough, Timothy G

    2010-06-12

    Current approaches to studying the evolution of biodiversity differ in their treatment of species and higher level diversity patterns. Species are regarded as the fundamental evolutionarily significant units of biodiversity, both in theory and in practice, and extensive theory explains how they originate and evolve. However, most species are still delimited using qualitative methods that only relate indirectly to the underlying theory. In contrast, higher level patterns of diversity have been subjected to rigorous quantitative study (using phylogenetics), but theory that adequately explains the observed patterns has been lacking. Most evolutionary analyses of higher level diversity patterns have considered non-equilibrium explanations based on rates of diversification (i.e. exponentially growing clades), rather than equilibrium explanations normally used at the species level and below (i.e. constant population sizes). This paper argues that species level and higher level patterns of diversity can be considered within a common framework, based on equilibrium explanations. It shows how forces normally considered in the context of speciation, namely divergent selection and geographical isolation, can generate evolutionarily significant units of diversity above the level of reproductively isolated species. Prospects for the framework to answer some unresolved questions about higher level diversity patterns are discussed. PMID:20439282

  7. Evolving entities: towards a unified framework for understanding diversity at the species and higher levels

    PubMed Central

    Barraclough, Timothy G.

    2010-01-01

    Current approaches to studying the evolution of biodiversity differ in their treatment of species and higher level diversity patterns. Species are regarded as the fundamental evolutionarily significant units of biodiversity, both in theory and in practice, and extensive theory explains how they originate and evolve. However, most species are still delimited using qualitative methods that only relate indirectly to the underlying theory. In contrast, higher level patterns of diversity have been subjected to rigorous quantitative study (using phylogenetics), but theory that adequately explains the observed patterns has been lacking. Most evolutionary analyses of higher level diversity patterns have considered non-equilibrium explanations based on rates of diversification (i.e. exponentially growing clades), rather than equilibrium explanations normally used at the species level and below (i.e. constant population sizes). This paper argues that species level and higher level patterns of diversity can be considered within a common framework, based on equilibrium explanations. It shows how forces normally considered in the context of speciation, namely divergent selection and geographical isolation, can generate evolutionarily significant units of diversity above the level of reproductively isolated species. Prospects for the framework to answer some unresolved questions about higher level diversity patterns are discussed. PMID:20439282

  8. Options for future effective water management in Lombok: A multi-level nested framework

    NASA Astrophysics Data System (ADS)

    Sjah, Taslim; Baldwin, Claudia

    2014-11-01

    Previous research on water use in Lombok identified reduced water available in springs and limits on seasonal water availability. It foreshadowed increasing competition for water resources in critical areas of Lombok. This study examines preliminary information on local social-institutional arrangements for water allocation in the context of Ostrom's rules for self-governing institutions. We identify robust customary mechanisms for decision-making about water sharing and rules at a local level and suggest areas of further investigation for strengthening multi-level networked and nested frameworks, in collaboration with higher levels of government.

  9. Ontological Problem-Solving Framework for Assigning Sensor Systems and Algorithms to High-Level Missions

    PubMed Central

    Qualls, Joseph; Russomanno, David J.

    2011-01-01

    The lack of knowledge models to represent sensor systems, algorithms, and missions makes opportunistically discovering a synthesis of systems and algorithms that can satisfy high-level mission specifications impractical. A novel ontological problem-solving framework has been designed that leverages knowledge models describing sensors, algorithms, and high-level missions to facilitate automated inference of assigning systems to subtasks that may satisfy a given mission specification. To demonstrate the efficacy of the ontological problem-solving architecture, a family of persistence surveillance sensor systems and algorithms has been instantiated in a prototype environment to demonstrate the assignment of systems to subtasks of high-level missions. PMID:22164081

  10. Breast mass segmentation in digital mammography based on pulse coupled neural network and level set method

    NASA Astrophysics Data System (ADS)

    Xie, Weiying; Ma, Yide; Li, Yunsong

    2015-05-01

    A novel approach to mammographic image segmentation, termed as PCNN-based level set algorithm, is presented in this paper. Just as its name implies, a method based on pulse coupled neural network (PCNN) in conjunction with the variational level set method for medical image segmentation. To date, little work has been done on detecting the initial zero level set contours based on PCNN algorithm for latterly level set evolution. When all the pixels of the input image are fired by PCNN, the small pixel value will be a much more refined segmentation. In mammographic image, the breast tumor presents big pixel value. Additionally, the mammographic image with predominantly dark region, so that we firstly obtain the negative of mammographic image with predominantly dark region except the breast tumor before all the pixels of an input image are fired by PCNN. Therefore, in here, PCNN algorithm is employed to achieve mammary-specific, initial mass contour detection. After that, the initial contours are all extracted. We define the extracted contours as the initial zero level set contours for automatic mass segmentation by variational level set in mammographic image analysis. What's more, a new proposed algorithm improves external energy of variational level set method in terms of mammographic images in low contrast. In accordance with the gray scale of mass region in mammographic image is higher than the region surrounded, so the Laplace operator is used to modify external energy, which could make the bright spot becoming much brighter than the surrounded pixels in the image. A preliminary evaluation of the proposed method performs on a known public database namely MIAS, rather than synthetic images. The experimental results demonstrate that our proposed approach can potentially obtain better masses detection results in terms of sensitivity and specificity. Ultimately, this algorithm could lead to increase both sensitivity and specificity of the physicians' interpretation of

  11. A framework for sea level rise vulnerability assessment for southwest U.S. military installations

    USGS Publications Warehouse

    Chadwick, B.; Flick, Reinhard; Helly, J.; Nishikawa, T.; Pei, Fang Wang; O'Reilly, W.; Guza, R.; Bromirski, Peter; Young, A.; Crampton, W.; Wild, B.; Canner, I.

    2011-01-01

    We describe an analysis framework to determine military installation vulnerabilities under increases in local mean sea level as projected over the next century. The effort is in response to an increasing recognition of potential climate change ramifications for national security and recommendations that DoD conduct assessments of the impact on U.S. military installations of climate change. Results of the effort described here focus on development of a conceptual framework for sea level rise vulnerability assessment at coastal military installations in the southwest U.S. We introduce the vulnerability assessment in the context of a risk assessment paradigm that incorporates sources in the form of future sea level conditions, pathways of impact including inundation, flooding, erosion and intrusion, and a range of military installation specific receptors such as critical infrastructure and training areas. A unique aspect of the methodology is the capability to develop wave climate projections from GCM outputs and transform these to future wave conditions at specific coastal sites. Future sea level scenarios are considered in the context of installation sensitivity curves which reveal response thresholds specific to each installation, pathway and receptor. In the end, our goal is to provide a military-relevant framework for assessment of accelerated SLR vulnerability, and develop the best scientifically-based scenarios of waves, tides and storms and their implications for DoD installations in the southwestern U.S. ?? 2011 MTS.

  12. A distributed decision framework for building clusters with different heterogeneity settings

    DOE PAGESBeta

    Jafari-Marandi, Ruholla; Omitaomu, Olufemi A.; Hu, Mengqi

    2016-01-05

    In the past few decades, extensive research has been conducted to develop operation and control strategy for smart buildings with the purpose of reducing energy consumption. Besides studying on single building, it is envisioned that the next generation buildings can freely connect with one another to share energy and exchange information in the context of smart grid. It was demonstrated that a network of connected buildings (aka building clusters) can significantly reduce primary energy consumption, improve environmental sustainability and building s resilience capability. However, an analytic tool to determine which type of buildings should form a cluster and what ismore » the impact of building clusters heterogeneity based on energy profile to the energy performance of building clusters is missing. To bridge these research gaps, we propose a self-organizing map clustering algorithm to divide multiple buildings to different clusters based on their energy profiles, and a homogeneity index to evaluate the heterogeneity of different building clusters configurations. In addition, a bi-level distributed decision model is developed to study the energy sharing in the building clusters. To demonstrate the effectiveness of the proposed clustering algorithm and decision model, we employ a dataset including monthly energy consumption data for 30 buildings where the data is collected every 15 min. It is demonstrated that the proposed decision model can achieve at least 13% cost savings for building clusters. Furthermore, the results show that the heterogeneity of energy profile is an important factor to select battery and renewable energy source for building clusters, and the shared battery and renewable energy are preferred for more heterogeneous building clusters.« less

  13. Target Detection in SAR Images Based on a Level Set Approach

    SciTech Connect

    Marques, Regis C.P.; Medeiros, Fatima N.S.; Ushizima, Daniela M.

    2008-09-01

    This paper introduces a new framework for point target detection in synthetic aperture radar (SAR) images. We focus on the task of locating reflective small regions using alevel set based algorithm. Unlike most of the approaches in image segmentation, we address an algorithm which incorporates speckle statistics instead of empirical parameters and also discards speckle filtering. The curve evolves according to speckle statistics, initially propagating with a maximum upward velocity in homogeneous areas. Our approach is validated by a series of tests on synthetic and real SAR images and compared with three other segmentation algorithms, demonstrating that it configures a novel and efficient method for target detection purpose.

  14. Estimations of a global sea level trend: limitations from the structure of the PSMSL global sea level data set

    NASA Astrophysics Data System (ADS)

    Gröger, M.; Plag, H.-P.

    1993-08-01

    Among the possible impacts on environmental conditions of a global warming expected as a consequence of the increasing release of CO 2 and various other greenhouse gases into the atmosphere, a predicted rise in global sea level is considered to be of high importance. Thus, quite a number of recent studies have focused on detecting the "global sea level rise" or even an acceleration of this trend. A brief review of these studies is presented, showing, however, that the results are not conclusive, though most of the studies have been based on a single global data set of coastal tide gauge data provided by the Permanent Service for Mean Sea Level (PSMSL). A detailed discussion of a thoroughly revised subset reveals that the PSMSL data set suffers from three severe limitations: (1) the geographical distribution of reliable tide gauge stations is rather uneven with pronounced concentrations in some areas of the northern hemisphere (Europe, North America, Japan), and much fewer stations on the southern hemisphere where particularly few stations are located in Africa and in Antarctica; (2) the number of stations recording simultaneously at any time is far less than the total number of stations with the maximum within the interval between 1958 and 1988; (3) the number of long records is extremely small and almost all of them originate from a few regions of the northern hemisphere. The sensitivity of the median of the local trends to these temporal and spatial limitations is discussed by restricting the data set in both the spatial and temporal distribution. It is shown that the data base is insufficient for determining an integral value of the global rise in relative sea level. The effect of polar motion on sea level is modelled and it turns out to be locally of the order of 0.5 mm/yr, affecting regional trends to an order of 0.1 mm/yr. Thus, this effect can be neglected on time scale of decades to a hundred years. Though the data set is insufficient for determining an

  15. A High-Level Framework for Distributed Processing of Large-Scale Graphs

    NASA Astrophysics Data System (ADS)

    Krepska, Elzbieta; Kielmann, Thilo; Fokkink, Wan; Bal, Henri

    Distributed processing of real-world graphs is challenging due to their size and the inherent irregular structure of graph computations. We present hipg, a distributed framework that facilitates high-level programming of parallel graph algorithms by expressing them as a hierarchy of distributed computations executed independently and managed by the user. hipg programs are in general short and elegant; they achieve good portability, memory utilization and performance.

  16. A Multi-Level Approach for Promoting HIV Testing Within African American Church Settings

    PubMed Central

    2015-01-01

    Abstract The African American church is a community-based organization that is integral to the lives, beliefs, and behaviors of the African American community. Engaging this vital institution as a primary setting for HIV testing and referral would significantly impact the epidemic. The disproportionately high HIV incidence rate among African Americans dictates the national priority for promotion of early and routine HIV testing, and suggests engaging community-based organizations in this endeavor. However, few multilevel HIV testing frameworks have been developed, tested, and evaluated within the African American church. This article proposes one such framework for promoting HIV testing and referral within African American churches. A qualitative study was employed to examine the perceptions, beliefs, knowledge, and behaviors related to understanding involvement in church-based HIV testing. A total of four focus groups with church leaders and four in-depth interviews with pastors, were conducted between November 2012 and June 2013 to identify the constructs most important to supporting Philadelphia churches' involvement in HIV testing, referral, and linkage to care. The data generated from this study were analyzed using a grounded theory approach and used to develop and refine a multilevel framework for identifying factors impacting church-based HIV testing and referral and to ultimately support capacity building among African American churches to promote HIV testing and linkage to care. PMID:25682887

  17. Issues related to setting exemption levels for oil and gas NORM

    SciTech Connect

    Blunt, D. L.; Gooden, D. S.; Smith, K. P.

    1999-11-12

    In the absence of any federal regulations that specifically address the handling and disposal of wastes containing naturally occurring radioactive material (NORM), individual states have taken responsibility for developing their own regulatory programs for NORM. A key issue in developing NORM rules is defining exemption levels--specific levels or concentrations that determine which waste materials are subject to controlled management. In general, states have drawn upon existing standards and guidelines for similar waste types in establishing exemption levels for NORM. Simply adopting these standards may not be appropriate for oil and gas NORM for several reasons. The Interstate Oil and Gas Compact Commission's NORM Subcommittee has summarized the issues involved in setting exemption levels in a report titled ``Naturally Occurring Radioactive Materials (NORM): Issues from the Oil and Gas Point of View''. The committee has also recommended a set of exemption levels for controlled practices and for remediation activities on the basis of the issues discussed.

  18. A Variational Level Set Approach to Segmentation and Bias Correction of Images with Intensity Inhomogeneity

    PubMed Central

    Huang, Rui; Ding, Zhaohua; Gatenby, Chris; Metaxas, Dimitris; Gore, John

    2009-01-01

    This paper presents a variational level set approach to joint segmentation and bias correction of images with intensity inhomogeneity. Our method is based on an observation that intensities in a relatively small local region are separable, despite of the inseparability of the intensities in the whole image caused by the intensity inhomogeneity. We first define a weighted K-means clustering objective function for image intensities in a neighborhood around each point, with the cluster centers having a multiplicative factor that estimates the bias within the neighborhood. The objective function is then integrated over the entire domain and incorporated into a variational level set formulation. The energy minimization is performed via a level set evolution process. Our method is able to estimate bias of quite general profiles. Moreover, it is robust to initialization, and therefore allows automatic applications. The proposed method has been used for images of various modalities with promising results. PMID:18982712

  19. Aerostructural Level Set Topology Optimization for a Common Research Model Wing

    NASA Technical Reports Server (NTRS)

    Dunning, Peter D.; Stanford, Bret K.; Kim, H. Alicia

    2014-01-01

    The purpose of this work is to use level set topology optimization to improve the design of a representative wing box structure for the NASA common research model. The objective is to minimize the total compliance of the structure under aerodynamic and body force loading, where the aerodynamic loading is coupled to the structural deformation. A taxi bump case was also considered, where only body force loads were applied. The trim condition that aerodynamic lift must balance the total weight of the aircraft is enforced by allowing the root angle of attack to change. The level set optimization method is implemented on an unstructured three-dimensional grid, so that the method can optimize a wing box with arbitrary geometry. Fast matching and upwind schemes are developed for an unstructured grid, which make the level set method robust and efficient. The adjoint method is used to obtain the coupled shape sensitivities required to perform aerostructural optimization of the wing box structure.

  20. Numerical Schemes for the Hamilton-Jacobi and Level Set Equations on Triangulated Domains

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.; Sethian, James A.

    2006-01-01

    Borrowing from techniques developed for conservation law equations, we have developed both monotone and higher order accurate numerical schemes which discretize the Hamilton-Jacobi and level set equations on triangulated domains. The use of unstructured meshes containing triangles (2D) and tetrahedra (3D) easily accommodates mesh adaptation to resolve disparate level set feature scales with a minimal number of solution unknowns. The minisymposium talk will discuss these algorithmic developments and present sample calculations using our adaptive triangulation algorithm applied to various moving interface problems such as etching, deposition, and curvature flow.

  1. Setting the Direction Framework

    ERIC Educational Resources Information Center

    Alberta Education, 2009

    2009-01-01

    Alberta has a long and proud history of meeting the educational needs of students with disabilities and diverse needs. The province serves many thousand students with behavioural, communicational and intellectual needs; as well as students with mental health challenges, learning or physical disabilities and students who are gifted and talented.…

  2. An integrated framework for high level design of high performance signal processing circuits on FPGAs

    NASA Astrophysics Data System (ADS)

    Benkrid, K.; Belkacemi, S.; Sukhsawas, S.

    2005-06-01

    This paper proposes an integrated framework for the high level design of high performance signal processing algorithms' implementations on FPGAs. The framework emerged from a constant need to rapidly implement increasingly complicated algorithms on FPGAs while maintaining the high performance needed in many real time digital signal processing applications. This is particularly important for application developers who often rely on iterative and interactive development methodologies. The central idea behind the proposed framework is to dynamically integrate high performance structural hardware description languages with higher level hardware languages in other to help satisfy the dual requirement of high level design and high performance implementation. The paper illustrates this by integrating two environments: Celoxica's Handel-C language, and HIDE, a structural hardware environment developed at the Queen's University of Belfast. On the one hand, Handel-C has been proven to be very useful in the rapid design and prototyping of FPGA circuits, especially control intensive ones. On the other hand, HIDE, has been used extensively, and successfully, in the generation of highly optimised parameterisable FPGA cores. In this paper, this is illustrated in the construction of a scalable and fully parameterisable core for image algebra's five core neighbourhood operations, where fully floorplanned efficient FPGA configurations, in the form of EDIF netlists, are generated automatically for instances of the core. In the proposed combined framework, highly optimised data paths are invoked dynamically from within Handel-C, and are synthesized using HIDE. Although the idea might seem simple prima facie, it could have serious implications on the design of future generations of hardware description languages.

  3. Individual-and Setting-Level Correlates of Secondary Traumatic Stress in Rape Crisis Center Staff.

    PubMed

    Dworkin, Emily R; Sorell, Nicole R; Allen, Nicole E

    2016-02-01

    Secondary traumatic stress (STS) is an issue of significant concern among providers who work with survivors of sexual assault. Although STS has been studied in relation to individual-level characteristics of a variety of types of trauma responders, less research has focused specifically on rape crisis centers as environments that might convey risk or protection from STS, and no research to knowledge has modeled setting-level variation in correlates of STS. The current study uses a sample of 164 staff members representing 40 rape crisis centers across a single Midwestern state to investigate the staff member-and agency-level correlates of STS. Results suggest that correlates exist at both levels of analysis. Younger age and greater severity of sexual assault history were statistically significant individual-level predictors of increased STS. Greater frequency of supervision was more strongly related to secondary stress for non-advocates than for advocates. At the setting level, lower levels of supervision and higher client loads agency-wide accounted for unique variance in staff members' STS. These findings suggest that characteristics of both providers and their settings are important to consider when understanding their STS. PMID:25381285

  4. Accurate Adaptive Level Set Method and Sharpening Technique for Three Dimensional Deforming Interfaces

    NASA Technical Reports Server (NTRS)

    Kim, Hyoungin; Liou, Meng-Sing

    2011-01-01

    In this paper, we demonstrate improved accuracy of the level set method for resolving deforming interfaces by proposing two key elements: (1) accurate level set solutions on adapted Cartesian grids by judiciously choosing interpolation polynomials in regions of different grid levels and (2) enhanced reinitialization by an interface sharpening procedure. The level set equation is solved using a fifth order WENO scheme or a second order central differencing scheme depending on availability of uniform stencils at each grid point. Grid adaptation criteria are determined so that the Hamiltonian functions at nodes adjacent to interfaces are always calculated by the fifth order WENO scheme. This selective usage between the fifth order WENO and second order central differencing schemes is confirmed to give more accurate results compared to those in literature for standard test problems. In order to further improve accuracy especially near thin filaments, we suggest an artificial sharpening method, which is in a similar form with the conventional re-initialization method but utilizes sign of curvature instead of sign of the level set function. Consequently, volume loss due to numerical dissipation on thin filaments is remarkably reduced for the test problems

  5. Nonparametric intensity priors for level set segmentation of low contrast structures.

    PubMed

    Makrogiannis, Sokratis; Bhotika, Rahul; Miller, James V; Skinner, John; Vass, Melissa

    2009-01-01

    Segmentation of low contrast objects is an important task in clinical applications like lesion analysis and vascular wall remodeling analysis. Several solutions to low contrast segmentation that exploit high-level information have been previously proposed, such as shape priors and generative models. In this work, we incorporate a priori distributions of intensity and low-level image information into a nonparametric dissimilarity measure that defines a local indicator function for the likelihood of belonging to a foreground object. We then integrate the indicator function into a level set formulation for segmenting low contrast structures. We apply the technique to the clinical problem of positive remodeling of the vessel wall in cardiac CT angiography images. We present results on a dataset of twenty five patient scans, showing improvement over conventional gradient-based level sets. PMID:20425993

  6. Demons versus level-set motion registration for coronary 18F-sodium fluoride PET

    NASA Astrophysics Data System (ADS)

    Rubeaux, Mathieu; Joshi, Nikhil; Dweck, Marc R.; Fletcher, Alison; Motwani, Manish; Thomson, Louise E.; Germano, Guido; Dey, Damini; Berman, Daniel S.; Newby, David E.; Slomka, Piotr J.

    2016-03-01

    Ruptured coronary atherosclerotic plaques commonly cause acute myocardial infarction. It has been recently shown that active microcalcification in the coronary arteries, one of the features that characterizes vulnerable plaques at risk of rupture, can be imaged using cardiac gated 18F-sodium fluoride (18F-NaF) PET. We have shown in previous work that a motion correction technique applied to cardiac-gated 18F-NaF PET images can enhance image quality and improve uptake estimates. In this study, we further investigated the applicability of different algorithms for registration of the coronary artery PET images. In particular, we aimed to compare demons vs. level-set nonlinear registration techniques applied for the correction of cardiac motion in coronary 18F-NaF PET. To this end, fifteen patients underwent 18F-NaF PET and prospective coronary CT angiography (CCTA). PET data were reconstructed in 10 ECG gated bins; subsequently these gated bins were registered using demons and level-set methods guided by the extracted coronary arteries from CCTA, to eliminate the effect of cardiac motion on PET images. Noise levels, target-to-background ratios (TBR) and global motion were compared to assess image quality. Compared to the reference standard of using only diastolic PET image (25% of the counts from PET acquisition), cardiac motion registration using either level-set or demons techniques almost halved image noise due to the use of counts from the full PET acquisition and increased TBR difference between 18F-NaF positive and negative lesions. The demons method produces smoother deformation fields, exhibiting no singularities (which reflects how physically plausible the registration deformation is), as compared to the level-set method, which presents between 4 and 8% of singularities, depending on the coronary artery considered. In conclusion, the demons method produces smoother motion fields as compared to the level-set method, with a motion that is physiologically

  7. Demons versus Level-Set motion registration for coronary 18F-sodium fluoride PET

    PubMed Central

    Rubeaux, Mathieu; Joshi, Nikhil; Dweck, Marc R.; Fletcher, Alison; Motwani, Manish; Thomson, Louise E.; Germano, Guido; Dey, Damini; Berman, Daniel S.; Newby, David E.; Slomka, Piotr J.

    2016-01-01

    Ruptured coronary atherosclerotic plaques commonly cause acute myocardial infarction. It has been recently shown that active microcalcification in the coronary arteries, one of the features that characterizes vulnerable plaques at risk of rupture, can be imaged using cardiac gated 18F-sodium fluoride (18F-NaF) PET. We have shown in previous work that a motion correction technique applied to cardiac-gated 18F-NaF PET images can enhance image quality and improve uptake estimates. In this study, we further investigated the applicability of different algorithms for registration of the coronary artery PET images. In particular, we aimed to compare demons vs. level-set nonlinear registration techniques applied for the correction of cardiac motion in coronary 18F-NaF PET. To this end, fifteen patients underwent 18F-NaF PET and prospective coronary CT angiography (CCTA). PET data were reconstructed in 10 ECG gated bins; subsequently these gated bins were registered using demons and level-set methods guided by the extracted coronary arteries from CCTA, to eliminate the effect of cardiac motion on PET images. Noise levels, target-to-background ratios (TBR) and global motion were compared to assess image quality. Compared to the reference standard of using only diastolic PET image (25% of the counts from PET acquisition), cardiac motion registration using either level-set or demons techniques almost halved image noise due to the use of counts from the full PET acquisition and increased TBR difference between 18F-NaF positive and negative lesions. The demons method produces smoother deformation fields, exhibiting no singularities (which reflects how physically plausible the registration deformation is), as compared to the level-set method, which presents between 4 and 8% of singularities, depending on the coronary artery considered. In conclusion, the demons method produces smoother motion fields as compared to the level-set method, with a motion that is physiologically

  8. Weld defect detection on digital radiographic image using level set method

    NASA Astrophysics Data System (ADS)

    Halim, Suhaila Abd; Petrus, Bertha Trissan; Ibrahim, Arsmah; Manurung, Yupiter HP; Jayes, Mohd Idris

    2013-09-01

    Segmentation is the most critical task and widely used to obtain useful information in image processing. In this study, Level set based on Chan Vese method is explored and applied to define weld defect on digital radiographic image and its accuracy is evaluated to measure its performance. A set of images with region of interest (ROI) that contain defect are used as input image. The ROI image is pre-processed to improve their quality for better detection. Then, the image is segmented using level set method that is implemented using MATLAB R2009a. The accuracy of the method is evaluated using Receiver Operating Characteristic (ROC). Experimental results show that the method generated an area underneath the ROC of 0.7 in the set of images and the operational point reached corresponds to 0.6 of sensitivity and 0.8 of specificity. The application of segmentation technique such as Chan-Vese level set able to assist radiographer in detecting the defect on digital radiographic image accurately.

  9. A Measurement Framework for Team Level Assessment of Innovation Capability in Early Requirements Engineering

    NASA Astrophysics Data System (ADS)

    Regnell, Björn; Höst, Martin; Nilsson, Fredrik; Bengtsson, Henrik

    When developing software-intensive products for a market-place it is important for a development organisation to create innovative features for coming releases in order to achieve advantage over competitors. This paper focuses on assessment of innovation capability at team level in relation to the requirements engineering that is taking place before the actual product development projects are decided, when new business models, technology opportunities and intellectual property rights are created and investigated through e.g. prototyping and concept development. The result is a measurement framework focusing on four areas: innovation elicitation, selection, impact and ways-of-working. For each area, candidate measurements were derived from interviews to be used as inspiration in the development of a tailored measurement program. The framework is based on interviews with participants of a software team with specific innovation responsibilities and validated through cross-case analysis and feedback from practitioners.

  10. An investigation of children's levels of inquiry in an informal science setting

    NASA Astrophysics Data System (ADS)

    Clark-Thomas, Beth Anne

    Elementary school students' understanding of both science content and processes are enhanced by the higher level thinking associated with inquiry-based science investigations. Informal science setting personnel, elementary school teachers, and curriculum specialists charged with designing inquiry-based investigations would be well served by an understanding of the varying influence of certain present factors upon the students' willingness and ability to delve into such higher level inquiries. This study examined young children's use of inquiry-based materials and factors which may influence the level of inquiry they engaged in during informal science activities. An informal science setting was selected as the context for the examination of student inquiry behaviors because of the rich inquiry-based environment present at the site and the benefits previously noted in the research regarding the impact of informal science settings upon the construction of knowledge in science. The study revealed several patterns of behavior among children when they are engaged in inquiry-based activities at informal science exhibits. These repeated behaviors varied in the children's apparent purposeful use of the materials at the exhibits. These levels of inquiry behavior were taxonomically defined as high/medium/low within this study utilizing a researcher-developed tool. Furthermore, in this study adult interventions, questions, or prompting were found to impact the level of inquiry engaged in by the children. This study revealed that higher levels of inquiry were preceded by task directed and physical feature prompts. Moreover, the levels of inquiry behaviors were haltered, even lowered, when preceded by a prompt that focused on a science content or concept question. Results of this study have implications for the enhancement of inquiry-based science activities in elementary schools as well as in informal science settings. These findings have significance for all science educators

  11. Analysis of Forensic Autopsy in 120 Cases of Medical Disputes Among Different Levels of Institutional Settings.

    PubMed

    Yu, Lin-Sheng; Ye, Guang-Hua; Fan, Yan-Yan; Li, Xing-Biao; Feng, Xiang-Ping; Han, Jun-Ge; Lin, Ke-Zhi; Deng, Miao-Wu; Li, Feng

    2015-09-01

    Despite advances in medical science, the causes of death can sometimes only be determined by pathologists after a complete autopsy. Few studies have investigated the importance of forensic autopsy in medically disputed cases among different levels of institutional settings. Our study aimed to analyze forensic autopsy in 120 cases of medical disputes among five levels of institutional settings between 2001 and 2012 in Wenzhou, China. The results showed an overall concordance rate of 55%. Of the 39% of clinically missed diagnosis, cardiovascular pathology comprises 55.32%, while respiratory pathology accounts for the remaining 44. 68%. Factors that increase the likelihood of missed diagnoses were private clinics, community settings, and county hospitals. These results support that autopsy remains an important tool in establishing causes of death in medically disputed case, which may directly determine or exclude the fault of medical care and therefore in helping in resolving these cases. PMID:25929602

  12. Level set segmentation for greenbelts by integrating wavelet texture and priori color knowledge

    NASA Astrophysics Data System (ADS)

    Yang, Tie-jun; Song, Zhi-hui; Jiang, Chuan-xian; Huang, Lin

    2013-09-01

    Segmenting greenbelts quickly and accurately in remote sensing images is an economic and effective method for the statistics of green coverage rate (GCR). Towards the problem of over-reliance on priori knowledge of the traditional level set segmentation model based on max-flow/min-cut Graph Cut principle and weighted Total Variation (GCTV), this paper proposes a level set segmentation method of combining regional texture features and priori knowledge of color and applies it to greenbelt segmentation in urban remote sensing images. For the color of greenbelts is not reliable for segmentation, Gabor wavelet transform is used to extract image texture features. Then we integrate the extracted features into the GCTV model which contains only priori knowledge of color, and use both the prior knowledge and the targets' texture to constrain the evolving of the level set which can solve the problem of over-reliance on priori knowledge. Meanwhile, the convexity of the corresponding energy functional is ensured by using relaxation and threshold method, and primal-dual algorithm with global relabeling is used to accelerate the evolution of the level set. The experiments show that our method can effectively reduce the dependence on priori knowledge of GCTV, and yields more accurate greenbelt segmentation results.

  13. Physical Activity Levels in Coeducational and Single-Gender High School Physical Education Settings

    ERIC Educational Resources Information Center

    Hannon, James; Ratliffe, Thomas

    2005-01-01

    The purpose of this study was to investigate the effects of coeducational (coed) and single-gender game-play settings on the activity levels of Caucasian and African American high school physical education students. Students participated in flag football, ultimate Frisbee, and soccer units. Classes were as follows: there were two coed classes, two…

  14. Re-Setting the Concentration Levels of Students in Higher Education: An Exploratory Study

    ERIC Educational Resources Information Center

    Burke, Lisa A.; Ray, Ruth

    2008-01-01

    Evidence suggests that college students' concentration levels are limited and hard to maintain. Even though relevant in higher education, scant empirical research exists on interventions to "re-set" their concentration during a college lecture. Using a within-subjects design, four active learning interventions are administered across two…

  15. Energy-optimal path planning by stochastic dynamically orthogonal level-set optimization

    NASA Astrophysics Data System (ADS)

    Subramani, Deepak N.; Lermusiaux, Pierre F. J.

    2016-04-01

    A stochastic optimization methodology is formulated for computing energy-optimal paths from among time-optimal paths of autonomous vehicles navigating in a dynamic flow field. Based on partial differential equations, the methodology rigorously leverages the level-set equation that governs time-optimal reachability fronts for a given relative vehicle-speed function. To set up the energy optimization, the relative vehicle-speed and headings are considered to be stochastic and new stochastic Dynamically Orthogonal (DO) level-set equations are derived. Their solution provides the distribution of time-optimal reachability fronts and corresponding distribution of time-optimal paths. An optimization is then performed on the vehicle's energy-time joint distribution to select the energy-optimal paths for each arrival time, among all stochastic time-optimal paths for that arrival time. Numerical schemes to solve the reduced stochastic DO level-set equations are obtained, and accuracy and efficiency considerations are discussed. These reduced equations are first shown to be efficient at solving the governing stochastic level-sets, in part by comparisons with direct Monte Carlo simulations. To validate the methodology and illustrate its accuracy, comparisons with semi-analytical energy-optimal path solutions are then completed. In particular, we consider the energy-optimal crossing of a canonical steady front and set up its semi-analytical solution using a energy-time nested nonlinear double-optimization scheme. We then showcase the inner workings and nuances of the energy-optimal path planning, considering different mission scenarios. Finally, we study and discuss results of energy-optimal missions in a wind-driven barotropic quasi-geostrophic double-gyre ocean circulation.

  16. Large Code Set for Double User Capacity and Low PAPR Level in Multicarrier Systems

    NASA Astrophysics Data System (ADS)

    Anwar, Khoirul; Saito, Masato; Hara, Takao; Okada, Minoru

    In this paper, a new large spreading code set with a uniform low cross-correlation is proposed. The proposed code set is capable of (1) increasing the number of assigned user (capacity) in a multicarrier code division multiple access (MC-CDMA) system and (2) reducing the peak-to-average power ratio (PAPR) of an orthogonal frequency division multiplexing (OFDM) system. In this paper, we derive a new code set and present an example to demonstrate performance improvements of OFDM and MC-CDMA systems. Our proposed code set with code length of N has K=2N+1 number of codes for supporting up to (2N+1) users and exhibits lower cross correlation properties compared to the existing spreading code sets. Our results with subcarrier N=16 confirm that the proposed code set outperforms the current pseudo-orthogonal carrier interferometry (POCI) code set with gain of 5dB at bit-error-rate (BER) level of 10-4 in the additive white Gaussian noise (AWGN) channel and gain of more than 3.6dB in a multipath fading channel.

  17. A GPU Accelerated Discontinuous Galerkin Conservative Level Set Method for Simulating Atomization

    NASA Astrophysics Data System (ADS)

    Jibben, Zechariah J.

    This dissertation describes a process for interface capturing via an arbitrary-order, nearly quadrature free, discontinuous Galerkin (DG) scheme for the conservative level set method (Olsson et al., 2005, 2008). The DG numerical method is utilized to solve both advection and reinitialization, and executed on a refined level set grid (Herrmann, 2008) for effective use of processing power. Computation is executed in parallel utilizing both CPU and GPU architectures to make the method feasible at high order. Finally, a sparse data structure is implemented to take full advantage of parallelism on the GPU, where performance relies on well-managed memory operations. With solution variables projected into a kth order polynomial basis, a k + 1 order convergence rate is found for both advection and reinitialization tests using the method of manufactured solutions. Other standard test cases, such as Zalesak's disk and deformation of columns and spheres in periodic vortices are also performed, showing several orders of magnitude improvement over traditional WENO level set methods. These tests also show the impact of reinitialization, which often increases shape and volume errors as a result of level set scalar trapping by normal vectors calculated from the local level set field. Accelerating advection via GPU hardware is found to provide a 30x speedup factor comparing a 2.0GHz Intel Xeon E5-2620 CPU in serial vs. a Nvidia Tesla K20 GPU, with speedup factors increasing with polynomial degree until shared memory is filled. A similar algorithm is implemented for reinitialization, which relies on heavier use of shared and global memory and as a result fills them more quickly and produces smaller speedups of 18x.

  18. Online monitoring of oil film using electrical capacitance tomography and level set method

    SciTech Connect

    Xue, Q. Ma, M.; Sun, B. Y.; Cui, Z. Q.; Wang, H. X.

    2015-08-15

    In the application of oil-air lubrication system, electrical capacitance tomography (ECT) provides a promising way for monitoring oil film in the pipelines by reconstructing cross sectional oil distributions in real time. While in the case of small diameter pipe and thin oil film, the thickness of the oil film is hard to be observed visually since the interface of oil and air is not obvious in the reconstructed images. And the existence of artifacts in the reconstructions has seriously influenced the effectiveness of image segmentation techniques such as level set method. Besides, level set method is also unavailable for online monitoring due to its low computation speed. To address these problems, a modified level set method is developed: a distance regularized level set evolution formulation is extended to image two-phase flow online using an ECT system, a narrowband image filter is defined to eliminate the influence of artifacts, and considering the continuity of the oil distribution variation, the detected oil-air interface of a former image can be used as the initial contour for the detection of the subsequent frame; thus, the propagation from the initial contour to the boundary can be greatly accelerated, making it possible for real time tracking. To testify the feasibility of the proposed method, an oil-air lubrication facility with 4 mm inner diameter pipe is measured in normal operation using an 8-electrode ECT system. Both simulation and experiment results indicate that the modified level set method is capable of visualizing the oil-air interface accurately online.

  19. Online monitoring of oil film using electrical capacitance tomography and level set method

    NASA Astrophysics Data System (ADS)

    Xue, Q.; Sun, B. Y.; Cui, Z. Q.; Ma, M.; Wang, H. X.

    2015-08-01

    In the application of oil-air lubrication system, electrical capacitance tomography (ECT) provides a promising way for monitoring oil film in the pipelines by reconstructing cross sectional oil distributions in real time. While in the case of small diameter pipe and thin oil film, the thickness of the oil film is hard to be observed visually since the interface of oil and air is not obvious in the reconstructed images. And the existence of artifacts in the reconstructions has seriously influenced the effectiveness of image segmentation techniques such as level set method. Besides, level set method is also unavailable for online monitoring due to its low computation speed. To address these problems, a modified level set method is developed: a distance regularized level set evolution formulation is extended to image two-phase flow online using an ECT system, a narrowband image filter is defined to eliminate the influence of artifacts, and considering the continuity of the oil distribution variation, the detected oil-air interface of a former image can be used as the initial contour for the detection of the subsequent frame; thus, the propagation from the initial contour to the boundary can be greatly accelerated, making it possible for real time tracking. To testify the feasibility of the proposed method, an oil-air lubrication facility with 4 mm inner diameter pipe is measured in normal operation using an 8-electrode ECT system. Both simulation and experiment results indicate that the modified level set method is capable of visualizing the oil-air interface accurately online.

  20. Online monitoring of oil film using electrical capacitance tomography and level set method.

    PubMed

    Xue, Q; Sun, B Y; Cui, Z Q; Ma, M; Wang, H X

    2015-08-01

    In the application of oil-air lubrication system, electrical capacitance tomography (ECT) provides a promising way for monitoring oil film in the pipelines by reconstructing cross sectional oil distributions in real time. While in the case of small diameter pipe and thin oil film, the thickness of the oil film is hard to be observed visually since the interface of oil and air is not obvious in the reconstructed images. And the existence of artifacts in the reconstructions has seriously influenced the effectiveness of image segmentation techniques such as level set method. Besides, level set method is also unavailable for online monitoring due to its low computation speed. To address these problems, a modified level set method is developed: a distance regularized level set evolution formulation is extended to image two-phase flow online using an ECT system, a narrowband image filter is defined to eliminate the influence of artifacts, and considering the continuity of the oil distribution variation, the detected oil-air interface of a former image can be used as the initial contour for the detection of the subsequent frame; thus, the propagation from the initial contour to the boundary can be greatly accelerated, making it possible for real time tracking. To testify the feasibility of the proposed method, an oil-air lubrication facility with 4 mm inner diameter pipe is measured in normal operation using an 8-electrode ECT system. Both simulation and experiment results indicate that the modified level set method is capable of visualizing the oil-air interface accurately online. PMID:26329232

  1. Loosely coupled level sets for retinal layers and drusen segmentation in subjects with dry age-related macular degeneration

    NASA Astrophysics Data System (ADS)

    Novosel, Jelena; Wang, Ziyuan; de Jong, Henk; Vermeer, Koenraad A.; van Vliet, Lucas J.

    2016-03-01

    Optical coherence tomography (OCT) is used to produce high-resolution three-dimensional images of the retina, which permit the investigation of retinal irregularities. In dry age-related macular degeneration (AMD), a chronic eye disease that causes central vision loss, disruptions such as drusen and changes in retinal layer thicknesses occur which could be used as biomarkers for disease monitoring and diagnosis. Due to the topology disrupting pathology, existing segmentation methods often fail. Here, we present a solution for the segmentation of retinal layers in dry AMD subjects by extending our previously presented loosely coupled level sets framework which operates on attenuation coefficients. In eyes affected by AMD, Bruch's membrane becomes visible only below the drusen and our segmentation framework is adapted to delineate such a partially discernible interface. Furthermore, the initialization stage, which tentatively segments five interfaces, is modified to accommodate the appearance of drusen. This stage is based on Dijkstra's algorithm and combines prior knowledge on the shape of the interface, gradient and attenuation coefficient in the newly proposed cost function. This prior knowledge is incorporated by varying the weights for horizontal, diagonal and vertical edges. Finally, quantitative evaluation of the accuracy shows a good agreement between manual and automated segmentation.

  2. Brain extraction from cerebral MRI volume using a hybrid level set based active contour neighborhood model

    PubMed Central

    2013-01-01

    Background The extraction of brain tissue from cerebral MRI volume is an important pre-procedure for neuroimage analyses. The authors have developed an accurate and robust brain extraction method using a hybrid level set based active contour neighborhood model. Methods The method uses a nonlinear speed function in the hybrid level set model to eliminate boundary leakage. When using the new hybrid level set model an active contour neighborhood model is applied iteratively in the neighborhood of brain boundary. A slice by slice contour initial method is proposed to obtain the neighborhood of the brain boundary. The method was applied to the internet brain MRI data provided by the Internet Brain Segmentation Repository (IBSR). Results In testing, a mean Dice similarity coefficient of 0.95±0.02 and a mean Hausdorff distance of 12.4±4.5 were obtained when performing our method across the IBSR data set (18 × 1.5 mm scans). The results obtained using our method were very similar to those produced using manual segmentation and achieved the smallest mean Hausdorff distance on the IBSR data. Conclusions An automatic method of brain extraction from cerebral MRI volume was achieved and produced competitively accurate results. PMID:23587217

  3. Use of a General Level Framework to Facilitate Performance Improvement in Hospital Pharmacists in Singapore

    PubMed Central

    Wong, Camilla; Coombes, Ian; Cardiff, Lynda; Duggan, Catherine; Yee, Mei-Ling; Wee Lim, Kiat; Bates, Ian

    2012-01-01

    Objective. To evaluate the acceptability and validity of an adapted version of the General Level Framework (GLF) as a tool to facilitate and evaluate performance development in general pharmacist practitioners (those with less than 3 years of experience) in a Singapore hospital. Method. Observational evaluations during daily clinical activities were prospectively recorded for 35 pharmacists using the GLF at 2 time points over an average of 9 months. Feedback was provided to the pharmacists and then individualized learning plans were formulated. Results. Pharmacists’ mean competency cluster scores improved in all 3 clusters, and significant improvement was seen in all but 8 of the 63 behavioral descriptors (p ≤ 0.05). Nonsignificant improvements were attributed to the highest level of performance having been attained upon initial evaluation. Feedback indicated that the GLF process was a positive experience, prompting reflection on practice and culminating in needs-based learning and ultimately improved patient care. Conclusions. The General Level Framework was an acceptable tool for the facilitation and evaluation of performance development in general pharmacist practitioners in a Singapore hospital. PMID:22919083

  4. Level set algorithms comparison for multi-slice CT left ventricle segmentation

    NASA Astrophysics Data System (ADS)

    Medina, Ruben; La Cruz, Alexandra; Ordoñes, Andrés.; Pesántez, Daniel; Morocho, Villie; Vanegas, Pablo

    2015-12-01

    The comparison of several Level Set algorithms is performed with respect to 2D left ventricle segmentation in Multi-Slice CT images. Five algorithms are compared by calculating the Dice coefficient between the resulting segmentation contour and a reference contour traced by a cardiologist. The algorithms are also tested on images contaminated with Gaussian noise for several values of PSNR. Additionally an algorithm for providing the initialization shape is proposed. This algorithm is based on a combination of mathematical morphology tools with watershed and region growing algorithms. Results on the set of test images are promising and suggest the extension to 3{D MSCT database segmentation.

  5. A variational level set method for the topology optimization of steady-state Navier Stokes flow

    NASA Astrophysics Data System (ADS)

    Zhou, Shiwei; Li, Qing

    2008-12-01

    The smoothness of topological interfaces often largely affects the fluid optimization and sometimes makes the density-based approaches, though well established in structural designs, inadequate. This paper presents a level-set method for topology optimization of steady-state Navier-Stokes flow subject to a specific fluid volume constraint. The solid-fluid interface is implicitly characterized by a zero-level contour of a higher-order scalar level set function and can be naturally transformed to other configurations as its host moves. A variational form of the cost function is constructed based upon the adjoint variable and Lagrangian multiplier techniques. To satisfy the volume constraint effectively, the Lagrangian multiplier derived from the first-order approximation of the cost function is amended by the bisection algorithm. The procedure allows evolving initial design to an optimal shape and/or topology by solving the Hamilton-Jacobi equation. Two classes of benchmarking examples are presented in this paper: (1) periodic microstructural material design for the maximum permeability; and (2) topology optimization of flow channels for minimizing energy dissipation. A number of 2D and 3D examples well demonstrated the feasibility and advantage of the level-set method in solving fluid-solid shape and topology optimization problems.

  6. A patient-centered pharmacy services model of HIV patient care in community pharmacy settings: a theoretical and empirical framework.

    PubMed

    Kibicho, Jennifer; Owczarzak, Jill

    2012-01-01

    Reflecting trends in health care delivery, pharmacy practice has shifted from a drug-specific to a patient-centered model of care, aimed at improving the quality of patient care and reducing health care costs. In this article, we outline a theoretical model of patient-centered pharmacy services (PCPS), based on in-depth, qualitative interviews with a purposive sample of 28 pharmacists providing care to HIV-infected patients in specialty, semispecialty, and nonspecialty pharmacy settings. Data analysis was an interactive process informed by pharmacists' interviews and a review of the general literature on patient centered care, including Medication Therapy Management (MTM) services. Our main finding was that the current models of pharmacy services, including MTM, do not capture the range of pharmacy services in excess of mandated drug dispensing services. In this article, we propose a theoretical PCPS model that reflects the actual services pharmacists provide. The model includes five elements: (1) addressing patients as whole, contextualized persons; (2) customizing interventions to unique patient circumstances; (3) empowering patients to take responsibility for their own health care; (4) collaborating with clinical and nonclinical providers to address patient needs; and (5) developing sustained relationships with patients. The overarching goal of PCPS is to empower patients' to take responsibility for their own health care and self-manage their HIV-infection. Our findings provide the foundation for future studies regarding how widespread these practices are in diverse community settings, the validity of the proposed PCPS model, the potential for standardizing pharmacist practices, and the feasibility of a PCPS framework to reimburse pharmacists services. PMID:22149903

  7. Geological repository for nuclear high level waste in France from feasibility to design within a legal framework

    SciTech Connect

    Voizard, Patrice; Mayer, Stefan; Ouzounian, Gerald

    2007-07-01

    Over the past 15 years, the French program on deep geologic disposal of high level and long-lived radioactive waste has benefited from a clear legal framework as the result of the December 30, 1991 French Waste Act. To fulfil its obligations stipulated in this law, ANDRA has submitted the 'Dossier 2005 Argile' (clay) and 'Dossier 2005 Granite' to the French Government. The first of those reports presents a concept for the underground disposal of nuclear waste at a specific clay site and focuses on a feasibility study. Knowledge of the host rock characteristics is based on the investigations carried out at the Meuse/Haute Marne Underground Research Laboratory. The repository concept addresses various issues, the most important of which relates to the large amount of waste, the clay host rock and the reversibility requirement. This phase has ended upon review and evaluation of the 'Dossier 2005' made by different organisations including the National Review Board, the National Safety Authority and the NEA International Review Team. By passing the 'new', June 28, 2006 Planning Act on the sustainable management of radioactive materials and waste, the French parliament has further defined a clear legal framework for future work. This June 28 Planning Act thus sets a schedule and defines the objectives for the next phase of repository design in requesting the submission of a construction authorization application by 2015. The law calls for the repository program to be in a position to commission disposal installations by 2025. (authors)

  8. Not Your Basic Base Levels: Simulations of Erosion and Deposition With Fluctuating Water Levels in Coastal and Enclosed Basin Settings

    NASA Astrophysics Data System (ADS)

    Howard, A. D.; Matsubara, Y.; Lloyd, H.

    2006-12-01

    The DELIM landform evolution model has been adapted to investigate erosional and depositional landforms in two setting with fluctuating base levels. The first is erosion and wave planation of terraced landscapes in Coastal Plain sediments along the estuarine Potomac River. The last 3.5 million years of erosion is simulated with base level fluctuations based upon the long-term oceanic delta 18O record, eustatic sea level changes during the last 120 ka, estimates of the history of tectonic uplift in the region, and maximum depths of incision of the Potomac River during sea-level lowstands. Inhibition of runoff erosion by vegetation has been a crucial factor allowing persistence of uplands in the soft coastal plain bedrock. The role of vegetation is simulated as a contributing area- dependent critical shear stress. Development of wave-cut terraces is simulated by episodic planation of the landscape during base-level highstands. Although low base level excursions are infrequent and of short duration, the total amount of erosion is largely controlled by the depth and frequency of lowstands. The model has also been adapted to account for flow routing and accompanying erosion and sedimentation in landscapes with multiple enclosed depressions. The hydrological portion of the model has been calibrated and tested in the Great Basin and Mojave regions of the southwestern U.S. In such a setting, runoff, largely from mountains, may flow through several lacustrine basins, each with evaporative losses. An iterative approach determines the size and depth of lakes, including overflow (or not) that balances runoff and evaporation. The model utilizes information on temperatures, rainfall, runoff, and evaporation within the region to parameterize evaporation and runoff as functions of latitude, mean annual temperature, precipitation, and elevation. The model is successful in predicting the location of modern perennial lakes in the region as well as that of lakes during the last

  9. Level set segmentation of bovine corpora lutea in ex situ ovarian ultrasound images

    PubMed Central

    Rusnell, Brennan J; Pierson, Roger A; Singh, Jaswant; Adams, Gregg P; Eramian, Mark G

    2008-01-01

    Background The objective of this study was to investigate the viability of level set image segmentation methods for the detection of corpora lutea (corpus luteum, CL) boundaries in ultrasonographic ovarian images. It was hypothesized that bovine CL boundaries could be located within 1–2 mm by a level set image segmentation methodology. Methods Level set methods embed a 2D contour in a 3D surface and evolve that surface over time according to an image-dependent speed function. A speed function suitable for segmentation of CL's in ovarian ultrasound images was developed. An initial contour was manually placed and contour evolution was allowed to proceed until the rate of change of the area was sufficiently small. The method was tested on ovarian ultrasonographic images (n = 8) obtained ex situ. A expert in ovarian ultrasound interpretation delineated CL boundaries manually to serve as a "ground truth". Accuracy of the level set segmentation algorithm was determined by comparing semi-automatically determined contours with ground truth contours using the mean absolute difference (MAD), root mean squared difference (RMSD), Hausdorff distance (HD), sensitivity, and specificity metrics. Results and discussion The mean MAD was 0.87 mm (sigma = 0.36 mm), RMSD was 1.1 mm (sigma = 0.47 mm), and HD was 3.4 mm (sigma = 2.0 mm) indicating that, on average, boundaries were accurate within 1–2 mm, however, deviations in excess of 3 mm from the ground truth were observed indicating under- or over-expansion of the contour. Mean sensitivity and specificity were 0.814 (sigma = 0.171) and 0.990 (sigma = 0.00786), respectively, indicating that CLs were consistently undersegmented but rarely did the contour interior include pixels that were judged by the human expert not to be part of the CL. It was observed that in localities where gradient magnitudes within the CL were strong due to high contrast speckle, contour expansion stopped too early. Conclusion The hypothesis that level set

  10. A Cartesian Adaptive Level Set Method for Two-Phase Flows

    NASA Technical Reports Server (NTRS)

    Ham, F.; Young, Y.-N.

    2003-01-01

    In the present contribution we develop a level set method based on local anisotropic Cartesian adaptation as described in Ham et al. (2002). Such an approach should allow for the smallest possible Cartesian grid capable of resolving a given flow. The remainder of the paper is organized as follows. In section 2 the level set formulation for free surface calculations is presented and its strengths and weaknesses relative to the other free surface methods reviewed. In section 3 the collocated numerical method is described. In section 4 the method is validated by solving the 2D and 3D drop oscilation problem. In section 5 we present some results from more complex cases including the 3D drop breakup in an impulsively accelerated free stream, and the 3D immiscible Rayleigh-Taylor instability. Conclusions are given in section 6.

  11. Parallel computation of level set method for 500 Hz visual servo control

    NASA Astrophysics Data System (ADS)

    Fei, Xianfeng; Igarashi, Yasunobu; Hashimoto, Koichi

    2008-11-01

    We propose a 2D microorganism tracking system using a parallel level set method and a column parallel vision system (CPV). This system keeps a single microorganism in the middle of the visual field under a microscope by visual servoing an automated stage. We propose a new energy function for the level set method. This function constrains an amount of light intensity inside the detected object contour to control the number of the detected objects. This algorithm is implemented in CPV system and computational time for each frame is 2 [ms], approximately. A tracking experiment for about 25 s is demonstrated. Also we demonstrate a single paramecium can be kept tracking even if other paramecia appear in the visual field and contact with the tracked paramecium.

  12. Segmentation of cardiac cine-MR images and myocardial deformation assessment using level set methods.

    PubMed

    Chenoune, Y; Deléchelle, E; Petit, E; Goissen, T; Garot, J; Rahmouni, A

    2005-12-01

    In this paper, we present an original method to assess the deformations of the left ventricular myocardium on cardiac cine-MRI. First, a segmentation process, based on a level set method is directly applied on a 2D + t dataset to detect endocardial contours. Second, the successive segmented contours are matched using a procedure of global alignment, followed by a morphing process based on a level set approach. Finally, local measurements of myocardial deformations are derived from the previously determined matched contours. The validation step is realized by comparing our results to the measurements achieved on the same patients by an expert using the semi-automated HARP reference method on tagged MR images. PMID:16290086

  13. A semi-implicit level set method for multiphase flows and fluid-structure interaction problems

    NASA Astrophysics Data System (ADS)

    Cottet, Georges-Henri; Maitre, Emmanuel

    2016-06-01

    In this paper we present a novel semi-implicit time-discretization of the level set method introduced in [8] for fluid-structure interaction problems. The idea stems from a linear stability analysis derived on a simplified one-dimensional problem. The semi-implicit scheme relies on a simple filter operating as a pre-processing on the level set function. It applies to multiphase flows driven by surface tension as well as to fluid-structure interaction problems. The semi-implicit scheme avoids the stability constraints that explicit scheme need to satisfy and reduces significantly the computational cost. It is validated through comparisons with the original explicit scheme and refinement studies on two-dimensional benchmarks.

  14. Feasibility of level-set analysis of enface OCT retinal images in diabetic retinopathy

    PubMed Central

    Mohammad, Fatimah; Ansari, Rashid; Wanek, Justin; Francis, Andrew; Shahidi, Mahnaz

    2015-01-01

    Pathology segmentation in retinal images of patients with diabetic retinopathy is important to help better understand disease processes. We propose an automated level-set method with Fourier descriptor-based shape priors. A cost function measures the difference between the current and expected output. We applied our method to enface images generated for seven retinal layers and determined correspondence of pathologies between retinal layers. We compared our method to a distance-regularized level set method and show the advantages of using well-defined shape priors. Results obtained allow us to observe pathologies across multiple layers and to obtain metrics that measure the co-localization of pathologies in different layers. PMID:26137390

  15. Automatic segmentation of Leishmania parasite in microscopic images using a modified CV level set method

    NASA Astrophysics Data System (ADS)

    Farahi, Maria; Rabbani, Hossein; Talebi, Ardeshir; Sarrafzadeh, Omid; Ensafi, Shahab

    2015-12-01

    Visceral Leishmaniasis is a parasitic disease that affects liver, spleen and bone marrow. According to World Health Organization report, definitive diagnosis is possible just by direct observation of the Leishman body in the microscopic image taken from bone marrow samples. We utilize morphological and CV level set method to segment Leishman bodies in digital color microscopic images captured from bone marrow samples. Linear contrast stretching method is used for image enhancement and morphological method is applied to determine the parasite regions and wipe up unwanted objects. Modified global and local CV level set methods are proposed for segmentation and a shape based stopping factor is used to hasten the algorithm. Manual segmentation is considered as ground truth to evaluate the proposed method. This method is tested on 28 samples and achieved 10.90% mean of segmentation error for global model and 9.76% for local model.

  16. An image-set for identifying multiple regions/levels of interest in digital images

    NASA Astrophysics Data System (ADS)

    Jaber, Mustafa; Bailly, Mark; Wang, Yuqiong; Saber, Eli

    2011-09-01

    In the field of identifying regions-of-interest (ROI) in digital images, several image-sets are referenced in the literature; the open-source ones typically present a single main object (usually located at or near the image center as a pop-out). In this paper, we present a comprehensive image-set (with its ground-truth) which will be made publically available. The database consists of images that demonstrate multiple-regions-of-interest (MROI) or multiple-levels-of-interest (MLOI). The former terminology signifies that the scene has a group of subjects/objects (not necessarily spatially-connected regions) that share the same level of perceptual priority to the human observer while the latter indicates that the scene is complex enough to have primary, secondary, and background objects. The methodology for developing the proposed image-set is described. A psychophysical experiment to identify MROI and MLOI was conducted, the results of which are also presented. The image-set has been developed to be used in training and evaluation of ROI detection algorithms. Applications include image compression, thumbnailing, summarization, and mobile phone imagery. fluor

  17. Therapeutic and diagnostic set for irradiation the cell lines in low level laser therapy

    NASA Astrophysics Data System (ADS)

    Gryko, Lukasz; Zajac, Andrzej; Gilewski, Marian; Szymanska, Justyna; Goralczyk, Krzysztof

    2014-05-01

    In the paper is presented optoelectronic diagnostic set for standardization the biostimulation procedures performed on cell lines. The basic functional components of the therapeutic set are two digitally controlled illuminators. They are composed of the sets of semiconductor emitters - medium power laser diodes and high power LEDs emitting radiation in wide spectral range from 600 nm to 1000 nm. Emitters are coupled with applicator by fibre optic and optical systems that provides uniform irradiation of vessel with cell culture samples. Integrated spectrometer and optical power meter allow to control the energy and spectral parameters of electromagnetic radiation during the Low Level Light Therapy procedure. Dedicated power supplies and digital controlling system allow independent power of each emitter . It was developed active temperature stabilization system to thermal adjust spectral line of emitted radiation to more efficient association with absorption spectra of biological acceptors. Using the set to controlled irradiation and allowing to measure absorption spectrum of biological medium it is possible to carry out objective assessment the impact of the exposure parameters on the state cells subjected to Low Level Light Therapy. That procedure allows comparing the biological response of cell lines after irradiation with radiation of variable spectral and energetic parameters. Researches were carried out on vascular endothelial cell lines. Cells proliferations after irradiation of LEDs: 645 nm, 680 nm, 740 nm, 780 nm, 830 nm, 870 nm, 890 nm, 970 nm and lasers 650 nm and 830 nm were examined.

  18. MOVE: a multi-level ontology-based visualization and exploration framework for genomic networks.

    PubMed

    Bosman, Diederik W J; Blom, Evert-Jan; Ogao, Patrick J; Kuipers, Oscar P; Roerdink, Jos B T M

    2007-01-01

    Among the various research areas that comprise bioinformatics, systems biology is gaining increasing attention. An important goal of systems biology is the unraveling of dynamic interactions between components of living cells (e. g., proteins, genes). These interactions exist among others on genomic, transcriptomic, proteomic and metabolomic levels. The levels themselves are heavily interconnected, resulting in complex networks of different interacting biological entities. Currently, various bioinformatics tools exist which are able to perform a particular analysis on a particular type of network. Unfortunately, each tool has its own disadvantages hampering it to be used consistently for different types of networks or analytical methods. This paper describes the conceptual development of an open source extensible software framework that supports visualization and exploration of highly complex genomic networks, like metabolic or gene regulatory networks. The focus is on the conceptual foundations, starting from requirements, a description of the state of the art of network visualization systems, and an analysis of their shortcomings. We describe the implementation of some initial modules of the framework and apply them to a biological test case in bacterial regulation, which shows the relevance and feasibility of the proposed approach. PMID:17688427

  19. A Bayesian framework for cell-level protein network analysis for multivariate proteomics image data

    NASA Astrophysics Data System (ADS)

    Kovacheva, Violet N.; Sirinukunwattana, Korsuk; Rajpoot, Nasir M.

    2014-03-01

    The recent development of multivariate imaging techniques, such as the Toponome Imaging System (TIS), has facilitated the analysis of multiple co-localisation of proteins. This could hold the key to understanding complex phenomena such as protein-protein interaction in cancer. In this paper, we propose a Bayesian framework for cell level network analysis allowing the identification of several protein pairs having significantly higher co-expression levels in cancerous tissue samples when compared to normal colon tissue. It involves segmenting the DAPI-labeled image into cells and determining the cell phenotypes according to their protein-protein dependence profile. The cells are phenotyped using Gaussian Bayesian hierarchical clustering (GBHC) after feature selection is performed. The phenotypes are then analysed using Difference in Sums of Weighted cO-dependence Profiles (DiSWOP), which detects differences in the co-expression patterns of protein pairs. We demonstrate that the pairs highlighted by the proposed framework have high concordance with recent results using a different phenotyping method. This demonstrates that the results are independent of the clustering method used. In addition, the highlighted protein pairs are further analysed via protein interaction pathway databases and by considering the localization of high protein-protein dependence within individual samples. This suggests that the proposed approach could identify potentially functional protein complexes active in cancer progression and cell differentiation.

  20. Probabilistic framework for assessing the ice sheet contribution to sea level change

    PubMed Central

    Little, Christopher M.; Urban, Nathan M.; Oppenheimer, Michael

    2013-01-01

    Previous sea level rise (SLR) assessments have excluded the potential for dynamic ice loss over much of Greenland and Antarctica, and recently proposed “upper bounds” on Antarctica’s 21st-century SLR contribution are derived principally from regions where present-day mass loss is concentrated (basin 15, or B15, drained largely by Pine Island, Thwaites, and Smith glaciers). Here, we present a probabilistic framework for assessing the ice sheet contribution to sea level change that explicitly accounts for mass balance uncertainty over an entire ice sheet. Applying this framework to Antarctica, we find that ongoing mass imbalances in non-B15 basins give an SLR contribution by 2100 that: (i) is comparable to projected changes in B15 discharge and Antarctica’s surface mass balance, and (ii) varies widely depending on the subset of basins and observational dataset used in projections. Increases in discharge uncertainty, or decreases in the exceedance probability used to define an upper bound, increase the fractional contribution of non-B15 basins; even weak spatial correlations in future discharge growth rates markedly enhance this sensitivity. Although these projections rely on poorly constrained statistical parameters, they may be updated with observations and/or models at many spatial scales, facilitating a more comprehensive account of uncertainty that, if implemented, will improve future assessments. PMID:23404697

  1. Systems Science and Obesity Policy: A Novel Framework for Analyzing and Rethinking Population-Level Planning

    PubMed Central

    Matteson, Carrie L.; Finegood, Diane T.

    2014-01-01

    Objectives. We demonstrate the use of a systems-based framework to assess solutions to complex health problems such as obesity. Methods. We coded 12 documents published between 2004 and 2013 aimed at influencing obesity planning for complex systems design (9 reports from US and Canadian governmental or health authorities, 1 Cochrane review, and 2 Institute of Medicine reports). We sorted data using the intervention-level framework (ILF), a novel solutions-oriented approach to complex problems. An in-depth comparison of 3 documents provides further insight into complexity and systems design in obesity policy. Results. The majority of strategies focused mainly on changing the determinants of energy imbalance (food intake and physical activity). ILF analysis brings to the surface actions aimed at higher levels of system function and points to a need for more innovative policy design. Conclusions. Although many policymakers acknowledge obesity as a complex problem, many strategies stem from the paradigm of individual choice and are limited in scope. The ILF provides a template to encourage natural systems thinking and more strategic policy design grounded in complexity science. PMID:24832406

  2. A Framework for Lab Work Management in Mass Courses. Application to Low Level Input/Output without Hardware

    ERIC Educational Resources Information Center

    Rodriguez, Santiago; Zamorano, Juan; Rosales, Francisco; Dopico, Antonio Garcia; Pedraza, Jose Luis

    2007-01-01

    This paper describes a complete lab work management framework designed and developed in the authors' department to help teachers to manage the small projects that students are expected to complete as lab assignments during their graduate-level computer engineering studies. The paper focuses on an application example of the framework to a specific…

  3. A level set simulation for ordering of quantum dots via cleaved-edge overgrowth

    NASA Astrophysics Data System (ADS)

    Niu, X. B.; Uccelli, E.; Fontcuberta i Morral, A.; Ratsch, C.

    2009-07-01

    Cleaved-edge overgrowth (CEO) is a promising technique to obtain ordered arrays of quantum dots, where the size and position of the dots can be controlled very well. We present level set simulations for CEO. Our simulations illustrate how the quality of the CEO technique depends on the potential energy surface (PES) for adatom diffusion, and thus suggest how variations of the PES can potentially improve the uniformity of quantum dot arrays.

  4. Stochastic level-set variational implicit-solvent approach to solute-solvent interfacial fluctuations.

    PubMed

    Zhou, Shenggao; Sun, Hui; Cheng, Li-Tien; Dzubiella, Joachim; Li, Bo; McCammon, J Andrew

    2016-08-01

    Recent years have seen the initial success of a variational implicit-solvent model (VISM), implemented with a robust level-set method, in capturing efficiently different hydration states and providing quantitatively good estimation of solvation free energies of biomolecules. The level-set minimization of the VISM solvation free-energy functional of all possible solute-solvent interfaces or dielectric boundaries predicts an equilibrium biomolecular conformation that is often close to an initial guess. In this work, we develop a theory in the form of Langevin geometrical flow to incorporate solute-solvent interfacial fluctuations into the VISM. Such fluctuations are crucial to biomolecular conformational changes and binding process. We also develop a stochastic level-set method to numerically implement such a theory. We describe the interfacial fluctuation through the "normal velocity" that is the solute-solvent interfacial force, derive the corresponding stochastic level-set equation in the sense of Stratonovich so that the surface representation is independent of the choice of implicit function, and develop numerical techniques for solving such an equation and processing the numerical data. We apply our computational method to study the dewetting transition in the system of two hydrophobic plates and a hydrophobic cavity of a synthetic host molecule cucurbit[7]uril. Numerical simulations demonstrate that our approach can describe an underlying system jumping out of a local minimum of the free-energy functional and can capture dewetting transitions of hydrophobic systems. In the case of two hydrophobic plates, we find that the wavelength of interfacial fluctuations has a strong influence to the dewetting transition. In addition, we find that the estimated energy barrier of the dewetting transition scales quadratically with the inter-plate distance, agreeing well with existing studies of molecular dynamics simulations. Our work is a first step toward the inclusion of

  5. Stochastic level-set variational implicit-solvent approach to solute-solvent interfacial fluctuations

    NASA Astrophysics Data System (ADS)

    Zhou, Shenggao; Sun, Hui; Cheng, Li-Tien; Dzubiella, Joachim; Li, Bo; McCammon, J. Andrew

    2016-08-01

    Recent years have seen the initial success of a variational implicit-solvent model (VISM), implemented with a robust level-set method, in capturing efficiently different hydration states and providing quantitatively good estimation of solvation free energies of biomolecules. The level-set minimization of the VISM solvation free-energy functional of all possible solute-solvent interfaces or dielectric boundaries predicts an equilibrium biomolecular conformation that is often close to an initial guess. In this work, we develop a theory in the form of Langevin geometrical flow to incorporate solute-solvent interfacial fluctuations into the VISM. Such fluctuations are crucial to biomolecular conformational changes and binding process. We also develop a stochastic level-set method to numerically implement such a theory. We describe the interfacial fluctuation through the "normal velocity" that is the solute-solvent interfacial force, derive the corresponding stochastic level-set equation in the sense of Stratonovich so that the surface representation is independent of the choice of implicit function, and develop numerical techniques for solving such an equation and processing the numerical data. We apply our computational method to study the dewetting transition in the system of two hydrophobic plates and a hydrophobic cavity of a synthetic host molecule cucurbit[7]uril. Numerical simulations demonstrate that our approach can describe an underlying system jumping out of a local minimum of the free-energy functional and can capture dewetting transitions of hydrophobic systems. In the case of two hydrophobic plates, we find that the wavelength of interfacial fluctuations has a strong influence to the dewetting transition. In addition, we find that the estimated energy barrier of the dewetting transition scales quadratically with the inter-plate distance, agreeing well with existing studies of molecular dynamics simulations. Our work is a first step toward the inclusion of

  6. Segmentation of the liver from abdominal MR images: a level-set approach

    NASA Astrophysics Data System (ADS)

    Abdalbari, Anwar; Huang, Xishi; Ren, Jing

    2015-03-01

    The usage of prior knowledge in segmentation of abdominal MR images enables more accurate and comprehensive interpretation about the organ to segment. Prior knowledge about abdominal organ like liver vessels can be employed to get an accurate segmentation of the liver that leads to accurate diagnosis or treatment plan. In this paper, a new method for segmenting the liver from abdominal MR images using liver vessels as prior knowledge is proposed. This paper employs the technique of level set method to segment the liver from MR abdominal images. The speed image used in the level set method is responsible for propagating and stopping region growing at boundaries. As a result of the poor contrast of the MR images between the liver and the surrounding organs i.e. stomach, kidneys, and heart causes leak of the segmented liver to those organs that lead to inaccurate or incorrect segmentation. For that reason, a second speed image is developed, as an extra term to the level set, to control the front propagation at weak edges with the help of the original speed image. The basic idea of the proposed approach is to use the second speed image as a boundary surface which is approximately orthogonal to the area of the leak. The aim of the new speed image is to slow down the level set propagation and prevent the leak in the regions close to liver boundary. The new speed image is a surface created by filling holes to reconstruct the liver surface. These holes are formed as a result of the exit and the entry of the liver vessels, and are considered the main cause of the segmentation leak. The result of the proposed method shows superior outcome than other methods in the literature.

  7. A self-adaptive oriented particles Level-Set method for tracking interfaces

    NASA Astrophysics Data System (ADS)

    Ianniello, S.; Di Mascio, A.

    2010-02-01

    A new method for tracking evolving interfaces by lagrangian particles in conjunction with a Level-Set approach is introduced. This numerical technique is based on the use of time evolution equations for fundamental vector and tensor quantities defined on the front and represents a new and convenient way to couple the advantages of the Eulerian description given by a Level-Set function ϕ to the use of Lagrangian massless particles. The term oriented points out that the information advected by the particles not only concern the spatial location, but also the local (outward) normal vector n to the interface Γ and the second fundamental tensor (the shape operator) ∇n. The particles are exactly located upon Γ and provide all the requested information for tracking the interface on their own. In addition, a self-adaptive mechanism suitably modifies, at each time step, the markers distribution in the numerical domain: each particle behaves both as a potential seeder of new markers on Γ (so as to guarantee an accurate reconstruction of the interface) and a de-seeder (to avoid any useless gathering of markers and to limit the computational effort). The algorithm is conceived to avoid any transport equation for ϕ and to confine the Level-Set function to the role of a mere post-processing tool; thus, all the numerical diffusion problems usually affecting the Level-Set methodology are removed. The method has been tested both on 2D and 3D configurations; it carries out a fast reconstruction of the interface and its accuracy is only limited by the spatial resolution of the mesh.

  8. Hydrological drivers of record-setting water level rise on Earth's largest lake system

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Bruxer, J.; Durnford, D.; Smith, J. P.; Clites, A. H.; Seglenieks, F.; Qian, S. S.; Hunter, T. S.; Fortin, V.

    2016-05-01

    Between January 2013 and December 2014, water levels on Lake Superior and Lake Michigan-Huron, the two largest lakes on Earth by surface area, rose at the highest rate ever recorded for a 2 year period beginning in January and ending in December of the following year. This historic event coincided with below-average air temperatures and extensive winter ice cover across the Great Lakes. It also brought an end to a 15 year period of persistently below-average water levels on Lakes Superior and Michigan-Huron that included several months of record-low water levels. To differentiate hydrological drivers behind the recent water level rise, we developed a Bayesian Markov chain Monte Carlo (MCMC) routine for inferring historical estimates of the major components of each lake's water budget. Our results indicate that, in 2013, the water level rise on Lake Superior was driven by increased spring runoff and over-lake precipitation. In 2014, reduced over-lake evaporation played a more significant role in Lake Superior's water level rise. The water level rise on Lake Michigan-Huron in 2013 was also due to above-average spring runoff and persistent over-lake precipitation, while in 2014, it was due to a rare combination of below-average evaporation, above-average runoff and precipitation, and very high inflow rates from Lake Superior through the St. Marys River. We expect, in future research, to apply our new framework across the other Laurentian Great Lakes, and to Earth's other large freshwater basins as well.

  9. Dynamic multi-source X-ray tomography using a spacetime level set method

    NASA Astrophysics Data System (ADS)

    Niemi, Esa; Lassas, Matti; Kallonen, Aki; Harhanen, Lauri; Hämäläinen, Keijo; Siltanen, Samuli

    2015-06-01

    A novel variant of the level set method is introduced for dynamic X-ray tomography. The target is allowed to change in time while being imaged by one or several source-detector pairs at a relatively high frame-rate. The algorithmic approach is motivated by the results in [22], showing that the modified level set method can tolerate highly incomplete projection data in stationary tomography. Furthermore, defining the level set function in spacetime enforces temporal continuity in the dynamic tomography context considered here. The tomographic reconstruction is found as a minimizer of a nonlinear functional. The functional contains a regularization term penalizing the L2 norms of up to n derivatives of the reconstruction. The case n = 1 is shown to be equivalent to a convex Tikhonov problem that has a unique minimizer. For n ≥ 2 the existence of a minimizer is proved under certain assumptions on the signal-to-noise ratio and the size of the regularization parameter. Numerical examples with both simulated and measured dynamic X-ray data are included, and the proposed method is found to yield reconstructions superior to standard methods such as FBP or non-negativity constrained Tikhonov regularization and favorably comparable to those of total variation regularization. Furthermore, the methodology can be adapted to a wide range of measurement arrangements with one or more X-ray sources.

  10. A Real-Time Algorithm for the Approximation of Level-Set-Based Curve Evolution

    PubMed Central

    Shi, Yonggang; Karl, William Clem

    2010-01-01

    In this paper, we present a complete and practical algorithm for the approximation of level-set-based curve evolution suitable for real-time implementation. In particular, we propose a two-cycle algorithm to approximate level-set-based curve evolution without the need of solving partial differential equations (PDEs). Our algorithm is applicable to a broad class of evolution speeds that can be viewed as composed of a data-dependent term and a curve smoothness regularization term. We achieve curve evolution corresponding to such evolution speeds by separating the evolution process into two different cycles: one cycle for the data-dependent term and a second cycle for the smoothness regularization. The smoothing term is derived from a Gaussian filtering process. In both cycles, the evolution is realized through a simple element switching mechanism between two linked lists, that implicitly represents the curve using an integer valued level-set function. By careful construction, all the key evolution steps require only integer operations. A consequence is that we obtain significant computation speedups compared to exact PDE-based approaches while obtaining excellent agreement with these methods for problems of practical engineering interest. In particular, the resulting algorithm is fast enough for use in real-time video processing applications, which we demonstrate through several image segmentation and video tracking experiments. PMID:18390371

  11. Vascular Tree Segmentation in Medical Images Using Hessian-Based Multiscale Filtering and Level Set Method

    PubMed Central

    Jin, Jiaoying; Yang, Linjun; Zhang, Xuming

    2013-01-01

    Vascular segmentation plays an important role in medical image analysis. A novel technique for the automatic extraction of vascular trees from 2D medical images is presented, which combines Hessian-based multiscale filtering and a modified level set method. In the proposed algorithm, the morphological top-hat transformation is firstly adopted to attenuate background. Then Hessian-based multiscale filtering is used to enhance vascular structures by combining Hessian matrix with Gaussian convolution to tune the filtering response to the specific scales. Because Gaussian convolution tends to blur vessel boundaries, which makes scale selection inaccurate, an improved level set method is finally proposed to extract vascular structures by introducing an external constrained term related to the standard deviation of Gaussian function into the traditional level set. Our approach was tested on synthetic images with vascular-like structures and 2D slices extracted from real 3D abdomen magnetic resonance angiography (MRA) images along the coronal plane. The segmentation rates for synthetic images are above 95%. The results for MRA images demonstrate that the proposed method can extract most of the vascular structures successfully and accurately in visualization. Therefore, the proposed method is effective for the vascular tree extraction in medical images. PMID:24348738

  12. Numerical Simulation of Dynamic Contact Angles and Contact Lines in Multiphase Flows using Level Set Method

    NASA Astrophysics Data System (ADS)

    Pendota, Premchand

    Many physical phenomena and industrial applications involve multiphase fluid flows and hence it is of high importance to be able to simulate various aspects of these flows accurately. The Dynamic Contact Angles (DCA) and the contact lines at the wall boundaries are a couple of such important aspects. In the past few decades, many mathematical models were developed for predicting the contact angles of the inter-face with the wall boundary under various flow conditions. These models are used to incorporate the physics of DCA and contact line motion in numerical simulations using various interface capturing/tracking techniques. In the current thesis, a simple approach to incorporate the static and dynamic contact angle boundary conditions using the level set method is developed and implemented in multiphase CFD codes, LIT (Level set Interface Tracking) (Herrmann (2008)) and NGA (flow solver) (Desjardins et al (2008)). Various DCA models and associated boundary conditions are reviewed. In addition, numerical aspects such as the occurrence of a stress singularity at the contact lines and grid convergence of macroscopic interface shape are dealt with in the context of the level set approach.

  13. On the geometry of two-dimensional slices of irregular level sets in turbulent flows

    SciTech Connect

    Catrakis, H.J.; Cook, A.W.; Dimotakis, P.E.; Patton, J.M.

    1998-03-20

    Isoscalar surfaces in turbulent flows are found to be more complex than (self-similar) fractals, in both the far field of liquid-phase turbulent jets and in a realization of Rayleigh-Taylor-instability flow. In particular, they exhibit a scale-dependent coverage dimension, D{sub 2}((lambda)), for 2-D slices of scalar level sets, that increases with scale, from unity, at small scales, to 2, at large scales. For the jet flow and Reynolds numbers investigated, the isoscalar-surface geometry is both scalar-threshold- and Re-dependent; the level-set (coverage) length decreases with increasing Re, indicating enhanced mixing with increasing Reynolds number; and the size distribution of closed regions is well described by lognormal statistics at small scales. A similar D{sub 2}((lambda)) behavior is found for level-set data of 3-D density-interface behavior in recent direct numerical-simulation studies of Rayleigh-Taylor-instability flow. A comparison of (spatial) spectral and isoscalar coverage statistics will be disc

  14. A level set simulation of dendritic solidification of multi-component alloys

    NASA Astrophysics Data System (ADS)

    Tan, Lijian; Zabaras, Nicholas

    2007-01-01

    A level set method combining features of front tracking methods and fixed domain methods is presented to model microstructure evolution in the solidification of multi-component alloys. Phase boundaries are tracked by solving the multi-phase level set equations. Diffused interfaces are constructed from these tracked phase boundaries using the level set functions. Based on the assumed diffused interfaces, volume-averaging techniques are applied for energy, species and momentum transport. Microstructure evolution in multi-component alloy systems is predicted using realistic material parameters. The methodology avoids the difficulty of parameter identification needed in other diffused interface models, and allows easy application to various practical alloy systems. Techniques including fast marching, narrow band computing and adaptive meshing are utilized to speed up computations. Several numerical examples are considered to validate the method and examine its potential for modeling solidification of practical alloy systems. These examples include two- and three-dimensional solidification of a binary alloy in an undercooled melt, a study of planar/cellular/dendritic transition in the solidification of a Ni-Cu alloy, and eutectic and peritectic solidification of an Fe-C system. Adaptive mesh refinement in the rapidly varying interface region makes the method practical for coupling the microstructure evolution at the meso-scale with buoyancy driven flow in the macro-scale, which is shown in the solidification of a Ni-Al-Ta ternary alloy.

  15. A predictive coding framework for rapid neural dynamics during sentence-level language comprehension.

    PubMed

    Lewis, Ashley G; Bastiaansen, Marcel

    2015-07-01

    There is a growing literature investigating the relationship between oscillatory neural dynamics measured using electroencephalography (EEG) and/or magnetoencephalography (MEG), and sentence-level language comprehension. Recent proposals have suggested a strong link between predictive coding accounts of the hierarchical flow of information in the brain, and oscillatory neural dynamics in the beta and gamma frequency ranges. We propose that findings relating beta and gamma oscillations to sentence-level language comprehension might be unified under such a predictive coding account. Our suggestion is that oscillatory activity in the beta frequency range may reflect both the active maintenance of the current network configuration responsible for representing the sentence-level meaning under construction, and the top-down propagation of predictions to hierarchically lower processing levels based on that representation. In addition, we suggest that oscillatory activity in the low and middle gamma range reflect the matching of top-down predictions with bottom-up linguistic input, while evoked high gamma might reflect the propagation of bottom-up prediction errors to higher levels of the processing hierarchy. We also discuss some of the implications of this predictive coding framework, and we outline ideas for how these might be tested experimentally. PMID:25840879

  16. Development of a hydrogeologic framework using tidally influenced groundwater levels, Hawaii

    NASA Astrophysics Data System (ADS)

    Rotzoll, K.; Oki, D. S.; El-Kadi, A. I.

    2013-12-01

    Aquifer hydraulic properties can be estimated from commonly available water-level data from tidally influenced wells because the tidal signal attenuation depends on the aquifer's regional hydraulic diffusivity. Estimates of hydraulic properties are required for models that are used to manage groundwater availability and quality. A few localized studies of tidal attenuation in Hawaii have been published, but many water-level records have not been analyzed and no regional synthesis of tidal attenuation information in Hawaii exists. Therefore, we estimate aquifer properties from tidal attenuation for Hawaii using groundwater-level records from more than 350 wells. Filtering methods to separate water-level fluctuations caused by ocean tides from other environmental stresses such as barometric pressure and long-period ocean-level variations are explored. For short-term records, several approaches to identify tidal components are examined. The estimated aquifer properties are combined in a regional context with respect to the hydrogeologic framework of each island. The results help to better understand conceptual models of groundwater flow in Hawaii aquifers and facilitate the development of regional numerical groundwater flow and transport models aimed at sustainable water-resource management.

  17. Statistical criteria to set alarm levels for continuous measurements of ground contamination.

    PubMed

    Brandl, A; Jimenez, A D Herrera

    2008-08-01

    In the course of the decommissioning of the ASTRA research reactor at the site of the Austrian Research Centers at Seibersdorf, the operator and licensee, Nuclear Engineering Seibersdorf, conducted an extensive site survey and characterization to demonstrate compliance with regulatory site release criteria. This survey included radiological characterization of approximately 400,000 m(2) of open land on the Austrian Research Centers premises. Part of this survey was conducted using a mobile large-area gas proportional counter, continuously recording measurements while it was moved at a speed of 0.5 ms(-1). In order to set reasonable investigation levels, two alarm levels based on statistical considerations were developed. This paper describes the derivation of these alarm levels and the operational experience gained by detector deployment in the field. PMID:18617795

  18. Improved inhalation technology for setting safe exposure levels for workplace chemicals

    NASA Technical Reports Server (NTRS)

    Stuart, Bruce O.

    1993-01-01

    Threshold Limit Values recommended as allowable air concentrations of a chemical in the workplace are often based upon a no-observable-effect-level (NOEL) determined by experimental inhalation studies using rodents. A 'safe level' for human exposure must then be estimated by the use of generalized safety factors in attempts to extrapolate from experimental rodents to man. The recent development of chemical-specific physiologically-based toxicokinetics makes use of measured physiological, biochemical, and metabolic parameters to construct a validated model that is able to 'scale-up' rodent response data to predict the behavior of the chemical in man. This procedure is made possible by recent advances in personal computer software and the emergence of appropriate biological data, and provides an analytical tool for much more reliable risk evaluation and airborne chemical exposure level setting for humans.

  19. Implementation of E.U. Water Framework Directive: source assessment of metallic substances at catchment levels.

    PubMed

    Chon, Ho-Sik; Ohandja, Dieudonne-Guy; Voulvoulis, Nikolaos

    2010-01-01

    The E.U. Water Framework Directive (WFD) aims to prevent deterioration of water quality and to phase out or reduce the concentrations of priority substances at catchment levels. It requires changes in water management from a local scale to a river basin scale, and establishes Environmental Quality Standards (EQS) as a guideline for the chemical status of receiving waters. According to the Directive, the standard and the scope of the investigation for water management are more stringent and expanded than in the past, and this change also needs to be applied to restoring the level of metals in water bodies. The aim of this study was to identify anthropogenic emission sources of metallic substances at catchment levels. Potential sources providing substantial amounts of such substances in receiving waters included stormwater, industrial effluents, treated effluents, agricultural drainage, sediments, mining drainage and landfill leachates. Metallic substances have more emission sources than other dangerous substances at catchment levels. Therefore, source assessment for these substances is required to be considered more significantly to restore their chemical status in the context of the WFD. To improve source assessment quality, research on the role of societal and environmental parameters and contribution of each source to the chemical distribution in receiving waters need to be carried out. PMID:20081997

  20. Differential optimal dopamine levels for set-shifting and working memory in Parkinson's disease.

    PubMed

    Fallon, Sean James; Smulders, Katrijn; Esselink, Rianne A; van de Warrenburg, Bart P; Bloem, Bastiaan R; Cools, Roshan

    2015-10-01

    Parkinson's disease (PD) is an important model for the role of dopamine in supporting human cognition. However, despite the uniformity of midbrain dopamine depletion only some patients experience cognitive impairment. The neurocognitive mechanisms of this heterogeneity remain unclear. A genetic polymorphism in the catechol O-methyltransferase (COMT) enzyme, predominantly thought to exert its cognitive effect through acting on prefrontal cortex (PFC) dopamine transmission, provides us with an experimental window onto dopamine's role in cognitive performance in PD. In a large cohort of PD patients (n=372), we examined the association between COMT genotype and two tasks known to implicate prefrontal dopamine (spatial working memory and attentional set-shifting) and on a task less sensitive to prefrontal dopamine (paired associates learning). Consistent with the known neuroanatomical locus of its effects, differences between the COMT genotype groups were observed on dopamine-dependant tasks, but not the paired associates learning task. However, COMT genotype had differential effects on the two prefrontal dopamine tasks. Putative prefrontal dopamine levels influenced spatial working memory in an 'Inverted-U'-shaped fashion, whereas a linear, dose-dependant pattern was observed for attentional set-shifting. Cumulatively, these results revise our understanding of when COMT genotype modulates cognitive functioning in PD patients by showing that the behavioural consequences of genetic variation vary according to task demands, presumably because set-shifting and working memory have different optimal dopamine levels. PMID:26239947

  1. Study of burn scar extraction automatically based on level set method using remote sensing data.

    PubMed

    Liu, Yang; Dai, Qin; Liu, Jianbo; Liu, ShiBin; Yang, Jin

    2014-01-01

    Burn scar extraction using remote sensing data is an efficient way to precisely evaluate burn area and measure vegetation recovery. Traditional burn scar extraction methodologies have no well effect on burn scar image with blurred and irregular edges. To address these issues, this paper proposes an automatic method to extract burn scar based on Level Set Method (LSM). This method utilizes the advantages of the different features in remote sensing images, as well as considers the practical needs of extracting the burn scar rapidly and automatically. This approach integrates Change Vector Analysis (CVA), Normalized Difference Vegetation Index (NDVI) and the Normalized Burn Ratio (NBR) to obtain difference image and modifies conventional Level Set Method Chan-Vese (C-V) model with a new initial curve which results from a binary image applying K-means method on fitting errors of two near-infrared band images. Landsat 5 TM and Landsat 8 OLI data sets are used to validate the proposed method. Comparison with conventional C-V model, OSTU algorithm, Fuzzy C-mean (FCM) algorithm are made to show that the proposed approach can extract the outline curve of fire burn scar effectively and exactly. The method has higher extraction accuracy and less algorithm complexity than that of the conventional C-V model. PMID:24503563

  2. Modelling calving front dynamics using a level-set method: application to Jakobshavn Isbræ, West Greenland

    NASA Astrophysics Data System (ADS)

    Bondzio, Johannes H.; Seroussi, Hélène; Morlighem, Mathieu; Kleiner, Thomas; Rückamp, Martin; Humbert, Angelika; Larour, Eric Y.

    2016-03-01

    Calving is a major mechanism of ice discharge of the Antarctic and Greenland ice sheets, and a change in calving front position affects the entire stress regime of marine terminating glaciers. The representation of calving front dynamics in a 2-D or 3-D ice sheet model remains non-trivial. Here, we present the theoretical and technical framework for a level-set method, an implicit boundary tracking scheme, which we implement into the Ice Sheet System Model (ISSM). This scheme allows us to study the dynamic response of a drainage basin to user-defined calving rates. We apply the method to Jakobshavn Isbræ, a major marine terminating outlet glacier of the West Greenland Ice Sheet. The model robustly reproduces the high sensitivity of the glacier to calving, and we find that enhanced calving triggers significant acceleration of the ice stream. Upstream acceleration is sustained through a combination of mechanisms. However, both lateral stress and ice influx stabilize the ice stream. This study provides new insights into the ongoing changes occurring at Jakobshavn Isbræ and emphasizes that the incorporation of moving boundaries and dynamic lateral effects, not captured in flow-line models, is key for realistic model projections of sea level rise on centennial timescales.

  3. Street Level Hydrology: An Urban Application of the WRF-Hydro Framework in Denver, Colorado

    NASA Astrophysics Data System (ADS)

    Read, L.; Hogue, T. S.; Salas, F. R.; Gochis, D.

    2015-12-01

    Urban flood modeling at the watershed scale carries unique challenges in routing complexity, data resolution, social and political issues, and land surface - infrastructure interactions. The ability to accurately trace and predict the flow of water through the urban landscape enables better emergency response management, floodplain mapping, and data for future urban infrastructure planning and development. These services are of growing importance as urban population is expected to continue increasing by 1.84% per year for the next 25 years, increasing the vulnerability of urban regions to damages and loss of life from floods. Although a range of watershed-scale models have been applied in specific urban areas to examine these issues, there is a trend towards national scale hydrologic modeling enabled by supercomputing resources to understand larger system-wide hydrologic impacts and feedbacks. As such it is important to address how urban landscapes can be represented in large scale modeling processes. The current project investigates how coupling terrain and infrastructure routing can improve flow prediction and flooding events over the urban landscape. We utilize the WRF-Hydro modeling framework and a high-resolution terrain routing grid with the goal of compiling standard data needs necessary for fine scale urban modeling and dynamic flood forecasting in the urban setting. The city of Denver is selected as a case study, as it has experienced several large flooding events in the last five years and has an urban annual population growth rate of 1.5%, one of the highest in the U.S. Our work highlights the hydro-informatic challenges associated with linking channel networks and drainage infrastructure in an urban area using the WRF-Hydro modeling framework and high resolution urban models for short-term flood prediction.

  4. GPU-Based Visualization of 3D Fluid Interfaces using Level Set Methods

    NASA Astrophysics Data System (ADS)

    Kadlec, B. J.

    2009-12-01

    We model a simple 3D fluid-interface problem using the level set method and visualize the interface as a dynamic surface. Level set methods allow implicit handling of complex topologies deformed by evolutions where sharp changes and cusps are present without destroying the representation. We present a highly optimized visualization and computation algorithm that is implemented in CUDA to run on the NVIDIA GeForce 295 GTX. CUDA is a general purpose parallel computing architecture that allows the NVIDIA GPU to be treated like a data parallel supercomputer in order to solve many computational problems in a fraction of the time required on a CPU. CUDA is compared to the new OpenCL™ (Open Computing Language), which is designed to run on heterogeneous computing environments but does not take advantage of low-level features in NVIDIA hardware that provide significant speedups. Therefore, our technique is implemented using CUDA and results are compared to a single CPU implementation to show the benefits of using the GPU and CUDA for visualizing fluid-interface problems. We solve a 1024^3 problem and experience significant speedup using the NVIDIA GeForce 295 GTX. Implementation details for mapping the problem to the GPU architecture are described as well as discussion on porting the technique to heterogeneous devices (AMD, Intel, IBM) using OpenCL. The results present a new interactive system for computing and visualizing the evolution of fluid interface problems on the GPU.

  5. Computer aided root lesion detection using level set and complex wavelets

    NASA Astrophysics Data System (ADS)

    Li, Shuo; Fevens, Thomas; Krzyżak, Adam; Jin, Chao; Li, Song

    2007-03-01

    A computer aided root lesion detection method for digital dental X-rays is proposed using level set and complex wavelets. The detection method consists of two stages: preprocessing and root lesion detection. During preprocessing, a level set segmentation is applied to separate the teeth from the background. Tailored for the dental clinical environment, a segmentation clinical acceleration scheme is applied by using a support vector machine (SVM) classifier and individual principal component analysis (PCA) to provide an initial contour. Then, based on the segmentation result, root lesion detection is performed. Firstly, the teeth are isolated by the average intensity profile. Secondly, a center-line zero crossing based candidate generation is applied to generate the possible root lesion areas. Thirdly, the Dual-Tree Complex Wavelets Transform (DT-CWT) is used to further remove false positives. Lastly when the root lesion is detected, the area of root lesion is automatically marked with color indication representing different levels of seriousness. 150 real dental X-rays with various degrees of root lesions are used to test the proposed method. The results were validated by the dentist. Experimental results show that the proposed method is able to successfully detect the root lesion and provide visual assistance to the dentist.

  6. Framework for leadership and training of Biosafety Level 4 laboratory workers.

    PubMed

    Le Duc, James W; Anderson, Kevin; Bloom, Marshall E; Estep, James E; Feldmann, Heinz; Geisbert, Joan B; Geisbert, Thomas W; Hensley, Lisa; Holbrook, Michael; Jahrling, Peter B; Ksiazek, Thomas G; Korch, George; Patterson, Jean; Skvorak, John P; Weingartl, Hana

    2008-11-01

    Construction of several new Biosafety Level 4 (BSL-4) laboratories and expansion of existing operations have created an increased international demand for well-trained staff and facility leaders. Directors of most North American BSL-4 laboratories met and agreed upon a framework for leadership and training of biocontainment research and operations staff. They agreed on essential preparation and training that includes theoretical consideration of biocontainment principles, practical hands-on training, and mentored on-the-job experiences relevant to positional responsibilities as essential preparation before a person's independent access to a BSL-4 facility. They also agreed that the BSL-4 laboratory director is the key person most responsible for ensuring that staff members are appropriately prepared for BSL-4 operations. Although standardized certification of training does not formally exist, the directors agreed that facility-specific, time-limited documentation to recognize specific skills and experiences of trained persons is needed. PMID:18976549

  7. Physical therapy for young children diagnosed with autism spectrum disorders-clinical frameworks model in an israeli setting.

    PubMed

    Atun-Einy, Osnat; Lotan, Meir; Harel, Yael; Shavit, Efrat; Burstein, Shimshon; Kempner, Gali

    2013-01-01

    Recent research findings suggest that many children with Autism Spectrum Disorders (ASD) demonstrate delayed and atypical motor achievements. It has now become clear that a more holistic, integrative and multi-disciplinary intervention is required to effectively address the motor-related impairments of this population. It is also crucial to ensure that this group of clients has access to early physical therapy (PT) interventions. Despite accumulating research on physical interventions, little is known about intervention model for implementation at a national level. This report introduces a model that uniquely illustrates implementation of PT services for a large number of children with ASD. The model has been operating for the past 2 years in one country (Israel), and includes an optional implementation model of PT practice settings for young children diagnosed with ASD. The Israeli setting offers a unique opportunity for implementing PT services for a multitude of children with ASD on a regular basis as an accepted/needed service. The initial outcomes of the present implementation suggest that an intensive PT intervention program might enhance therapeutic outcomes for this population, and contribute to our knowledge on the potential of PT for individuals with ASD. PMID:24400265

  8. Non-Rigid Object Contour Tracking via a Novel Supervised Level Set Model.

    PubMed

    Sun, Xin; Yao, Hongxun; Zhang, Shengping; Li, Dong

    2015-11-01

    We present a novel approach to non-rigid objects contour tracking in this paper based on a supervised level set model (SLSM). In contrast to most existing trackers that use bounding box to specify the tracked target, the proposed method extracts the accurate contours of the target as tracking output, which achieves better description of the non-rigid objects while reduces background pollution to the target model. Moreover, conventional level set models only emphasize the regional intensity consistency and consider no priors. Differently, the curve evolution of the proposed SLSM is object-oriented and supervised by the specific knowledge of the targets we want to track. Therefore, the SLSM can ensure a more accurate convergence to the exact targets in tracking applications. In particular, we firstly construct the appearance model for the target in an online boosting manner due to its strong discriminative power between the object and the background. Then, the learnt target model is incorporated to model the probabilities of the level set contour by a Bayesian manner, leading the curve converge to the candidate region with maximum likelihood of being the target. Finally, the accurate target region qualifies the samples fed to the boosting procedure as well as the target model prepared for the next time step. We firstly describe the proposed mechanism of two-phase SLSM for single target tracking, then give its generalized multi-phase version for dealing with multi-target tracking cases. Positive decrease rate is used to adjust the learning pace over time, enabling tracking to continue under partial and total occlusion. Experimental results on a number of challenging sequences validate the effectiveness of the proposed method. PMID:26099142

  9. A three-dimensional coupled Nitsche and level set method for electrohydrodynamic potential flows in moving domains

    NASA Astrophysics Data System (ADS)

    Johansson, A.; Garzon, M.; Sethian, J. A.

    2016-03-01

    In this paper we present a new algorithm for computing three-dimensional electrohydrodynamic flow in moving domains which can undergo topological changes. We consider a non-viscous, irrotational, perfect conducting fluid and introduce a way to model the electrically charged flow with an embedded potential approach. To numerically solve the resulting system, we combine a level set method to track both the free boundary and the surface velocity potential with a Nitsche finite element method for solving the Laplace equations. This results in an algorithmic framework that does not require body-conforming meshes, works in three dimensions, and seamlessly tracks topological change. Assembling this coupled system requires care: while convergence and stability properties of Nitsche's methods have been well studied for static problems, they have rarely been considered for moving domains or for obtaining the gradients of the solution on the embedded boundary. We therefore investigate the performance of the symmetric and non-symmetric Nitsche formulations, as well as two different stabilization techniques. The global algorithm and in particular the coupling between the Nitsche solver and the level set method are also analyzed in detail. Finally we present numerical results for several time-dependent problems, each one designed to achieve a specific objective: (a) The oscillation of a perturbed sphere, which is used for convergence studies and the examination of the Nitsche methods; (b) The break-up of a two lobe droplet with axial symmetry, which tests the capability of the algorithm to go past flow singularities such as topological changes and preservation of an axi-symmetric flow, and compares results to previous axi-symmetric calculations; (c) The electrohydrodynamical deformation of a thin film and subsequent jet ejection, which will account for the presence of electrical forces in a non-axi-symmetric geometry.

  10. Springback assessment based on level set interpolation and shape manifolds in deep drawing

    NASA Astrophysics Data System (ADS)

    Le Quilliec, Guenhael; Raghavan, Balaji; Breitkopf, Piotr; Rassineux, Alain; Villon, Pierre; Roelandt, Jean-Marc

    2013-12-01

    In this paper, we introduce an original shape representation approach for automatic springback characterization. It is based on the generation of parameterized Level Set functions. The central idea is the concept of the shape manifold representing the design domain in the reduced-order shape-space. Performing Proper Orthogonal Decomposition on the shapes followed by using the Diffuse Approximation allows us to efficiently reduce the problem dimensionality and to interpolate uniquely between admissible input shapes, while also determining the smallest number of parameters needed to characterize the final formed shape. We apply this methodology to the problem of springback assessment for the deep drawing operation of metal sheets.