Science.gov

Sample records for level set framework

  1. An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs

    PubMed Central

    Mosaliganti, Kishore R.; Gelas, Arnaud; Megason, Sean G.

    2013-01-01

    In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish

  2. Setting Healthcare Priorities at the Macro and Meso Levels: A Framework for Evaluation

    PubMed Central

    Barasa, Edwine W.; Molyneux, Sassy; English, Mike; Cleary, Susan

    2015-01-01

    Background: Priority setting in healthcare is a key determinant of health system performance. However, there is no widely accepted priority setting evaluation framework. We reviewed literature with the aim of developing and proposing a framework for the evaluation of macro and meso level healthcare priority setting practices. Methods: We systematically searched Econlit, PubMed, CINAHL, and EBSCOhost databases and supplemented this with searches in Google Scholar, relevant websites and reference lists of relevant papers. A total of 31 papers on evaluation of priority setting were identified. These were supplemented by broader theoretical literature related to evaluation of priority setting. A conceptual review of selected papers was undertaken. Results: Based on a synthesis of the selected literature, we propose an evaluative framework that requires that priority setting practices at the macro and meso levels of the health system meet the following conditions: (1) Priority setting decisions should incorporate both efficiency and equity considerations as well as the following outcomes; (a) Stakeholder satisfaction, (b) Stakeholder understanding, (c) Shifted priorities (reallocation of resources), and (d) Implementation of decisions. (2) Priority setting processes should also meet the procedural conditions of (a) Stakeholder engagement, (b) Stakeholder empowerment, (c) Transparency, (d) Use of evidence, (e) Revisions, (f) Enforcement, and (g) Being grounded on community values. Conclusion: Available frameworks for the evaluation of priority setting are mostly grounded on procedural requirements, while few have included outcome requirements. There is, however, increasing recognition of the need to incorporate both consequential and procedural considerations in priority setting practices. In this review, we adapt an integrative approach to develop and propose a framework for the evaluation of priority setting practices at the macro and meso levels that draws from these

  3. A unified variational segmentation framework with a level-set based sparse composite shape prior

    NASA Astrophysics Data System (ADS)

    Liu, Wenyang; Ruan, Dan

    2015-03-01

    Image segmentation plays an essential role in many medical applications. Low SNR conditions and various artifacts makes its automation challenging. To achieve robust and accurate segmentation results, a good approach is to introduce proper shape priors. In this study, we present a unified variational segmentation framework that regularizes the target shape with a level-set based sparse composite prior. When the variational problem is solved with a block minimization/decent scheme, the regularizing impact of the sparse composite prior can be observed to adjust to the most recent shape estimate, and may be interpreted as a ‘dynamic’ shape prior, yet without compromising convergence thanks to the unified energy framework. The proposed method was applied to segment corpus callosum from 2D MR images and liver from 3D CT volumes. Its performance was evaluated using Dice Similarity Coefficient and Hausdorff distance, and compared with two benchmark level-set based segmentation methods. The proposed method has achieved statistically significant higher accuracy in both experiments and avoided faulty inclusion/exclusion of surrounding structures with similar intensities, as opposed to the benchmark methods.

  4. A Unified Variational Segmentation Framework with a Level-set based Sparse Composite Shape Prior

    PubMed Central

    Liu, Wenyang; Ruan, Dan

    2015-01-01

    Image segmentation plays an essential role in many medical applications. Low SNR conditions and various artifacts makes its automation challenging. To achieve robust and accurate segmentation results, a good approach is to introduce proper shape priors. In this study, we present a unified variational segmentation framework that regularizes the target shape with a level-set based sparse composite prior. When the variational problem is solved with a block minimization/decent scheme, the regularizing impact of the sparse composite prior can be observed to adjust to the most recent shape estimate, and may be interpreted as a “dynamic” shape prior, yet without compromising convergence thanks to the unified energy framework. The proposed method was applied to segment corpus callosum from 2D MR images and liver from 3D CT volumes. Its performance was evaluated using Dice Similarity Coefficient and Hausdorff distance, and compared with two benchmark level-set based segmentation methods. The proposed method has achieved statistically significant higher accuracy in both experiments and avoided faulty inclusion/exclusion of surrounding structures with similar intensities, as opposed to the benchmark methods. PMID:25668234

  5. A multi-phase level set framework for source reconstruction in bioluminescence tomography

    SciTech Connect

    Huang Heyu; Qu Xiaochao; Liang Jimin; He Xiaowei; Chen Xueli; Yang Da'an; Tian Jie

    2010-07-01

    We propose a novel multi-phase level set algorithm for solving the inverse problem of bioluminescence tomography. The distribution of unknown interior source is considered as piecewise constant and represented by using multiple level set functions. The localization of interior bioluminescence source is implemented by tracing the evolution of level set function. An alternate search scheme is incorporated to ensure the global optimal of reconstruction. Both numerical and physical experiments are performed to evaluate the developed level set reconstruction method. Reconstruction results show that the proposed method can stably resolve the interior source of bioluminescence tomography.

  6. A coupled level-set framework for bladder wall segmentation with application to MRI-based virtual cystoscopy

    NASA Astrophysics Data System (ADS)

    Duan, Chaijie; Bao, Shanglian; Liang, Zhengrong

    2009-02-01

    In this paper, we propose a coupled level-set framework for segmentation of bladder wall using T1-weighted magnetic resonance (MR) images. The segmentation results will be used for non-invasive MR-based virtual cystoscopy (VCys). The framework uses two level-set functions to segment inner and outer borders of the bladder wall respectively. Based on Chan-Vese (C-V) model, a local adaptive fitting (LAF) image energy is introduced to capture local intensity contrast. Comparing with previous work, our method has the following advantages. First of all, unlike most other work which only segments the boundary of the bladder but not inner border and outer border respectively, our method extracts the inner border as well as the outer border of bladder wall automatically. Secondly, we focus on T1-weighted MR images which decrease the image intensity of the urine and therefore minimize the partial volume effect (PVE) on the bladder wall for detection of abnormalities on the mucosa layer in contrast to others' work on CT images and T2-weighted MR images which enhance the intensity of the urine and encounter the PVE. In addition, T1-weighted MR images provide the best tissue contrast for detection of the outer border of the bladder wall. Since MR images tend to be inhomogeneous and have ghost artifacts due to motion and other causes as compared to computer tomography (CT)-based VCys, our framework is easy to control the geometric property of level-set functions to mitigate the influences of inhomogeneity and ghosts. Finally, a variety of geometric parameters, such as the thickness of bladder wall, etc, can be measured easily under the level-set framework. These parameters are clinically important for VCys. The segmentation results were evaluated by experienced radiologists, whose feedback strongly demonstrated the usefulness of such coupled level-set framework for VCys.

  7. Improving district level health planning and priority setting in Tanzania through implementing accountability for reasonableness framework: Perceptions of stakeholders

    PubMed Central

    2010-01-01

    Background In 2006, researchers and decision-makers launched a five-year project - Response to Accountable Priority Setting for Trust in Health Systems (REACT) - to improve planning and priority-setting through implementing the Accountability for Reasonableness framework in Mbarali District, Tanzania. The objective of this paper is to explore the acceptability of Accountability for Reasonableness from the perspectives of the Council Health Management Team, local government officials, health workforce and members of user boards and committees. Methods Individual interviews were carried out with different categories of actors and stakeholders in the district. The interview guide consisted of a series of questions, asking respondents to describe their perceptions regarding each condition of the Accountability for Reasonableness framework in terms of priority setting. Interviews were analysed using thematic framework analysis. Documentary data were used to support, verify and highlight the key issues that emerged. Results Almost all stakeholders viewed Accountability for Reasonableness as an important and feasible approach for improving priority-setting and health service delivery in their context. However, a few aspects of Accountability for Reasonableness were seen as too difficult to implement given the socio-political conditions and traditions in Tanzania. Respondents mentioned: budget ceilings and guidelines, low level of public awareness, unreliable and untimely funding, as well as the limited capacity of the district to generate local resources as the major contextual factors that hampered the full implementation of the framework in their context. Conclusion This study was one of the first assessments of the applicability of Accountability for Reasonableness in health care priority-setting in Tanzania. The analysis, overall, suggests that the Accountability for Reasonableness framework could be an important tool for improving priority-setting processes in the

  8. A fourth-order accurate curvature computation in a level set framework for two-phase flows subjected to surface tension forces

    NASA Astrophysics Data System (ADS)

    Coquerelle, Mathieu; Glockner, Stéphane

    2016-01-01

    We propose an accurate and robust fourth-order curvature extension algorithm in a level set framework for the transport of the interface. The method is based on the Continuum Surface Force approach, and is shown to efficiently calculate surface tension forces for two-phase flows. In this framework, the accuracy of the algorithms mostly relies on the precise computation of the surface curvature which we propose to accomplish using a two-step algorithm: first by computing a reliable fourth-order curvature estimation from the level set function, and second by extending this curvature rigorously in the vicinity of the surface, following the Closest Point principle. The algorithm is easy to implement and to integrate into existing solvers, and can easily be extended to 3D. We propose a detailed analysis of the geometrical and numerical criteria responsible for the appearance of spurious currents, a well known phenomenon observed in various numerical frameworks. We study the effectiveness of this novel numerical method on state-of-the-art test cases showing that the resulting curvature estimate significantly reduces parasitic currents. In addition, the proposed approach converges to fourth-order regarding spatial discretization, which is two orders of magnitude better than algorithms currently available. We also show the necessity for high-order transport methods for the surface by studying the case of the 2D advection of a column at equilibrium thereby proving the robustness of the proposed approach. The algorithm is further validated on more complex test cases such as a rising bubble.

  9. Monitoring Street-Level Spatial-Temporal Variations of Carbon Monoxide in Urban Settings Using a Wireless Sensor Network (WSN) Framework

    PubMed Central

    Wen, Tzai-Hung; Jiang, Joe-Air; Sun, Chih-Hong; Juang, Jehn-Yih; Lin, Tzu-Shiang

    2013-01-01

    Air pollution has become a severe environmental problem due to urbanization and heavy traffic. Monitoring street-level air quality is an important issue, but most official monitoring stations are installed to monitor large-scale air quality conditions, and their limited spatial resolution cannot reflect the detailed variations in air quality that may be induced by traffic jams. By deploying wireless sensors on crossroads and main roads, this study established a pilot framework for a wireless sensor network (WSN)-based real-time monitoring system to understand street-level spatial-temporal changes of carbon monoxide (CO) in urban settings. The system consists of two major components. The first component is the deployment of wireless sensors. We deployed 44 sensor nodes, 40 transmitter nodes and four gateway nodes in this study. Each sensor node includes a signal processing module, a CO sensor and a wireless communication module. In order to capture realistic human exposure to traffic pollutants, all sensors were deployed at a height of 1.5 m on lampposts and traffic signs. The study area covers a total length of 1.5 km of Keelung Road in Taipei City. The other component is a map-based monitoring platform for sensor data visualization and manipulation in time and space. Using intensive real-time street-level monitoring framework, we compared the spatial-temporal patterns of air pollution in different time periods. Our results capture four CO concentration peaks throughout the day at the location, which was located along an arterial and nearby traffic sign. The hourly average could reach 5.3 ppm from 5:00 pm to 7:00 pm due to the traffic congestion. The proposed WSN-based framework captures detailed ground information and potential risk of human exposure to traffic-related air pollution. It also provides street-level insights into real-time monitoring for further early warning of air pollution and urban environmental management. PMID:24287859

  10. Monitoring street-level spatial-temporal variations of carbon monoxide in urban settings using a wireless sensor network (WSN) framework.

    PubMed

    Wen, Tzai-Hung; Jiang, Joe-Air; Sun, Chih-Hong; Juang, Jehn-Yih; Lin, Tzu-Shiang

    2013-12-01

    Air pollution has become a severe environmental problem due to urbanization and heavy traffic. Monitoring street-level air quality is an important issue, but most official monitoring stations are installed to monitor large-scale air quality conditions, and their limited spatial resolution cannot reflect the detailed variations in air quality that may be induced by traffic jams. By deploying wireless sensors on crossroads and main roads, this study established a pilot framework for a wireless sensor network (WSN)-based real-time monitoring system to understand street-level spatial-temporal changes of carbon monoxide (CO) in urban settings. The system consists of two major components. The first component is the deployment of wireless sensors. We deployed 44 sensor nodes, 40 transmitter nodes and four gateway nodes in this study. Each sensor node includes a signal processing module, a CO sensor and a wireless communication module. In order to capture realistic human exposure to traffic pollutants, all sensors were deployed at a height of 1.5 m on lampposts and traffic signs. The study area covers a total length of 1.5 km of Keelung Road in Taipei City. The other component is a map-based monitoring platform for sensor data visualization and manipulation in time and space. Using intensive real-time street-level monitoring framework, we compared the spatial-temporal patterns of air pollution in different time periods. Our results capture four CO concentration peaks throughout the day at the location, which was located along an arterial and nearby traffic sign. The hourly average could reach 5.3 ppm from 5:00 pm to 7:00 pm due to the traffic congestion. The proposed WSN-based framework captures detailed ground information and potential risk of human exposure to traffic-related air pollution. It also provides street-level insights into real-time monitoring for further early warning of air pollution and urban environmental management. PMID:24287859

  11. Optimization from design rules, source and mask, to full chip with a single computational lithography framework: level-set-methods-based inverse lithography technology (ILT)

    NASA Astrophysics Data System (ADS)

    Pang, Linyong; Peng, Danping; Hu, Peter; Chen, Dongxue; Cecil, Tom; He, Lin; Xiao, Guangming; Tolani, Vikram; Dam, Thuc; Baik, Ki-Ho; Gleason, Bob

    2010-04-01

    For semiconductor manufacturers moving toward advanced technology nodes -32nm, 22nm and below - lithography presents a great challenge, because it is fundamentally constrained by basic principles of optical physics. Because no major lithography hardware improvements are expected over the next couple years, Computational Lithography has been recognized by the industry as the key technology needed to drive lithographic performance. This implies not only simultaneous co-optimization of all the lithographic enhancement tricks that have been learned over the years, but that they also be pushed to the limit by powerful computational techniques and systems. In this paper a single computational lithography framework for design, mask, and source co-optimization will be explained in non-mathematical language. A number of memory and logic device results at the 32nm node and below are presented to demonstrate the benefits of Level-Set-Method-based ILT in applications covering design rule optimization, SMO, and full-chip correction.

  12. A framework and a set of tools called Nutting models to estimate retention capacities and loads of nitrogen and phosphorus in rivers at catchment and national level (France)

    NASA Astrophysics Data System (ADS)

    Legeay, Pierre-Louis; Moatar, Florentina; Dupas, Rémi; Gascuel-Odoux, Chantal

    2016-04-01

    The Nutting-N and Nutting-P models (Dupas et al., 2013, 2015) have been developed to estimate Nitrogen and Phosphorus nonpoint-source emissions to surface water, using readily available data. These models were inspired from US model SPARROW (Smith al., 1997) and European model GREEN (Grizzetti et al., 2008), i.e. statistical approaches consisting of linking nitrogen and phosphorus surplus to catchment's land and rivers characteristics to find the catchment relative retention capacities. The nutrient load (L) at the outlet of each catchment is expressed as: L=R*(B*DS+PS) [1] where DS is diffuse sources (i.e. surplus in kg.ha-1/yr-1 for N, P storage in soil for P), PS is point sources from domestic and industrial origin (kg.ha-1.yr-1), R and B are the river system and basin reduction factor, respectively and they combine observed variables and calibrated parameters. The model was calibrated on independent catchments for the 2005-2009 and 2008-2012 periods. Variables were selected according to Bayesian Information Criterion (BIC) in order to optimize the predictive performance of the models. From these basic models, different improvements have been realized to build a framework and a set of tools: 1) a routing module has been added in order to improve estimations on 4 or 5 stream order, i.e. upscaling the basic Nutting approach; 2) a territorial module, in order to test the models at local scale (from 500 to 5000 km²); 3) a seasonal estimation has been investigated. The basic approach as well territorial application will be illustrated. These tools allow water manager to identify areas at risk where high nutrients loads are estimated, as well areas where retention is potentially high and can buffer high nutrient sources. References Dupas R., Curie F., Gascuel-Odoux C., Moatar F., Delmas M., Parnaudeau, V., Durand P., 2013. Assessing N emissions in surface water at the national level: Comparison of country-wide vs. regionalized models. Science of the Total Environment

  13. An adaptive level set method

    SciTech Connect

    Milne, R.B.

    1995-12-01

    This thesis describes a new method for the numerical solution of partial differential equations of the parabolic type on an adaptively refined mesh in two or more spatial dimensions. The method is motivated and developed in the context of the level set formulation for the curvature dependent propagation of surfaces in three dimensions. In that setting, it realizes the multiple advantages of decreased computational effort, localized accuracy enhancement, and compatibility with problems containing a range of length scales.

  14. High-Level Application Framework for LCLS

    SciTech Connect

    Chu, P; Chevtsov, S.; Fairley, D.; Larrieu, C.; Rock, J.; Rogind, D.; White, G.; Zalazny, M.; /SLAC

    2008-04-22

    A framework for high level accelerator application software is being developed for the Linac Coherent Light Source (LCLS). The framework is based on plug-in technology developed by an open source project, Eclipse. Many existing functionalities provided by Eclipse are available to high-level applications written within this framework. The framework also contains static data storage configuration and dynamic data connectivity. Because the framework is Eclipse-based, it is highly compatible with any other Eclipse plug-ins. The entire infrastructure of the software framework will be presented. Planned applications and plug-ins based on the framework are also presented.

  15. A Framework for Describing Interlanguages in Multilingual Settings.

    ERIC Educational Resources Information Center

    Tenjoh-Okwen, Thomas

    1989-01-01

    Outlines a contrastive analysis model and a non-contrastive analysis model for studying interlanguage in strictly bilingual settings, and suggests a bidimensional framework, including both linguistic and curricular components, for studying interlanguage in multilingual settings. (21 references) (CB)

  16. Towards a Framework for Change Detection in Data Sets

    NASA Astrophysics Data System (ADS)

    Böttcher, Mirko; Nauck, Detlef; Ruta, Dymitr; Spott, Martin

    Since the world with its markets, innovations and customers is changing faster than ever before, the key to survival for businesses is the ability to detect, assess and respond to changing conditions rapidly and intelligently. Discovering changes and reacting to or acting upon them before others do has therefore become a strategical issue for many companies. However, existing data analysis techniques are insufflent for this task since they typically assume that the domain under consideration is stable over time. This paper presents a framework that detects changes within a data set at virtually any level of granularity. The underlying idea is to derive a rule-based description of the data set at different points in time and to subsequently analyse how these rules change. Nevertheless, further techniques are required to assist the data analyst in interpreting and assessing their changes. Therefore the framework also contains methods to discard rules that are non-drivers for change and to assess the interestingness of detected changes.

  17. Level set method for microfabrication simulations

    NASA Astrophysics Data System (ADS)

    Baranski, Maciej; Kasztelanic, Rafal; Albero, Jorge; Nieradko, Lukasz; Gorecki, Christophe

    2010-05-01

    The article describes application of Level Set method for two different microfabrication processes. First is shape evolution of during reflow of the glass structure. Investigated problem were approximated by viscous flow of material thus kinetics of the process were known from physical model. Second problem is isotropic wet etching of silicon. Which is much more complicated because dynamics of the shape evolution is strongly coupled with time and geometry shapes history. In etching simulations Level Set method is coupled with Finite Element Method (FEM) that is used for calculation of etching acid concentration that determine geometry evolution of the structure. The problem arising from working with FEM with time varying boundaries was solved with the use of the dynamic mesh technique employing the Level Set formalism of higher dimensional function for geometry description. Isotropic etching was investigated in context of mico-lenses fabrication. Model was compared with experimental data obtained in etching of the silicon moulds used for micro-lenses fabrication.

  18. Setting dietary intake levels: problems and pitfalls.

    PubMed

    Russell, Robert M

    2007-01-01

    Recommended dietary intake levels are the nutrient standards used in designing food assistance programmes, institutional feeding programmes, counselling and teaching. In the USA, the recommended dietary allowances (RDAs) are the basis for setting the poverty threshold and food stamp allotments. In the 1990s, a new paradigm was put forth for estimating nutrient requirements and recommended intake levels. This considered the level of nutrient needed for normal body functioning (versus the amount needed to prevent a deficiency state from occurring). An estimated average requirement (EAR), an RDA and a tolerable upper intake level (UL) were determined for most nutrients. In setting forth these nutrient intake levels (dietary reference intakes, DRIs), a number of data challenges were encountered. For example, it was recognized that for most nutrients there was an absence of dose-response data, and few chronic human or animal studies had been undertaken. In considering how to revise nutrient intake recommendations for populations in the future, the following pitfalls must be overcome: (1) invalid assumption that a threshold level for a requirement will hold for all nutrients; (2) lack of uniform criteria for the selection of the endpoints used (need for evidence-based review, consideration of comparative risk); (3) invalid extrapolations to children for many nutrients; (4) lack of information on variability of responses, and interactions with other nutrients; and (5) lack of understanding in the community of how to use the various DRI numbers. PMID:17913222

  19. Level Set Segmentation of Lumbar Vertebrae Using Appearance Models

    NASA Astrophysics Data System (ADS)

    Fritscher, Karl; Leber, Stefan; Schmölz, Werner; Schubert, Rainer

    For the planning of surgical interventions of the spine exact knowledge about 3D shape and the local bone quality of vertebrae are of great importance in order to estimate the anchorage strength of screws or implants. As a prerequisite for quantitative analysis a method for objective and therefore automated segmentation of vertebrae is needed. In this paper a framework for the automatic segmentation of vertebrae using 3D appearance models in a level set framework is presented. In this framework model information as well as gradient information and probabilities of pixel intensities at object edges in the unseen image are used. The method is tested on 29 lumbar vertebrae leading to accurate results, which can be useful for surgical planning and further analysis of the local bone quality.

  20. Pulmonary lobe segmentation with level sets

    NASA Astrophysics Data System (ADS)

    Schmidt-Richberg, Alexander; Ehrhardt, Jan; Wilms, Matthias; Werner, René; Handels, Heinz

    2012-02-01

    Automatic segmentation of the separate human lung lobes is a crucial task in computer aided diagnostics and intervention planning, and required for example for determination of disease spreading or pulmonary parenchyma quantification. In this work, a novel approach for lobe segmentation based on multi-region level sets is presented. In a first step, interlobular fissures are detected using a supervised enhancement filter. The fissures are then used to compute a cost image, which is incorporated in the level set approach. By this, the segmentation is drawn to the fissures at places where structure information is present in the image. In areas with incomplete fissures (e.g. due to insufficient image quality or anatomical conditions) the smoothing term of the level sets applies and a closed continuation of the fissures is provided. The approach is tested on nine pulmonary CT scans. It is shown that incorporating the additional force term improves the segmentation significantly. On average, 83% of the left fissure is traced correctly; the right oblique and horizontal fissures are properly segmented to 76% and 48%, respectively.

  1. Setting the stage for master's level success

    NASA Astrophysics Data System (ADS)

    Roberts, Donna

    Comprehensive reading, writing, research, and study skills play a critical role in a graduate student's success and ability to contribute to a field of study effectively. The literature indicated a need to support graduate student success in the areas of mentoring, navigation, as well as research and writing. The purpose of this two-phased mixed methods explanatory study was to examine factors that characterize student success at the Master's level in the fields of education, sociology and social work. The study was grounded in a transformational learning framework which focused on three levels of learning: technical knowledge, practical or communicative knowledge, and emancipatory knowledge. The study included two data collection points. Phase one consisted of a Master's Level Success questionnaire that was sent via Qualtrics to graduate level students at three colleges and universities in the Central Valley of California: a California State University campus, a University of California campus, and a private college campus. The results of the chi-square indicated that seven questionnaire items were significant with p values less than .05. Phase two in the data collection included semi-structured interview questions that resulted in three themes emerged using Dedoose software: (1) the need for more language and writing support at the Master's level, (2) the need for mentoring, especially for second-language learners, and (3) utilizing the strong influence of faculty in student success. It is recommended that institutions continually assess and strengthen their programs to meet the full range of learners and to support students to degree completion.

  2. Etch Profile Simulation Using Level Set Methods

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    Etching and deposition of materials are critical steps in semiconductor processing for device manufacturing. Both etching and deposition may have isotropic and anisotropic components, due to directional sputtering and redeposition of materials, for example. Previous attempts at modeling profile evolution have used so-called "string theory" to simulate the moving solid-gas interface between the semiconductor and the plasma. One complication of this method is that extensive de-looping schemes are required at the profile corners. We will present a 2D profile evolution simulation using level set theory to model the surface. (1) By embedding the location of the interface in a field variable, the need for de-looping schemes is eliminated and profile corners are more accurately modeled. This level set profile evolution model will calculate both isotropic and anisotropic etch and deposition rates of a substrate in low pressure (10s mTorr) plasmas, considering the incident ion energy angular distribution functions and neutral fluxes. We will present etching profiles of Si substrates in Ar/Cl2 discharges for various incident ion energies and trench geometries.

  3. Level set based structural topology optimization for minimizing frequency response

    NASA Astrophysics Data System (ADS)

    Shu, Lei; Wang, Michael Yu; Fang, Zongde; Ma, Zhengdong; Wei, Peng

    2011-11-01

    For the purpose of structure vibration reduction, a structural topology optimization for minimizing frequency response is proposed based on the level set method. The objective of the present study is to minimize the frequency response at the specified points or surfaces on the structure with an excitation frequency or a frequency range, subject to the given amount of the material over the admissible design domain. The sensitivity analysis with respect to the structural boundaries is carried out, while the Extended finite element method (X-FEM) is employed for solving the state equation and the adjoint equation. The optimal structure with smooth boundaries is obtained by the level set evolution with advection velocity, derived from the sensitivity analysis and the optimization algorithm. A number of numerical examples, in the frameworks of two-dimension (2D) and three-dimension (3D), are presented to demonstrate the feasibility and effectiveness of the proposed approach.

  4. A probabilistic level set formulation for interactive organ segmentation

    NASA Astrophysics Data System (ADS)

    Cremers, Daniel; Fluck, Oliver; Rousson, Mikael; Aharon, Shmuel

    2007-03-01

    Level set methods have become increasingly popular as a framework for image segmentation. Yet when used as a generic segmentation tool, they suffer from an important drawback: Current formulations do not allow much user interaction. Upon initialization, boundaries propagate to the final segmentation without the user being able to guide or correct the segmentation. In the present work, we address this limitation by proposing a probabilistic framework for image segmentation which integrates input intensity information and user interaction on equal footings. The resulting algorithm determines the most likely segmentation given the input image and the user input. In order to allow a user interaction in real-time during the segmentation, the algorithm is implemented on a graphics card and in a narrow band formulation.

  5. Framework for State-Level Renewable Energy Market Potential Studies

    SciTech Connect

    Kreycik, C.; Vimmerstedt, L.; Doris, E.

    2010-01-01

    State-level policymakers are relying on estimates of the market potential for renewable energy resources as they set goals and develop policies to accelerate the development of these resources. Therefore, accuracy of such estimates should be understood and possibly improved to appropriately support these decisions. This document provides a framework and next steps for state officials who require estimates of renewable energy market potential. The report gives insight into how to conduct a market potential study, including what supporting data are needed and what types of assumptions need to be made. The report distinguishes between goal-oriented studies and other types of studies, and explains the benefits of each.

  6. Chemically Induced Surface Evolutions with Level Sets

    SciTech Connect

    2006-11-17

    ChISELS is used for the theoretical modeling of detailed surface chemistry and consomitant surface evolutions occurring during microsystem fabrication processes conducted at low pressures. Examples include physical vapor deposition (PVD), low pressure chemical vapor deposition (PECVD), and plasma etching. Evolving interfaces are represented using the level-set method and the evolution equations time integrated using a Semi-Lagrangian approach. A Ballistic transport model is employed to solve for the fluxes incident on each of the surface elements. Surface chemistry leading to etching or deposition is computed by either coupling to Surface Chemkin (a commercially available code) or by providing user defined subroutines. The computational meshes used are quad-trees (2-D) and oct-trees (3-D), constructed such that grid refinement is localized to regions near the surface interfaces. As the interface evolves, the mesh is dynamically reconstructed as needed for the grid to remain fine only around the interface. For parallel computation, a domain decomposition scheme with dynamic load balancing is used to distribute the computational work across processors.

  7. Chemically Induced Surface Evolutions with Level Sets

    Energy Science and Technology Software Center (ESTSC)

    2006-11-17

    ChISELS is used for the theoretical modeling of detailed surface chemistry and consomitant surface evolutions occurring during microsystem fabrication processes conducted at low pressures. Examples include physical vapor deposition (PVD), low pressure chemical vapor deposition (PECVD), and plasma etching. Evolving interfaces are represented using the level-set method and the evolution equations time integrated using a Semi-Lagrangian approach. A Ballistic transport model is employed to solve for the fluxes incident on each of the surface elements.more » Surface chemistry leading to etching or deposition is computed by either coupling to Surface Chemkin (a commercially available code) or by providing user defined subroutines. The computational meshes used are quad-trees (2-D) and oct-trees (3-D), constructed such that grid refinement is localized to regions near the surface interfaces. As the interface evolves, the mesh is dynamically reconstructed as needed for the grid to remain fine only around the interface. For parallel computation, a domain decomposition scheme with dynamic load balancing is used to distribute the computational work across processors.« less

  8. Advanced level set segmentation of the right atrium in MR

    NASA Astrophysics Data System (ADS)

    Chen, Siqi; Kohlberger, Timo; Kirchberg, Klaus J.

    2011-03-01

    Atrial fibrillation is a common heart arrhythmia, and can be effectively treated with ablation. Ablation planning requires 3D models of the patient's left atrium (LA) and/or right atrium (RA), therefore an automatic segmentation procedure to retrieve these models is desirable. In this study, we investigate the use of advanced level set segmentation approaches to automatically segment RA in magnetic resonance angiographic (MRA) volume images. Low contrast to noise ratio makes the boundary between the RA and the nearby structures nearly indistinguishable. Therefore, pure data driven segmentation approaches such as watershed and ChanVese methods are bound to fail. Incorporating training shapes through PCA modeling to constrain the segmentation is one popular solution, and is also used in our segmentation framework. The shape parameters from PCA are optimized with a global histogram based energy model. However, since the shape parameters span a much smaller space, it can not capture fine details of the shape. Therefore, we employ a second refinement step after the shape based segmentation stage, which follows closely the recent work of localized appearance model based techniques. The local appearance model is established through a robust point tracking mechanism and is learned through landmarks embedded on the surface of training shapes. The key contribution of our work is the combination of a statistical shape prior and a localized appearance prior for level set segmentation of the right atrium from MRA. We test this two step segmentation framework on porcine RA to verify the algorithm.

  9. A level set segmentation for computer-aided dental x-ray analysis

    NASA Astrophysics Data System (ADS)

    Li, Shuo; Fevens, Thomas; Krzyzak, Adam; Li, Song

    2005-04-01

    A level-set-based segmentation framework for Computer Aided Dental X-rays Analysis (CADXA) is proposed. In this framework, we first employ level set methods to segment the dental X-ray image into three regions: Normal Region (NR), Potential Abnormal Region (PAR), Abnormal and Background Region (ABR). The segmentation results are then used to build uncertainty maps based on a proposed uncertainty measurement method and an analysis scheme is applied. The level set segmentation method consists of two stages: a training stage and a segmentation stage. During the training stage, manually chosen representative images are segmented using hierarchical level set region detection. The segmentation results are used to train a support vector machine (SVM) classifier. During the segmentation stage, a dental X-ray image is first classified by the trained SVM. The classifier provides an initial contour which is close to the correct boundary for the coupled level set method which is then used to further segment the image. Different dental X-ray images are used to test the framework. Experimental results show that the proposed framework achieves faster level set segmentation and provides more detailed information and indications of possible problems to the dentist. To our best knowledge, this is one of the first results on CADXA using level set methods.

  10. Priority setting in healthcare: towards guidelines for the program budgeting and marginal analysis framework.

    PubMed

    Peacock, Stuart J; Mitton, Craig; Ruta, Danny; Donaldson, Cam; Bate, Angela; Hedden, Lindsay

    2010-10-01

    Economists' approaches to priority setting focus on the principles of opportunity cost, marginal analysis and choice under scarcity. These approaches are based on the premise that it is possible to design a rational priority setting system that will produce legitimate changes in resource allocation. However, beyond issuing guidance at the national level, economic approaches to priority setting have had only a moderate impact in practice. In particular, local health service organizations - such as health authorities, health maintenance organizations, hospitals and healthcare trusts - have had difficulty implementing evidence from economic appraisals. Yet, in the context of making decisions between competing claims on scarce health service resources, economic tools and thinking have much to offer. The purpose of this article is to describe and discuss ten evidence-based guidelines for the successful design and implementation of a program budgeting and marginal analysis (PBMA) priority setting exercise. PBMA is a framework that explicitly recognizes the need to balance pragmatic and ethical considerations with economic rationality when making resource allocation decisions. While the ten guidelines are drawn from the PBMA framework, they may be generalized across a range of economic approaches to priority setting. PMID:20950070

  11. Efficient molecular surface generation using level-set methods.

    PubMed

    Can, Tolga; Chen, Chao-I; Wang, Yuan-Fang

    2006-12-01

    Molecules interact through their surface residues. Calculation of the molecular surface of a protein structure is thus an important step for a detailed functional analysis. One of the main considerations in comparing existing methods for molecular surface computations is their speed. Most of the methods that produce satisfying results for small molecules fail to do so for large complexes. In this article, we present a level-set-based approach to compute and visualize a molecular surface at a desired resolution. The emerging level-set methods have been used for computing evolving boundaries in several application areas from fluid mechanics to computer vision. Our method provides a uniform framework for computing solvent-accessible, solvent-excluded surfaces and interior cavities. The computation is carried out very efficiently even for very large molecular complexes with tens of thousands of atoms. We compared our method to some of the most widely used molecular visualization tools (Swiss-PDBViewer, PyMol, and Chimera) and our results show that we can calculate and display a molecular surface 1.5-3.14 times faster on average than all three of the compared programs. Furthermore, we demonstrate that our method is able to detect all of the interior inaccessible cavities that can accommodate one or more water molecules. PMID:16621636

  12. Decentralized health care priority-setting in Tanzania: evaluating against the accountability for reasonableness framework.

    PubMed

    Maluka, Stephen; Kamuzora, Peter; San Sebastiån, Miguel; Byskov, Jens; Olsen, Øystein E; Shayo, Elizabeth; Ndawi, Benedict; Hurtig, Anna-Karin

    2010-08-01

    Priority-setting has become one of the biggest challenges faced by health decision-makers worldwide. Fairness is a key goal of priority-setting and Accountability for Reasonableness has emerged as a guiding framework for fair priority-setting. This paper describes the processes of setting health care priorities in Mbarali district, Tanzania, and evaluates the descriptions against Accountability for Reasonableness. Key informant interviews were conducted with district health managers, local government officials and other stakeholders using a semi-structured interview guide. Relevant documents were also gathered and group priority-setting in the district was observed. The results indicate that, while Tanzania has a decentralized public health care system, the reality of the district level priority-setting process was that it was not nearly as participatory as the official guidelines suggest it should have been. Priority-setting usually occurred in the context of budget cycles and the process was driven by historical allocation. Stakeholders' involvement in the process was minimal. Decisions (but not the reasoning behind them) were publicized through circulars and notice boards, but there were no formal mechanisms in place to ensure that this information reached the public. There were neither formal mechanisms for challenging decisions nor an adequate enforcement mechanism to ensure that decisions were made in a fair and equitable manner. Therefore, priority-setting in Mbarali district did not satisfy all four conditions of Accountability for Reasonableness; namely relevance, publicity, appeals and revision, and enforcement. This paper aims to make two important contributions to this problematic situation. First, it provides empirical analysis of priority-setting at the district level in the contexts of low-income countries. Second, it provides guidance to decision-makers on how to improve fairness, legitimacy, and sustainability of the priority-setting process. PMID

  13. Beyond SMART? A New Framework for Goal Setting

    ERIC Educational Resources Information Center

    Day, Trevor; Tosey, Paul

    2011-01-01

    This article extends currently reported theory and practice in the use of learning goals or targets with students in secondary and further education. Goal-setting and action-planning constructs are employed in personal development plans (PDPs) and personal learning plans (PLPs) and are advocated as practice within the English national policy…

  14. A Framework for Credit. Framework Guidelines 1. Levels, Credit Value and the Award of Credits.

    ERIC Educational Resources Information Center

    Further Education Unit, London (England).

    This document explores the rationale and technical issues underlying the proposal for a common credit framework in Great Britain. This volume, aimed at senior institutional managers, curriculum managers, and practitioners, offers advice on levels, credit value, and award of credit within the framework proposal. A list of terminology is found at…

  15. A contribution to set a legal framework for biofertilisers.

    PubMed

    Malusá, E; Vassilev, N

    2014-08-01

    The extensive research, production and use of microorganisms to improve plant nutrition have resulted in an inconsistent definition of the term "biofertiliser" which, in some cases, is due to the different microbial mechanisms involved. The rationale for adopting the term biofertiliser is that it derives from "biological fertiliser", that, in turn, implies the use of living microorganisms. Here, we propose a definition for this kind of products which is distinguishing them from biostimulants or other inorganic and organic fertilisers. Special emphasis is given to microorganism(s) with multifunctional properties and biofertilisers containing more than one microorganism. This definition could be included in legal provisions regulating registration and marketing requirements. A set of rules is also proposed which could guarantee the quality of biofertilisers present on the market and thus foster their use by farmers. PMID:24903811

  16. Tailoring Healthy Workplace Interventions to Local Healthcare Settings: A Complexity Theory-Informed Workplace of Well-Being Framework

    PubMed Central

    Brand, Sarah L.; Fleming, Lora E.; Wyatt, Katrina M.

    2015-01-01

    Many healthy workplace interventions have been developed for healthcare settings to address the consistently low scores of healthcare professionals on assessments of mental and physical well-being. Complex healthcare settings present challenges for the scale-up and spread of successful interventions from one setting to another. Despite general agreement regarding the importance of the local setting in affecting intervention success across different settings, there is no consensus on what it is about a local setting that needs to be taken into account to design healthy workplace interventions appropriate for different local settings. Complexity theory principles were used to understand a workplace as a complex adaptive system and to create a framework of eight domains (system characteristics) that affect the emergence of system-level behaviour. This Workplace of Well-being (WoW) framework is responsive and adaptive to local settings and allows a shared understanding of the enablers and barriers to behaviour change by capturing local information for each of the eight domains. We use the results of applying the WoW framework to one workplace, a UK National Health Service ward, to describe the utility of this approach in informing design of setting-appropriate healthy workplace interventions that create workplaces conducive to healthy behaviour change. PMID:26380358

  17. Levels of racism: a theoretic framework and a gardener's tale.

    PubMed Central

    Jones, C P

    2000-01-01

    The author presents a theoretic framework for understanding racism on 3 levels: institutionalized, personally mediated, and internalized. This framework is useful for raising new hypotheses about the basis of race-associated differences in health outcomes, as well as for designing effective interventions to eliminate those differences. She then presents an allegory about a gardener with 2 flower boxes, rich and poor soil, and red and pink flowers. This allegory illustrates the relationship between the 3 levels of racism and may guide our thinking about how to intervene to mitigate the impacts of racism on health. It may also serve as a tool for starting a national conversation on racism. PMID:10936998

  18. An efficient MRF embedded level set method for image segmentation.

    PubMed

    Yang, Xi; Gao, Xinbo; Tao, Dacheng; Li, Xuelong; Li, Jie

    2015-01-01

    This paper presents a fast and robust level set method for image segmentation. To enhance the robustness against noise, we embed a Markov random field (MRF) energy function to the conventional level set energy function. This MRF energy function builds the correlation of a pixel with its neighbors and encourages them to fall into the same region. To obtain a fast implementation of the MRF embedded level set model, we explore algebraic multigrid (AMG) and sparse field method (SFM) to increase the time step and decrease the computation domain, respectively. Both AMG and SFM can be conducted in a parallel fashion, which facilitates the processing of our method for big image databases. By comparing the proposed fast and robust level set method with the standard level set method and its popular variants on noisy synthetic images, synthetic aperture radar (SAR) images, medical images, and natural images, we comprehensively demonstrate the new method is robust against various kinds of noises. In particular, the new level set method can segment an image of size 500 × 500 within 3 s on MATLAB R2010b installed in a computer with 3.30-GHz CPU and 4-GB memory. PMID:25420261

  19. A 3D Level Set Method for Microwave Breast Imaging

    PubMed Central

    Colgan, Timothy J.; Hagness, Susan C.; Van Veen, Barry D.

    2015-01-01

    Objective Conventional inverse-scattering algorithms for microwave breast imaging result in moderate resolution images with blurred boundaries between tissues. Recent 2D numerical microwave imaging studies demonstrate that the use of a level set method preserves dielectric boundaries, resulting in a more accurate, higher resolution reconstruction of the dielectric properties distribution. Previously proposed level set algorithms are computationally expensive and thus impractical in 3D. In this paper we present a computationally tractable 3D microwave imaging algorithm based on level sets. Methods We reduce the computational cost of the level set method using a Jacobian matrix, rather than an adjoint method, to calculate Frechet derivatives. We demonstrate the feasibility of 3D imaging using simulated array measurements from 3D numerical breast phantoms. We evaluate performance by comparing full 3D reconstructions to those from a conventional microwave imaging technique. We also quantitatively assess the efficacy of our algorithm in evaluating breast density. Results Our reconstructions of 3D numerical breast phantoms improve upon those of a conventional microwave imaging technique. The density estimates from our level set algorithm are more accurate than those of conventional microwave imaging, and the accuracy is greater than that reported for mammographic density estimation. Conclusion Our level set method leads to a feasible level of computational complexity for full 3D imaging, and reconstructs the heterogeneous dielectric properties distribution of the breast more accurately than conventional microwave imaging methods. Significance 3D microwave breast imaging using a level set method is a promising low-cost, non-ionizing alternative to current breast imaging techniques. PMID:26011863

  20. Hippocampus segmentation using locally weighted prior based level set

    NASA Astrophysics Data System (ADS)

    Achuthan, Anusha; Rajeswari, Mandava

    2015-12-01

    Segmentation of hippocampus in the brain is one of a major challenge in medical image segmentation due to its' imaging characteristics, with almost similar intensity between another adjacent gray matter structure, such as amygdala. The intensity similarity has causes the hippocampus to have weak or fuzzy boundaries. With this main challenge being demonstrated by hippocampus, a segmentation method that relies on image information alone may not produce accurate segmentation results. Therefore, it is needed an assimilation of prior information such as shape and spatial information into existing segmentation method to produce the expected segmentation. Previous studies has widely integrated prior information into segmentation methods. However, the prior information has been utilized through a global manner integration, and this does not reflect the real scenario during clinical delineation. Therefore, in this paper, a locally integrated prior information into a level set model is presented. This work utilizes a mean shape model to provide automatic initialization for level set evolution, and has been integrated as prior information into the level set model. The local integration of edge based information and prior information has been implemented through an edge weighting map that decides at voxel level which information need to be observed during a level set evolution. The edge weighting map shows which corresponding voxels having sufficient edge information. Experiments shows that the proposed integration of prior information locally into a conventional edge-based level set model, known as geodesic active contour has shown improvement of 9% in averaged Dice coefficient.

  1. Exploring the level sets of quantum control landscapes

    SciTech Connect

    Rothman, Adam; Ho, Tak-San; Rabitz, Herschel

    2006-05-15

    A quantum control landscape is defined by the value of a physical observable as a functional of the time-dependent control field E(t) for a given quantum-mechanical system. Level sets through this landscape are prescribed by a particular value of the target observable at the final dynamical time T, regardless of the intervening dynamics. We present a technique for exploring a landscape level set, where a scalar variable s is introduced to characterize trajectories along these level sets. The control fields E(s,t) accomplishing this exploration (i.e., that produce the same value of the target observable for a given system) are determined by solving a differential equation over s in conjunction with the time-dependent Schroedinger equation. There is full freedom to traverse a level set, and a particular trajectory is realized by making an a priori choice for a continuous function f(s,t) that appears in the differential equation for the control field. The continuous function f(s,t) can assume an arbitrary form, and thus a level set generally contains a family of controls, where each control takes the quantum system to the same final target value, but produces a distinct control mechanism. In addition, although the observable value remains invariant over the level set, other dynamical properties (e.g., the degree of robustness to control noise) are not specifically preserved and can vary greatly. Examples are presented to illustrate the continuous nature of level-set controls and their associated induced dynamical features, including continuously morphing mechanisms for population control in model quantum systems.

  2. An Expanded Theoretical Framework of Care Coordination Across Transitions in Care Settings.

    PubMed

    Radwin, Laurel E; Castonguay, Denise; Keenan, Carolyn B; Hermann, Cherice

    2016-01-01

    For many patients, high-quality, patient-centered, and cost-effective health care requires coordination among multiple clinicians and settings. Ensuring optimal care coordination requires a clear understanding of how clinician activities and continuity during transitions affect patient-centeredness and quality outcomes. This article describes an expanded theoretical framework to better understand care coordination. The framework provides clear articulation of concepts. Examples are provided of ways to measure the concepts. PMID:26595361

  3. A Conceptual Framework for a Psychometric Theory for Standard Setting with Examples of Its Use for Evaluating the Functioning of Two Standard Setting Methods

    ERIC Educational Resources Information Center

    Reckase, Mark D.

    2006-01-01

    A conceptual framework is proposed for a psychometric theory of standard setting. The framework suggests that participants in a standard setting process (panelists) develop an internal, intended standard as a result of training and the participant's background. The goal of a standard setting process is to convert panelists' intended standards to…

  4. The exchange boundary framework: understanding the evolution of power within collaborative decision-making settings.

    PubMed

    Watson, Erin R; Foster-Fishman, Pennie G

    2013-03-01

    Many community decision-making bodies encounter challenges in creating conditions where stakeholders from disadvantaged populations can authentically participate in ways that give them actual influence over decisions affecting their lives (Foster-Fishman et al., Lessons for the journey: Strategies and suggestions for guiding planning, governance, and sustainability in comprehensive community initiatives. W.K. Kellogg Foundation, Battle Creek, MI, 2004). These challenges are often rooted in asymmetrical power dynamics operating within the settings (Prilleltensky, J Commun Psychol 36:116-136, 2008). In response, this paper presents the Exchange Boundary Framework, a new approach for understanding and promoting authentic, empowered participation within collaborative decision-making settings. The framework expands upon theories currently used in the field of community psychology by focusing on the underlying processes through which power operates in relationships and examining the evolution of power dynamics over time. By integrating concepts from social exchange theory (Emerson, Am Soc Rev 27:31-41, 1962) and social boundaries theory (Hayward, Polity 31(1):1-22, 1998), the framework situates power within parallel processes of resources exchange and social regulation. The framework can be used to understand the conditions leading to power asymmetries within collaborative decisionmaking processes, and guide efforts to promote more equitable and authentic participation by all stakeholders within these settings. In this paper we describe the Exchange Boundary Framework, apply it to three distinct case studies, and discuss key considerations for its application within collaborative community settings. PMID:22760794

  5. A variational level set approach to multiphase motion

    SciTech Connect

    Zhao, Hong-Kai; Chan, T.; Merriman, B.; Osher, S.

    1996-08-01

    A coupled level set method for the motion of multiple junctions (of, e.g., solid, liquid, and grain boundaries), which follows the gradient flow for an energy functional consisting of surface tension (proportional to length) and bulk energies (proportional to area), is developed. The approach combines the level set method of S. Osher and J. A. Sethian with a theoretical variational formulation of the motion by F. Reitich and H. M. Sonar. The resulting method uses as many level set functions as there are regions and the energy functional is evaluated entirely in terms of level set functions. The gradient projection method leads to a coupled system of perturbed (by curvature terms) Hamilton-Jacobi equations. The coupling is enforced using a single Lagrange multiplier associated with a constraint which essentially prevents (a) regions from overlapping and (b) the development of a vacuum. The numerical implementation is relatively simple and the results agree with (and go beyond) the theory as given in. Other applications of this methodology, including the decomposition of a domain into subregions with minimal interface length, are discussed. Finally, some new techniques and results in level set methodology are presented. 18 refs., 10 figs.

  6. A PDE-Based Fast Local Level Set Method

    NASA Astrophysics Data System (ADS)

    Peng, Danping; Merriman, Barry; Osher, Stanley; Zhao, Hongkai; Kang, Myungjoo

    1999-11-01

    We develop a fast method to localize the level set method of Osher and Sethian (1988, J. Comput. Phys.79, 12) and address two important issues that are intrinsic to the level set method: (a) how to extend a quantity that is given only on the interface to a neighborhood of the interface; (b) how to reset the level set function to be a signed distance function to the interface efficiently without appreciably moving the interface. This fast local level set method reduces the computational effort by one order of magnitude, works in as much generality as the original one, and is conceptually simple and easy to implement. Our approach differs from previous related works in that we extract all the information needed from the level set function (or functions in multiphase flow) and do not need to find explicitly the location of the interface in the space domain. The complexity of our method to do tasks such as extension and distance reinitialization is O(N), where N is the number of points in space, not O(N log N) as in works by Sethian (1996, Proc. Nat. Acad. Sci. 93, 1591) and Helmsen and co-workers (1996, SPIE Microlithography IX, p. 253). This complexity estimation is also valid for quite general geometrically based front motion for our localized method.

  7. Public health and health promotion capacity at national and regional level: a review of conceptual frameworks.

    PubMed

    Aluttis, Christoph; den Broucke, Stephan Van; Chiotan, Cristina; Costongs, Caroline; Michelsen, Kai; Brand, Helmut

    2014-03-26

    The concept of capacity building for public health has gained much attention during the last decade. National as well as international organizations increasingly focus their efforts on capacity building to improve performance in the health sector. During the past two decades, a variety of conceptual frameworks have been developed which describe relevant dimensions for public health capacity. Notably, these frameworks differ in design and conceptualization. This paper therefore reviews the existing conceptual frameworks and integrates them into one framework, which contains the most relevant dimensions for public health capacity at the country- or regional level. A comprehensive literature search was performed to identify frameworks addressing public health capacity building at the national or regional level. We content-analysed these frameworks to identify the core dimensions of public health capacity. The dimensions were subsequently synthesized into a set of thematic areas to construct a conceptual framework which describes the most relevant dimensions for capacities at the national- or regional level. The systematic review resulted in the identification of seven core domains for public health capacity: resources, organizational structures, workforce, partnerships, leadership and governance, knowledge development and country specific context. Accordingly, these dimensions were used to construct a framework, which describes these core domains more in detail. Our research shows that although there is no generally agreedupon model of public health capacity, a number of key domains for public health and health promotion capacity are consistently recurring in existing frameworks, regardless of their geographical location or thematic area. As only little work on the core concepts of public health capacities has yet taken place, this study adds value to the discourse by identifying these consistencies across existing frameworks and by synthesising them into a new

  8. Public Health and Health Promotion Capacity at National and Regional Level: A Review of Conceptual Frameworks

    PubMed Central

    Aluttis, Christoph; den Broucke, Stephan Van; Chiotan, Cristina; Costongs, Caroline; Michelsen, Kai; Brand, Helmut

    2014-01-01

    The concept of capacity building for public health has gained much attention during the last decade. National as well as international organizations increasingly focus their efforts on capacity building to improve performance in the health sector. During the past two decades, a variety of conceptual frameworks have been developed which describe relevant dimensions for public health capacity. Notably, these frameworks differ in design and conceptualization. This paper therefore reviews the existing conceptual frameworks and integrates them into one framework, which contains the most relevant dimensions for public health capacity at the country- or regional level. A comprehensive literature search was performed to identify frameworks addressing public health capacity building at the national or regional level. We content-analysed these frameworks to identify the core dimensions of public health capacity. The dimensions were subsequently synthesized into a set of thematic areas to construct a conceptual framework which describes the most relevant dimensions for capacities at the national- or regional level. The systematic review resulted in the identification of seven core domains for public health capacity: resources, organizational structures, workforce, partnerships, leadership and governance, knowledge development and country specific context. Accordingly, these dimensions were used to construct a framework, which describes these core domains more in detail. Our research shows that although there is no generally agreedupon model of public health capacity, a number of key domains for public health and health promotion capacity are consistently recurring in existing frameworks, regardless of their geographical location or thematic area. As only little work on the core concepts of public health capacities has yet taken place, this study adds value to the discourse by identifying these consistencies across existing frameworks and by synthesising them into a new

  9. Total variation and level set methods in image science

    NASA Astrophysics Data System (ADS)

    Tsai, Yen-Hsi Richard; Osher, Stanley

    We review level set methods and the related techniques that are common in many PDE-based image models. Many of these techniques involve minimizing the total variation of the solution and admit regularizations on the curvature of its level sets. We examine the scope of these techniques in image science, in particular in image segmentation, interpolation, and decomposition, and introduce some relevant level set techniques that are useful for this class of applications. Many of the standard problems are formulated as variational models. We observe increasing synergistic progression of new tools and ideas between the inverse problem community and the `imagers'. We show that image science demands multi-disciplinary knowledge and flexible, but still robust methods. That is why the level set method and total variation methods have become thriving techniques in this field.Our goal is to survey recently developed techniques in various fields of research that are relevant to diverse objectives in image science. We begin by reviewing some typical PDE-based applications in image processing. In typical PDE methods, images are assumed to be continuous functions sampled on a grid. We will show that these methods all share a common feature, which is the emphasis on processing the level lines of the underlying image. The importance of level lines has been known for some time. See, e.g., Alvarez, Guichard, Morel and Lions (1993). This feature places our slightly general definition of the level set method for image science in context. In Section 2 we describe the building blocks of a typical level set method in the continuum setting. Each important task that we need to do is formulated as the solution to certain PDEs. Then, in Section 3, we briefly describe the finite difference methods developed to construct approximate solutions to these PDEs. Some approaches to interpolation into small subdomains of an image are reviewed in Section 4. In Section 5 we describe the Chan

  10. Level-Set Topology Optimization with Aeroelastic Constraints

    NASA Technical Reports Server (NTRS)

    Dunning, Peter D.; Stanford, Bret K.; Kim, H. Alicia

    2015-01-01

    Level-set topology optimization is used to design a wing considering skin buckling under static aeroelastic trim loading, as well as dynamic aeroelastic stability (flutter). The level-set function is defined over the entire 3D volume of a transport aircraft wing box. Therefore, the approach is not limited by any predefined structure and can explore novel configurations. The Sequential Linear Programming (SLP) level-set method is used to solve the constrained optimization problems. The proposed method is demonstrated using three problems with mass, linear buckling and flutter objective and/or constraints. A constraint aggregation method is used to handle multiple buckling constraints in the wing skins. A continuous flutter constraint formulation is used to handle difficulties arising from discontinuities in the design space caused by a switching of the critical flutter mode.

  11. The adoption of the Reference Framework for diabetes care among primary care physicians in primary care settings

    PubMed Central

    Wong, Martin C.S.; Wang, Harry H.X.; Kwan, Mandy W.M.; Chan, Wai Man; Fan, Carmen K.M.; Liang, Miaoyin; Li, Shannon TS; Fung, Franklin D.H.; Yeung, Ming Sze; Chan, David K.L.; Griffiths, Sian M.

    2016-01-01

    Abstract The prevalence of diabetes mellitus has been increasing both globally and locally. Primary care physicians (PCPs) are in a privileged position to provide first contact and continuing care for diabetic patients. A territory-wide Reference Framework for Diabetes Care for Adults has been released by the Hong Kong Primary Care Office in 2010, with the aim to further enhance evidence-based and high quality care for diabetes in the primary care setting through wide adoption of the Reference Framework. A valid questionnaire survey was conducted among PCPs to evaluate the levels of, and the factors associated with, their adoption of the Reference Framework. A total of 414 completed surveys were received with the response rate of 13.0%. The average adoption score was 3.29 (SD 0.51) out of 4. Approximately 70% of PCPs highly adopted the Reference Framework in their routine practice. Binary logistic regression analysis showed that the PCPs perceptions on the inclusion of sufficient local information (adjusted odds ratio [aOR] = 4.748, 95%CI 1.597–14.115, P = 0.005) and reduction of professional autonomy of PCPs (aOR = 1.859, 95%CI 1.013–3.411, P = 0.045) were more likely to influence their adoption level of the Reference Framework for diabetes care in daily practices. The overall level of guideline adoption was found to be relatively high among PCPs for adult diabetes in primary care settings. The adoption barriers identified in this study should be addressed in the continuous updating of the Reference Framework. Strategies need to be considered to enhance the guideline adoption and implementation capacity. PMID:27495018

  12. Bi-directional evolutionary level set method for topology optimization

    NASA Astrophysics Data System (ADS)

    Zhu, Benliang; Zhang, Xianmin; Fatikow, Sergej; Wang, Nianfeng

    2015-03-01

    A bi-directional evolutionary level set method for solving topology optimization problems is presented in this article. The proposed method has three main advantages over the standard level set method. First, new holes can be automatically generated in the design domain during the optimization process. Second, the dependency of the obtained optimized configurations upon the initial configurations is eliminated. Optimized configurations can be obtained even being started from a minimum possible initial guess. Third, the method can be easily implemented and is computationally more efficient. The validity of the proposed method is tested on the mean compliance minimization problem and the compliant mechanisms topology optimization problem.

  13. Set-membership identification and fault detection using a Bayesian framework

    NASA Astrophysics Data System (ADS)

    Fernández-Cantí, Rosa M.; Blesa, Joaquim; Puig, Vicenç; Tornil-Sin, Sebastian

    2016-05-01

    This paper deals with the problem of set-membership identification and fault detection using a Bayesian framework. The paper presents how the set-membership model estimation problem can be reformulated from the Bayesian viewpoint in order to, first, determine the feasible parameter set in the identification stage and, second, check the consistency between the measurement data and the model in the fault-detection stage. The paper shows that, assuming uniform distributed measurement noise and uniform model prior probability distributions, the Bayesian approach leads to the same feasible parameter set than the well-known set-membership technique based on approximating the feasible parameter set using sets. Additionally, it can deal with models that are nonlinear in the parameters. The single-output and multiple-output cases are addressed as well. The procedure and results are illustrated by means of the application to a quadruple-tank process.

  14. The ICF: A Framework for Setting Goals for Children with Speech Impairment

    ERIC Educational Resources Information Center

    McLeod, Sharynne; Bleile, Ken

    2004-01-01

    The International Classification of Functioning, Disability and Health (ICF) (World Health Organization, 2001) is proposed as a framework for integrative goal setting for children with speech impairment. The ICF incorporates both impairment and social factors to consider when selecting appropriate goals to bring about change in the lives of…

  15. Counselors' Job Satisfaction across Education Levels, Settings, and Specialties

    ERIC Educational Resources Information Center

    Gambrell, Crista E.

    2010-01-01

    This study examined counselor satisfaction across education levels (Masters and Doctorate), work settings (private practice and institutions), and specializations (mental health counselors, school counselors, counselor educators, and creative arts/other counselors). Counselors were surveyed counseling professionals across these variables to…

  16. Geologic setting of the low-level burial grounds

    SciTech Connect

    Lindsey, K.A.; Jaeger, G.K.; Slate, J.L.; Swett, K.J.; Mercer, R.B.

    1994-10-13

    This report describes the regional and site specific geology of the Hanford Sites low-level burial grounds in the 200 East and West Areas. The report incorporates data from boreholes across the entire 200 Areas, integrating the geology of this area into a single framework. Geologic cross-sections, isopach maps, and structure contour maps of all major geological units from the top of the Columbia River Basalt Group to the surface are included. The physical properties and characteristics of the major suprabasalt sedimentary units also are discussed.

  17. Developing a pressure ulcer risk factor minimum data set and risk assessment framework

    PubMed Central

    Coleman, Susanne; Nelson, E Andrea; Keen, Justin; Wilson, Lyn; McGinnis, Elizabeth; Dealey, Carol; Stubbs, Nikki; Muir, Delia; Farrin, Amanda; Dowding, Dawn; Schols, Jos MGA; Cuddigan, Janet; Berlowitz, Dan; Jude, Edward; Vowden, Peter; Bader, Dan L; Gefen, Amit; Oomens, Cees WJ; Schoonhoven, Lisette; Nixon, Jane

    2014-01-01

    Aim To agree a draft pressure ulcer risk factor Minimum Data Set to underpin the development of a new evidenced-based Risk Assessment Framework. Background A recent systematic review identified the need for a pressure ulcer risk factor Minimum Data Set and development and validation of an evidenced-based pressure ulcer Risk Assessment Framework. This was undertaken through the Pressure UlceR Programme Of reSEarch (RP-PG-0407-10056), funded by the National Institute for Health Research and incorporates five phases. This article reports phase two, a consensus study. Design Consensus study. Method A modified nominal group technique based on the Research and Development/University of California at Los Angeles appropriateness method. This incorporated an expert group, review of the evidence and the views of a Patient and Public Involvement service user group. Data were collected December 2010–December 2011. Findings The risk factors and assessment items of the Minimum Data Set (including immobility, pressure ulcer and skin status, perfusion, diabetes, skin moisture, sensory perception and nutrition) were agreed. In addition, a draft Risk Assessment Framework incorporating all Minimum Data Set items was developed, comprising a two stage assessment process (screening and detailed full assessment) and decision pathways. Conclusion The draft Risk Assessment Framework will undergo further design and pre-testing with clinical nurses to assess and improve its usability. It will then be evaluated in clinical practice to assess its validity and reliability. The Minimum Data Set could be used in future for large scale risk factor studies informing refinement of the Risk Assessment Framework. PMID:24845398

  18. High-fidelity interface tracking in compressible flows: Unlimited anchored adaptive level set

    NASA Astrophysics Data System (ADS)

    Nourgaliev, R. R.; Theofanous, T. G.

    2007-06-01

    The interface-capturing-fidelity issue of the level set method is addressed wholly within the Eulerian framework. Our aim is for a practical and efficient way to realize the expected benefits of grid resolution and high order schemes. Based on a combination of structured adaptive mesh refinement (SAMR), rather than quad/octrees, and on high-order spatial discretization, rather than the use of Lagrangian particles, our method is tailored to compressible flows, while it provides a potentially useful alternative to the particle level set (PLS) for incompressible flows. Interesting salient features of our method include (a) avoidance of limiting (in treating the Hamiltonian of the level set equation), (b) anchoring the level set in a manner that ensures no drift and no spurious oscillations of the zero level during PDE-reinitialization, and (c) a non-linear tagging procedure for defining the neighborhood of the interface subject to mesh refinement. Numerous computational results on a set of benchmark problems (strongly deforming, stretching and tearing interfaces) demonstrate that with this approach, implemented up to 11th order accuracy, the level set method becomes essentially free of mass conservation errors and also free of parasitic interfacial oscillations, while it is still highly efficient, and convenient for 3D parallel implementation. In addition, demonstration of performance in fully-coupled simulations is presented for multimode Rayleigh-Taylor instability (low-Mach number regime) and shock-induced, bubble-collapse (highly compressible regime).

  19. A Framework for Translating a High Level Security Policy into Low Level Security Mechanisms

    NASA Astrophysics Data System (ADS)

    Hassan, Ahmed A.; Bahgat, Waleed M.

    2010-01-01

    Security policies have different components; firewall, active directory, and IDS are some examples of these components. Enforcement of network security policies to low level security mechanisms faces some essential difficulties. Consistency, verification, and maintenance are the major ones of these difficulties. One approach to overcome these difficulties is to automate the process of translation of high level security policy into low level security mechanisms. This paper introduces a framework of an automation process that translates a high level security policy into low level security mechanisms. The framework is described in terms of three phases; in the first phase all network assets are categorized according to their roles in the network security and relations between them are identified to constitute the network security model. This proposed model is based on organization based access control (OrBAC). However, the proposed model extend the OrBAC model to include not only access control policy but also some other administrative security policies like auditing policy. Besides, the proposed model enables matching of each rule of the high level security policy with the corresponding ones of the low level security policy. Through the second phase of the proposed framework, the high level security policy is mapped into the network security model. The second phase could be considered as a translation of the high level security policy into an intermediate model level. Finally, the intermediate model level is translated automatically into low level security mechanism. The paper illustrates the applicability of proposed approach through an application example.

  20. Settings for health promotion: an analytic framework to guide intervention design and implementation.

    PubMed

    Poland, Blake; Krupa, Gene; McCall, Douglas

    2009-10-01

    Taking a settings approach to health promotion means addressing the contexts within which people live, work, and play and making these the object of inquiry and intervention as well as the needs and capacities of people to be found in different settings. This approach can increase the likelihood of success because it offers opportunities to situate practice in its context. Members of the setting can optimize interventions for specific contextual contingencies, target crucial factors in the organizational context influencing behavior, and render settings themselves more health promoting. A number of attempts have been made to systematize evidence regarding the effectiveness of interventions in different types of settings (e.g., school-based health promotion, community development). Few, if any, attempts have been made to systematically develop a template or framework for analyzing those features of settings that should influence intervention design and delivery. This article lays out the core elements of such a framework in the form of a nested series of questions to guide analysis. Furthermore, it offers advice on additional considerations that should be taken into account when operationalizing a settings approach in the field. PMID:19809004

  1. Conceptual framework for indexing visual information at multiple levels

    NASA Astrophysics Data System (ADS)

    Jaimes, Alejandro; Chang, Shih-Fu

    1999-12-01

    In this paper, we present a conceptual framework for indexing different aspects of visual information. Our framework unifies concepts from this literature in diverse fields such as cognitive psychology, library sciences, art, and the more recent content-based retrieval. We present multiple level structures for visual and non-visual and non- visual information. The ten-level visual structure presented provides a systematic way of indexing images based on syntax and semantics, and includes distinctions between general concept and visual concept. We define different types of relations at different levels of the visual structure, and also use a semantic information table to summarize important aspects related to an image. While the focus is on the development of a conceptual indexing structure, our aim is also to bring together the knowledge from various fields, unifying the issues that should be considered when building a digital image library. Our analysis stresses the limitations of state of the art content-based retrieval systems and suggests areas in which improvements are necessary.

  2. Variational and Shape Prior-based Level Set Model for Image Segmentation

    SciTech Connect

    Diop, El Hadji S.; Jerbi, Taha; Burdin, Valerie

    2010-09-30

    A new image segmentation model based on level sets approach is presented herein. We deal with radiographic medical images where boundaries are not salient, and objects of interest have the same gray level as other structures in the image. Thus, an a priori information about the shape we look for is integrated in the level set evolution for good segmentation results. The proposed model also accounts a penalization term that forces the level set to be close to a signed distance function (SDF), which then avoids the re-initialization procedure. In addition, a variant and complete Mumford-Shah model is used in our functional; the added Hausdorff measure helps to better handle zones where boundaries are occluded or not salient. Finally, a weighted area term is added to the functional to make the level set drive rapidly to object's boundaries. The segmentation model is formulated in a variational framework, which, thanks to calculus of variations, yields to partial differential equations (PDEs) to guide the level set evolution. Results obtained on both synthetic and digital radiographs reconstruction (DRR) show that the proposed model improves on existing prior and non-prior shape based image segmentation.

  3. Improvements to Level Set, Immersed Boundary methods for Interface Tracking

    NASA Astrophysics Data System (ADS)

    Vogl, Chris; Leveque, Randy

    2014-11-01

    It is not uncommon to find oneself solving a moving boundary problem under flow in the context of some application. Of particular interest is when the moving boundary exerts a curvature-dependent force on the liquid. Such a force arises when observing a boundary that is resistant to bending or has surface tension. Numerically speaking, stable numerical computation of the curvature can be difficult as it is often described in terms of high-order derivatives of either marker particle positions or of a level set function. To address this issue, the level set method is modified to track not only the position of the boundary, but the curvature as well. The definition of the signed-distance function that is used to modify the level set method is also used to develop an interpolation-free, closest-point method. These improvements are used to simulate a bending-resistant, inextensible boundary under shear flow to highlight area and volume conservation, as well as stable curvature calculation. Funded by a NSF MSPRF grant.

  4. Priority-setting in healthcare: a framework for reasonable clinical judgements.

    PubMed

    Baerøe, K

    2009-08-01

    What are the criteria for reasonable clinical judgements? The reasonableness of macro-level decision-making has been much discussed, but little attention has been paid to the reasonableness of applying guidelines generated at a macro-level to individual cases. This paper considers a framework for reasonable clinical decision-making that will capture cases where relevant guidelines cannot reasonably be followed. There are three main sections. (1) Individual claims on healthcare from the point of view of concerns about equity are analysed. (2) The demands of responsibility and equity on professional clinical performance are discussed, and how the combination of these demands emerges into seven requirements that constitute the framework is explored. Since this framework is developed to assist in reasonable clinical decision-making, practical implications of all these requirements are also suggested. (3) Challenges concerning the framework are discussed. First, a crucial presumption that the framework relies upon is considered-namely, clinicians' willingness to justify their decisions as requested. Then how public deliberation may influence clinical decision-making is discussed. Next is a consideration of how clinicians' need to have confidence in their own judgements in order to perform in a manner worthy of trust would be compatible with adherence to the framework supported by public deliberation. It is concluded that fair distribution in the interplay between macro- and micro-level considerations can be secured by legitimising procedures on each level, by ensuring well-organised and continuing public debate and by basing individual clinical judgements upon well-justified and principled normative bases. PMID:19644007

  5. A linear optimal transportation framework for quantifying and visualizing variations in sets of images

    PubMed Central

    Wang, Wei; Slepčev, Dejan; Basu, Saurav; Ozolek, John A.

    2012-01-01

    Transportation-based metrics for comparing images have long been applied to analyze images, especially where one can interpret the pixel intensities (or derived quantities) as a distribution of ‘mass’ that can be transported without strict geometric constraints. Here we describe a new transportation-based framework for analyzing sets of images. More specifically, we describe a new transportation-related distance between pairs of images, which we denote as linear optimal transportation (LOT). The LOT can be used directly on pixel intensities, and is based on a linearized version of the Kantorovich-Wasserstein metric (an optimal transportation distance, as is the earth mover’s distance). The new framework is especially well suited for computing all pairwise distances for a large database of images efficiently, and thus it can be used for pattern recognition in sets of images. In addition, the new LOT framework also allows for an isometric linear embedding, greatly facilitating the ability to visualize discriminant information in different classes of images. We demonstrate the application of the framework to several tasks such as discriminating nuclear chromatin patterns in cancer cells, decoding differences in facial expressions, galaxy morphologies, as well as sub cellular protein distributions. PMID:23729991

  6. A level set method for materials with texturally equilibrated pores

    NASA Astrophysics Data System (ADS)

    Ghanbarzadeh, Soheil; Hesse, Marc A.; Prodanović, Maša

    2015-09-01

    Textural equilibrium controls the distribution of the liquid phase in many naturally occurring porous materials such as partially molten rocks and alloys, salt-brine and ice-water systems. In these materials, pore geometry evolves to minimize the solid-liquid interfacial energy while maintaining a constant dihedral angle, θ, at solid-liquid contact lines. We present a level set method to compute an implicit representation of the liquid-solid interface in textural equilibrium with space-filling tessellations of multiple solid grains in three dimensions. Each grain is represented by a separate level set function and interfacial energy minimization is achieved by evolving the solid-liquid interface under surface diffusion to constant mean curvature surface. The liquid volume and dihedral angle constraints are added to the formulation using virtual convective and normal velocity terms. This results in an initial value problem for a system of non-linear coupled PDEs governing the evolution of the level sets for each grain, using the implicit representation of the solid grains as initial condition. A domain decomposition scheme is devised to restrict the computational domain of each grain to few grid points around the grain. The coupling between the interfaces is achieved in a higher level on the original computational domain. The spatial resolution of the discretization is improved through high-order spatial differentiation schemes and localization of computations through domain composition. Examples of three-dimensional solutions are also obtained for different grain distributions networks that illustrate the geometric flexibility of the method.

  7. Coupled level set segmentation using a point-based statistical shape model relying on correspondence probabilities

    NASA Astrophysics Data System (ADS)

    Hufnagel, Heike; Ehrhardt, Jan; Pennec, Xavier; Schmidt-Richberg, Alexander; Handels, Heinz

    2010-03-01

    In this article, we propose a unified statistical framework for image segmentation with shape prior information. The approach combines an explicitely parameterized point-based probabilistic statistical shape model (SSM) with a segmentation contour which is implicitly represented by the zero level set of a higher dimensional surface. These two aspects are unified in a Maximum a Posteriori (MAP) estimation where the level set is evolved to converge towards the boundary of the organ to be segmented based on the image information while taking into account the prior given by the SSM information. The optimization of the energy functional obtained by the MAP formulation leads to an alternate update of the level set and an update of the fitting of the SSM. We then adapt the probabilistic SSM for multi-shape modeling and extend the approach to multiple-structure segmentation by introducing a level set function for each structure. During segmentation, the evolution of the different level set functions is coupled by the multi-shape SSM. First experimental evaluations indicate that our method is well suited for the segmentation of topologically complex, non spheric and multiple-structure shapes. We demonstrate the effectiveness of the method by experiments on kidney segmentation as well as on hip joint segmentation in CT images.

  8. Level set method for image segmentation based on moment competition

    NASA Astrophysics Data System (ADS)

    Min, Hai; Wang, Xiao-Feng; Huang, De-Shuang; Jin, Jing; Wang, Hong-Zhi; Li, Hai

    2015-05-01

    We propose a level set method for image segmentation which introduces the moment competition and weakly supervised information into the energy functional construction. Different from the region-based level set methods which use force competition, the moment competition is adopted to drive the contour evolution. Here, a so-called three-point labeling scheme is proposed to manually label three independent points (weakly supervised information) on the image. Then the intensity differences between the three points and the unlabeled pixels are used to construct the force arms for each image pixel. The corresponding force is generated from the global statistical information of a region-based method and weighted by the force arm. As a result, the moment can be constructed and incorporated into the energy functional to drive the evolving contour to approach the object boundary. In our method, the force arm can take full advantage of the three-point labeling scheme to constrain the moment competition. Additionally, the global statistical information and weakly supervised information are successfully integrated, which makes the proposed method more robust than traditional methods for initial contour placement and parameter setting. Experimental results with performance analysis also show the superiority of the proposed method on segmenting different types of complicated images, such as noisy images, three-phase images, images with intensity inhomogeneity, and texture images.

  9. Toward automatic computer aided dental X-ray analysis using level set method.

    PubMed

    Li, Shuo; Fevens, Thomas; Krzyzak, Adam; Jin, Chao; Li, Song

    2005-01-01

    A Computer Aided Dental X-rays Analysis (CADXA) framework is proposed to semi-automatically detect areas of bone loss and root decay in digital dental X-rays. In this framework, first, a new proposed competitive coupled level set method is proposed to segment the image into three pathologically meaningful regions using two coupled level set functions. Tailored for the dental clinical environment, the segmentation stage uses a trained support vector machine (SVM) classifier to provide initial contours. Then, based on the segmentation results, an analysis scheme is applied. First, the scheme builds an uncertainty map from which those areas with bone loss will be automatically detected. Secondly, the scheme employs a method based on the SVM and the average intensity profile to isolate the teeth and detect root decay. Experimental results show that our proposed framework is able to automatically detect the areas of bone loss and, when given the orientation of the teeth, it is able to automatically detect the root decay with a seriousness level marked for diagnosis. PMID:16685904

  10. Discrete Optimization with Polynomially Detectable Boundaries and Restricted Level Sets

    NASA Astrophysics Data System (ADS)

    Zinder, Yakov; Memar, Julia; Singh, Gaurav

    The paper describes an optimization procedure for a class of discrete optimization problems which is defined by certain properties of the boundary of the feasible region and level sets of the objective function. It is shown that these properties are possessed, for example, by various scheduling problems, including a number of well-known NP-hard problems which play an important role in scheduling theory. For an important particular case the presented optimization procedure is compared with a version of the branch-and-bound algorithm by means of computational experiments.

  11. A geometric level set model for ultrasounds analysis

    SciTech Connect

    Sarti, A.; Malladi, R.

    1999-10-01

    We propose a partial differential equation (PDE) for filtering and segmentation of echocardiographic images based on a geometric-driven scheme. The method allows edge-preserving image smoothing and a semi-automatic segmentation of the heart chambers, that regularizes the shapes and improves edge fidelity especially in presence of distinct gaps in the edge map as is common in ultrasound imagery. A numerical scheme for solving the proposed PDE is borrowed from level set methods. Results on human in vivo acquired 2D, 2D+time,3D, 3D+time echocardiographic images are shown.

  12. The Augmented Fast Marching Method for Level Set Reinitialization

    NASA Astrophysics Data System (ADS)

    Salac, David

    2011-11-01

    The modeling of multiphase fluid flows typically requires accurate descriptions of the interface and curvature of the interface. Here a new reinitialization technique based on the fast marching method for gradient-augmented level sets is presented. The method is explained and results in both 2D and 3D are presented. Overall the method is more accurate than reinitialization methods based on similar stencils and the resulting curvature fields are much smoother. The method will also be demonstrated in a sample application investigating the dynamic behavior of vesicles in general fluid flows. Support provided by University at Buffalo - SUNY.

  13. Automatic segmentation of right ventricle on ultrasound images using sparse matrix transform and level set

    NASA Astrophysics Data System (ADS)

    Qin, Xulei; Cong, Zhibin; Halig, Luma V.; Fei, Baowei

    2013-03-01

    An automatic framework is proposed to segment right ventricle on ultrasound images. This method can automatically segment both epicardial and endocardial boundaries from a continuous echocardiography series by combining sparse matrix transform (SMT), a training model, and a localized region based level set. First, the sparse matrix transform extracts main motion regions of myocardium as eigenimages by analyzing statistical information of these images. Second, a training model of right ventricle is registered to the extracted eigenimages in order to automatically detect the main location of the right ventricle and the corresponding transform relationship between the training model and the SMT-extracted results in the series. Third, the training model is then adjusted as an adapted initialization for the segmentation of each image in the series. Finally, based on the adapted initializations, a localized region based level set algorithm is applied to segment both epicardial and endocardial boundaries of the right ventricle from the whole series. Experimental results from real subject data validated the performance of the proposed framework in segmenting right ventricle from echocardiography. The mean Dice scores for both epicardial and endocardial boundaries are 89.1%+/-2.3% and 83.6+/-7.3%, respectively. The automatic segmentation method based on sparse matrix transform and level set can provide a useful tool for quantitative cardiac imaging.

  14. XFEM schemes for level set based structural optimization

    NASA Astrophysics Data System (ADS)

    Li, Li; Wang, Michael Yu; Wei, Peng

    2012-12-01

    In this paper, some elegant extended finite element method (XFEM) schemes for level set method structural optimization are proposed. Firstly, two-dimension (2D) and three-dimension (3D) XFEM schemes with partition integral method are developed and numerical examples are employed to evaluate their accuracy, which indicate that an accurate analysis result can be obtained on the structural boundary. Furthermore, the methods for improving the computational accuracy and efficiency of XFEM are studied, which include the XFEM integral scheme without quadrature sub-cells and higher order element XFEM scheme. Numerical examples show that the XFEM scheme without quadrature sub-cells can yield similar accuracy of structural analysis while prominently reducing the time cost and that higher order XFEM elements can improve the computational accuracy of structural analysis in the boundary elements, but the time cost is increasing. Therefore, the balance of time cost between FE system scale and the order of element needs to be discussed. Finally, the reliability and advantages of the proposed XFEM schemes are illustrated with several 2D and 3D mean compliance minimization examples that are widely used in the recent literature of structural topology optimization. All numerical results demonstrate that the proposed XFEM is a promising structural analysis approach for structural optimization with the level set method.

  15. Variational level set segmentation for forest based on MCMC sampling

    NASA Astrophysics Data System (ADS)

    Yang, Tie-Jun; Huang, Lin; Jiang, Chuan-xian; Nong, Jian

    2014-11-01

    Environmental protection is one of the themes of today's world. The forest is a recycler of carbon dioxide and natural oxygen bar. Protection of forests, monitoring of forest growth is long-term task of environmental protection. It is very important to automatically statistic the forest coverage rate using optical remote sensing images and the computer, by which we can timely understand the status of the forest of an area, and can be freed from tedious manual statistics. Towards the problem of computational complexity of the global optimization using convexification, this paper proposes a level set segmentation method based on Markov chain Monte Carlo (MCMC) sampling and applies it to forest segmentation in remote sensing images. The presented method needs not to do any convexity transformation for the energy functional of the goal, and uses MCMC sampling method with global optimization capability instead. The possible local minima occurring by using gradient descent method is also avoided. There are three major contributions in the paper. Firstly, by using MCMC sampling, the convexity of the energy functional is no longer necessary and global optimization can still be achieved. Secondly, taking advantage of the data (texture) and knowledge (a priori color) to guide the construction of Markov chain, the convergence rate of Markov chains is improved significantly. Finally, the level set segmentation method by integrating a priori color and texture for forest is proposed. The experiments show that our method can efficiently and accurately segment forest in remote sensing images.

  16. A conceptual framework of computations in mid-level vision

    PubMed Central

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P.

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words—or, rather, descriptors—capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations. PMID:25566044

  17. A level-set method for interfacial flows with surfactant

    NASA Astrophysics Data System (ADS)

    Xu, Jian-Jun; Li, Zhilin; Lowengrub, John; Zhao, Hongkai

    2006-03-01

    A level-set method for the simulation of fluid interfaces with insoluble surfactant is presented in two-dimensions. The method can be straightforwardly extended to three-dimensions and to soluble surfactants. The method couples a semi-implicit discretization for solving the surfactant transport equation recently developed by Xu and Zhao [J. Xu, H. Zhao. An Eulerian formulation for solving partial differential equations along a moving interface, J. Sci. Comput. 19 (2003) 573-594] with the immersed interface method originally developed by LeVeque and Li and [R. LeVeque, Z. Li. The immersed interface method for elliptic equations with discontinuous coefficients and singular sources, SIAM J. Numer. Anal. 31 (1994) 1019-1044] for solving the fluid flow equations and the Laplace-Young boundary conditions across the interfaces. Novel techniques are developed to accurately conserve component mass and surfactant mass during the evolution. Convergence of the method is demonstrated numerically. The method is applied to study the effects of surfactant on single drops, drop-drop interactions and interactions among multiple drops in Stokes flow under a steady applied shear. Due to Marangoni forces and to non-uniform Capillary forces, the presence of surfactant results in larger drop deformations and more complex drop-drop interactions compared to the analogous cases for clean drops. The effects of surfactant are found to be most significant in flows with multiple drops. To our knowledge, this is the first time that the level-set method has been used to simulate fluid interfaces with surfactant.

  18. Statistics of dark matter halos in the excursion set peak framework

    SciTech Connect

    Lapi, A.; Danese, L. E-mail: danese@sissa.it

    2014-07-01

    We derive approximated, yet very accurate analytical expressions for the abundance and clustering properties of dark matter halos in the excursion set peak framework; the latter relies on the standard excursion set approach, but also includes the effects of a realistic filtering of the density field, a mass-dependent threshold for collapse, and the prescription from peak theory that halos tend to form around density maxima. We find that our approximations work excellently for diverse power spectra, collapse thresholds and density filters. Moreover, when adopting a cold dark matter power spectra, a tophat filtering and a mass-dependent collapse threshold (supplemented with conceivable scatter), our approximated halo mass function and halo bias represent very well the outcomes of cosmological N-body simulations.

  19. Comprehensive evaluation of long-term hydrological data sets: Constraints of the Budyko framework

    NASA Astrophysics Data System (ADS)

    Greve, Peter; Orlowsky, Boris; Seneviratne, Sonia I.

    2013-04-01

    An accurate estimate of the climatological land water balance is essential for a wide range of socio-economical issues. Despite the simplicity of the underlying water balance equation, its individual variables are of complex nature. Global estimates, either derived from observations or from models, of precipitation (P ) and especially evapotranspiration (ET) are characterized by high uncertainties. This leads to inconsistent results in determining conditions related to the land water balance and its components. In this study, we consider the Budyko framework as a constraint to evaluate long-term hydrological data sets within the period from 1984 to 2005. The Budyko framework is a well established empirically based relationsship between ET-P and Ep-P , with Ep being the potential evaporation. We use estimates of ET associated with the LandFlux-EVAL initiative (Mueller et. al., 2012), either derived from observations, CMIP5 models or land-surface models (LSMs) driven with observation-based forcing or atmospheric reanalyses. Data sets of P comprise all commonly used global observation-based estimates. Ep is determined by methods of differing complexity with recent global temperature and radiation data sets. Based on this comprehensive synthesis of data sets and methods to determine Ep, more than 2000 possible combinations for ET-P in conjunction with Ep-P are created. All combinations are validated against the Budyko curve and against physical limits within the Budyko phase space. For this purpose we develop an error measure based on the root mean square error which combines both constraints. We find that uncertainties are mainly induced by the ET data sets. In particular, reanalysis and CMIP5 data sets are characterized by low realism. The realism of LSMs is further not primarily controlled by the forcing, as different LSMs driven with the same forcing show significantly different error measures. Our comprehensive approach is thus suitable to detect uncertainties

  20. Powerful Set-Based Gene-Environment Interaction Testing Framework for Complex Diseases.

    PubMed

    Jiao, Shuo; Peters, Ulrike; Berndt, Sonja; Bézieau, Stéphane; Brenner, Hermann; Campbell, Peter T; Chan, Andrew T; Chang-Claude, Jenny; Lemire, Mathieu; Newcomb, Polly A; Potter, John D; Slattery, Martha L; Woods, Michael O; Hsu, Li

    2015-12-01

    Identification of gene-environment interaction (G × E) is important in understanding the etiology of complex diseases. Based on our previously developed Set Based gene EnviRonment InterAction test (SBERIA), in this paper we propose a powerful framework for enhanced set-based G × E testing (eSBERIA). The major challenge of signal aggregation within a set is how to tell signals from noise. eSBERIA tackles this challenge by adaptively aggregating the interaction signals within a set weighted by the strength of the marginal and correlation screening signals. eSBERIA then combines the screening-informed aggregate test with a variance component test to account for the residual signals. Additionally, we develop a case-only extension for eSBERIA (coSBERIA) and an existing set-based method, which boosts the power not only by exploiting the G-E independence assumption but also by avoiding the need to specify main effects for a large number of variants in the set. Through extensive simulation, we show that coSBERIA and eSBERIA are considerably more powerful than existing methods within the case-only and the case-control method categories across a wide range of scenarios. We conduct a genome-wide G × E search by applying our methods to Illumina HumanExome Beadchip data of 10,446 colorectal cancer cases and 10,191 controls and identify two novel interactions between nonsteroidal anti-inflammatory drugs (NSAIDs) and MINK1 and PTCHD3. PMID:26095235

  1. Implementing accountability for reasonableness framework at district level in Tanzania: a realist evaluation

    PubMed Central

    2011-01-01

    Background Despite the growing importance of the Accountability for Reasonableness (A4R) framework in priority setting worldwide, there is still an inadequate understanding of the processes and mechanisms underlying its influence on legitimacy and fairness, as conceived and reflected in service management processes and outcomes. As a result, the ability to draw scientifically sound lessons for the application of the framework to services and interventions is limited. This paper evaluates the experiences of implementing the A4R approach in Mbarali District, Tanzania, in order to find out how the innovation was shaped, enabled, and constrained by the interaction between contexts, mechanisms and outcomes. Methods This study draws on the principles of realist evaluation -- a largely qualitative approach, chiefly concerned with testing and refining programme theories by exploring the complex interactions of contexts, mechanisms, and outcomes. Mixed methods were used in data collection, including individual interviews, non-participant observation, and document reviews. A thematic framework approach was adopted for the data analysis. Results The study found that while the A4R approach to priority setting was helpful in strengthening transparency, accountability, stakeholder engagement, and fairness, the efforts at integrating it into the current district health system were challenging. Participatory structures under the decentralisation framework, central government's call for partnership in district-level planning and priority setting, perceived needs of stakeholders, as well as active engagement between researchers and decision makers all facilitated the adoption and implementation of the innovation. In contrast, however, limited local autonomy, low level of public awareness, unreliable and untimely funding, inadequate accountability mechanisms, and limited local resources were the major contextual factors that hampered the full implementation. Conclusion This study

  2. Towards a Dynamic Conceptual Framework for English-Medium Education in Multilingual University Settings

    ERIC Educational Resources Information Center

    Dafouz, Emma; Smit, Ute

    2016-01-01

    At a time of increasing internationalization in tertiary education, English-Medium Education in Multilingual University Settings (EMEMUS) has become a common practice. While there is already ample research describing this phenomenon at a local level (Smit and Dafouz 2012a), the theoretical side needs to be elaborated. This article thus aims to…

  3. An Analysis Framework Addressing the Scale and Legibility of Large Scientific Data Sets

    SciTech Connect

    Childs, H R

    2006-11-20

    Much of the previous work in the large data visualization area has solely focused on handling the scale of the data. This task is clearly a great challenge and necessary, but it is not sufficient. Applying standard visualization techniques to large scale data sets often creates complicated pictures where meaningful trends are lost. A second challenge, then, is to also provide algorithms that simplify what an analyst must understand, using either visual or quantitative means. This challenge can be summarized as improving the legibility or reducing the complexity of massive data sets. Fully meeting both of these challenges is the work of many, many PhD dissertations. In this dissertation, we describe some new techniques to address both the scale and legibility challenges, in hope of contributing to the larger solution. In addition to our assumption of simultaneously addressing both scale and legibility, we add an additional requirement that the solutions considered fit well within an interoperable framework for diverse algorithms, because a large suite of algorithms is often necessary to fully understand complex data sets. For scale, we present a general architecture for handling large data, as well as details of a contract-based system for integrating advanced optimizations into a data flow network design. We also describe techniques for volume rendering and performing comparisons at the extreme scale. For legibility, we present several techniques. Most noteworthy are equivalence class functions, a technique to drive visualizations using statistical methods, and line-scan based techniques for characterizing shape.

  4. Profile Evolution Simulation in Etching Systems Using Level Set Methods

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Govindan, T. R.; Meyyappan, M.

    1998-01-01

    Semiconductor device profiles are determined by the characteristics of both etching and deposition processes. In particular, a highly anisotropic etch is required to achieve vertical sidewalls. However, etching is comprised of both anisotropic and isotropic components, due to ion and neutral fluxes, respectively. In Ar/Cl2 plasmas, for example, neutral chlorine reacts with the Si surfaces to form silicon chlorides. These compounds are then removed by the impinging ion fluxes. Hence the directionality of the ions (and thus the ion angular distribution function, or IAD), as well as the relative fluxes of neutrals and ions determines the amount of undercutting. One method of modeling device profile evolution is to simulate the moving solid-gas interface between the semiconductor and the plasma as a string of nodes. The velocity of each node is calculated and then the nodes are advanced accordingly. Although this technique appears to be relatively straightforward, extensive looping schemes are required at the profile corners. An alternate method is to use level set theory, which involves embedding the location of the interface in a field variable. The normal speed is calculated at each mesh point, and the field variable is updated. The profile comers are more accurately modeled as the need for looping algorithms is eliminated. The model we have developed is a 2-D Level Set Profile Evolution Simulation (LSPES). The LSPES calculates etch rates of a substrate in low pressure plasmas due to the incident ion and neutral fluxes. For a Si substrate in an Ar/C12 gas mixture, for example, the predictions of the LSPES are identical to those from a string evolution model for high neutral fluxes and two different ion angular distributions.(2) In the figure shown, the relative neutral to ion flux in the bulk plasma is 100 to 1. For a moderately isotropic ion angular distribution function as shown in the cases in the left hand column, both the LSPES (top row) and rude's string

  5. Automatic segmentation of right ventricular ultrasound images using sparse matrix transform and a level set

    NASA Astrophysics Data System (ADS)

    Qin, Xulei; Cong, Zhibin; Fei, Baowei

    2013-11-01

    An automatic segmentation framework is proposed to segment the right ventricle (RV) in echocardiographic images. The method can automatically segment both epicardial and endocardial boundaries from a continuous echocardiography series by combining sparse matrix transform, a training model, and a localized region-based level set. First, the sparse matrix transform extracts main motion regions of the myocardium as eigen-images by analyzing the statistical information of the images. Second, an RV training model is registered to the eigen-images in order to locate the position of the RV. Third, the training model is adjusted and then serves as an optimized initialization for the segmentation of each image. Finally, based on the initializations, a localized, region-based level set algorithm is applied to segment both epicardial and endocardial boundaries in each echocardiograph. Three evaluation methods were used to validate the performance of the segmentation framework. The Dice coefficient measures the overall agreement between the manual and automatic segmentation. The absolute distance and the Hausdorff distance between the boundaries from manual and automatic segmentation were used to measure the accuracy of the segmentation. Ultrasound images of human subjects were used for validation. For the epicardial and endocardial boundaries, the Dice coefficients were 90.8 ± 1.7% and 87.3 ± 1.9%, the absolute distances were 2.0 ± 0.42 mm and 1.79 ± 0.45 mm, and the Hausdorff distances were 6.86 ± 1.71 mm and 7.02 ± 1.17 mm, respectively. The automatic segmentation method based on a sparse matrix transform and level set can provide a useful tool for quantitative cardiac imaging.

  6. Automatic segmentation of right ventricular ultrasound images using sparse matrix transform and a level set

    PubMed Central

    Qin, Xulei; Cong, Zhibin; Fei, Baowei

    2013-01-01

    An automatic segmentation framework is proposed to segment the right ventricle (RV) in echocardiographic images. The method can automatically segment both epicardial and endocardial boundaries from a continuous echocardiography series by combining sparse matrix transform, a training model, and a localized region-based level set. First, the sparse matrix transform extracts main motion regions of the myocardium as eigen-images by analyzing the statistical information of the images. Second, an RV training model is registered to the eigen-images in order to locate the position of the RV. Third, the training model is adjusted and then serves as an optimized initialization for the segmentation of each image. Finally, based on the initializations, a localized, region-based level set algorithm is applied to segment both epicardial and endocardial boundaries in each echocardiograph. Three evaluation methods were used to validate the performance of the segmentation framework. The Dice coefficient measures the overall agreement between the manual and automatic segmentation. The absolute distance and the Hausdorff distance between the boundaries from manual and automatic segmentation were used to measure the accuracy of the segmentation. Ultrasound images of human subjects were used for validation. For the epicardial and endocardial boundaries, the Dice coefficients were 90.8 ± 1.7% and 87.3 ± 1.9%, the absolute distances were 2.0 ± 0.42 mm and 1.79 ± 0.45 mm, and the Hausdorff distances were 6.86 ± 1.71 mm and 7.02 ± 1.17 mm, respectively. The automatic segmentation method based on a sparse matrix transform and level set can provide a useful tool for quantitative cardiac imaging. PMID:24107618

  7. Some free boundary problems in potential flow regime usinga based level set method

    SciTech Connect

    Garzon, M.; Bobillo-Ares, N.; Sethian, J.A.

    2008-12-09

    Recent advances in the field of fluid mechanics with moving fronts are linked to the use of Level Set Methods, a versatile mathematical technique to follow free boundaries which undergo topological changes. A challenging class of problems in this context are those related to the solution of a partial differential equation posed on a moving domain, in which the boundary condition for the PDE solver has to be obtained from a partial differential equation defined on the front. This is the case of potential flow models with moving boundaries. Moreover the fluid front will possibly be carrying some material substance which will diffuse in the front and be advected by the front velocity, as for example the use of surfactants to lower surface tension. We present a Level Set based methodology to embed this partial differential equations defined on the front in a complete Eulerian framework, fully avoiding the tracking of fluid particles and its known limitations. To show the advantages of this approach in the field of Fluid Mechanics we present in this work one particular application: the numerical approximation of a potential flow model to simulate the evolution and breaking of a solitary wave propagating over a slopping bottom and compare the level set based algorithm with previous front tracking models.

  8. Distance regularized two level sets for segmentation of left and right ventricles from cine-MRI.

    PubMed

    Liu, Yu; Captur, Gabriella; Moon, James C; Guo, Shuxu; Yang, Xiaoping; Zhang, Shaoxiang; Li, Chunming

    2016-06-01

    This paper presents a new level set method for segmentation of cardiac left and right ventricles. We extend the edge based distance regularized level set evolution (DRLSE) model in Li et al. (2010) to a two-level-set formulation, with the 0-level set and k-level set representing the endocardium and epicardium, respectively. The extraction of endocardium and epicardium is obtained as a result of the interactive curve evolution of the 0 and k level sets derived from the proposed variational level set formulation. The initialization of the level set function in the proposed two-level-set DRLSE model is generated from roughly located endocardium, which can be performed by applying the original DRLSE model. Experimental results have demonstrated the effectiveness of the proposed two-level-set DRLSE model. PMID:26740057

  9. INSTITUTIONALIZING SAFEGUARDS-BY-DESIGN: HIGH-LEVEL FRAMEWORK

    SciTech Connect

    Trond Bjornard PhD; Joseph Alexander; Robert Bean; Brian Castle; Scott DeMuth, Ph.D.; Phillip Durst; Michael Ehinger; Prof. Michael Golay, Ph.D.; Kevin Hase, Ph.D.; David J. Hebditch, DPhil; John Hockert, Ph.D.; Bruce Meppen; James Morgan; Jerry Phillips, Ph.D., PE

    2009-02-01

    The application of a Safeguards-by-Design (SBD) process for new nuclear facilities can reduce proliferation risks. A multi-laboratory team was sponsored in Fiscal Year (FY) 2008 to define a SBD process and determine how it could be incorporated into existing facility design and construction processes. The possibility to significantly influence major design features, such as process selection and plant layout, largely ends with the conceptual design step. Therefore SBD’s principal focus must be on the early inclusion of safeguards requirements and the early identification of beneficial design features. The result could help form the basis for a new international norm for integrating safeguards into facility design. This is an interim report describing progress and project status as of the end of FY08. In this effort, SBD is defined as a structured approach to ensure the timely, efficient, and cost-effective integration of international and national safeguards, physical security, and other nonproliferation objectives into the overall design process for a nuclear facility. A key objective is to ensure that security and nonproliferation issues are considered when weighing facility design alternatives. Central to the work completed in FY08 was a study in which a SBD process was developed in the context of the current DOE facility acquisition process. The DOE study enabled the development of a “SBD design loop” that is suitable for use in any facility design process. It is a graded, iterative process that incorporates safeguards concerns throughout the conceptual, preliminary and final design processes. Additionally, a set of proposed design principles for SBD was developed. A “Generic SBD Process” was then developed. Key features of the process include the initiation of safeguards design activities in the pre-conceptual planning phase, early incorporation of safeguards requirements into the project requirements, early appointment of an SBD team, and

  10. Setting background nutrient levels for coastal waters with oceanic influences

    NASA Astrophysics Data System (ADS)

    Smith, Alastair F.; Fryer, Rob J.; Webster, Lynda; Berx, Bee; Taylor, Alison; Walsham, Pamela; Turrell, William R.

    2014-05-01

    Nutrient enrichment of coastal water bodies as a result of human activities can lead to ecological changes. As part of a strategy to monitor such changes and detect potential eutrophication, samples were collected during research cruises conducted around the Scottish coast each January over the period 2007-2013. Data were obtained for total oxidised nitrogen (TOxN; nitrite and nitrate), phosphate and silicate, and incorporated into data-driven spatial models. Spatial averages in defined sea areas were calculated for each year in order to study inter-annual variability and systematic trends over time. Variation between some years was found to be significant (p < 0.05) but no evidence was found for any trends over the time period studied. This may have been due to the relatively short time series considered here. Modelled distributions were developed using data from groups of years (2007-2009, 2010-2011 and 2012-2013) and compared to the OSPAR Ecological Quality Objectives (EcoQOs) for dissolved inorganic nitrogen (DIN; the concentration of TOxN and ammonia), the ratio of DIN to dissolved inorganic phosphorous (N/P) and the ratio of DIN to dissolved silicate (N/S). In these three models, TOxN was below the offshore background concentration of 10 μM (12 μM at coastal locations) over more than 50% of the modelled area while N/S exceeded the upper assessment criterion of 2 over more than 50% of the modelled area. In the 2007-2009 model, N/P was below the background ratio (16) over the entire modelled area. In the 2010-2011 model the N/P ratio exceeded the background in 91% of the modelled area but remained below the upper assessment criterion (24). Scottish shelf sea waters were found to be depleted in TOxN relative to oceanic waters. This was not accounted for in the development of background values for the OSPAR EcoQOs so new estimates of these background values were derived. The implications of these results for setting reasonable background nutrient levels when

  11. Parallel Computation of the Topology of Level Sets

    SciTech Connect

    Pascucci, V; Cole-McLaughlin, K

    2004-12-16

    This paper introduces two efficient algorithms that compute the Contour Tree of a 3D scalar field F and its augmented version with the Betti numbers of each isosurface. The Contour Tree is a fundamental data structure in scientific visualization that is used to preprocess the domain mesh to allow optimal computation of isosurfaces with minimal overhead storage. The Contour Tree can also be used to build user interfaces reporting the complete topological characterization of a scalar field, as shown in Figure 1. Data exploration time is reduced since the user understands the evolution of level set components with changing isovalue. The Augmented Contour Tree provides even more accurate information segmenting the range space of the scalar field in portion of invariant topology. The exploration time for a single isosurface is also improved since its genus is known in advance. Our first new algorithm augments any given Contour Tree with the Betti numbers of all possible corresponding isocontours in linear time with the size of the tree. Moreover we show how to extend the scheme introduced in [3] with the Betti number computation without increasing its complexity. Thus, we improve on the time complexity from our previous approach [10] from O(m log m) to O(n log n + m), where m is the number of cells and n is the number of vertices in the domain of F. Our second contribution is a new divide-and-conquer algorithm that computes the Augmented Contour Tree with improved efficiency. The approach computes the output Contour Tree by merging two intermediate Contour Trees and is independent of the interpolant. In this way we confine any knowledge regarding a specific interpolant to an independent function that computes the tree for a single cell. We have implemented this function for the trilinear interpolant and plan to replace it with higher order interpolants when needed. The time complexity is O(n + t log n), where t is the number of critical points of F. For the first time

  12. A Relation Extraction Framework for Biomedical Text Using Hybrid Feature Set.

    PubMed

    Muzaffar, Abdul Wahab; Azam, Farooque; Qamar, Usman

    2015-01-01

    The information extraction from unstructured text segments is a complex task. Although manual information extraction often produces the best results, it is harder to manage biomedical data extraction manually because of the exponential increase in data size. Thus, there is a need for automatic tools and techniques for information extraction in biomedical text mining. Relation extraction is a significant area under biomedical information extraction that has gained much importance in the last two decades. A lot of work has been done on biomedical relation extraction focusing on rule-based and machine learning techniques. In the last decade, the focus has changed to hybrid approaches showing better results. This research presents a hybrid feature set for classification of relations between biomedical entities. The main contribution of this research is done in the semantic feature set where verb phrases are ranked using Unified Medical Language System (UMLS) and a ranking algorithm. Support Vector Machine and Naïve Bayes, the two effective machine learning techniques, are used to classify these relations. Our approach has been validated on the standard biomedical text corpus obtained from MEDLINE 2001. Conclusively, it can be articulated that our framework outperforms all state-of-the-art approaches used for relation extraction on the same corpus. PMID:26347797

  13. A Relation Extraction Framework for Biomedical Text Using Hybrid Feature Set

    PubMed Central

    Muzaffar, Abdul Wahab; Azam, Farooque; Qamar, Usman

    2015-01-01

    The information extraction from unstructured text segments is a complex task. Although manual information extraction often produces the best results, it is harder to manage biomedical data extraction manually because of the exponential increase in data size. Thus, there is a need for automatic tools and techniques for information extraction in biomedical text mining. Relation extraction is a significant area under biomedical information extraction that has gained much importance in the last two decades. A lot of work has been done on biomedical relation extraction focusing on rule-based and machine learning techniques. In the last decade, the focus has changed to hybrid approaches showing better results. This research presents a hybrid feature set for classification of relations between biomedical entities. The main contribution of this research is done in the semantic feature set where verb phrases are ranked using Unified Medical Language System (UMLS) and a ranking algorithm. Support Vector Machine and Naïve Bayes, the two effective machine learning techniques, are used to classify these relations. Our approach has been validated on the standard biomedical text corpus obtained from MEDLINE 2001. Conclusively, it can be articulated that our framework outperforms all state-of-the-art approaches used for relation extraction on the same corpus. PMID:26347797

  14. A universal surface complexation framework for modeling proton binding onto bacterial surfaces in geologic settings

    USGS Publications Warehouse

    Borrok, D.; Turner, B.F.; Fein, J.B.

    2005-01-01

    Adsorption onto bacterial cell walls can significantly affect the speciation and mobility of aqueous metal cations in many geologic settings. However, a unified thermodynamic framework for describing bacterial adsorption reactions does not exist. This problem originates from the numerous approaches that have been chosen for modeling bacterial surface protonation reactions. In this study, we compile all currently available potentiometric titration datasets for individual bacterial species, bacterial consortia, and bacterial cell wall components. Using a consistent, four discrete site, non-electrostatic surface complexation model, we determine total functional group site densities for all suitable datasets, and present an averaged set of 'universal' thermodynamic proton binding and site density parameters for modeling bacterial adsorption reactions in geologic systems. Modeling results demonstrate that the total concentrations of proton-active functional group sites for the 36 bacterial species and consortia tested are remarkably similar, averaging 3.2 ?? 1.0 (1??) ?? 10-4 moles/wet gram. Examination of the uncertainties involved in the development of proton-binding modeling parameters suggests that ignoring factors such as bacterial species, ionic strength, temperature, and growth conditions introduces relatively small error compared to the unavoidable uncertainty associated with the determination of cell abundances in realistic geologic systems. Hence, we propose that reasonable estimates of the extent of bacterial cell wall deprotonation can be made using averaged thermodynamic modeling parameters from all of the experiments that are considered in this study, regardless of bacterial species used, ionic strength, temperature, or growth condition of the experiment. The average site densities for the four discrete sites are 1.1 ?? 0.7 ?? 10-4, 9.1 ?? 3.8 ?? 10-5, 5.3 ?? 2.1 ?? 10-5, and 6.6 ?? 3.0 ?? 10-5 moles/wet gram bacteria for the sites with pKa values of 3

  15. Research on classified real-time flood forecasting framework based on K-means cluster and rough set.

    PubMed

    Xu, Wei; Peng, Yong

    2015-01-01

    This research presents a new classified real-time flood forecasting framework. In this framework, historical floods are classified by a K-means cluster according to the spatial and temporal distribution of precipitation, the time variance of precipitation intensity and other hydrological factors. Based on the classified results, a rough set is used to extract the identification rules for real-time flood forecasting. Then, the parameters of different categories within the conceptual hydrological model are calibrated using a genetic algorithm. In real-time forecasting, the corresponding category of parameters is selected for flood forecasting according to the obtained flood information. This research tests the new classified framework on Guanyinge Reservoir and compares the framework with the traditional flood forecasting method. It finds that the performance of the new classified framework is significantly better in terms of accuracy. Furthermore, the framework can be considered in a catchment with fewer historical floods. PMID:26442493

  16. Simultaneous segmentation and reconstruction: A level set method approach for limited view computed tomography

    SciTech Connect

    Yoon, Sungwon; Pineda, Angel R.; Fahrig, Rebecca

    2010-05-15

    Purpose: An iterative tomographic reconstruction algorithm that simultaneously segments and reconstructs the reconstruction domain is proposed and applied to tomographic reconstructions from a sparse number of projection images. Methods: The proposed algorithm uses a two-phase level set method segmentation in conjunction with an iterative tomographic reconstruction to achieve simultaneous segmentation and reconstruction. The simultaneous segmentation and reconstruction is achieved by alternating between level set function evolutions and per-region intensity value updates. To deal with the limited number of projections, a priori information about the reconstruction is enforced via penalized likelihood function. Specifically, smooth function within each region (piecewise smooth function) and bounded function intensity values for each region are assumed. Such a priori information is formulated into a quadratic objective function with linear bound constraints. The level set function evolutions are achieved by artificially time evolving the level set function in the negative gradient direction; the intensity value updates are achieved by using the gradient projection conjugate gradient algorithm. Results: The proposed simultaneous segmentation and reconstruction results were compared to ''conventional'' iterative reconstruction (with no segmentation), iterative reconstruction followed by segmentation, and filtered backprojection. Improvements of 6%-13% in the normalized root mean square error were observed when the proposed algorithm was applied to simulated projections of a numerical phantom and to real fan-beam projections of the Catphan phantom, both of which did not satisfy the a priori assumptions. Conclusions: The proposed simultaneous segmentation and reconstruction resulted in improved reconstruction image quality. The algorithm correctly segments the reconstruction space into regions, preserves sharp edges between different regions, and smoothes the noise

  17. Shared Investment Projects and Forecasting Errors: Setting Framework Conditions for Coordination and Sequencing Data Quality Activities

    PubMed Central

    Leitner, Stephan; Brauneis, Alexander; Rausch, Alexandra

    2015-01-01

    In this paper, we investigate the impact of inaccurate forecasting on the coordination of distributed investment decisions. In particular, by setting up a computational multi-agent model of a stylized firm, we investigate the case of investment opportunities that are mutually carried out by organizational departments. The forecasts of concern pertain to the initial amount of money necessary to launch and operate an investment opportunity, to the expected intertemporal distribution of cash flows, and the departments’ efficiency in operating the investment opportunity at hand. We propose a budget allocation mechanism for coordinating such distributed decisions The paper provides guidance on how to set framework conditions, in terms of the number of investment opportunities considered in one round of funding and the number of departments operating one investment opportunity, so that the coordination mechanism is highly robust to forecasting errors. Furthermore, we show that—in some setups—a certain extent of misforecasting is desirable from the firm’s point of view as it supports the achievement of the corporate objective of value maximization. We then address the question of how to improve forecasting quality in the best possible way, and provide policy advice on how to sequence activities for improving forecasting quality so that the robustness of the coordination mechanism to errors increases in the best possible way. At the same time, we show that wrong decisions regarding the sequencing can lead to a decrease in robustness. Finally, we conduct a comprehensive sensitivity analysis and prove that—in particular for relatively good forecasters—most of our results are robust to changes in setting the parameters of our multi-agent simulation model. PMID:25803736

  18. Shared investment projects and forecasting errors: setting framework conditions for coordination and sequencing data quality activities.

    PubMed

    Leitner, Stephan; Brauneis, Alexander; Rausch, Alexandra

    2015-01-01

    In this paper, we investigate the impact of inaccurate forecasting on the coordination of distributed investment decisions. In particular, by setting up a computational multi-agent model of a stylized firm, we investigate the case of investment opportunities that are mutually carried out by organizational departments. The forecasts of concern pertain to the initial amount of money necessary to launch and operate an investment opportunity, to the expected intertemporal distribution of cash flows, and the departments' efficiency in operating the investment opportunity at hand. We propose a budget allocation mechanism for coordinating such distributed decisions The paper provides guidance on how to set framework conditions, in terms of the number of investment opportunities considered in one round of funding and the number of departments operating one investment opportunity, so that the coordination mechanism is highly robust to forecasting errors. Furthermore, we show that-in some setups-a certain extent of misforecasting is desirable from the firm's point of view as it supports the achievement of the corporate objective of value maximization. We then address the question of how to improve forecasting quality in the best possible way, and provide policy advice on how to sequence activities for improving forecasting quality so that the robustness of the coordination mechanism to errors increases in the best possible way. At the same time, we show that wrong decisions regarding the sequencing can lead to a decrease in robustness. Finally, we conduct a comprehensive sensitivity analysis and prove that-in particular for relatively good forecasters-most of our results are robust to changes in setting the parameters of our multi-agent simulation model. PMID:25803736

  19. A Graphical Framework for Specification of Clinical Guidelines at Multiple Representation Levels

    PubMed Central

    Shalom, Erez; Shahar, Yuval

    2005-01-01

    Formalization of a clinical guideline for purposes of automated application and quality assessment mainly involves conversion of its free-text representation into a machine comprehensible representation, i.e., a formal language, thus enabling automated support. The main issues involved in this process are related to the collaboration between the expert physician and the knowledge engineer. We introduce GESHER - a graphical framework for specification of clinical guidelines at multiple representation levels. The GESHER architecture facilitates incremental specification through a set of views adapted to each representation level, enabling this process to proceed smoothly and in a transparent fashion, fostering extensive collaboration among the various types of users. The GESHER framework supports specification of guidelines at multiple representation levels, in more than one specification language, and uses the DeGeL digital guideline library architecture as its knowledge base. The GESHER architecture also uses a temporal abstraction knowledge base to store its declarative knowledge, and a standard medical-vocabularies server for generic specification of key terms, thus enabling reuse of the specification at multiple sites. PMID:16779126

  20. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the

  1. Measuring Afterschool Program Quality Using Setting-Level Observational Approaches

    ERIC Educational Resources Information Center

    Oh, Yoonkyung; Osgood, D. Wayne; Smith, Emilie P.

    2015-01-01

    The importance of afterschool hours for youth development is widely acknowledged, and afterschool settings have recently received increasing attention as an important venue for youth interventions, bringing a growing need for reliable and valid measures of afterschool quality. This study examined the extent to which the two observational tools,…

  2. Measuring afterschool program quality using setting-level observational approaches

    PubMed Central

    Oh, Yoonkyung; Osgood, D. Wayne; Smith, Emilie Phillips

    2016-01-01

    As the importance of afterschool hours for youth development is widely acknowledged, afterschool settings have recently received increasing attention as an important venue for youth interventions. A range of intervention programs have been in place, generally aiming at positive youth development through enhancing the quality of programs. A growing need has thus arisen for reliable and valid measures of afterschool quality. This study examined the extent to which the two observational tools, i.e., Caregiver Interaction Scales (CIS) and Promising Practices Rating Scales (PPRS), could serve as reliable and valid tools for assessing the various dimensions of afterschool setting quality. The study shows the potential promise of the instruments, on the one hand, and suggests future directions for improvement of measurement design and development of the field, on the other hand. In particular, our findings suggest the importance of addressing the effect of day-to-day fluctuations in observed afterschool quality. PMID:26819487

  3. Joint Infrared Target Recognition and Segmentation Using a Shape Manifold-Aware Level Set

    PubMed Central

    Yu, Liangjiang; Fan, Guoliang; Gong, Jiulu; Havlicek, Joseph P.

    2015-01-01

    We propose new techniques for joint recognition, segmentation and pose estimation of infrared (IR) targets. The problem is formulated in a probabilistic level set framework where a shape constrained generative model is used to provide a multi-class and multi-view shape prior and where the shape model involves a couplet of view and identity manifolds (CVIM). A level set energy function is then iteratively optimized under the shape constraints provided by the CVIM. Since both the view and identity variables are expressed explicitly in the objective function, this approach naturally accomplishes recognition, segmentation and pose estimation as joint products of the optimization process. For realistic target chips, we solve the resulting multi-modal optimization problem by adopting a particle swarm optimization (PSO) algorithm and then improve the computational efficiency by implementing a gradient-boosted PSO (GB-PSO). Evaluation was performed using the Military Sensing Information Analysis Center (SENSIAC) ATR database, and experimental results show that both of the PSO algorithms reduce the cost of shape matching during CVIM-based shape inference. Particularly, GB-PSO outperforms other recent ATR algorithms, which require intensive shape matching, either explicitly (with pre-segmentation) or implicitly (without pre-segmentation). PMID:25938202

  4. An Evaluation of the High Level Architecture (HLA) as a Framework for NASA Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reid, Michael R.; Powers, Edward I. (Technical Monitor)

    2000-01-01

    The High Level Architecture (HLA) is a current US Department of Defense and an industry (IEEE-1516) standard architecture for modeling and simulations. It provides a framework and set of functional rules and common interfaces for integrating separate and disparate simulators into a larger simulation. The goal of the HLA is to reduce software costs by facilitating the reuse of simulation components and by providing a runtime infrastructure to manage the simulations. In order to evaluate the applicability of the HLA as a technology for NASA space mission simulations, a Simulations Group at Goddard Space Flight Center (GSFC) conducted a study of the HLA and developed a simple prototype HLA-compliant space mission simulator. This paper summarizes the prototyping effort and discusses the potential usefulness of the HLA in the design and planning of future NASA space missions with a focus on risk mitigation and cost reduction.

  5. The adoption of the Reference Framework for diabetes care among primary care physicians in primary care settings: A cross-sectional study.

    PubMed

    Wong, Martin C S; Wang, Harry H X; Kwan, Mandy W M; Chan, Wai Man; Fan, Carmen K M; Liang, Miaoyin; Li, Shannon Ts; Fung, Franklin D H; Yeung, Ming Sze; Chan, David K L; Griffiths, Sian M

    2016-08-01

    The prevalence of diabetes mellitus has been increasing both globally and locally. Primary care physicians (PCPs) are in a privileged position to provide first contact and continuing care for diabetic patients. A territory-wide Reference Framework for Diabetes Care for Adults has been released by the Hong Kong Primary Care Office in 2010, with the aim to further enhance evidence-based and high quality care for diabetes in the primary care setting through wide adoption of the Reference Framework.A valid questionnaire survey was conducted among PCPs to evaluate the levels of, and the factors associated with, their adoption of the Reference Framework.A total of 414 completed surveys were received with the response rate of 13.0%. The average adoption score was 3.29 (SD 0.51) out of 4. Approximately 70% of PCPs highly adopted the Reference Framework in their routine practice. Binary logistic regression analysis showed that the PCPs perceptions on the inclusion of sufficient local information (adjusted odds ratio [aOR] = 4.748, 95%CI 1.597-14.115, P = 0.005) and reduction of professional autonomy of PCPs (aOR = 1.859, 95%CI 1.013-3.411, P = 0.045) were more likely to influence their adoption level of the Reference Framework for diabetes care in daily practices.The overall level of guideline adoption was found to be relatively high among PCPs for adult diabetes in primary care settings. The adoption barriers identified in this study should be addressed in the continuous updating of the Reference Framework. Strategies need to be considered to enhance the guideline adoption and implementation capacity. PMID:27495018

  6. Structural engineering masters level education framework of knowledge for the needs of initial professional practice

    NASA Astrophysics Data System (ADS)

    Balogh, Zsuzsa Enriko

    For at least the last decade, engineering, civil engineering, along with structural engineering as a profession within civil engineering, have and continue to face an emerging need for "Raising the Bar" of preparedness of young engineers seeking to become practicing professional engineers. The present consensus of the civil engineering profession is that the increasing need for broad and in-depth knowledge should require the young structural engineers to have at least a Masters-Level education. This study focuses on the Masters-Level preparedness in the structural engineering area within the civil engineering field. It follows much of the methodology used in the American Society of Civil Engineers (ASCE) Body of Knowledge determination for civil engineering and extends this type of study to better define the portion of the young engineers preparation beyond the undergraduate program for one specialty area of civil engineering. The objective of this research was to create a Framework of Knowledge for the young engineer which identifies and recognizes the needs of the profession, along with the profession's expectations of how those needs can be achieved in the graduate-level academic setting, in the practice environment, and through lifelong learning opportunities with an emphasis on the initial five years experience past completion of a Masters program in structural engineering. This study applied a modified Delphi method to obtain the critical information from members of the structural engineering profession. The results provide a Framework of Knowledge which will be useful to several groups seeking to better ensure the preparedness of the future young structural engineers at the Masters-Level.

  7. A novel framework for assessing metadata quality in epidemiological and public health research settings

    PubMed Central

    McMahon, Christiana; Denaxas, Spiros

    2016-01-01

    Metadata are critical in epidemiological and public health research. However, a lack of biomedical metadata quality frameworks and limited awareness of the implications of poor quality metadata renders data analyses problematic. In this study, we created and evaluated a novel framework to assess metadata quality of epidemiological and public health research datasets. We performed a literature review and surveyed stakeholders to enhance our understanding of biomedical metadata quality assessment. The review identified 11 studies and nine quality dimensions; none of which were specifically aimed at biomedical metadata. 96 individuals completed the survey; of those who submitted data, most only assessed metadata quality sometimes, and eight did not at all. Our framework has four sections: a) general information; b) tools and technologies; c) usability; and d) management and curation. We evaluated the framework using three test cases and sought expert feedback. The framework can assess biomedical metadata quality systematically and robustly. PMID:27570670

  8. High-level waste tank farm set point document

    SciTech Connect

    Anthony, J.A. III

    1995-01-15

    Setpoints for nuclear safety-related instrumentation are required for actions determined by the design authorization basis. Minimum requirements need to be established for assuring that setpoints are established and held within specified limits. This document establishes the controlling methodology for changing setpoints of all classifications. The instrumentation under consideration involve the transfer, storage, and volume reduction of radioactive liquid waste in the F- and H-Area High-Level Radioactive Waste Tank Farms. The setpoint document will encompass the PROCESS AREA listed in the Safety Analysis Report (SAR) (DPSTSA-200-10 Sup 18) which includes the diversion box HDB-8 facility. In addition to the PROCESS AREAS listed in the SAR, Building 299-H and the Effluent Transfer Facility (ETF) are also included in the scope.

  9. Telemedicine: what framework, what levels of proof, implementation rules.

    PubMed

    Zannad, Faiez; Maugendre, Philippe; Audry, Antoine; Avril, Carole; Blaise, Lucile; Blin, Olivier; Burnel, Philippe; Falise-Mirat, Béatrice; Girault, Danièle; Giri, Isabelle; Goehrs, Jean-Marie; Lassale, Catherine; Le Meur, Roland; Leurent, Pierre; Ratignier-Carbonneil, Christelle; Rossignol, Patrick; Satonnet, Evelyne; Simon, Pierre; Treluyer, Laurent

    2014-01-01

    The concept of telemedicine was formalised in France in the 2009 "Hospital, patients, health territories" (loi hôpital, patients, santé, territoire) law and the 2010 decree through which it was applied. Many experiments have been carried out and the regulatory institutions (Ministry, Regional Health Agency [Agence régionale de santé, ARS], French National Health Authority [Haute autorité de santé, HAS], etc.) have issued various guidance statements and recommendations on its organisation and on the expectations of its evaluation. With this background, the round table wanted to produce recommendations on different areas of medical telemonitoring (the role of telemonitoring, the regulatory system, the principles for assessment, methods of use and conditions for sustained and seamless deployment). Whilst many studies carried out on new medical telemonitoring approaches have led to the postulate that it offers benefit, both clinically and in terms of patient quality of life, more information is needed to demonstrate its impact on the organisation of healthcare and the associated medico-economic benefit (criteria, methods, resources). Similarly, contractual frameworks for deployment of telemonitoring do exist, although they are complicated and involve many different stakeholders (Director General fo the Care Offering [Direction générale de l'offre de soins, DGOS], ARS, HAS, Agency for Shared Health Information Systems [Agence des systèmes d'information partagés de santé, ASIP], French National Data Protection Commission [Commission nationale informatique et libertés, CNIL], French National Medical Council [Conseil national de l'Ordre des médecins, CNOM], etc.) that would benefit from a shared approach and seamless exchange between the partners involved. The current challenge is also to define the conditions required to validate a stable economic model in order to promote organisational change. One topical issue is placing the emphasis on its evaluation and

  10. Investigating the Experience of Outdoor and Adventurous Project Work in an Educational Setting Using a Self-Determination Framework

    ERIC Educational Resources Information Center

    Sproule, John; Martindale, Russell; Wang, John; Allison, Peter; Nash, Christine; Gray, Shirley

    2013-01-01

    The purpose of this study was to carry out a preliminary investigation to explore the use of outdoor and adventurous project work (PW) within an educational setting. Specifically, differences between the PW and normal academic school experiences were examined using a self-determination theory framework integrated with a goal orientation and…

  11. A Conceptual Framework for Educational Design at Modular Level to Promote Transfer of Learning

    ERIC Educational Resources Information Center

    Botma, Yvonne; Van Rensburg, G. H.; Coetzee, I. M.; Heyns, T.

    2015-01-01

    Students bridge the theory-practice gap when they apply in practice what they have learned in class. A conceptual framework was developed that can serve as foundation to design for learning transfer at modular level. The framework is based on an adopted and adapted systemic model of transfer of learning, existing learning theories, constructive…

  12. Toppled television sets and head injuries in the pediatric population: a framework for prevention.

    PubMed

    Cusimano, Michael D; Parker, Nadine

    2016-01-01

    Injuries to children caused by falling televisions have become more frequent during the last decade. These injuries can be severe and even fatal and are likely to become even more common in the future as TVs increase in size and become more affordable. To formulate guidelines for the prevention of these injuries, the authors systematically reviewed the literature on injuries related to toppling televisions. The authors searched MEDLINE, PubMed, Embase, Scopus, CINAHL (Cumulative Index to Nursing and Allied Health Literature), Cochrane Library, and Google Scholar according to the Cochrane guidelines for all studies involving children 0-18 years of age who were injured by toppled TVs. Factors contributing to injury were categorized using Haddon's Matrix, and the public health approach was used as a framework for developing strategies to prevent these injuries. The vast majority (84%) of the injuries occurred in homes and more than three-fourths were unwitnessed by adult caregivers. The TVs were most commonly large and elevated off the ground. Dressers and other furniture not designed to support TVs were commonly involved in the TV-toppling incident. The case fatality rate varies widely, but almost all deaths reported (96%) were due to brain injuries. Toddlers between the ages of 1 and 3 years most frequently suffer injuries to the head and neck, and they are most likely to suffer severe injuries. Many of these injuries require brain imaging and neurosurgical intervention. Prevention of these injuries will require changes in TV design and legislation as well as increases in public education and awareness. Television-toppling injuries can be easily prevented; however, the rates of injury do not reflect a sufficient level of awareness, nor do they reflect an acceptable effort from an injury prevention perspective. PMID:26416669

  13. Bushmeat genetics: setting up a reference framework for the DNA typing of African forest bushmeat.

    PubMed

    Gaubert, Philippe; Njiokou, Flobert; Olayemi, Ayodeji; Pagani, Paolo; Dufour, Sylvain; Danquah, Emmanuel; Nutsuakor, Mac Elikem K; Ngua, Gabriel; Missoup, Alain-Didier; Tedesco, Pablo A; Dernat, Rémy; Antunes, Agostinho

    2015-05-01

    The bushmeat trade in tropical Africa represents illegal, unsustainable off-takes of millions of tons of wild game - mostly mammals - per year. We sequenced four mitochondrial gene fragments (cyt b, COI, 12S, 16S) in >300 bushmeat items representing nine mammalian orders and 59 morphological species from five western and central African countries (Guinea, Ghana, Nigeria, Cameroon and Equatorial Guinea). Our objectives were to assess the efficiency of cross-species PCR amplification and to evaluate the usefulness of our multilocus approach for reliable bushmeat species identification. We provide a straightforward amplification protocol using a single 'universal' primer pair per gene that generally yielded >90% PCR success rates across orders and was robust to different types of meat preprocessing and DNA extraction protocols. For taxonomic identification, we set up a decision pipeline combining similarity- and tree-based approaches with an assessment of taxonomic expertise and coverage of the GENBANK database. Our multilocus approach permitted us to: (i) adjust for existing taxonomic gaps in GENBANK databases, (ii) assign to the species level 67% of the morphological species hypotheses and (iii) successfully identify samples with uncertain taxonomic attribution (preprocessed carcasses and cryptic lineages). High levels of genetic polymorphism across genes and taxa, together with the excellent resolution observed among species-level clusters (neighbour-joining trees and Klee diagrams) advocate the usefulness of our markers for bushmeat DNA typing. We formalize our DNA typing decision pipeline through an expert-curated query database - DNA BUSHMEAT - that shall permit the automated identification of African forest bushmeat items. PMID:25264212

  14. A variational approach to multi-phase motion of gas, liquid and solid based on the level set method

    NASA Astrophysics Data System (ADS)

    Yokoi, Kensuke

    2009-07-01

    We propose a simple and robust numerical algorithm to deal with multi-phase motion of gas, liquid and solid based on the level set method [S. Osher, J.A. Sethian, Front propagating with curvature-dependent speed: Algorithms based on Hamilton-Jacobi formulation, J. Comput. Phys. 79 (1988) 12; M. Sussman, P. Smereka, S. Osher, A level set approach for capturing solution to incompressible two-phase flow, J. Comput. Phys. 114 (1994) 146; J.A. Sethian, Level Set Methods and Fast Marching Methods, Cambridge University Press, 1999; S. Osher, R. Fedkiw, Level Set Methods and Dynamics Implicit Surface, Applied Mathematical Sciences, vol. 153, Springer, 2003]. In Eulerian framework, to simulate interaction between a moving solid object and an interfacial flow, we need to define at least two functions (level set functions) to distinguish three materials. In such simulations, in general two functions overlap and/or disagree due to numerical errors such as numerical diffusion. In this paper, we resolved the problem using the idea of the active contour model [M. Kass, A. Witkin, D. Terzopoulos, Snakes: active contour models, International Journal of Computer Vision 1 (1988) 321; V. Caselles, R. Kimmel, G. Sapiro, Geodesic active contours, International Journal of Computer Vision 22 (1997) 61; G. Sapiro, Geometric Partial Differential Equations and Image Analysis, Cambridge University Press, 2001; R. Kimmel, Numerical Geometry of Images: Theory, Algorithms, and Applications, Springer-Verlag, 2003] introduced in the field of image processing.

  15. Cooperative fuzzy games approach to setting target levels of ECs in quality function deployment.

    PubMed

    Yang, Zhihui; Chen, Yizeng; Yin, Yunqiang

    2014-01-01

    Quality function deployment (QFD) can provide a means of translating customer requirements (CRs) into engineering characteristics (ECs) for each stage of product development and production. The main objective of QFD-based product planning is to determine the target levels of ECs for a new product or service. QFD is a breakthrough tool which can effectively reduce the gap between CRs and a new product/service. Even though there are conflicts among some ECs, the objective of developing new product is to maximize the overall customer satisfaction. Therefore, there may be room for cooperation among ECs. A cooperative game framework combined with fuzzy set theory is developed to determine the target levels of the ECs in QFD. The key to develop the model is the formulation of the bargaining function. In the proposed methodology, the players are viewed as the membership functions of ECs to formulate the bargaining function. The solution for the proposed model is Pareto-optimal. An illustrated example is cited to demonstrate the application and performance of the proposed approach. PMID:25097884

  16. Cooperative Fuzzy Games Approach to Setting Target Levels of ECs in Quality Function Deployment

    PubMed Central

    Yang, Zhihui; Chen, Yizeng; Yin, Yunqiang

    2014-01-01

    Quality function deployment (QFD) can provide a means of translating customer requirements (CRs) into engineering characteristics (ECs) for each stage of product development and production. The main objective of QFD-based product planning is to determine the target levels of ECs for a new product or service. QFD is a breakthrough tool which can effectively reduce the gap between CRs and a new product/service. Even though there are conflicts among some ECs, the objective of developing new product is to maximize the overall customer satisfaction. Therefore, there may be room for cooperation among ECs. A cooperative game framework combined with fuzzy set theory is developed to determine the target levels of the ECs in QFD. The key to develop the model is the formulation of the bargaining function. In the proposed methodology, the players are viewed as the membership functions of ECs to formulate the bargaining function. The solution for the proposed model is Pareto-optimal. An illustrated example is cited to demonstrate the application and performance of the proposed approach. PMID:25097884

  17. Benchmarking density functional theory predictions of framework structures and properties in a chemically diverse test set of metal-organic frameworks

    SciTech Connect

    Nazarian, Dalar; Ganesh, P.; Sholl, David S.

    2015-09-30

    We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionals for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.

  18. Segmentation of neonatal brain MR images using patch-driven level sets.

    PubMed

    Wang, Li; Shi, Feng; Li, Gang; Gao, Yaozong; Lin, Weili; Gilmore, John H; Shen, Dinggang

    2014-01-01

    The segmentation of neonatal brain MR image into white matter (WM), gray matter (GM), and cerebrospinal fluid (CSF), is challenging due to the low spatial resolution, severe partial volume effect, high image noise, and dynamic myelination and maturation processes. Atlas-based methods have been widely used for guiding neonatal brain segmentation. Existing brain atlases were generally constructed by equally averaging all the aligned template images from a population. However, such population-based atlases might not be representative of a testing subject in the regions with high inter-subject variability and thus often lead to a low capability in guiding segmentation in those regions. Recently, patch-based sparse representation techniques have been proposed to effectively select the most relevant elements from a large group of candidates, which can be used to generate a subject-specific representation with rich local anatomical details for guiding the segmentation. Accordingly, in this paper, we propose a novel patch-driven level set method for the segmentation of neonatal brain MR images by taking advantage of sparse representation techniques. Specifically, we first build a subject-specific atlas from a library of aligned, manually segmented images by using sparse representation in a patch-based fashion. Then, the spatial consistency in the probability maps from the subject-specific atlas is further enforced by considering the similarities of a patch with its neighboring patches. Finally, the probability maps are integrated into a coupled level set framework for more accurate segmentation. The proposed method has been extensively evaluated on 20 training subjects using leave-one-out cross validation, and also on 132 additional testing subjects. Our method achieved a high accuracy of 0.919±0.008 for white matter and 0.901±0.005 for gray matter, respectively, measured by Dice ratio for the overlap between the automated and manual segmentations in the cortical region

  19. Multi-scale texture-based level-set segmentation of breast B-mode images.

    PubMed

    Lang, Itai; Sklair-Levy, Miri; Spitzer, Hedva

    2016-05-01

    Automatic segmentation of ultrasonographic breast lesions is very challenging, due to the lesions' spiculated nature and the variance in shape and texture of the B-mode ultrasound images. Many studies have tried to answer this challenge by applying a variety of computational methods including: Markov random field, artificial neural networks, and active contours and level-set techniques. These studies focused on creating an automatic contour, with maximal resemblance to a manual contour, delineated by a trained radiologist. In this study, we have developed an algorithm, designed to capture the spiculated boundary of the lesion by using the properties from the corresponding ultrasonic image. This is primarily achieved through a unique multi-scale texture identifier (inspired by visual system models) integrated in a level-set framework. The algorithm׳s performance has been evaluated quantitatively via contour-based and region-based error metrics. We compared the algorithm-generated contour to a manual contour delineated by an expert radiologist. In addition, we suggest here a new method for performance evaluation where corrections made by the radiologist replace the algorithm-generated (original) result in the correction zones. The resulting corrected contour is then compared to the original version. The evaluation showed: (1) Mean absolute error of 0.5 pixels between the original and the corrected contour; (2) Overlapping area of 99.2% between the lesion regions, obtained by the algorithm and the corrected contour. These results are significantly better than those previously reported. In addition, we have examined the potential of our segmentation results to contribute to the discrimination between malignant and benign lesions. PMID:27010737

  20. Novel multimodality segmentation using level sets and Jensen-Rényi divergence

    SciTech Connect

    Markel, Daniel; Zaidi, Habib; El Naqa, Issam

    2013-12-15

    Purpose: Positron emission tomography (PET) is playing an increasing role in radiotherapy treatment planning. However, despite progress, robust algorithms for PET and multimodal image segmentation are still lacking, especially if the algorithm were extended to image-guided and adaptive radiotherapy (IGART). This work presents a novel multimodality segmentation algorithm using the Jensen-Rényi divergence (JRD) to evolve the geometric level set contour. The algorithm offers improved noise tolerance which is particularly applicable to segmentation of regions found in PET and cone-beam computed tomography. Methods: A steepest gradient ascent optimization method is used in conjunction with the JRD and a level set active contour to iteratively evolve a contour to partition an image based on statistical divergence of the intensity histograms. The algorithm is evaluated using PET scans of pharyngolaryngeal squamous cell carcinoma with the corresponding histological reference. The multimodality extension of the algorithm is evaluated using 22 PET/CT scans of patients with lung carcinoma and a physical phantom scanned under varying image quality conditions. Results: The average concordance index (CI) of the JRD segmentation of the PET images was 0.56 with an average classification error of 65%. The segmentation of the lung carcinoma images had a maximum diameter relative error of 63%, 19.5%, and 14.8% when using CT, PET, and combined PET/CT images, respectively. The estimated maximal diameters of the gross tumor volume (GTV) showed a high correlation with the macroscopically determined maximal diameters, with aR{sup 2} value of 0.85 and 0.88 using the PET and PET/CT images, respectively. Results from the physical phantom show that the JRD is more robust to image noise compared to mutual information and region growing. Conclusions: The JRD has shown improved noise tolerance compared to mutual information for the purpose of PET image segmentation. Presented is a flexible

  1. Validation of the Visitor and Resident Framework in an E-Book Setting

    ERIC Educational Resources Information Center

    Engelsmann, Hazel C.; Greifeneder, Elke; Lauridsen, Nikoline D.; Nielsen, Anja G.

    2014-01-01

    Introduction: By applying the visitor and resident framework on e-book usage, the article explores whether the concepts of a resident and a visitor can help to explain e-book use, and can help to gain a better insight into users' motivations for e-book use. Method: A questionnaire and semi-structured interviews were conducted with users of…

  2. An explanatory framework of teachers' perceptions of a positive mealtime environment in a preschool setting.

    PubMed

    Mita, Satoko C; Gray, Samuel A; Goodell, L Suzanne

    2015-07-01

    Attending a preschool center may help preschoolers with growth and development that encourage a healthy lifestyle, including sound eating behaviors. Providing a positive mealtime environment (PME) may be one of the keys to fostering a child's healthy eating habits in the classroom. However, a specific definition of a PME, the components of a PME, or directions on how to create one have not been established. The purpose of this study, therefore, was to explore Head Start teachers' perceptions related to a PME and create a conceptual framework representing these perceptions. To achieve this purpose, researchers conducted 65 in-depth phone interviews with Head Start teachers around the US. Applying principles of grounded theory, researchers developed a conceptual framework depicting teachers' perceptions of PME, consisting of five key components: (1) the people (i.e., teachers, kitchen staff, parent volunteers, and children), (2) positive emotional tone (e.g., relaxed and happy), (3) rules, expectations, and routines (e.g., family-style mealtime), (4) operations of a PME (i.e., eating, socialization, and learning), and (5) both short- and long-term outcomes of a PME. With this PME framework, researchers may be able to enhance the effectiveness of nutrition interventions related to a PME, focusing on the factors in the conceptual framework as well as barriers associated with achieving these factors. PMID:25728886

  3. Developing Individualized Education Programs for Children in Inclusive Settings: A Developmentally Appropriate Framework.

    ERIC Educational Resources Information Center

    Edmiaston, Rebecca; Dolezal, Val; Doolittle, Sharon; Erickson, Carol; Merritt, Sandy

    2000-01-01

    Presents a developmentally appropriate framework reflecting the constructivist orientation of early childhood education to guide development of IEP goals and objectives for young children with disabilities. Discusses problems teachers encounter with IEPs, including defining skills too narrowly, not considering the time factor, and isolating the…

  4. A conceptual framework for organizational readiness to implement nutrition and physical activity programs in early childhood education settings.

    PubMed

    Sharma, Shreela V; Upadhyaya, Mudita; Schober, Daniel J; Byrd-Williams, Courtney

    2014-01-01

    Across multiple sectors, organizational readiness predicts the success of program implementation. However, the factors influencing readiness of early childhood education (ECE) organizations for implementation of new nutrition and physical activity programs is poorly understood. This study presents a new conceptual framework to measure organizational readiness to implement nutrition and physical activity programs in ECE centers serving children aged 0 to 5 years. The framework was validated for consensus on relevance and generalizability by conducting focus groups; the participants were managers (16 directors and 2 assistant directors) of ECE centers. The framework theorizes that it is necessary to have "collective readiness," which takes into account such factors as resources, organizational operations, work culture, and the collective attitudes, motivation, beliefs, and intentions of ECE staff. Results of the focus groups demonstrated consensus on the relevance of proposed constructs across ECE settings. Including readiness measures during program planning and evaluation could inform implementation of ECE programs targeting nutrition and physical activity behaviors. PMID:25357258

  5. Cell segmentation using coupled level sets and graph-vertex coloring.

    PubMed

    Nath, Sumit K; Palaniappan, Kannappan; Bunyak, Filiz

    2006-01-01

    Current level-set based approaches for segmenting a large number of objects are computationally expensive since they require a unique level set per object (the N-level set paradigm), or [log2N] level sets when using a multiphase interface tracking formulation. Incorporating energy-based coupling constraints to control the topological interactions between level sets further increases the computational cost to O(N2). We propose a new approach, with dramatic computational savings, that requires only four, or fewer, level sets for an arbitrary number of similar objects (like cells) using the Delaunay graph to capture spatial relationships. Even more significantly, the coupling constraints (energy-based and topological) are incorporated using just constant O(1) complexity. The explicit topological coupling constraint, based on predicting contour collisions between adjacent level sets, is developed to further prevent false merging or absorption of neighboring cells, and also reduce fragmentation during level set evolution. The proposed four-color level set algorithm is used to efficiently and accurately segment hundreds of individual epithelial cells within a moving monolayer sheet from time-lapse images of in vitro wound healing without any false merging of cells. PMID:17354879

  6. A novel level set model with automated initialization and controlling parameters for medical image segmentation.

    PubMed

    Liu, Qingyi; Jiang, Mingyan; Bai, Peirui; Yang, Guang

    2016-03-01

    In this paper, a level set model without the need of generating initial contour and setting controlling parameters manually is proposed for medical image segmentation. The contribution of this paper is mainly manifested in three points. First, we propose a novel adaptive mean shift clustering method based on global image information to guide the evolution of level set. By simple threshold processing, the results of mean shift clustering can automatically and speedily generate an initial contour of level set evolution. Second, we devise several new functions to estimate the controlling parameters of the level set evolution based on the clustering results and image characteristics. Third, the reaction diffusion method is adopted to supersede the distance regularization term of RSF-level set model, which can improve the accuracy and speed of segmentation effectively with less manual intervention. Experimental results demonstrate the performance and efficiency of the proposed model for medical image segmentation. PMID:26748038

  7. Screening Systems and Decision Making at the Preschool Level: Application of a Comprehensive Validity Framework

    ERIC Educational Resources Information Center

    Kettler, Ryan J.; Feeney-Kettler, Kelly A.

    2011-01-01

    Universal screening is designed to be an efficient method for identifying preschool students with mental health problems, but prior to use, screening systems must be evaluated to determine their appropriateness within a specific setting. In this article, an evidence-based validity framework is applied to four screening systems for identifying…

  8. Conceptual Framework and Levels of Abstraction for a Complex Large-Scale System

    SciTech Connect

    Simpson, Mary J.

    2005-03-23

    A conceptual framework and levels of abstraction are created to apply across all potential threats. Bioterrorism is used as a complex example to describe the general framework. Bioterrorism is unlimited with respect to the use of a specific agent, mode of dissemination, and potential target. Because the threat is open-ended, there is a strong need for a common, systemic understanding of attack scenarios related to bioterrorism. In recognition of this large-scale complex problem, systems are being created to define, design and use the proper level of abstraction and conceptual framework in bioterrorism. The wide variety of biological agents and delivery mechanisms provide an opportunity for dynamic scale changes by the linking or interlinking of existing threat components. Concurrent impacts must be separated and evaluated in terms of a given environment and/or ‘abstraction framework.’

  9. Multireference Level Set for the Characterization of Nuclear Morphology in Glioblastoma Multiforme

    PubMed Central

    Han, Ju; Spellman, Paul T.

    2013-01-01

    Histological tissue sections provide rich information and continue to be the gold standard for the assessment of tissue neoplasm. However, there are a significant amount of technical and biological variations that impede analysis of large histological datasets. In this paper, we have proposed a novel approach for nuclear segmentation in tumor histology sections, which addresses the problem of technical and biological variations by incorporating information from both manually annotated reference patches and the original image. Subsequently, the solution is formulated within a multireference level set framework. This approach has been validated on manually annotated samples and then applied to the TCGA glioblastoma multiforme (GBM) dataset consisting of 440 whole mount tissue sections scanned with either a 20× or 40× objective, in which, each tissue section varies in size from 40k × 40k pixels to 100k × 100k pixels. Experimental results show a superior performance of the proposed method in comparison with present state of art techniques. PMID:22987497

  10. Alternative Frameworks of the Secondary School Students on the Concept of Condensation at Submicroscopic Level

    ERIC Educational Resources Information Center

    Abdullah, Nurdiana; Surif, Johari; Ismail, Syuhaida

    2016-01-01

    The study was carried out to identify the alternative frameworks on the concept of condensation at submicroscopic level among secondary school students (N = 324). Data was collected by using the qualitative method through the Understanding Test on the Concept of Matter at Submicroscopic Level which consisted of 10 open-ended questions. The…

  11. The Agenda Setting Function of the Mass Media at Three Levels of "Information Holding"

    ERIC Educational Resources Information Center

    Benton, Marc; Frazier, P. Jean

    1976-01-01

    Extends the theoretical concept of agenda setting to include awareness of general issues, awareness of proposed solutions, and specific knowledge about the proposals. Examines whether or not agenda setting is operative at these levels and compares findings with previous agenda setting studies. (MH)

  12. A new framework for intrusion detection based on rough set theory

    NASA Astrophysics Data System (ADS)

    Li, Zhijun; Wu, Yu; Wang, Guoyin; Hai, Yongjun; He, Yunpeng

    2004-04-01

    Intrusion detection is an essential component of critical infrastructure protection mechanism. Since many current IDSs are constructed by manual encoding of expert knowledge, it is time-consuming to update their knowledge. In order to solve this problem, an effective method for misuse intrusion detection with low cost and high efficiency is presented. This paper gives an overview of our research in building a detection model for identifying known intrusions, their variations and novel attacks with unknown natures. The method is based on rough set theory and capable of extracting a set of detection rules from network packet features. After getting a decision table through preprocessing raw packet data, rough-set-based reduction and rule generation algorithms are applied, and useful rules for intrusion detection are obtained. In addition, a rough set and rule-tree-based incremental knowledge acquisition algorithm is presented in order to solve problems of updating rule set when new attacks appear. Compared with other methods, our method requires a smaller size of training data set and less effort to collect training data. Experimental results demonstrate that our system is effective and more suitable for online intrusion detection.

  13. 76 FR 9004 - Public Comment on Setting Achievement Levels in Writing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-16

    ...The National Assessment Governing Board (Governing Board) is soliciting public comments and recommendations to improve the design proposed for setting achievement levels for NAEP in writing. This notice provides opportunity for public comment and submitting recommendations for improving the design proposed for setting achievement levels for the 2011 National Assessment of Educational Progress......

  14. Intervention complexity--a conceptual framework to inform priority-setting in health.

    PubMed Central

    Gericke, Christian A.; Kurowski, Christoph; Ranson, M. Kent; Mills, Anne

    2005-01-01

    Health interventions vary substantially in the degree of effort required to implement them. To some extent this is apparent in their financial cost, but the nature and availability of non-financial resources is often of similar importance. In particular, human resource requirements are frequently a major constraint. We propose a conceptual framework for the analysis of interventions according to their degree of technical complexity; this complements the notion of institutional capacity in considering the feasibility of implementing an intervention. Interventions are categorized into four dimensions: characteristics of the basic intervention; characteristics of delivery; requirements on government capacity; and usage characteristics. The analysis of intervention complexity should lead to a better understanding of supply- and demand-side constraints to scaling up, indicate priorities for further research and development, and can point to potential areas for improvement of specific aspects of each intervention to close the gap between the complexity of an intervention and the capacity to implement it. The framework is illustrated using the examples of scaling up condom social marketing programmes, and the DOTS strategy for tuberculosis control in highly resource-constrained countries. The framework could be used as a tool for policy-makers, planners and programme managers when considering the expansion of existing projects or the introduction of new interventions. Intervention complexity thus complements the considerations of burden of disease, cost-effectiveness, affordability and political feasibility in health policy decision-making. Reducing the technical complexity of interventions will be crucial to meeting the health-related Millennium Development Goals. PMID:15868020

  15. A level set approach for left ventricle detection in CT images using shape segmentation and optical flow

    NASA Astrophysics Data System (ADS)

    Brieva, Jorge; Moya-Albor, Ernesto; Escalante-Ramírez, Boris

    2015-01-01

    The left ventricle (LV) segmentation plays an important role in a subsequent process for the functional analysis of the LV. Typical segmentation of the endocardium wall in the ventricle excludes papillary muscles which leads to an incorrect measure of the ejected volume in the LV. In this paper we present a new variational strategy using a 2D level set framework that includes a local term for enhancing the low contrast structures and a 2D shape model. The shape model in the level set method is propagated to all image sequences corresponding to the cardiac cycles through the optical flow approach using the Hermite transform. To evaluate our strategy we use the Dice index and the Hausdorff distance to compare the segmentation results with the manual segmentation carried out by the physician.

  16. Joint Target Tracking, Recognition and Segmentation for Infrared Imagery Using a Shape Manifold-Based Level Set

    PubMed Central

    Gong, Jiulu; Fan, Guoliang; Yu, Liangjiang; Havlicek, Joseph P.; Chen, Derong; Fan, Ningjun

    2014-01-01

    We propose a new integrated target tracking, recognition and segmentation algorithm, called ATR-Seg, for infrared imagery. ATR-Seg is formulated in a probabilistic shape-aware level set framework that incorporates a joint view-identity manifold (JVIM) for target shape modeling. As a shape generative model, JVIM features a unified manifold structure in the latent space that is embedded with one view-independent identity manifold and infinite identity-dependent view manifolds. In the ATR-Seg algorithm, the ATR problem formulated as a sequential level-set optimization process over the latent space of JVIM, so that tracking and recognition can be jointly optimized via implicit shape matching where target segmentation is achieved as a by-product without any pre-processing or feature extraction. Experimental results on the recently released SENSIAC ATR database demonstrate the advantages and effectiveness of ATR-Seg over two recent ATR algorithms that involve explicit shape matching. PMID:24919014

  17. An improved variational level set method for MR image segmentation and bias field correction.

    PubMed

    Zhan, Tianming; Zhang, Jun; Xiao, Liang; Chen, Yunjie; Wei, Zhihui

    2013-04-01

    In this paper, we propose an improved variational level set approach to correct the bias and to segment the magnetic resonance (MR) images with inhomogeneous intensity. First, we use a Gaussian distribution with bias field as a local region descriptor in two-phase level set formulation for segmentation and bias field correction of the images with inhomogeneous intensities. By using the information of the local variance in this descriptor, our method is able to obtain accurate segmentation results. Furthermore, we extend this method to three-phase level set formulation for brain MR image segmentation and bias field correction. By using this three-phase level set function to replace the four-phase level set function, we can reduce the number of convolution operations in each iteration and improve the efficiency. Compared with other approaches, this algorithm demonstrates a superior performance. PMID:23219273

  18. Benchmarking density functional theory predictions of framework structures and properties in a chemically diverse test set of metal-organic frameworks

    DOE PAGESBeta

    Nazarian, Dalar; Ganesh, P.; Sholl, David S.

    2015-09-30

    We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionalsmore » for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.« less

  19. Education leadership in the clinical health care setting: A framework for nursing education development.

    PubMed

    Mockett, Lynda; Horsfall, Janine; O'Callaghan, Wendy

    2006-12-01

    This paper describes how a new framework for clinical nursing education was introduced at Counties Manukau District Health Board (CMDHB), New Zealand. The project was initiated in response to the significant legislative and post registration nursing education changes within New Zealand. The journey of change has been a significant undertaking, and has required clear management, strong leadership, perseverance and understanding of the organisation's culture. The approach taken to managing the change had four stages, and reflects various change management models. The first stage, the identification process, identified the impetus for change. Creating the vision is the second stage and identified what the change would look like within the organisation. To ensure success and to guide the process of change a realistic and sustainable vision was developed. Implementing the vision was the third stage, and discusses the communication and pilot phase of implementing the nursing education framework. Stage four, embedding the vision, explores the process and experiences of changing an education culture and embedding the vision into an organisation. The paper concludes by discussing the importance of implementing robust, consistent, strategic and collaborative processes - that reflect and evaluate best educational nursing practice. PMID:19040908

  20. Education leadership in the clinical health care setting: a framework for nursing education development.

    PubMed

    Mockett, Lynda; Horsfall, Janine; O'Callaghan, Wendy

    2006-12-01

    This paper describes how a new framework for clinical nursing education was introduced at Counties Manukau District Health Board (CMDHB), New Zealand. The project was initiated in response to the significant legislative and post registration nursing education changes within New Zealand. The journey of change has been a significant undertaking, and has required clear management, strong leadership, perseverance and understanding of the organisation's culture. The approach taken to managing the change had four stages, and reflects various change management models. The first stage, the identification process, identified the impetus for change. Creating the vision is the second stage and identified what the change would look like within the organisation. To ensure success and to guide the process of change a realistic and sustainable vision was developed. Implementing the vision was the third stage, and discusses the communication and pilot phase of implementing the nursing education framework. Stage four, embedding the vision, explores the process and experiences of changing an education culture and embedding the vision into an organisation. The paper concludes by discussing the importance of implementing robust, consistent, strategic and collaborative processes--that reflect and evaluate best educational nursing practice. PMID:17028073

  1. Holocene sea level variations on the basis of integration of independent data sets

    SciTech Connect

    Sahagian, D.; Berkman, P. . Dept. of Geological Sciences and Byrd Polar Research Center)

    1992-01-01

    Variations in sea level through earth history have occurred at a wide variety of time scales. Sea level researchers have attacked the problem of measuring these sea level changes through a variety of approaches, each relevant only to the time scale in question, and usually only relevant to the specific locality from which a specific type of data are derived. There is a plethora of different data types that can and have been used (locally) for the measurement of Holocene sea level variations. The problem of merging different data sets for the purpose of constructing a global eustatic sea level curve for the Holocene has not previously been adequately addressed. The authors direct the efforts to that end. Numerous studies have been published regarding Holocene sea level changes. These have involved exposed fossil reef elevations, elevation of tidal deltas, elevation of depth of intertidal peat deposits, caves, tree rings, ice cores, moraines, eolian dune ridges, marine-cut terrace elevations, marine carbonate species, tide gauges, and lake level variations. Each of these data sets is based on particular set of assumptions, and is valid for a specific set of environments. In order to obtain the most accurate possible sea level curve for the Holocene, these data sets must be merged so that local and other influences can be filtered out of each data set. Since each data set involves very different measurements, each is scaled in order to define the sensitivity of the proxy measurement parameter to sea level, including error bounds. This effectively determines the temporal and spatial resolution of each data set. The level of independence of data sets is also quantified, in order to rule out the possibility of a common non-eustatic factor affecting more than one variety of data. The Holocene sea level curve is considered to be independent of other factors affecting the proxy data, and is taken to represent the relation between global ocean water and basin volumes.

  2. Intellectual Curiosity in Action: A Framework to Assess First-Year Seminars in Liberal Arts Settings

    ERIC Educational Resources Information Center

    Kolb, Kenneth H.; Longest, Kyle C.; Barnett, Jenna C.

    2014-01-01

    Fostering students' intellectual curiosity is a common goal of first-year seminar programs--especially in liberal arts settings. The authors propose an alternative method to assess this ambiguous, value-laden concept. Relying on data gathered from pre- and posttest in-depth interviews of 34 students enrolled in first-year seminars, they…

  3. Alternative Dispute Resolution (ADR): A Different Framework for Conflict Resolution in Educational Settings.

    ERIC Educational Resources Information Center

    Turan, Selahattin; Taylor, Charles

    This paper briefly introduces alternative dispute resolution (ADR) processes and their fundamental principles. The paper provides a review of the literature on ADR and discusses its applicability in educational settings. The concept of conflict is explained, along with analysis of the limitations of traditional conflict resolution processes. The…

  4. Evidence-Based Standard Setting: Establishing a Validity Framework for Cut Scores

    ERIC Educational Resources Information Center

    McClarty, Katie Larsen; Way, Walter D.; Porter, Andrew C.; Beimers, Jennifer N.; Miles, Julie A.

    2013-01-01

    Performance standards are a powerful way to communicate K-12 student achievement (e.g., proficiency) and are the cornerstone of standards-based reform. As education reform shifts the focus to college and career readiness, approaches for setting performance standards need to be revised. We argue that the focus on assessing student readiness can…

  5. Commentary: A Response to Reckase's Conceptual Framework and Examples for Evaluating Standard Setting Methods

    ERIC Educational Resources Information Center

    Schulz, E. Matthew

    2006-01-01

    A look at real data shows that Reckase's psychometric theory for standard setting is not applicable to bookmark and that his simulations cannot explain actual differences between methods. It is suggested that exclusively test-centered, criterion-referenced approaches are too idealized and that a psychophysics paradigm and a theory of group…

  6. Translating evidence into practice: Hong Kong Reference Framework for Preventive Care for Children in Primary Care Settings.

    PubMed

    Siu, Natalie P Y; Too, L C; Tsang, Caroline S H; Young, Betty W Y

    2015-06-01

    There is increasing evidence that supports the close relationship between childhood and adult health. Fostering healthy growth and development of children deserves attention and effort. The Reference Framework for Preventive Care for Children in Primary Care Settings has been published by the Task Force on Conceptual Model and Preventive Protocols under the direction of the Working Group on Primary Care. It aims to promote health and prevent disease in children and is based on the latest research, and contributions of the Clinical Advisory Group that comprises primary care physicians, paediatricians, allied health professionals, and patient groups. This article highlights the comprehensive, continuing, and patient-centred preventive care for children and discusses how primary care physicians can incorporate the evidence-based recommendations into clinical practice. It is anticipated that the adoption of this framework will contribute to improved health and wellbeing of children. PMID:25999033

  7. Level set based vertebra segmentation for the evaluation of Ankylosing Spondylitis

    NASA Astrophysics Data System (ADS)

    Tan, Sovira; Yao, Jianhua; Ward, Michael M.; Yao, Lawrence; Summers, Ronald M.

    2006-03-01

    Ankylosing Spondylitis is a disease of the vertebra where abnormal bone structures (syndesmophytes) grow at intervertebral disk spaces. Because this growth is so slow as to be undetectable on plain radiographs taken over years, it is necessary to resort to computerized techniques to complement qualitative human judgment with precise quantitative measures on 3-D CT images. Very fine segmentation of the vertebral body is required to capture the small structures caused by the pathology. We propose a segmentation algorithm based on a cascade of three level set stages and requiring no training or prior knowledge. First, the noise inside the vertebral body that often blocks the proper evolution of level set surfaces is attenuated by a sigmoid function whose parameters are determined automatically. The 1st level set (geodesic active contour) is designed to roughly segment the interior of the vertebra despite often highly inhomogeneous and even discontinuous boundaries. The result is used as an initial contour for the 2nd level set (Laplacian level set) that closely captures the inner boundary of the cortical bone. The last level set (reversed Laplacian level set) segments the outer boundary of the cortical bone and also corrects small flaws of the previous stage. We carried out extensive tests on 30 vertebrae (5 from each of 6 patients). Two medical experts scored the results at intervertebral disk spaces focusing on end plates and syndesmophytes. Only two minor segmentation errors at vertebral end plates were reported and two syndesmophytes were considered slightly under-segmented.

  8. A framework for outcome-level evaluation of in-service training of health care workers

    PubMed Central

    2013-01-01

    Background In-service training is a key strategic approach to addressing the severe shortage of health care workers in many countries. However, there is a lack of evidence linking these health care worker trainings to improved health outcomes. In response, the United States President’s Emergency Plan for AIDS Relief’s Human Resources for Health Technical Working Group initiated a project to develop an outcome-focused training evaluation framework. This paper presents the methods and results of that project. Methods A general inductive methodology was used for the conceptualization and development of the framework. Fifteen key informant interviews were conducted to explore contextual factors, perceived needs, barriers and facilitators affecting the evaluation of training outcomes. In addition, a thematic analysis of 70 published articles reporting health care worker training outcomes identified key themes and categories. These were integrated, synthesized and compared to several existing training evaluation models. This formed an overall typology which was used to draft a new framework. Finally, the framework was refined and validated through an iterative process of feedback, pilot testing and revision. Results The inductive process resulted in identification of themes and categories, as well as relationships among several levels and types of outcomes. The resulting framework includes nine distinct types of outcomes that can be evaluated, which are organized within three nested levels: individual, organizational and health system/population. The outcome types are: (1) individual knowledge, attitudes and skills; (2) individual performance; (3) individual patient health; (4) organizational systems; (5) organizational performance; (6) organizational-level patient health; (7) health systems; (8) population-level performance; and (9) population-level health. The framework also addresses contextual factors which may influence the outcomes of training, as well as the

  9. A framework for evaluating safety-net and other community-level factors on access for low-income populations.

    PubMed

    Davidson, Pamela L; Andersen, Ronald M; Wyn, Roberta; Brown, E Richard

    2004-01-01

    The framework presented in this article extends the Andersen behavioral model of health services utilization research to examine the effects of contextual determinants of access. A conceptual framework is suggested for selecting and constructing contextual (or community-level) variables representing the social, economic, structural, and public policy environment that influence low-income people's use of medical care. Contextual variables capture the characteristics of the population that disproportionately relies on the health care safety net, the public policy support for low-income and safety-net populations, and the structure of the health care market and safety-net services within that market. Until recently, the literature in this area has been largely qualitative and descriptive and few multivariate studies comprehensively investigated the contextual determinants of access. The comprehensive and systematic approach suggested by the framework will enable researchers to strengthen the external validity of results by accounting for the influence of a consistent set of contextual factors across locations and populations. A subsequent article in this issue of Inquiry applies the framework to examine access to ambulatory care for low-income adults, both insured and uninsured. PMID:15224958

  10. Level set discrete element method for three-dimensional computations with triaxial case study

    NASA Astrophysics Data System (ADS)

    Kawamoto, Reid; Andò, Edward; Viggiani, Gioacchino; Andrade, José E.

    2016-06-01

    In this paper, we outline the level set discrete element method (LS-DEM) which is a discrete element method variant able to simulate systems of particles with arbitrary shape using level set functions as a geometric basis. This unique formulation allows seamless interfacing with level set-based characterization methods as well as computational ease in contact calculations. We then apply LS-DEM to simulate two virtual triaxial specimens generated from XRCT images of experiments and demonstrate LS-DEM's ability to quantitatively capture and predict stress-strain and volume-strain behavior observed in the experiments.

  11. The Harmonizing Outcome Measures for Eczema (HOME) roadmap: a methodological framework to develop core sets of outcome measurements in dermatology.

    PubMed

    Schmitt, Jochen; Apfelbacher, Christian; Spuls, Phyllis I; Thomas, Kim S; Simpson, Eric L; Furue, Masutaka; Chalmers, Joanne; Williams, Hywel C

    2015-01-01

    Core outcome sets (COSs) are consensus-derived minimum sets of outcomes to be assessed in a specific situation. COSs are being increasingly developed to limit outcome-reporting bias, allow comparisons across trials, and strengthen clinical decision making. Despite the increasing interest in outcomes research, methods to develop COSs have not yet been standardized. The aim of this paper is to present the Harmonizing Outcomes Measures for Eczema (HOME) roadmap for the development and implementation of COSs, which was developed on the basis of our experience in the standardization of outcome measurements for atopic eczema. Following the establishment of a panel representing all relevant stakeholders and a research team experienced in outcomes research, the scope and setting of the core set should be defined. The next steps are the definition of a core set of outcome domains such as symptoms or quality of life, followed by the identification or development and validation of appropriate outcome measurement instruments to measure these core domains. Finally, the consented COS needs to be disseminated, implemented, and reviewed. We believe that the HOME roadmap is a useful methodological framework to develop COSs in dermatology, with the ultimate goal of better decision making and promoting patient-centered health care. PMID:25186228

  12. Ice cover, landscape setting, and geological framework of Lake Vostok, East Antarctica

    USGS Publications Warehouse

    Studinger, M.; Bell, R.E.; Karner, G.D.; Tikku, A.A.; Holt, J.W.; Morse, D.L.; David, L.; Richter, T.G.; Kempf, S.D.; Peters, M.E.; Blankenship, D.D.; Sweeney, R.E.; Rystrom, V.L.

    2003-01-01

    Lake Vostok, located beneath more than 4 km of ice in the middle of East Antarctica, is a unique subglacial habitat and may contain microorganisms with distinct adaptations to such an extreme environment. Melting and freezing at the base of the ice sheet, which slowly flows across the lake, controls the flux of water, biota and sediment particles through the lake. The influx of thermal energy, however, is limited to contributions from below. Thus the geological origin of Lake Vostok is a critical boundary condition for the subglacial ecosystem. We present the first comprehensive maps of ice surface, ice thickness and subglacial topography around Lake Vostok. The ice flow across the lake and the landscape setting are closely linked to the geological origin of Lake Vostok. Our data show that Lake Vostok is located along a major geological boundary. Magnetic and gravity data are distinct east and west of the lake, as is the roughness of the subglacial topography. The physiographic setting of the lake has important consequences for the ice flow and thus the melting and freezing pattern and the lake's circulation. Lake Vostok is a tectonically controlled subglacial lake. The tectonic processes provided the space for a unique habitat and recent minor tectonic activity could have the potential to introduce small, but significant amounts of thermal energy into the lake. ?? 2002 Elsevier Science B.V. All rights reserved.

  13. A rough set based rational clustering framework for determining correlated genes.

    PubMed

    Jeyaswamidoss, Jeba Emilyn; Thangaraj, Kesavan; Ramar, Kadarkarai; Chitra, Muthusamy

    2016-06-01

    Cluster analysis plays a foremost role in identifying groups of genes that show similar behavior under a set of experimental conditions. Several clustering algorithms have been proposed for identifying gene behaviors and to understand their significance. The principal aim of this work is to develop an intelligent rough clustering technique, which will efficiently remove the irrelevant dimensions in a high-dimensional space and obtain appropriate meaningful clusters. This paper proposes a novel biclustering technique that is based on rough set theory. The proposed algorithm uses correlation coefficient as a similarity measure to simultaneously cluster both the rows and columns of a gene expression data matrix and mean squared residue to generate the initial biclusters. Furthermore, the biclusters are refined to form the lower and upper boundaries by determining the membership of the genes in the clusters using mean squared residue. The algorithm is illustrated with yeast gene expression data and the experiment proves the effectiveness of the method. The main advantage is that it overcomes the problem of selection of initial clusters and also the restriction of one object belonging to only one cluster by allowing overlapping of biclusters. PMID:27352972

  14. Using a Framework for Three Levels of Sense Making in a Mathematics Classroom

    ERIC Educational Resources Information Center

    Moss, Diana L.; Lamberg, Teruni

    2016-01-01

    This discussion-based lesson is designed to support Year 6 students in their initial understanding of using letters to represent numbers, expressions, and equations in algebra. The three level framework is designed for: (1) making thinking explicit, (2) exploring each other's solutions, and (3) developing new mathematical insights. In each level…

  15. Epidemic Reconstruction in a Phylogenetics Framework: Transmission Trees as Partitions of the Node Set

    PubMed Central

    Hall, Matthew; Woolhouse, Mark; Rambaut, Andrew

    2015-01-01

    The use of genetic data to reconstruct the transmission tree of infectious disease epidemics and outbreaks has been the subject of an increasing number of studies, but previous approaches have usually either made assumptions that are not fully compatible with phylogenetic inference, or, where they have based inference on a phylogeny, have employed a procedure that requires this tree to be fixed. At the same time, the coalescent-based models of the pathogen population that are employed in the methods usually used for time-resolved phylogeny reconstruction are a considerable simplification of epidemic process, as they assume that pathogen lineages mix freely. Here, we contribute a new method that is simultaneously a phylogeny reconstruction method for isolates taken from an epidemic, and a procedure for transmission tree reconstruction. We observe that, if one or more samples is taken from each host in an epidemic or outbreak and these are used to build a phylogeny, a transmission tree is equivalent to a partition of the set of nodes of this phylogeny, such that each partition element is a set of nodes that is connected in the full tree and contains all the tips corresponding to samples taken from one and only one host. We then implement a Monte Carlo Markov Chain (MCMC) procedure for simultaneous sampling from the spaces of both trees, utilising a newly-designed set of phylogenetic tree proposals that also respect node partitions. We calculate the posterior probability of these partitioned trees based on a model that acknowledges the population structure of an epidemic by employing an individual-based disease transmission model and a coalescent process taking place within each host. We demonstrate our method, first using simulated data, and then with sequences taken from the H7N7 avian influenza outbreak that occurred in the Netherlands in 2003. We show that it is superior to established coalescent methods for reconstructing the topology and node heights of the

  16. [Head and Neck Tumor Segmentation Based on Augmented Gradient Level Set Method].

    PubMed

    Zhang, Qiongmin; Zhang, Jing; Wang, Mintang; He, Ling; Men, Yi; Wei, Jun; Haung, Hua

    2015-08-01

    To realize the accurate positioning and quantitative volume measurement of tumor in head and neck tumor CT images, we proposed a level set method based on augmented gradient. With the introduction of gradient information in the edge indicator function, our proposed level set model is adaptive to different intensity variation, and achieves accurate tumor segmentation. The segmentation result has been used to calculate tumor volume. In large volume tumor segmentation, the proposed level set method can reduce manual intervention and enhance the segmentation accuracy. Tumor volume calculation results are close to the gold standard. From the experiment results, the augmented gradient based level set method has achieved accurate head and neck tumor segmentation. It can provide useful information to computer aided diagnosis. PMID:26710464

  17. Learning A Superpixel-Driven Speed Function for Level Set Tracking.

    PubMed

    Zhou, Xue; Li, Xi; Hu, Weiming

    2016-07-01

    A key problem in level set tracking is to construct a discriminative speed function for effective contour evolution. In this paper, we propose a level set tracking method based on a discriminative speed function, which produces a superpixel-driven force for effective level set evolution. Based on kernel density estimation and metric learning, the speed function is capable of effectively encoding the discriminative information on object appearance within a feasible metric space. Furthermore, we introduce adaptive object shape modeling into the level set evolution process, which leads to the tracking robustness in complex scenarios. To ensure the efficiency of adaptive object shape modeling, we develop a simple but efficient weighted non-negative matrix factorization method that can online learn an object shape dictionary. Experimental results on a number of challenging video sequences demonstrate the effectiveness and robustness of the proposed tracking method. PMID:26292353

  18. The Reliability and Validity of the Comfort Level Method of Setting Hearing Aid Gain

    ERIC Educational Resources Information Center

    Walden, Brian E.; And Others

    1977-01-01

    Investigated in a series of experiments with 40 adults (20- to 70-years-old) having bilateral sensorineural hearing impairments was the test-retest reliability of the comfort level method for setting the acoustic gain of hearing aids, and the relationship between the comfort settings utilized in more realistic daily listening situations.…

  19. Basin-scale runoff prediction: An Ensemble Kalman Filter framework based on global hydrometeorological data sets

    NASA Astrophysics Data System (ADS)

    Lorenz, Christof; Tourian, Mohammad J.; Devaraju, Balaji; Sneeuw, Nico; Kunstmann, Harald

    2015-10-01

    In order to cope with the steady decline of the number of in situ gauges worldwide, there is a growing need for alternative methods to estimate runoff. We present an Ensemble Kalman Filter based approach that allows us to conclude on runoff for poorly or irregularly gauged basins. The approach focuses on the application of publicly available global hydrometeorological data sets for precipitation (GPCC, GPCP, CRU, UDEL), evapotranspiration (MODIS, FLUXNET, GLEAM, ERA interim, GLDAS), and water storage changes (GRACE, WGHM, GLDAS, MERRA LAND). Furthermore, runoff data from the GRDC and satellite altimetry derived estimates are used. We follow a least squares prediction that exploits the joint temporal and spatial auto- and cross-covariance structures of precipitation, evapotranspiration, water storage changes and runoff. We further consider time-dependent uncertainty estimates derived from all data sets. Our in-depth analysis comprises of 29 large river basins of different climate regions, with which runoff is predicted for a subset of 16 basins. Six configurations are analyzed: the Ensemble Kalman Filter (Smoother) and the hard (soft) Constrained Ensemble Kalman Filter (Smoother). Comparing the predictions to observed monthly runoff shows correlations larger than 0.5, percentage biases lower than ± 20%, and NSE-values larger than 0.5. A modified NSE-metric, stressing the difference to the mean annual cycle, shows an improvement of runoff predictions for 14 of the 16 basins. The proposed method is able to provide runoff estimates for nearly 100 poorly gauged basins covering an area of more than 11,500,000 km2 with a freshwater discharge, in volume, of more than 125,000 m3/s.

  20. A discontinuous Galerkin conservative level set scheme for interface capturing in multiphase flows

    SciTech Connect

    Owkes, Mark Desjardins, Olivier

    2013-09-15

    The accurate conservative level set (ACLS) method of Desjardins et al. [O. Desjardins, V. Moureau, H. Pitsch, An accurate conservative level set/ghost fluid method for simulating turbulent atomization, J. Comput. Phys. 227 (18) (2008) 8395–8416] is extended by using a discontinuous Galerkin (DG) discretization. DG allows for the scheme to have an arbitrarily high order of accuracy with the smallest possible computational stencil resulting in an accurate method with good parallel scaling. This work includes a DG implementation of the level set transport equation, which moves the level set with the flow field velocity, and a DG implementation of the reinitialization equation, which is used to maintain the shape of the level set profile to promote good mass conservation. A near second order converging interface curvature is obtained by following a height function methodology (common amongst volume of fluid schemes) in the context of the conservative level set. Various numerical experiments are conducted to test the properties of the method and show excellent results, even on coarse meshes. The tests include Zalesak’s disk, two-dimensional deformation of a circle, time evolution of a standing wave, and a study of the Kelvin–Helmholtz instability. Finally, this novel methodology is employed to simulate the break-up of a turbulent liquid jet.

  1. Locally constrained active contour: a region-based level set for ovarian cancer metastasis segmentation

    NASA Astrophysics Data System (ADS)

    Liu, Jianfei; Yao, Jianhua; Wang, Shijun; Linguraru, Marius George; Summers, Ronald M.

    2014-03-01

    Accurate segmentation of ovarian cancer metastases is clinically useful to evaluate tumor growth and determine follow-up treatment. We present a region-based level set algorithm with localization constraints to segment ovarian cancer metastases. Our approach is established on a representative region-based level set, Chan-Vese model, in which an active contour is driven by region competition. To reduce over-segmentation, we constrain the level set propagation within a narrow image band by embedding a dynamic localization function. The metastasis intensity prior is also estimated from image regions within the level set initialization. The localization function and intensity prior force the level set to stop at the desired metastasis boundaries. Our approach was validated on 19 ovarian cancer metastases with radiologist-labeled ground-truth on contrast-enhanced CT scans from 15 patients. The comparison between our algorithm and geodesic active contour indicated that the volume overlap was 75+/-10% vs. 56+/-6%, the Dice coefficient was 83+/-8% vs. 63+/-8%, and the average surface distance was 2.2+/-0.6mm vs. 4.4+/-0.9mm. Experimental results demonstrated that our algorithm outperformed traditional level set algorithms.

  2. An adaptive level set approach for incompressible two-phase flows

    SciTech Connect

    Sussman, M.; Almgren, A.S.; Bell, J.B.

    1997-04-01

    In Sussman, Smereka and Osher, a numerical method using the level set approach was formulated for solving incompressible two-phase flow with surface tension. In the level set approach, the interface is represented as the zero level set of a smooth function; this has the effect of replacing the advection of density, which has steep gradients at the interface, with the advection of the level set function, which is smooth. In addition, the interface can merge or break up with no special treatment. The authors maintain the level set function as the signed distance from the interface in order to robustly compute flows with high density ratios and stiff surface tension effects. In this work, they couple the level set scheme to an adaptive projection method for the incompressible Navier-Stokes equations, in order to achieve higher resolution of the interface with a minimum of additional expense. They present two-dimensional axisymmetric and fully three-dimensional results of air bubble and water drop computations.

  3. [Intellectual development disorders in Latin America: a framework for setting policy priorities for research and care].

    PubMed

    Lazcano-Ponce, Eduardo; Katz, Gregorio; Allen-Leigh, Betania; Magaña Valladares, Laura; Rangel-Eudave, Guillermina; Minoletti, Alberto; Wahlberg, Ernesto; Vásquez, Armando; Salvador-Carulla, Luis

    2013-09-01

    Intellectual development disorders (IDDs) are a set of development disorders characterized by significantly limited cognitive functioning, learning disorders, and disorders related to adaptive skills and behavior. Previously grouped under the term "intellectual disability," this problem has not been widely studied or quantified in Latin America. Those affected are absent from public policy and do not benefit from government social development and poverty reduction strategies. This article offers a critical look at IDDs and describes a new taxonomy; it also proposes recognizing IDDs as a public health issue and promoting the professionalization of care, and suggests an agenda for research and regional action. In Latin America there is no consensus on the diagnostic criteria for IDDs. A small number of rehabilitation programs cover a significant proportion of the people who suffer from IDDs, evidence-based services are not offered, and health care guidelines have not been evaluated. Manuals on psychiatric diagnosis focus heavily on identifying serious IDDs and contribute to underreporting and erroneous classification. The study of these disorders has not been a legal, social science, or public health priority, resulting in a dearth of scientific evidence on them. Specific competencies and professionalization of care for these persons are needed, and interventions must be carried out with a view to prevention, rehabilitation, community integration, and inclusion in the work force. PMID:24233114

  4. Multiphase permittivity imaging using absolute value electrical capacitance tomography data and a level set algorithm.

    PubMed

    Al Hosani, E; Soleimani, M

    2016-06-28

    Multiphase flow imaging is a very challenging and critical topic in industrial process tomography. In this article, simulation and experimental results of reconstructing the permittivity profile of multiphase material from data collected in electrical capacitance tomography (ECT) are presented. A multiphase narrowband level set algorithm is developed to reconstruct the interfaces between three- or four-phase permittivity values. The level set algorithm is capable of imaging multiphase permittivity by using one set of ECT measurement data, so-called absolute value ECT reconstruction, and this is tested with high-contrast and low-contrast multiphase data. Simulation and experimental results showed the superiority of this algorithm over classical pixel-based image reconstruction methods. The multiphase level set algorithm and absolute ECT reconstruction are presented for the first time, to the best of our knowledge, in this paper and critically evaluated. This article is part of the themed issue 'Supersensing through industrial process tomography'. PMID:27185966

  5. Joint inversion of geophysical data using petrophysical clustering and facies deformation wth the level set technique

    NASA Astrophysics Data System (ADS)

    Revil, A.

    2015-12-01

    Geological expertise and petrophysical relationships can be brought together to provide prior information while inverting multiple geophysical datasets. The merging of such information can result in more realistic solution in the distribution of the model parameters, reducing ipse facto the non-uniqueness of the inverse problem. We consider two level of heterogeneities: facies, described by facies boundaries and heteroegenities inside each facies determined by a correlogram. In this presentation, we pose the geophysical inverse problem in terms of Gaussian random fields with mean functions controlled by petrophysical relationships and covariance functions controlled by a prior geological cross-section, including the definition of spatial boundaries for the geological facies. The petrophysical relationship problem is formulated as a regression problem upon each facies. The inversion of the geophysical data is performed in a Bayesian framework. We demonstrate the usefulness of this strategy using a first synthetic case for which we perform a joint inversion of gravity and galvanometric resistivity data with the stations located at the ground surface. The joint inversion is used to recover the density and resistivity distributions of the subsurface. In a second step, we consider the possibility that the facies boundaries are deformable and their shapes are inverted as well. We use the level set approach to perform such deformation preserving prior topological properties of the facies throughout the inversion. With the help of prior facies petrophysical relationships and topological characteristic of each facies, we make posterior inference about multiple geophysical tomograms based on their corresponding geophysical data misfits. The method is applied to a second synthetic case showing that we can recover the heterogeneities inside the facies, the mean values for the petrophysical properties, and, to some extent, the facies boundaries using the 2D joint inversion of

  6. Evolving entities: towards a unified framework for understanding diversity at the species and higher levels.

    PubMed

    Barraclough, Timothy G

    2010-06-12

    Current approaches to studying the evolution of biodiversity differ in their treatment of species and higher level diversity patterns. Species are regarded as the fundamental evolutionarily significant units of biodiversity, both in theory and in practice, and extensive theory explains how they originate and evolve. However, most species are still delimited using qualitative methods that only relate indirectly to the underlying theory. In contrast, higher level patterns of diversity have been subjected to rigorous quantitative study (using phylogenetics), but theory that adequately explains the observed patterns has been lacking. Most evolutionary analyses of higher level diversity patterns have considered non-equilibrium explanations based on rates of diversification (i.e. exponentially growing clades), rather than equilibrium explanations normally used at the species level and below (i.e. constant population sizes). This paper argues that species level and higher level patterns of diversity can be considered within a common framework, based on equilibrium explanations. It shows how forces normally considered in the context of speciation, namely divergent selection and geographical isolation, can generate evolutionarily significant units of diversity above the level of reproductively isolated species. Prospects for the framework to answer some unresolved questions about higher level diversity patterns are discussed. PMID:20439282

  7. Evolving entities: towards a unified framework for understanding diversity at the species and higher levels

    PubMed Central

    Barraclough, Timothy G.

    2010-01-01

    Current approaches to studying the evolution of biodiversity differ in their treatment of species and higher level diversity patterns. Species are regarded as the fundamental evolutionarily significant units of biodiversity, both in theory and in practice, and extensive theory explains how they originate and evolve. However, most species are still delimited using qualitative methods that only relate indirectly to the underlying theory. In contrast, higher level patterns of diversity have been subjected to rigorous quantitative study (using phylogenetics), but theory that adequately explains the observed patterns has been lacking. Most evolutionary analyses of higher level diversity patterns have considered non-equilibrium explanations based on rates of diversification (i.e. exponentially growing clades), rather than equilibrium explanations normally used at the species level and below (i.e. constant population sizes). This paper argues that species level and higher level patterns of diversity can be considered within a common framework, based on equilibrium explanations. It shows how forces normally considered in the context of speciation, namely divergent selection and geographical isolation, can generate evolutionarily significant units of diversity above the level of reproductively isolated species. Prospects for the framework to answer some unresolved questions about higher level diversity patterns are discussed. PMID:20439282

  8. Options for future effective water management in Lombok: A multi-level nested framework

    NASA Astrophysics Data System (ADS)

    Sjah, Taslim; Baldwin, Claudia

    2014-11-01

    Previous research on water use in Lombok identified reduced water available in springs and limits on seasonal water availability. It foreshadowed increasing competition for water resources in critical areas of Lombok. This study examines preliminary information on local social-institutional arrangements for water allocation in the context of Ostrom's rules for self-governing institutions. We identify robust customary mechanisms for decision-making about water sharing and rules at a local level and suggest areas of further investigation for strengthening multi-level networked and nested frameworks, in collaboration with higher levels of government.

  9. Ontological Problem-Solving Framework for Assigning Sensor Systems and Algorithms to High-Level Missions

    PubMed Central

    Qualls, Joseph; Russomanno, David J.

    2011-01-01

    The lack of knowledge models to represent sensor systems, algorithms, and missions makes opportunistically discovering a synthesis of systems and algorithms that can satisfy high-level mission specifications impractical. A novel ontological problem-solving framework has been designed that leverages knowledge models describing sensors, algorithms, and high-level missions to facilitate automated inference of assigning systems to subtasks that may satisfy a given mission specification. To demonstrate the efficacy of the ontological problem-solving architecture, a family of persistence surveillance sensor systems and algorithms has been instantiated in a prototype environment to demonstrate the assignment of systems to subtasks of high-level missions. PMID:22164081

  10. Breast mass segmentation in digital mammography based on pulse coupled neural network and level set method

    NASA Astrophysics Data System (ADS)

    Xie, Weiying; Ma, Yide; Li, Yunsong

    2015-05-01

    A novel approach to mammographic image segmentation, termed as PCNN-based level set algorithm, is presented in this paper. Just as its name implies, a method based on pulse coupled neural network (PCNN) in conjunction with the variational level set method for medical image segmentation. To date, little work has been done on detecting the initial zero level set contours based on PCNN algorithm for latterly level set evolution. When all the pixels of the input image are fired by PCNN, the small pixel value will be a much more refined segmentation. In mammographic image, the breast tumor presents big pixel value. Additionally, the mammographic image with predominantly dark region, so that we firstly obtain the negative of mammographic image with predominantly dark region except the breast tumor before all the pixels of an input image are fired by PCNN. Therefore, in here, PCNN algorithm is employed to achieve mammary-specific, initial mass contour detection. After that, the initial contours are all extracted. We define the extracted contours as the initial zero level set contours for automatic mass segmentation by variational level set in mammographic image analysis. What's more, a new proposed algorithm improves external energy of variational level set method in terms of mammographic images in low contrast. In accordance with the gray scale of mass region in mammographic image is higher than the region surrounded, so the Laplace operator is used to modify external energy, which could make the bright spot becoming much brighter than the surrounded pixels in the image. A preliminary evaluation of the proposed method performs on a known public database namely MIAS, rather than synthetic images. The experimental results demonstrate that our proposed approach can potentially obtain better masses detection results in terms of sensitivity and specificity. Ultimately, this algorithm could lead to increase both sensitivity and specificity of the physicians' interpretation of

  11. A distributed decision framework for building clusters with different heterogeneity settings

    DOE PAGESBeta

    Jafari-Marandi, Ruholla; Omitaomu, Olufemi A.; Hu, Mengqi

    2016-01-05

    In the past few decades, extensive research has been conducted to develop operation and control strategy for smart buildings with the purpose of reducing energy consumption. Besides studying on single building, it is envisioned that the next generation buildings can freely connect with one another to share energy and exchange information in the context of smart grid. It was demonstrated that a network of connected buildings (aka building clusters) can significantly reduce primary energy consumption, improve environmental sustainability and building s resilience capability. However, an analytic tool to determine which type of buildings should form a cluster and what ismore » the impact of building clusters heterogeneity based on energy profile to the energy performance of building clusters is missing. To bridge these research gaps, we propose a self-organizing map clustering algorithm to divide multiple buildings to different clusters based on their energy profiles, and a homogeneity index to evaluate the heterogeneity of different building clusters configurations. In addition, a bi-level distributed decision model is developed to study the energy sharing in the building clusters. To demonstrate the effectiveness of the proposed clustering algorithm and decision model, we employ a dataset including monthly energy consumption data for 30 buildings where the data is collected every 15 min. It is demonstrated that the proposed decision model can achieve at least 13% cost savings for building clusters. Furthermore, the results show that the heterogeneity of energy profile is an important factor to select battery and renewable energy source for building clusters, and the shared battery and renewable energy are preferred for more heterogeneous building clusters.« less

  12. A framework for sea level rise vulnerability assessment for southwest U.S. military installations

    USGS Publications Warehouse

    Chadwick, B.; Flick, Reinhard; Helly, J.; Nishikawa, T.; Pei, Fang Wang; O'Reilly, W.; Guza, R.; Bromirski, Peter; Young, A.; Crampton, W.; Wild, B.; Canner, I.

    2011-01-01

    We describe an analysis framework to determine military installation vulnerabilities under increases in local mean sea level as projected over the next century. The effort is in response to an increasing recognition of potential climate change ramifications for national security and recommendations that DoD conduct assessments of the impact on U.S. military installations of climate change. Results of the effort described here focus on development of a conceptual framework for sea level rise vulnerability assessment at coastal military installations in the southwest U.S. We introduce the vulnerability assessment in the context of a risk assessment paradigm that incorporates sources in the form of future sea level conditions, pathways of impact including inundation, flooding, erosion and intrusion, and a range of military installation specific receptors such as critical infrastructure and training areas. A unique aspect of the methodology is the capability to develop wave climate projections from GCM outputs and transform these to future wave conditions at specific coastal sites. Future sea level scenarios are considered in the context of installation sensitivity curves which reveal response thresholds specific to each installation, pathway and receptor. In the end, our goal is to provide a military-relevant framework for assessment of accelerated SLR vulnerability, and develop the best scientifically-based scenarios of waves, tides and storms and their implications for DoD installations in the southwestern U.S. ?? 2011 MTS.

  13. Target Detection in SAR Images Based on a Level Set Approach

    SciTech Connect

    Marques, Regis C.P.; Medeiros, Fatima N.S.; Ushizima, Daniela M.

    2008-09-01

    This paper introduces a new framework for point target detection in synthetic aperture radar (SAR) images. We focus on the task of locating reflective small regions using alevel set based algorithm. Unlike most of the approaches in image segmentation, we address an algorithm which incorporates speckle statistics instead of empirical parameters and also discards speckle filtering. The curve evolves according to speckle statistics, initially propagating with a maximum upward velocity in homogeneous areas. Our approach is validated by a series of tests on synthetic and real SAR images and compared with three other segmentation algorithms, demonstrating that it configures a novel and efficient method for target detection purpose.

  14. Estimations of a global sea level trend: limitations from the structure of the PSMSL global sea level data set

    NASA Astrophysics Data System (ADS)

    Gröger, M.; Plag, H.-P.

    1993-08-01

    Among the possible impacts on environmental conditions of a global warming expected as a consequence of the increasing release of CO 2 and various other greenhouse gases into the atmosphere, a predicted rise in global sea level is considered to be of high importance. Thus, quite a number of recent studies have focused on detecting the "global sea level rise" or even an acceleration of this trend. A brief review of these studies is presented, showing, however, that the results are not conclusive, though most of the studies have been based on a single global data set of coastal tide gauge data provided by the Permanent Service for Mean Sea Level (PSMSL). A detailed discussion of a thoroughly revised subset reveals that the PSMSL data set suffers from three severe limitations: (1) the geographical distribution of reliable tide gauge stations is rather uneven with pronounced concentrations in some areas of the northern hemisphere (Europe, North America, Japan), and much fewer stations on the southern hemisphere where particularly few stations are located in Africa and in Antarctica; (2) the number of stations recording simultaneously at any time is far less than the total number of stations with the maximum within the interval between 1958 and 1988; (3) the number of long records is extremely small and almost all of them originate from a few regions of the northern hemisphere. The sensitivity of the median of the local trends to these temporal and spatial limitations is discussed by restricting the data set in both the spatial and temporal distribution. It is shown that the data base is insufficient for determining an integral value of the global rise in relative sea level. The effect of polar motion on sea level is modelled and it turns out to be locally of the order of 0.5 mm/yr, affecting regional trends to an order of 0.1 mm/yr. Thus, this effect can be neglected on time scale of decades to a hundred years. Though the data set is insufficient for determining an

  15. A High-Level Framework for Distributed Processing of Large-Scale Graphs

    NASA Astrophysics Data System (ADS)

    Krepska, Elzbieta; Kielmann, Thilo; Fokkink, Wan; Bal, Henri

    Distributed processing of real-world graphs is challenging due to their size and the inherent irregular structure of graph computations. We present hipg, a distributed framework that facilitates high-level programming of parallel graph algorithms by expressing them as a hierarchy of distributed computations executed independently and managed by the user. hipg programs are in general short and elegant; they achieve good portability, memory utilization and performance.

  16. A Multi-Level Approach for Promoting HIV Testing Within African American Church Settings

    PubMed Central

    2015-01-01

    Abstract The African American church is a community-based organization that is integral to the lives, beliefs, and behaviors of the African American community. Engaging this vital institution as a primary setting for HIV testing and referral would significantly impact the epidemic. The disproportionately high HIV incidence rate among African Americans dictates the national priority for promotion of early and routine HIV testing, and suggests engaging community-based organizations in this endeavor. However, few multilevel HIV testing frameworks have been developed, tested, and evaluated within the African American church. This article proposes one such framework for promoting HIV testing and referral within African American churches. A qualitative study was employed to examine the perceptions, beliefs, knowledge, and behaviors related to understanding involvement in church-based HIV testing. A total of four focus groups with church leaders and four in-depth interviews with pastors, were conducted between November 2012 and June 2013 to identify the constructs most important to supporting Philadelphia churches' involvement in HIV testing, referral, and linkage to care. The data generated from this study were analyzed using a grounded theory approach and used to develop and refine a multilevel framework for identifying factors impacting church-based HIV testing and referral and to ultimately support capacity building among African American churches to promote HIV testing and linkage to care. PMID:25682887

  17. Issues related to setting exemption levels for oil and gas NORM

    SciTech Connect

    Blunt, D. L.; Gooden, D. S.; Smith, K. P.

    1999-11-12

    In the absence of any federal regulations that specifically address the handling and disposal of wastes containing naturally occurring radioactive material (NORM), individual states have taken responsibility for developing their own regulatory programs for NORM. A key issue in developing NORM rules is defining exemption levels--specific levels or concentrations that determine which waste materials are subject to controlled management. In general, states have drawn upon existing standards and guidelines for similar waste types in establishing exemption levels for NORM. Simply adopting these standards may not be appropriate for oil and gas NORM for several reasons. The Interstate Oil and Gas Compact Commission's NORM Subcommittee has summarized the issues involved in setting exemption levels in a report titled ``Naturally Occurring Radioactive Materials (NORM): Issues from the Oil and Gas Point of View''. The committee has also recommended a set of exemption levels for controlled practices and for remediation activities on the basis of the issues discussed.

  18. Aerostructural Level Set Topology Optimization for a Common Research Model Wing

    NASA Technical Reports Server (NTRS)

    Dunning, Peter D.; Stanford, Bret K.; Kim, H. Alicia

    2014-01-01

    The purpose of this work is to use level set topology optimization to improve the design of a representative wing box structure for the NASA common research model. The objective is to minimize the total compliance of the structure under aerodynamic and body force loading, where the aerodynamic loading is coupled to the structural deformation. A taxi bump case was also considered, where only body force loads were applied. The trim condition that aerodynamic lift must balance the total weight of the aircraft is enforced by allowing the root angle of attack to change. The level set optimization method is implemented on an unstructured three-dimensional grid, so that the method can optimize a wing box with arbitrary geometry. Fast matching and upwind schemes are developed for an unstructured grid, which make the level set method robust and efficient. The adjoint method is used to obtain the coupled shape sensitivities required to perform aerostructural optimization of the wing box structure.

  19. A Variational Level Set Approach to Segmentation and Bias Correction of Images with Intensity Inhomogeneity

    PubMed Central

    Huang, Rui; Ding, Zhaohua; Gatenby, Chris; Metaxas, Dimitris; Gore, John

    2009-01-01

    This paper presents a variational level set approach to joint segmentation and bias correction of images with intensity inhomogeneity. Our method is based on an observation that intensities in a relatively small local region are separable, despite of the inseparability of the intensities in the whole image caused by the intensity inhomogeneity. We first define a weighted K-means clustering objective function for image intensities in a neighborhood around each point, with the cluster centers having a multiplicative factor that estimates the bias within the neighborhood. The objective function is then integrated over the entire domain and incorporated into a variational level set formulation. The energy minimization is performed via a level set evolution process. Our method is able to estimate bias of quite general profiles. Moreover, it is robust to initialization, and therefore allows automatic applications. The proposed method has been used for images of various modalities with promising results. PMID:18982712

  20. Numerical Schemes for the Hamilton-Jacobi and Level Set Equations on Triangulated Domains

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.; Sethian, James A.

    2006-01-01

    Borrowing from techniques developed for conservation law equations, we have developed both monotone and higher order accurate numerical schemes which discretize the Hamilton-Jacobi and level set equations on triangulated domains. The use of unstructured meshes containing triangles (2D) and tetrahedra (3D) easily accommodates mesh adaptation to resolve disparate level set feature scales with a minimal number of solution unknowns. The minisymposium talk will discuss these algorithmic developments and present sample calculations using our adaptive triangulation algorithm applied to various moving interface problems such as etching, deposition, and curvature flow.

  1. Setting the Direction Framework

    ERIC Educational Resources Information Center

    Alberta Education, 2009

    2009-01-01

    Alberta has a long and proud history of meeting the educational needs of students with disabilities and diverse needs. The province serves many thousand students with behavioural, communicational and intellectual needs; as well as students with mental health challenges, learning or physical disabilities and students who are gifted and talented.…

  2. An integrated framework for high level design of high performance signal processing circuits on FPGAs

    NASA Astrophysics Data System (ADS)

    Benkrid, K.; Belkacemi, S.; Sukhsawas, S.

    2005-06-01

    This paper proposes an integrated framework for the high level design of high performance signal processing algorithms' implementations on FPGAs. The framework emerged from a constant need to rapidly implement increasingly complicated algorithms on FPGAs while maintaining the high performance needed in many real time digital signal processing applications. This is particularly important for application developers who often rely on iterative and interactive development methodologies. The central idea behind the proposed framework is to dynamically integrate high performance structural hardware description languages with higher level hardware languages in other to help satisfy the dual requirement of high level design and high performance implementation. The paper illustrates this by integrating two environments: Celoxica's Handel-C language, and HIDE, a structural hardware environment developed at the Queen's University of Belfast. On the one hand, Handel-C has been proven to be very useful in the rapid design and prototyping of FPGA circuits, especially control intensive ones. On the other hand, HIDE, has been used extensively, and successfully, in the generation of highly optimised parameterisable FPGA cores. In this paper, this is illustrated in the construction of a scalable and fully parameterisable core for image algebra's five core neighbourhood operations, where fully floorplanned efficient FPGA configurations, in the form of EDIF netlists, are generated automatically for instances of the core. In the proposed combined framework, highly optimised data paths are invoked dynamically from within Handel-C, and are synthesized using HIDE. Although the idea might seem simple prima facie, it could have serious implications on the design of future generations of hardware description languages.

  3. Individual-and Setting-Level Correlates of Secondary Traumatic Stress in Rape Crisis Center Staff.

    PubMed

    Dworkin, Emily R; Sorell, Nicole R; Allen, Nicole E

    2016-02-01

    Secondary traumatic stress (STS) is an issue of significant concern among providers who work with survivors of sexual assault. Although STS has been studied in relation to individual-level characteristics of a variety of types of trauma responders, less research has focused specifically on rape crisis centers as environments that might convey risk or protection from STS, and no research to knowledge has modeled setting-level variation in correlates of STS. The current study uses a sample of 164 staff members representing 40 rape crisis centers across a single Midwestern state to investigate the staff member-and agency-level correlates of STS. Results suggest that correlates exist at both levels of analysis. Younger age and greater severity of sexual assault history were statistically significant individual-level predictors of increased STS. Greater frequency of supervision was more strongly related to secondary stress for non-advocates than for advocates. At the setting level, lower levels of supervision and higher client loads agency-wide accounted for unique variance in staff members' STS. These findings suggest that characteristics of both providers and their settings are important to consider when understanding their STS. PMID:25381285

  4. Accurate Adaptive Level Set Method and Sharpening Technique for Three Dimensional Deforming Interfaces

    NASA Technical Reports Server (NTRS)

    Kim, Hyoungin; Liou, Meng-Sing

    2011-01-01

    In this paper, we demonstrate improved accuracy of the level set method for resolving deforming interfaces by proposing two key elements: (1) accurate level set solutions on adapted Cartesian grids by judiciously choosing interpolation polynomials in regions of different grid levels and (2) enhanced reinitialization by an interface sharpening procedure. The level set equation is solved using a fifth order WENO scheme or a second order central differencing scheme depending on availability of uniform stencils at each grid point. Grid adaptation criteria are determined so that the Hamiltonian functions at nodes adjacent to interfaces are always calculated by the fifth order WENO scheme. This selective usage between the fifth order WENO and second order central differencing schemes is confirmed to give more accurate results compared to those in literature for standard test problems. In order to further improve accuracy especially near thin filaments, we suggest an artificial sharpening method, which is in a similar form with the conventional re-initialization method but utilizes sign of curvature instead of sign of the level set function. Consequently, volume loss due to numerical dissipation on thin filaments is remarkably reduced for the test problems

  5. Nonparametric intensity priors for level set segmentation of low contrast structures.

    PubMed

    Makrogiannis, Sokratis; Bhotika, Rahul; Miller, James V; Skinner, John; Vass, Melissa

    2009-01-01

    Segmentation of low contrast objects is an important task in clinical applications like lesion analysis and vascular wall remodeling analysis. Several solutions to low contrast segmentation that exploit high-level information have been previously proposed, such as shape priors and generative models. In this work, we incorporate a priori distributions of intensity and low-level image information into a nonparametric dissimilarity measure that defines a local indicator function for the likelihood of belonging to a foreground object. We then integrate the indicator function into a level set formulation for segmenting low contrast structures. We apply the technique to the clinical problem of positive remodeling of the vessel wall in cardiac CT angiography images. We present results on a dataset of twenty five patient scans, showing improvement over conventional gradient-based level sets. PMID:20425993

  6. Demons versus level-set motion registration for coronary 18F-sodium fluoride PET

    NASA Astrophysics Data System (ADS)

    Rubeaux, Mathieu; Joshi, Nikhil; Dweck, Marc R.; Fletcher, Alison; Motwani, Manish; Thomson, Louise E.; Germano, Guido; Dey, Damini; Berman, Daniel S.; Newby, David E.; Slomka, Piotr J.

    2016-03-01

    Ruptured coronary atherosclerotic plaques commonly cause acute myocardial infarction. It has been recently shown that active microcalcification in the coronary arteries, one of the features that characterizes vulnerable plaques at risk of rupture, can be imaged using cardiac gated 18F-sodium fluoride (18F-NaF) PET. We have shown in previous work that a motion correction technique applied to cardiac-gated 18F-NaF PET images can enhance image quality and improve uptake estimates. In this study, we further investigated the applicability of different algorithms for registration of the coronary artery PET images. In particular, we aimed to compare demons vs. level-set nonlinear registration techniques applied for the correction of cardiac motion in coronary 18F-NaF PET. To this end, fifteen patients underwent 18F-NaF PET and prospective coronary CT angiography (CCTA). PET data were reconstructed in 10 ECG gated bins; subsequently these gated bins were registered using demons and level-set methods guided by the extracted coronary arteries from CCTA, to eliminate the effect of cardiac motion on PET images. Noise levels, target-to-background ratios (TBR) and global motion were compared to assess image quality. Compared to the reference standard of using only diastolic PET image (25% of the counts from PET acquisition), cardiac motion registration using either level-set or demons techniques almost halved image noise due to the use of counts from the full PET acquisition and increased TBR difference between 18F-NaF positive and negative lesions. The demons method produces smoother deformation fields, exhibiting no singularities (which reflects how physically plausible the registration deformation is), as compared to the level-set method, which presents between 4 and 8% of singularities, depending on the coronary artery considered. In conclusion, the demons method produces smoother motion fields as compared to the level-set method, with a motion that is physiologically

  7. Demons versus Level-Set motion registration for coronary 18F-sodium fluoride PET

    PubMed Central

    Rubeaux, Mathieu; Joshi, Nikhil; Dweck, Marc R.; Fletcher, Alison; Motwani, Manish; Thomson, Louise E.; Germano, Guido; Dey, Damini; Berman, Daniel S.; Newby, David E.; Slomka, Piotr J.

    2016-01-01

    Ruptured coronary atherosclerotic plaques commonly cause acute myocardial infarction. It has been recently shown that active microcalcification in the coronary arteries, one of the features that characterizes vulnerable plaques at risk of rupture, can be imaged using cardiac gated 18F-sodium fluoride (18F-NaF) PET. We have shown in previous work that a motion correction technique applied to cardiac-gated 18F-NaF PET images can enhance image quality and improve uptake estimates. In this study, we further investigated the applicability of different algorithms for registration of the coronary artery PET images. In particular, we aimed to compare demons vs. level-set nonlinear registration techniques applied for the correction of cardiac motion in coronary 18F-NaF PET. To this end, fifteen patients underwent 18F-NaF PET and prospective coronary CT angiography (CCTA). PET data were reconstructed in 10 ECG gated bins; subsequently these gated bins were registered using demons and level-set methods guided by the extracted coronary arteries from CCTA, to eliminate the effect of cardiac motion on PET images. Noise levels, target-to-background ratios (TBR) and global motion were compared to assess image quality. Compared to the reference standard of using only diastolic PET image (25% of the counts from PET acquisition), cardiac motion registration using either level-set or demons techniques almost halved image noise due to the use of counts from the full PET acquisition and increased TBR difference between 18F-NaF positive and negative lesions. The demons method produces smoother deformation fields, exhibiting no singularities (which reflects how physically plausible the registration deformation is), as compared to the level-set method, which presents between 4 and 8% of singularities, depending on the coronary artery considered. In conclusion, the demons method produces smoother motion fields as compared to the level-set method, with a motion that is physiologically

  8. Weld defect detection on digital radiographic image using level set method

    NASA Astrophysics Data System (ADS)

    Halim, Suhaila Abd; Petrus, Bertha Trissan; Ibrahim, Arsmah; Manurung, Yupiter HP; Jayes, Mohd Idris

    2013-09-01

    Segmentation is the most critical task and widely used to obtain useful information in image processing. In this study, Level set based on Chan Vese method is explored and applied to define weld defect on digital radiographic image and its accuracy is evaluated to measure its performance. A set of images with region of interest (ROI) that contain defect are used as input image. The ROI image is pre-processed to improve their quality for better detection. Then, the image is segmented using level set method that is implemented using MATLAB R2009a. The accuracy of the method is evaluated using Receiver Operating Characteristic (ROC). Experimental results show that the method generated an area underneath the ROC of 0.7 in the set of images and the operational point reached corresponds to 0.6 of sensitivity and 0.8 of specificity. The application of segmentation technique such as Chan-Vese level set able to assist radiographer in detecting the defect on digital radiographic image accurately.

  9. A Measurement Framework for Team Level Assessment of Innovation Capability in Early Requirements Engineering

    NASA Astrophysics Data System (ADS)

    Regnell, Björn; Höst, Martin; Nilsson, Fredrik; Bengtsson, Henrik

    When developing software-intensive products for a market-place it is important for a development organisation to create innovative features for coming releases in order to achieve advantage over competitors. This paper focuses on assessment of innovation capability at team level in relation to the requirements engineering that is taking place before the actual product development projects are decided, when new business models, technology opportunities and intellectual property rights are created and investigated through e.g. prototyping and concept development. The result is a measurement framework focusing on four areas: innovation elicitation, selection, impact and ways-of-working. For each area, candidate measurements were derived from interviews to be used as inspiration in the development of a tailored measurement program. The framework is based on interviews with participants of a software team with specific innovation responsibilities and validated through cross-case analysis and feedback from practitioners.

  10. An investigation of children's levels of inquiry in an informal science setting

    NASA Astrophysics Data System (ADS)

    Clark-Thomas, Beth Anne

    Elementary school students' understanding of both science content and processes are enhanced by the higher level thinking associated with inquiry-based science investigations. Informal science setting personnel, elementary school teachers, and curriculum specialists charged with designing inquiry-based investigations would be well served by an understanding of the varying influence of certain present factors upon the students' willingness and ability to delve into such higher level inquiries. This study examined young children's use of inquiry-based materials and factors which may influence the level of inquiry they engaged in during informal science activities. An informal science setting was selected as the context for the examination of student inquiry behaviors because of the rich inquiry-based environment present at the site and the benefits previously noted in the research regarding the impact of informal science settings upon the construction of knowledge in science. The study revealed several patterns of behavior among children when they are engaged in inquiry-based activities at informal science exhibits. These repeated behaviors varied in the children's apparent purposeful use of the materials at the exhibits. These levels of inquiry behavior were taxonomically defined as high/medium/low within this study utilizing a researcher-developed tool. Furthermore, in this study adult interventions, questions, or prompting were found to impact the level of inquiry engaged in by the children. This study revealed that higher levels of inquiry were preceded by task directed and physical feature prompts. Moreover, the levels of inquiry behaviors were haltered, even lowered, when preceded by a prompt that focused on a science content or concept question. Results of this study have implications for the enhancement of inquiry-based science activities in elementary schools as well as in informal science settings. These findings have significance for all science educators

  11. Analysis of Forensic Autopsy in 120 Cases of Medical Disputes Among Different Levels of Institutional Settings.

    PubMed

    Yu, Lin-Sheng; Ye, Guang-Hua; Fan, Yan-Yan; Li, Xing-Biao; Feng, Xiang-Ping; Han, Jun-Ge; Lin, Ke-Zhi; Deng, Miao-Wu; Li, Feng

    2015-09-01

    Despite advances in medical science, the causes of death can sometimes only be determined by pathologists after a complete autopsy. Few studies have investigated the importance of forensic autopsy in medically disputed cases among different levels of institutional settings. Our study aimed to analyze forensic autopsy in 120 cases of medical disputes among five levels of institutional settings between 2001 and 2012 in Wenzhou, China. The results showed an overall concordance rate of 55%. Of the 39% of clinically missed diagnosis, cardiovascular pathology comprises 55.32%, while respiratory pathology accounts for the remaining 44. 68%. Factors that increase the likelihood of missed diagnoses were private clinics, community settings, and county hospitals. These results support that autopsy remains an important tool in establishing causes of death in medically disputed case, which may directly determine or exclude the fault of medical care and therefore in helping in resolving these cases. PMID:25929602

  12. Physical Activity Levels in Coeducational and Single-Gender High School Physical Education Settings

    ERIC Educational Resources Information Center

    Hannon, James; Ratliffe, Thomas

    2005-01-01

    The purpose of this study was to investigate the effects of coeducational (coed) and single-gender game-play settings on the activity levels of Caucasian and African American high school physical education students. Students participated in flag football, ultimate Frisbee, and soccer units. Classes were as follows: there were two coed classes, two…

  13. Re-Setting the Concentration Levels of Students in Higher Education: An Exploratory Study

    ERIC Educational Resources Information Center

    Burke, Lisa A.; Ray, Ruth

    2008-01-01

    Evidence suggests that college students' concentration levels are limited and hard to maintain. Even though relevant in higher education, scant empirical research exists on interventions to "re-set" their concentration during a college lecture. Using a within-subjects design, four active learning interventions are administered across two…

  14. Level set segmentation for greenbelts by integrating wavelet texture and priori color knowledge

    NASA Astrophysics Data System (ADS)

    Yang, Tie-jun; Song, Zhi-hui; Jiang, Chuan-xian; Huang, Lin

    2013-09-01

    Segmenting greenbelts quickly and accurately in remote sensing images is an economic and effective method for the statistics of green coverage rate (GCR). Towards the problem of over-reliance on priori knowledge of the traditional level set segmentation model based on max-flow/min-cut Graph Cut principle and weighted Total Variation (GCTV), this paper proposes a level set segmentation method of combining regional texture features and priori knowledge of color and applies it to greenbelt segmentation in urban remote sensing images. For the color of greenbelts is not reliable for segmentation, Gabor wavelet transform is used to extract image texture features. Then we integrate the extracted features into the GCTV model which contains only priori knowledge of color, and use both the prior knowledge and the targets' texture to constrain the evolving of the level set which can solve the problem of over-reliance on priori knowledge. Meanwhile, the convexity of the corresponding energy functional is ensured by using relaxation and threshold method, and primal-dual algorithm with global relabeling is used to accelerate the evolution of the level set. The experiments show that our method can effectively reduce the dependence on priori knowledge of GCTV, and yields more accurate greenbelt segmentation results.

  15. Energy-optimal path planning by stochastic dynamically orthogonal level-set optimization

    NASA Astrophysics Data System (ADS)

    Subramani, Deepak N.; Lermusiaux, Pierre F. J.

    2016-04-01

    A stochastic optimization methodology is formulated for computing energy-optimal paths from among time-optimal paths of autonomous vehicles navigating in a dynamic flow field. Based on partial differential equations, the methodology rigorously leverages the level-set equation that governs time-optimal reachability fronts for a given relative vehicle-speed function. To set up the energy optimization, the relative vehicle-speed and headings are considered to be stochastic and new stochastic Dynamically Orthogonal (DO) level-set equations are derived. Their solution provides the distribution of time-optimal reachability fronts and corresponding distribution of time-optimal paths. An optimization is then performed on the vehicle's energy-time joint distribution to select the energy-optimal paths for each arrival time, among all stochastic time-optimal paths for that arrival time. Numerical schemes to solve the reduced stochastic DO level-set equations are obtained, and accuracy and efficiency considerations are discussed. These reduced equations are first shown to be efficient at solving the governing stochastic level-sets, in part by comparisons with direct Monte Carlo simulations. To validate the methodology and illustrate its accuracy, comparisons with semi-analytical energy-optimal path solutions are then completed. In particular, we consider the energy-optimal crossing of a canonical steady front and set up its semi-analytical solution using a energy-time nested nonlinear double-optimization scheme. We then showcase the inner workings and nuances of the energy-optimal path planning, considering different mission scenarios. Finally, we study and discuss results of energy-optimal missions in a wind-driven barotropic quasi-geostrophic double-gyre ocean circulation.

  16. Large Code Set for Double User Capacity and Low PAPR Level in Multicarrier Systems

    NASA Astrophysics Data System (ADS)

    Anwar, Khoirul; Saito, Masato; Hara, Takao; Okada, Minoru

    In this paper, a new large spreading code set with a uniform low cross-correlation is proposed. The proposed code set is capable of (1) increasing the number of assigned user (capacity) in a multicarrier code division multiple access (MC-CDMA) system and (2) reducing the peak-to-average power ratio (PAPR) of an orthogonal frequency division multiplexing (OFDM) system. In this paper, we derive a new code set and present an example to demonstrate performance improvements of OFDM and MC-CDMA systems. Our proposed code set with code length of N has K=2N+1 number of codes for supporting up to (2N+1) users and exhibits lower cross correlation properties compared to the existing spreading code sets. Our results with subcarrier N=16 confirm that the proposed code set outperforms the current pseudo-orthogonal carrier interferometry (POCI) code set with gain of 5dB at bit-error-rate (BER) level of 10-4 in the additive white Gaussian noise (AWGN) channel and gain of more than 3.6dB in a multipath fading channel.

  17. Online monitoring of oil film using electrical capacitance tomography and level set method

    SciTech Connect

    Xue, Q. Ma, M.; Sun, B. Y.; Cui, Z. Q.; Wang, H. X.

    2015-08-15

    In the application of oil-air lubrication system, electrical capacitance tomography (ECT) provides a promising way for monitoring oil film in the pipelines by reconstructing cross sectional oil distributions in real time. While in the case of small diameter pipe and thin oil film, the thickness of the oil film is hard to be observed visually since the interface of oil and air is not obvious in the reconstructed images. And the existence of artifacts in the reconstructions has seriously influenced the effectiveness of image segmentation techniques such as level set method. Besides, level set method is also unavailable for online monitoring due to its low computation speed. To address these problems, a modified level set method is developed: a distance regularized level set evolution formulation is extended to image two-phase flow online using an ECT system, a narrowband image filter is defined to eliminate the influence of artifacts, and considering the continuity of the oil distribution variation, the detected oil-air interface of a former image can be used as the initial contour for the detection of the subsequent frame; thus, the propagation from the initial contour to the boundary can be greatly accelerated, making it possible for real time tracking. To testify the feasibility of the proposed method, an oil-air lubrication facility with 4 mm inner diameter pipe is measured in normal operation using an 8-electrode ECT system. Both simulation and experiment results indicate that the modified level set method is capable of visualizing the oil-air interface accurately online.

  18. Online monitoring of oil film using electrical capacitance tomography and level set method

    NASA Astrophysics Data System (ADS)

    Xue, Q.; Sun, B. Y.; Cui, Z. Q.; Ma, M.; Wang, H. X.

    2015-08-01

    In the application of oil-air lubrication system, electrical capacitance tomography (ECT) provides a promising way for monitoring oil film in the pipelines by reconstructing cross sectional oil distributions in real time. While in the case of small diameter pipe and thin oil film, the thickness of the oil film is hard to be observed visually since the interface of oil and air is not obvious in the reconstructed images. And the existence of artifacts in the reconstructions has seriously influenced the effectiveness of image segmentation techniques such as level set method. Besides, level set method is also unavailable for online monitoring due to its low computation speed. To address these problems, a modified level set method is developed: a distance regularized level set evolution formulation is extended to image two-phase flow online using an ECT system, a narrowband image filter is defined to eliminate the influence of artifacts, and considering the continuity of the oil distribution variation, the detected oil-air interface of a former image can be used as the initial contour for the detection of the subsequent frame; thus, the propagation from the initial contour to the boundary can be greatly accelerated, making it possible for real time tracking. To testify the feasibility of the proposed method, an oil-air lubrication facility with 4 mm inner diameter pipe is measured in normal operation using an 8-electrode ECT system. Both simulation and experiment results indicate that the modified level set method is capable of visualizing the oil-air interface accurately online.

  19. A GPU Accelerated Discontinuous Galerkin Conservative Level Set Method for Simulating Atomization

    NASA Astrophysics Data System (ADS)

    Jibben, Zechariah J.

    This dissertation describes a process for interface capturing via an arbitrary-order, nearly quadrature free, discontinuous Galerkin (DG) scheme for the conservative level set method (Olsson et al., 2005, 2008). The DG numerical method is utilized to solve both advection and reinitialization, and executed on a refined level set grid (Herrmann, 2008) for effective use of processing power. Computation is executed in parallel utilizing both CPU and GPU architectures to make the method feasible at high order. Finally, a sparse data structure is implemented to take full advantage of parallelism on the GPU, where performance relies on well-managed memory operations. With solution variables projected into a kth order polynomial basis, a k + 1 order convergence rate is found for both advection and reinitialization tests using the method of manufactured solutions. Other standard test cases, such as Zalesak's disk and deformation of columns and spheres in periodic vortices are also performed, showing several orders of magnitude improvement over traditional WENO level set methods. These tests also show the impact of reinitialization, which often increases shape and volume errors as a result of level set scalar trapping by normal vectors calculated from the local level set field. Accelerating advection via GPU hardware is found to provide a 30x speedup factor comparing a 2.0GHz Intel Xeon E5-2620 CPU in serial vs. a Nvidia Tesla K20 GPU, with speedup factors increasing with polynomial degree until shared memory is filled. A similar algorithm is implemented for reinitialization, which relies on heavier use of shared and global memory and as a result fills them more quickly and produces smaller speedups of 18x.

  20. Online monitoring of oil film using electrical capacitance tomography and level set method.

    PubMed

    Xue, Q; Sun, B Y; Cui, Z Q; Ma, M; Wang, H X

    2015-08-01

    In the application of oil-air lubrication system, electrical capacitance tomography (ECT) provides a promising way for monitoring oil film in the pipelines by reconstructing cross sectional oil distributions in real time. While in the case of small diameter pipe and thin oil film, the thickness of the oil film is hard to be observed visually since the interface of oil and air is not obvious in the reconstructed images. And the existence of artifacts in the reconstructions has seriously influenced the effectiveness of image segmentation techniques such as level set method. Besides, level set method is also unavailable for online monitoring due to its low computation speed. To address these problems, a modified level set method is developed: a distance regularized level set evolution formulation is extended to image two-phase flow online using an ECT system, a narrowband image filter is defined to eliminate the influence of artifacts, and considering the continuity of the oil distribution variation, the detected oil-air interface of a former image can be used as the initial contour for the detection of the subsequent frame; thus, the propagation from the initial contour to the boundary can be greatly accelerated, making it possible for real time tracking. To testify the feasibility of the proposed method, an oil-air lubrication facility with 4 mm inner diameter pipe is measured in normal operation using an 8-electrode ECT system. Both simulation and experiment results indicate that the modified level set method is capable of visualizing the oil-air interface accurately online. PMID:26329232

  1. Loosely coupled level sets for retinal layers and drusen segmentation in subjects with dry age-related macular degeneration

    NASA Astrophysics Data System (ADS)

    Novosel, Jelena; Wang, Ziyuan; de Jong, Henk; Vermeer, Koenraad A.; van Vliet, Lucas J.

    2016-03-01

    Optical coherence tomography (OCT) is used to produce high-resolution three-dimensional images of the retina, which permit the investigation of retinal irregularities. In dry age-related macular degeneration (AMD), a chronic eye disease that causes central vision loss, disruptions such as drusen and changes in retinal layer thicknesses occur which could be used as biomarkers for disease monitoring and diagnosis. Due to the topology disrupting pathology, existing segmentation methods often fail. Here, we present a solution for the segmentation of retinal layers in dry AMD subjects by extending our previously presented loosely coupled level sets framework which operates on attenuation coefficients. In eyes affected by AMD, Bruch's membrane becomes visible only below the drusen and our segmentation framework is adapted to delineate such a partially discernible interface. Furthermore, the initialization stage, which tentatively segments five interfaces, is modified to accommodate the appearance of drusen. This stage is based on Dijkstra's algorithm and combines prior knowledge on the shape of the interface, gradient and attenuation coefficient in the newly proposed cost function. This prior knowledge is incorporated by varying the weights for horizontal, diagonal and vertical edges. Finally, quantitative evaluation of the accuracy shows a good agreement between manual and automated segmentation.

  2. Brain extraction from cerebral MRI volume using a hybrid level set based active contour neighborhood model

    PubMed Central

    2013-01-01

    Background The extraction of brain tissue from cerebral MRI volume is an important pre-procedure for neuroimage analyses. The authors have developed an accurate and robust brain extraction method using a hybrid level set based active contour neighborhood model. Methods The method uses a nonlinear speed function in the hybrid level set model to eliminate boundary leakage. When using the new hybrid level set model an active contour neighborhood model is applied iteratively in the neighborhood of brain boundary. A slice by slice contour initial method is proposed to obtain the neighborhood of the brain boundary. The method was applied to the internet brain MRI data provided by the Internet Brain Segmentation Repository (IBSR). Results In testing, a mean Dice similarity coefficient of 0.95±0.02 and a mean Hausdorff distance of 12.4±4.5 were obtained when performing our method across the IBSR data set (18 × 1.5 mm scans). The results obtained using our method were very similar to those produced using manual segmentation and achieved the smallest mean Hausdorff distance on the IBSR data. Conclusions An automatic method of brain extraction from cerebral MRI volume was achieved and produced competitively accurate results. PMID:23587217

  3. Use of a General Level Framework to Facilitate Performance Improvement in Hospital Pharmacists in Singapore

    PubMed Central

    Wong, Camilla; Coombes, Ian; Cardiff, Lynda; Duggan, Catherine; Yee, Mei-Ling; Wee Lim, Kiat; Bates, Ian

    2012-01-01

    Objective. To evaluate the acceptability and validity of an adapted version of the General Level Framework (GLF) as a tool to facilitate and evaluate performance development in general pharmacist practitioners (those with less than 3 years of experience) in a Singapore hospital. Method. Observational evaluations during daily clinical activities were prospectively recorded for 35 pharmacists using the GLF at 2 time points over an average of 9 months. Feedback was provided to the pharmacists and then individualized learning plans were formulated. Results. Pharmacists’ mean competency cluster scores improved in all 3 clusters, and significant improvement was seen in all but 8 of the 63 behavioral descriptors (p ≤ 0.05). Nonsignificant improvements were attributed to the highest level of performance having been attained upon initial evaluation. Feedback indicated that the GLF process was a positive experience, prompting reflection on practice and culminating in needs-based learning and ultimately improved patient care. Conclusions. The General Level Framework was an acceptable tool for the facilitation and evaluation of performance development in general pharmacist practitioners in a Singapore hospital. PMID:22919083

  4. Level set algorithms comparison for multi-slice CT left ventricle segmentation

    NASA Astrophysics Data System (ADS)

    Medina, Ruben; La Cruz, Alexandra; Ordoñes, Andrés.; Pesántez, Daniel; Morocho, Villie; Vanegas, Pablo

    2015-12-01

    The comparison of several Level Set algorithms is performed with respect to 2D left ventricle segmentation in Multi-Slice CT images. Five algorithms are compared by calculating the Dice coefficient between the resulting segmentation contour and a reference contour traced by a cardiologist. The algorithms are also tested on images contaminated with Gaussian noise for several values of PSNR. Additionally an algorithm for providing the initialization shape is proposed. This algorithm is based on a combination of mathematical morphology tools with watershed and region growing algorithms. Results on the set of test images are promising and suggest the extension to 3{D MSCT database segmentation.

  5. A variational level set method for the topology optimization of steady-state Navier Stokes flow

    NASA Astrophysics Data System (ADS)

    Zhou, Shiwei; Li, Qing

    2008-12-01

    The smoothness of topological interfaces often largely affects the fluid optimization and sometimes makes the density-based approaches, though well established in structural designs, inadequate. This paper presents a level-set method for topology optimization of steady-state Navier-Stokes flow subject to a specific fluid volume constraint. The solid-fluid interface is implicitly characterized by a zero-level contour of a higher-order scalar level set function and can be naturally transformed to other configurations as its host moves. A variational form of the cost function is constructed based upon the adjoint variable and Lagrangian multiplier techniques. To satisfy the volume constraint effectively, the Lagrangian multiplier derived from the first-order approximation of the cost function is amended by the bisection algorithm. The procedure allows evolving initial design to an optimal shape and/or topology by solving the Hamilton-Jacobi equation. Two classes of benchmarking examples are presented in this paper: (1) periodic microstructural material design for the maximum permeability; and (2) topology optimization of flow channels for minimizing energy dissipation. A number of 2D and 3D examples well demonstrated the feasibility and advantage of the level-set method in solving fluid-solid shape and topology optimization problems.

  6. A patient-centered pharmacy services model of HIV patient care in community pharmacy settings: a theoretical and empirical framework.

    PubMed

    Kibicho, Jennifer; Owczarzak, Jill

    2012-01-01

    Reflecting trends in health care delivery, pharmacy practice has shifted from a drug-specific to a patient-centered model of care, aimed at improving the quality of patient care and reducing health care costs. In this article, we outline a theoretical model of patient-centered pharmacy services (PCPS), based on in-depth, qualitative interviews with a purposive sample of 28 pharmacists providing care to HIV-infected patients in specialty, semispecialty, and nonspecialty pharmacy settings. Data analysis was an interactive process informed by pharmacists' interviews and a review of the general literature on patient centered care, including Medication Therapy Management (MTM) services. Our main finding was that the current models of pharmacy services, including MTM, do not capture the range of pharmacy services in excess of mandated drug dispensing services. In this article, we propose a theoretical PCPS model that reflects the actual services pharmacists provide. The model includes five elements: (1) addressing patients as whole, contextualized persons; (2) customizing interventions to unique patient circumstances; (3) empowering patients to take responsibility for their own health care; (4) collaborating with clinical and nonclinical providers to address patient needs; and (5) developing sustained relationships with patients. The overarching goal of PCPS is to empower patients' to take responsibility for their own health care and self-manage their HIV-infection. Our findings provide the foundation for future studies regarding how widespread these practices are in diverse community settings, the validity of the proposed PCPS model, the potential for standardizing pharmacist practices, and the feasibility of a PCPS framework to reimburse pharmacists services. PMID:22149903

  7. Not Your Basic Base Levels: Simulations of Erosion and Deposition With Fluctuating Water Levels in Coastal and Enclosed Basin Settings

    NASA Astrophysics Data System (ADS)

    Howard, A. D.; Matsubara, Y.; Lloyd, H.

    2006-12-01

    The DELIM landform evolution model has been adapted to investigate erosional and depositional landforms in two setting with fluctuating base levels. The first is erosion and wave planation of terraced landscapes in Coastal Plain sediments along the estuarine Potomac River. The last 3.5 million years of erosion is simulated with base level fluctuations based upon the long-term oceanic delta 18O record, eustatic sea level changes during the last 120 ka, estimates of the history of tectonic uplift in the region, and maximum depths of incision of the Potomac River during sea-level lowstands. Inhibition of runoff erosion by vegetation has been a crucial factor allowing persistence of uplands in the soft coastal plain bedrock. The role of vegetation is simulated as a contributing area- dependent critical shear stress. Development of wave-cut terraces is simulated by episodic planation of the landscape during base-level highstands. Although low base level excursions are infrequent and of short duration, the total amount of erosion is largely controlled by the depth and frequency of lowstands. The model has also been adapted to account for flow routing and accompanying erosion and sedimentation in landscapes with multiple enclosed depressions. The hydrological portion of the model has been calibrated and tested in the Great Basin and Mojave regions of the southwestern U.S. In such a setting, runoff, largely from mountains, may flow through several lacustrine basins, each with evaporative losses. An iterative approach determines the size and depth of lakes, including overflow (or not) that balances runoff and evaporation. The model utilizes information on temperatures, rainfall, runoff, and evaporation within the region to parameterize evaporation and runoff as functions of latitude, mean annual temperature, precipitation, and elevation. The model is successful in predicting the location of modern perennial lakes in the region as well as that of lakes during the last

  8. Geological repository for nuclear high level waste in France from feasibility to design within a legal framework

    SciTech Connect

    Voizard, Patrice; Mayer, Stefan; Ouzounian, Gerald

    2007-07-01

    Over the past 15 years, the French program on deep geologic disposal of high level and long-lived radioactive waste has benefited from a clear legal framework as the result of the December 30, 1991 French Waste Act. To fulfil its obligations stipulated in this law, ANDRA has submitted the 'Dossier 2005 Argile' (clay) and 'Dossier 2005 Granite' to the French Government. The first of those reports presents a concept for the underground disposal of nuclear waste at a specific clay site and focuses on a feasibility study. Knowledge of the host rock characteristics is based on the investigations carried out at the Meuse/Haute Marne Underground Research Laboratory. The repository concept addresses various issues, the most important of which relates to the large amount of waste, the clay host rock and the reversibility requirement. This phase has ended upon review and evaluation of the 'Dossier 2005' made by different organisations including the National Review Board, the National Safety Authority and the NEA International Review Team. By passing the 'new', June 28, 2006 Planning Act on the sustainable management of radioactive materials and waste, the French parliament has further defined a clear legal framework for future work. This June 28 Planning Act thus sets a schedule and defines the objectives for the next phase of repository design in requesting the submission of a construction authorization application by 2015. The law calls for the repository program to be in a position to commission disposal installations by 2025. (authors)

  9. Level set segmentation of bovine corpora lutea in ex situ ovarian ultrasound images

    PubMed Central

    Rusnell, Brennan J; Pierson, Roger A; Singh, Jaswant; Adams, Gregg P; Eramian, Mark G

    2008-01-01

    Background The objective of this study was to investigate the viability of level set image segmentation methods for the detection of corpora lutea (corpus luteum, CL) boundaries in ultrasonographic ovarian images. It was hypothesized that bovine CL boundaries could be located within 1–2 mm by a level set image segmentation methodology. Methods Level set methods embed a 2D contour in a 3D surface and evolve that surface over time according to an image-dependent speed function. A speed function suitable for segmentation of CL's in ovarian ultrasound images was developed. An initial contour was manually placed and contour evolution was allowed to proceed until the rate of change of the area was sufficiently small. The method was tested on ovarian ultrasonographic images (n = 8) obtained ex situ. A expert in ovarian ultrasound interpretation delineated CL boundaries manually to serve as a "ground truth". Accuracy of the level set segmentation algorithm was determined by comparing semi-automatically determined contours with ground truth contours using the mean absolute difference (MAD), root mean squared difference (RMSD), Hausdorff distance (HD), sensitivity, and specificity metrics. Results and discussion The mean MAD was 0.87 mm (sigma = 0.36 mm), RMSD was 1.1 mm (sigma = 0.47 mm), and HD was 3.4 mm (sigma = 2.0 mm) indicating that, on average, boundaries were accurate within 1–2 mm, however, deviations in excess of 3 mm from the ground truth were observed indicating under- or over-expansion of the contour. Mean sensitivity and specificity were 0.814 (sigma = 0.171) and 0.990 (sigma = 0.00786), respectively, indicating that CLs were consistently undersegmented but rarely did the contour interior include pixels that were judged by the human expert not to be part of the CL. It was observed that in localities where gradient magnitudes within the CL were strong due to high contrast speckle, contour expansion stopped too early. Conclusion The hypothesis that level set

  10. A semi-implicit level set method for multiphase flows and fluid-structure interaction problems

    NASA Astrophysics Data System (ADS)

    Cottet, Georges-Henri; Maitre, Emmanuel

    2016-06-01

    In this paper we present a novel semi-implicit time-discretization of the level set method introduced in [8] for fluid-structure interaction problems. The idea stems from a linear stability analysis derived on a simplified one-dimensional problem. The semi-implicit scheme relies on a simple filter operating as a pre-processing on the level set function. It applies to multiphase flows driven by surface tension as well as to fluid-structure interaction problems. The semi-implicit scheme avoids the stability constraints that explicit scheme need to satisfy and reduces significantly the computational cost. It is validated through comparisons with the original explicit scheme and refinement studies on two-dimensional benchmarks.

  11. A Cartesian Adaptive Level Set Method for Two-Phase Flows

    NASA Technical Reports Server (NTRS)

    Ham, F.; Young, Y.-N.

    2003-01-01

    In the present contribution we develop a level set method based on local anisotropic Cartesian adaptation as described in Ham et al. (2002). Such an approach should allow for the smallest possible Cartesian grid capable of resolving a given flow. The remainder of the paper is organized as follows. In section 2 the level set formulation for free surface calculations is presented and its strengths and weaknesses relative to the other free surface methods reviewed. In section 3 the collocated numerical method is described. In section 4 the method is validated by solving the 2D and 3D drop oscilation problem. In section 5 we present some results from more complex cases including the 3D drop breakup in an impulsively accelerated free stream, and the 3D immiscible Rayleigh-Taylor instability. Conclusions are given in section 6.

  12. Parallel computation of level set method for 500 Hz visual servo control

    NASA Astrophysics Data System (ADS)

    Fei, Xianfeng; Igarashi, Yasunobu; Hashimoto, Koichi

    2008-11-01

    We propose a 2D microorganism tracking system using a parallel level set method and a column parallel vision system (CPV). This system keeps a single microorganism in the middle of the visual field under a microscope by visual servoing an automated stage. We propose a new energy function for the level set method. This function constrains an amount of light intensity inside the detected object contour to control the number of the detected objects. This algorithm is implemented in CPV system and computational time for each frame is 2 [ms], approximately. A tracking experiment for about 25 s is demonstrated. Also we demonstrate a single paramecium can be kept tracking even if other paramecia appear in the visual field and contact with the tracked paramecium.

  13. Segmentation of cardiac cine-MR images and myocardial deformation assessment using level set methods.

    PubMed

    Chenoune, Y; Deléchelle, E; Petit, E; Goissen, T; Garot, J; Rahmouni, A

    2005-12-01

    In this paper, we present an original method to assess the deformations of the left ventricular myocardium on cardiac cine-MRI. First, a segmentation process, based on a level set method is directly applied on a 2D + t dataset to detect endocardial contours. Second, the successive segmented contours are matched using a procedure of global alignment, followed by a morphing process based on a level set approach. Finally, local measurements of myocardial deformations are derived from the previously determined matched contours. The validation step is realized by comparing our results to the measurements achieved on the same patients by an expert using the semi-automated HARP reference method on tagged MR images. PMID:16290086

  14. Feasibility of level-set analysis of enface OCT retinal images in diabetic retinopathy

    PubMed Central

    Mohammad, Fatimah; Ansari, Rashid; Wanek, Justin; Francis, Andrew; Shahidi, Mahnaz

    2015-01-01

    Pathology segmentation in retinal images of patients with diabetic retinopathy is important to help better understand disease processes. We propose an automated level-set method with Fourier descriptor-based shape priors. A cost function measures the difference between the current and expected output. We applied our method to enface images generated for seven retinal layers and determined correspondence of pathologies between retinal layers. We compared our method to a distance-regularized level set method and show the advantages of using well-defined shape priors. Results obtained allow us to observe pathologies across multiple layers and to obtain metrics that measure the co-localization of pathologies in different layers. PMID:26137390

  15. Automatic segmentation of Leishmania parasite in microscopic images using a modified CV level set method

    NASA Astrophysics Data System (ADS)

    Farahi, Maria; Rabbani, Hossein; Talebi, Ardeshir; Sarrafzadeh, Omid; Ensafi, Shahab

    2015-12-01

    Visceral Leishmaniasis is a parasitic disease that affects liver, spleen and bone marrow. According to World Health Organization report, definitive diagnosis is possible just by direct observation of the Leishman body in the microscopic image taken from bone marrow samples. We utilize morphological and CV level set method to segment Leishman bodies in digital color microscopic images captured from bone marrow samples. Linear contrast stretching method is used for image enhancement and morphological method is applied to determine the parasite regions and wipe up unwanted objects. Modified global and local CV level set methods are proposed for segmentation and a shape based stopping factor is used to hasten the algorithm. Manual segmentation is considered as ground truth to evaluate the proposed method. This method is tested on 28 samples and achieved 10.90% mean of segmentation error for global model and 9.76% for local model.

  16. An image-set for identifying multiple regions/levels of interest in digital images

    NASA Astrophysics Data System (ADS)

    Jaber, Mustafa; Bailly, Mark; Wang, Yuqiong; Saber, Eli

    2011-09-01

    In the field of identifying regions-of-interest (ROI) in digital images, several image-sets are referenced in the literature; the open-source ones typically present a single main object (usually located at or near the image center as a pop-out). In this paper, we present a comprehensive image-set (with its ground-truth) which will be made publically available. The database consists of images that demonstrate multiple-regions-of-interest (MROI) or multiple-levels-of-interest (MLOI). The former terminology signifies that the scene has a group of subjects/objects (not necessarily spatially-connected regions) that share the same level of perceptual priority to the human observer while the latter indicates that the scene is complex enough to have primary, secondary, and background objects. The methodology for developing the proposed image-set is described. A psychophysical experiment to identify MROI and MLOI was conducted, the results of which are also presented. The image-set has been developed to be used in training and evaluation of ROI detection algorithms. Applications include image compression, thumbnailing, summarization, and mobile phone imagery. fluor

  17. Therapeutic and diagnostic set for irradiation the cell lines in low level laser therapy

    NASA Astrophysics Data System (ADS)

    Gryko, Lukasz; Zajac, Andrzej; Gilewski, Marian; Szymanska, Justyna; Goralczyk, Krzysztof

    2014-05-01

    In the paper is presented optoelectronic diagnostic set for standardization the biostimulation procedures performed on cell lines. The basic functional components of the therapeutic set are two digitally controlled illuminators. They are composed of the sets of semiconductor emitters - medium power laser diodes and high power LEDs emitting radiation in wide spectral range from 600 nm to 1000 nm. Emitters are coupled with applicator by fibre optic and optical systems that provides uniform irradiation of vessel with cell culture samples. Integrated spectrometer and optical power meter allow to control the energy and spectral parameters of electromagnetic radiation during the Low Level Light Therapy procedure. Dedicated power supplies and digital controlling system allow independent power of each emitter . It was developed active temperature stabilization system to thermal adjust spectral line of emitted radiation to more efficient association with absorption spectra of biological acceptors. Using the set to controlled irradiation and allowing to measure absorption spectrum of biological medium it is possible to carry out objective assessment the impact of the exposure parameters on the state cells subjected to Low Level Light Therapy. That procedure allows comparing the biological response of cell lines after irradiation with radiation of variable spectral and energetic parameters. Researches were carried out on vascular endothelial cell lines. Cells proliferations after irradiation of LEDs: 645 nm, 680 nm, 740 nm, 780 nm, 830 nm, 870 nm, 890 nm, 970 nm and lasers 650 nm and 830 nm were examined.

  18. A Bayesian framework for cell-level protein network analysis for multivariate proteomics image data

    NASA Astrophysics Data System (ADS)

    Kovacheva, Violet N.; Sirinukunwattana, Korsuk; Rajpoot, Nasir M.

    2014-03-01

    The recent development of multivariate imaging techniques, such as the Toponome Imaging System (TIS), has facilitated the analysis of multiple co-localisation of proteins. This could hold the key to understanding complex phenomena such as protein-protein interaction in cancer. In this paper, we propose a Bayesian framework for cell level network analysis allowing the identification of several protein pairs having significantly higher co-expression levels in cancerous tissue samples when compared to normal colon tissue. It involves segmenting the DAPI-labeled image into cells and determining the cell phenotypes according to their protein-protein dependence profile. The cells are phenotyped using Gaussian Bayesian hierarchical clustering (GBHC) after feature selection is performed. The phenotypes are then analysed using Difference in Sums of Weighted cO-dependence Profiles (DiSWOP), which detects differences in the co-expression patterns of protein pairs. We demonstrate that the pairs highlighted by the proposed framework have high concordance with recent results using a different phenotyping method. This demonstrates that the results are independent of the clustering method used. In addition, the highlighted protein pairs are further analysed via protein interaction pathway databases and by considering the localization of high protein-protein dependence within individual samples. This suggests that the proposed approach could identify potentially functional protein complexes active in cancer progression and cell differentiation.

  19. MOVE: a multi-level ontology-based visualization and exploration framework for genomic networks.

    PubMed

    Bosman, Diederik W J; Blom, Evert-Jan; Ogao, Patrick J; Kuipers, Oscar P; Roerdink, Jos B T M

    2007-01-01

    Among the various research areas that comprise bioinformatics, systems biology is gaining increasing attention. An important goal of systems biology is the unraveling of dynamic interactions between components of living cells (e. g., proteins, genes). These interactions exist among others on genomic, transcriptomic, proteomic and metabolomic levels. The levels themselves are heavily interconnected, resulting in complex networks of different interacting biological entities. Currently, various bioinformatics tools exist which are able to perform a particular analysis on a particular type of network. Unfortunately, each tool has its own disadvantages hampering it to be used consistently for different types of networks or analytical methods. This paper describes the conceptual development of an open source extensible software framework that supports visualization and exploration of highly complex genomic networks, like metabolic or gene regulatory networks. The focus is on the conceptual foundations, starting from requirements, a description of the state of the art of network visualization systems, and an analysis of their shortcomings. We describe the implementation of some initial modules of the framework and apply them to a biological test case in bacterial regulation, which shows the relevance and feasibility of the proposed approach. PMID:17688427

  20. Probabilistic framework for assessing the ice sheet contribution to sea level change

    PubMed Central

    Little, Christopher M.; Urban, Nathan M.; Oppenheimer, Michael

    2013-01-01

    Previous sea level rise (SLR) assessments have excluded the potential for dynamic ice loss over much of Greenland and Antarctica, and recently proposed “upper bounds” on Antarctica’s 21st-century SLR contribution are derived principally from regions where present-day mass loss is concentrated (basin 15, or B15, drained largely by Pine Island, Thwaites, and Smith glaciers). Here, we present a probabilistic framework for assessing the ice sheet contribution to sea level change that explicitly accounts for mass balance uncertainty over an entire ice sheet. Applying this framework to Antarctica, we find that ongoing mass imbalances in non-B15 basins give an SLR contribution by 2100 that: (i) is comparable to projected changes in B15 discharge and Antarctica’s surface mass balance, and (ii) varies widely depending on the subset of basins and observational dataset used in projections. Increases in discharge uncertainty, or decreases in the exceedance probability used to define an upper bound, increase the fractional contribution of non-B15 basins; even weak spatial correlations in future discharge growth rates markedly enhance this sensitivity. Although these projections rely on poorly constrained statistical parameters, they may be updated with observations and/or models at many spatial scales, facilitating a more comprehensive account of uncertainty that, if implemented, will improve future assessments. PMID:23404697

  1. Systems Science and Obesity Policy: A Novel Framework for Analyzing and Rethinking Population-Level Planning

    PubMed Central

    Matteson, Carrie L.; Finegood, Diane T.

    2014-01-01

    Objectives. We demonstrate the use of a systems-based framework to assess solutions to complex health problems such as obesity. Methods. We coded 12 documents published between 2004 and 2013 aimed at influencing obesity planning for complex systems design (9 reports from US and Canadian governmental or health authorities, 1 Cochrane review, and 2 Institute of Medicine reports). We sorted data using the intervention-level framework (ILF), a novel solutions-oriented approach to complex problems. An in-depth comparison of 3 documents provides further insight into complexity and systems design in obesity policy. Results. The majority of strategies focused mainly on changing the determinants of energy imbalance (food intake and physical activity). ILF analysis brings to the surface actions aimed at higher levels of system function and points to a need for more innovative policy design. Conclusions. Although many policymakers acknowledge obesity as a complex problem, many strategies stem from the paradigm of individual choice and are limited in scope. The ILF provides a template to encourage natural systems thinking and more strategic policy design grounded in complexity science. PMID:24832406

  2. A Framework for Lab Work Management in Mass Courses. Application to Low Level Input/Output without Hardware

    ERIC Educational Resources Information Center

    Rodriguez, Santiago; Zamorano, Juan; Rosales, Francisco; Dopico, Antonio Garcia; Pedraza, Jose Luis

    2007-01-01

    This paper describes a complete lab work management framework designed and developed in the authors' department to help teachers to manage the small projects that students are expected to complete as lab assignments during their graduate-level computer engineering studies. The paper focuses on an application example of the framework to a specific…

  3. Stochastic level-set variational implicit-solvent approach to solute-solvent interfacial fluctuations.

    PubMed

    Zhou, Shenggao; Sun, Hui; Cheng, Li-Tien; Dzubiella, Joachim; Li, Bo; McCammon, J Andrew

    2016-08-01

    Recent years have seen the initial success of a variational implicit-solvent model (VISM), implemented with a robust level-set method, in capturing efficiently different hydration states and providing quantitatively good estimation of solvation free energies of biomolecules. The level-set minimization of the VISM solvation free-energy functional of all possible solute-solvent interfaces or dielectric boundaries predicts an equilibrium biomolecular conformation that is often close to an initial guess. In this work, we develop a theory in the form of Langevin geometrical flow to incorporate solute-solvent interfacial fluctuations into the VISM. Such fluctuations are crucial to biomolecular conformational changes and binding process. We also develop a stochastic level-set method to numerically implement such a theory. We describe the interfacial fluctuation through the "normal velocity" that is the solute-solvent interfacial force, derive the corresponding stochastic level-set equation in the sense of Stratonovich so that the surface representation is independent of the choice of implicit function, and develop numerical techniques for solving such an equation and processing the numerical data. We apply our computational method to study the dewetting transition in the system of two hydrophobic plates and a hydrophobic cavity of a synthetic host molecule cucurbit[7]uril. Numerical simulations demonstrate that our approach can describe an underlying system jumping out of a local minimum of the free-energy functional and can capture dewetting transitions of hydrophobic systems. In the case of two hydrophobic plates, we find that the wavelength of interfacial fluctuations has a strong influence to the dewetting transition. In addition, we find that the estimated energy barrier of the dewetting transition scales quadratically with the inter-plate distance, agreeing well with existing studies of molecular dynamics simulations. Our work is a first step toward the inclusion of

  4. Stochastic level-set variational implicit-solvent approach to solute-solvent interfacial fluctuations

    NASA Astrophysics Data System (ADS)

    Zhou, Shenggao; Sun, Hui; Cheng, Li-Tien; Dzubiella, Joachim; Li, Bo; McCammon, J. Andrew

    2016-08-01

    Recent years have seen the initial success of a variational implicit-solvent model (VISM), implemented with a robust level-set method, in capturing efficiently different hydration states and providing quantitatively good estimation of solvation free energies of biomolecules. The level-set minimization of the VISM solvation free-energy functional of all possible solute-solvent interfaces or dielectric boundaries predicts an equilibrium biomolecular conformation that is often close to an initial guess. In this work, we develop a theory in the form of Langevin geometrical flow to incorporate solute-solvent interfacial fluctuations into the VISM. Such fluctuations are crucial to biomolecular conformational changes and binding process. We also develop a stochastic level-set method to numerically implement such a theory. We describe the interfacial fluctuation through the "normal velocity" that is the solute-solvent interfacial force, derive the corresponding stochastic level-set equation in the sense of Stratonovich so that the surface representation is independent of the choice of implicit function, and develop numerical techniques for solving such an equation and processing the numerical data. We apply our computational method to study the dewetting transition in the system of two hydrophobic plates and a hydrophobic cavity of a synthetic host molecule cucurbit[7]uril. Numerical simulations demonstrate that our approach can describe an underlying system jumping out of a local minimum of the free-energy functional and can capture dewetting transitions of hydrophobic systems. In the case of two hydrophobic plates, we find that the wavelength of interfacial fluctuations has a strong influence to the dewetting transition. In addition, we find that the estimated energy barrier of the dewetting transition scales quadratically with the inter-plate distance, agreeing well with existing studies of molecular dynamics simulations. Our work is a first step toward the inclusion of

  5. Segmentation of the liver from abdominal MR images: a level-set approach

    NASA Astrophysics Data System (ADS)

    Abdalbari, Anwar; Huang, Xishi; Ren, Jing

    2015-03-01

    The usage of prior knowledge in segmentation of abdominal MR images enables more accurate and comprehensive interpretation about the organ to segment. Prior knowledge about abdominal organ like liver vessels can be employed to get an accurate segmentation of the liver that leads to accurate diagnosis or treatment plan. In this paper, a new method for segmenting the liver from abdominal MR images using liver vessels as prior knowledge is proposed. This paper employs the technique of level set method to segment the liver from MR abdominal images. The speed image used in the level set method is responsible for propagating and stopping region growing at boundaries. As a result of the poor contrast of the MR images between the liver and the surrounding organs i.e. stomach, kidneys, and heart causes leak of the segmented liver to those organs that lead to inaccurate or incorrect segmentation. For that reason, a second speed image is developed, as an extra term to the level set, to control the front propagation at weak edges with the help of the original speed image. The basic idea of the proposed approach is to use the second speed image as a boundary surface which is approximately orthogonal to the area of the leak. The aim of the new speed image is to slow down the level set propagation and prevent the leak in the regions close to liver boundary. The new speed image is a surface created by filling holes to reconstruct the liver surface. These holes are formed as a result of the exit and the entry of the liver vessels, and are considered the main cause of the segmentation leak. The result of the proposed method shows superior outcome than other methods in the literature.

  6. A self-adaptive oriented particles Level-Set method for tracking interfaces

    NASA Astrophysics Data System (ADS)

    Ianniello, S.; Di Mascio, A.

    2010-02-01

    A new method for tracking evolving interfaces by lagrangian particles in conjunction with a Level-Set approach is introduced. This numerical technique is based on the use of time evolution equations for fundamental vector and tensor quantities defined on the front and represents a new and convenient way to couple the advantages of the Eulerian description given by a Level-Set function ϕ to the use of Lagrangian massless particles. The term oriented points out that the information advected by the particles not only concern the spatial location, but also the local (outward) normal vector n to the interface Γ and the second fundamental tensor (the shape operator) ∇n. The particles are exactly located upon Γ and provide all the requested information for tracking the interface on their own. In addition, a self-adaptive mechanism suitably modifies, at each time step, the markers distribution in the numerical domain: each particle behaves both as a potential seeder of new markers on Γ (so as to guarantee an accurate reconstruction of the interface) and a de-seeder (to avoid any useless gathering of markers and to limit the computational effort). The algorithm is conceived to avoid any transport equation for ϕ and to confine the Level-Set function to the role of a mere post-processing tool; thus, all the numerical diffusion problems usually affecting the Level-Set methodology are removed. The method has been tested both on 2D and 3D configurations; it carries out a fast reconstruction of the interface and its accuracy is only limited by the spatial resolution of the mesh.

  7. A level set simulation for ordering of quantum dots via cleaved-edge overgrowth

    NASA Astrophysics Data System (ADS)

    Niu, X. B.; Uccelli, E.; Fontcuberta i Morral, A.; Ratsch, C.

    2009-07-01

    Cleaved-edge overgrowth (CEO) is a promising technique to obtain ordered arrays of quantum dots, where the size and position of the dots can be controlled very well. We present level set simulations for CEO. Our simulations illustrate how the quality of the CEO technique depends on the potential energy surface (PES) for adatom diffusion, and thus suggest how variations of the PES can potentially improve the uniformity of quantum dot arrays.

  8. Hydrological drivers of record-setting water level rise on Earth's largest lake system

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Bruxer, J.; Durnford, D.; Smith, J. P.; Clites, A. H.; Seglenieks, F.; Qian, S. S.; Hunter, T. S.; Fortin, V.

    2016-05-01

    Between January 2013 and December 2014, water levels on Lake Superior and Lake Michigan-Huron, the two largest lakes on Earth by surface area, rose at the highest rate ever recorded for a 2 year period beginning in January and ending in December of the following year. This historic event coincided with below-average air temperatures and extensive winter ice cover across the Great Lakes. It also brought an end to a 15 year period of persistently below-average water levels on Lakes Superior and Michigan-Huron that included several months of record-low water levels. To differentiate hydrological drivers behind the recent water level rise, we developed a Bayesian Markov chain Monte Carlo (MCMC) routine for inferring historical estimates of the major components of each lake's water budget. Our results indicate that, in 2013, the water level rise on Lake Superior was driven by increased spring runoff and over-lake precipitation. In 2014, reduced over-lake evaporation played a more significant role in Lake Superior's water level rise. The water level rise on Lake Michigan-Huron in 2013 was also due to above-average spring runoff and persistent over-lake precipitation, while in 2014, it was due to a rare combination of below-average evaporation, above-average runoff and precipitation, and very high inflow rates from Lake Superior through the St. Marys River. We expect, in future research, to apply our new framework across the other Laurentian Great Lakes, and to Earth's other large freshwater basins as well.

  9. On the geometry of two-dimensional slices of irregular level sets in turbulent flows

    SciTech Connect

    Catrakis, H.J.; Cook, A.W.; Dimotakis, P.E.; Patton, J.M.

    1998-03-20

    Isoscalar surfaces in turbulent flows are found to be more complex than (self-similar) fractals, in both the far field of liquid-phase turbulent jets and in a realization of Rayleigh-Taylor-instability flow. In particular, they exhibit a scale-dependent coverage dimension, D{sub 2}((lambda)), for 2-D slices of scalar level sets, that increases with scale, from unity, at small scales, to 2, at large scales. For the jet flow and Reynolds numbers investigated, the isoscalar-surface geometry is both scalar-threshold- and Re-dependent; the level-set (coverage) length decreases with increasing Re, indicating enhanced mixing with increasing Reynolds number; and the size distribution of closed regions is well described by lognormal statistics at small scales. A similar D{sub 2}((lambda)) behavior is found for level-set data of 3-D density-interface behavior in recent direct numerical-simulation studies of Rayleigh-Taylor-instability flow. A comparison of (spatial) spectral and isoscalar coverage statistics will be disc

  10. Vascular Tree Segmentation in Medical Images Using Hessian-Based Multiscale Filtering and Level Set Method

    PubMed Central

    Jin, Jiaoying; Yang, Linjun; Zhang, Xuming

    2013-01-01

    Vascular segmentation plays an important role in medical image analysis. A novel technique for the automatic extraction of vascular trees from 2D medical images is presented, which combines Hessian-based multiscale filtering and a modified level set method. In the proposed algorithm, the morphological top-hat transformation is firstly adopted to attenuate background. Then Hessian-based multiscale filtering is used to enhance vascular structures by combining Hessian matrix with Gaussian convolution to tune the filtering response to the specific scales. Because Gaussian convolution tends to blur vessel boundaries, which makes scale selection inaccurate, an improved level set method is finally proposed to extract vascular structures by introducing an external constrained term related to the standard deviation of Gaussian function into the traditional level set. Our approach was tested on synthetic images with vascular-like structures and 2D slices extracted from real 3D abdomen magnetic resonance angiography (MRA) images along the coronal plane. The segmentation rates for synthetic images are above 95%. The results for MRA images demonstrate that the proposed method can extract most of the vascular structures successfully and accurately in visualization. Therefore, the proposed method is effective for the vascular tree extraction in medical images. PMID:24348738

  11. Numerical Simulation of Dynamic Contact Angles and Contact Lines in Multiphase Flows using Level Set Method

    NASA Astrophysics Data System (ADS)

    Pendota, Premchand

    Many physical phenomena and industrial applications involve multiphase fluid flows and hence it is of high importance to be able to simulate various aspects of these flows accurately. The Dynamic Contact Angles (DCA) and the contact lines at the wall boundaries are a couple of such important aspects. In the past few decades, many mathematical models were developed for predicting the contact angles of the inter-face with the wall boundary under various flow conditions. These models are used to incorporate the physics of DCA and contact line motion in numerical simulations using various interface capturing/tracking techniques. In the current thesis, a simple approach to incorporate the static and dynamic contact angle boundary conditions using the level set method is developed and implemented in multiphase CFD codes, LIT (Level set Interface Tracking) (Herrmann (2008)) and NGA (flow solver) (Desjardins et al (2008)). Various DCA models and associated boundary conditions are reviewed. In addition, numerical aspects such as the occurrence of a stress singularity at the contact lines and grid convergence of macroscopic interface shape are dealt with in the context of the level set approach.

  12. A Real-Time Algorithm for the Approximation of Level-Set-Based Curve Evolution

    PubMed Central

    Shi, Yonggang; Karl, William Clem

    2010-01-01

    In this paper, we present a complete and practical algorithm for the approximation of level-set-based curve evolution suitable for real-time implementation. In particular, we propose a two-cycle algorithm to approximate level-set-based curve evolution without the need of solving partial differential equations (PDEs). Our algorithm is applicable to a broad class of evolution speeds that can be viewed as composed of a data-dependent term and a curve smoothness regularization term. We achieve curve evolution corresponding to such evolution speeds by separating the evolution process into two different cycles: one cycle for the data-dependent term and a second cycle for the smoothness regularization. The smoothing term is derived from a Gaussian filtering process. In both cycles, the evolution is realized through a simple element switching mechanism between two linked lists, that implicitly represents the curve using an integer valued level-set function. By careful construction, all the key evolution steps require only integer operations. A consequence is that we obtain significant computation speedups compared to exact PDE-based approaches while obtaining excellent agreement with these methods for problems of practical engineering interest. In particular, the resulting algorithm is fast enough for use in real-time video processing applications, which we demonstrate through several image segmentation and video tracking experiments. PMID:18390371

  13. Dynamic multi-source X-ray tomography using a spacetime level set method

    NASA Astrophysics Data System (ADS)

    Niemi, Esa; Lassas, Matti; Kallonen, Aki; Harhanen, Lauri; Hämäläinen, Keijo; Siltanen, Samuli

    2015-06-01

    A novel variant of the level set method is introduced for dynamic X-ray tomography. The target is allowed to change in time while being imaged by one or several source-detector pairs at a relatively high frame-rate. The algorithmic approach is motivated by the results in [22], showing that the modified level set method can tolerate highly incomplete projection data in stationary tomography. Furthermore, defining the level set function in spacetime enforces temporal continuity in the dynamic tomography context considered here. The tomographic reconstruction is found as a minimizer of a nonlinear functional. The functional contains a regularization term penalizing the L2 norms of up to n derivatives of the reconstruction. The case n = 1 is shown to be equivalent to a convex Tikhonov problem that has a unique minimizer. For n ≥ 2 the existence of a minimizer is proved under certain assumptions on the signal-to-noise ratio and the size of the regularization parameter. Numerical examples with both simulated and measured dynamic X-ray data are included, and the proposed method is found to yield reconstructions superior to standard methods such as FBP or non-negativity constrained Tikhonov regularization and favorably comparable to those of total variation regularization. Furthermore, the methodology can be adapted to a wide range of measurement arrangements with one or more X-ray sources.

  14. A level set simulation of dendritic solidification of multi-component alloys

    NASA Astrophysics Data System (ADS)

    Tan, Lijian; Zabaras, Nicholas

    2007-01-01

    A level set method combining features of front tracking methods and fixed domain methods is presented to model microstructure evolution in the solidification of multi-component alloys. Phase boundaries are tracked by solving the multi-phase level set equations. Diffused interfaces are constructed from these tracked phase boundaries using the level set functions. Based on the assumed diffused interfaces, volume-averaging techniques are applied for energy, species and momentum transport. Microstructure evolution in multi-component alloy systems is predicted using realistic material parameters. The methodology avoids the difficulty of parameter identification needed in other diffused interface models, and allows easy application to various practical alloy systems. Techniques including fast marching, narrow band computing and adaptive meshing are utilized to speed up computations. Several numerical examples are considered to validate the method and examine its potential for modeling solidification of practical alloy systems. These examples include two- and three-dimensional solidification of a binary alloy in an undercooled melt, a study of planar/cellular/dendritic transition in the solidification of a Ni-Cu alloy, and eutectic and peritectic solidification of an Fe-C system. Adaptive mesh refinement in the rapidly varying interface region makes the method practical for coupling the microstructure evolution at the meso-scale with buoyancy driven flow in the macro-scale, which is shown in the solidification of a Ni-Al-Ta ternary alloy.

  15. Development of a hydrogeologic framework using tidally influenced groundwater levels, Hawaii

    NASA Astrophysics Data System (ADS)

    Rotzoll, K.; Oki, D. S.; El-Kadi, A. I.

    2013-12-01

    Aquifer hydraulic properties can be estimated from commonly available water-level data from tidally influenced wells because the tidal signal attenuation depends on the aquifer's regional hydraulic diffusivity. Estimates of hydraulic properties are required for models that are used to manage groundwater availability and quality. A few localized studies of tidal attenuation in Hawaii have been published, but many water-level records have not been analyzed and no regional synthesis of tidal attenuation information in Hawaii exists. Therefore, we estimate aquifer properties from tidal attenuation for Hawaii using groundwater-level records from more than 350 wells. Filtering methods to separate water-level fluctuations caused by ocean tides from other environmental stresses such as barometric pressure and long-period ocean-level variations are explored. For short-term records, several approaches to identify tidal components are examined. The estimated aquifer properties are combined in a regional context with respect to the hydrogeologic framework of each island. The results help to better understand conceptual models of groundwater flow in Hawaii aquifers and facilitate the development of regional numerical groundwater flow and transport models aimed at sustainable water-resource management.

  16. A predictive coding framework for rapid neural dynamics during sentence-level language comprehension.

    PubMed

    Lewis, Ashley G; Bastiaansen, Marcel

    2015-07-01

    There is a growing literature investigating the relationship between oscillatory neural dynamics measured using electroencephalography (EEG) and/or magnetoencephalography (MEG), and sentence-level language comprehension. Recent proposals have suggested a strong link between predictive coding accounts of the hierarchical flow of information in the brain, and oscillatory neural dynamics in the beta and gamma frequency ranges. We propose that findings relating beta and gamma oscillations to sentence-level language comprehension might be unified under such a predictive coding account. Our suggestion is that oscillatory activity in the beta frequency range may reflect both the active maintenance of the current network configuration responsible for representing the sentence-level meaning under construction, and the top-down propagation of predictions to hierarchically lower processing levels based on that representation. In addition, we suggest that oscillatory activity in the low and middle gamma range reflect the matching of top-down predictions with bottom-up linguistic input, while evoked high gamma might reflect the propagation of bottom-up prediction errors to higher levels of the processing hierarchy. We also discuss some of the implications of this predictive coding framework, and we outline ideas for how these might be tested experimentally. PMID:25840879

  17. Statistical criteria to set alarm levels for continuous measurements of ground contamination.

    PubMed

    Brandl, A; Jimenez, A D Herrera

    2008-08-01

    In the course of the decommissioning of the ASTRA research reactor at the site of the Austrian Research Centers at Seibersdorf, the operator and licensee, Nuclear Engineering Seibersdorf, conducted an extensive site survey and characterization to demonstrate compliance with regulatory site release criteria. This survey included radiological characterization of approximately 400,000 m(2) of open land on the Austrian Research Centers premises. Part of this survey was conducted using a mobile large-area gas proportional counter, continuously recording measurements while it was moved at a speed of 0.5 ms(-1). In order to set reasonable investigation levels, two alarm levels based on statistical considerations were developed. This paper describes the derivation of these alarm levels and the operational experience gained by detector deployment in the field. PMID:18617795

  18. Improved inhalation technology for setting safe exposure levels for workplace chemicals

    NASA Technical Reports Server (NTRS)

    Stuart, Bruce O.

    1993-01-01

    Threshold Limit Values recommended as allowable air concentrations of a chemical in the workplace are often based upon a no-observable-effect-level (NOEL) determined by experimental inhalation studies using rodents. A 'safe level' for human exposure must then be estimated by the use of generalized safety factors in attempts to extrapolate from experimental rodents to man. The recent development of chemical-specific physiologically-based toxicokinetics makes use of measured physiological, biochemical, and metabolic parameters to construct a validated model that is able to 'scale-up' rodent response data to predict the behavior of the chemical in man. This procedure is made possible by recent advances in personal computer software and the emergence of appropriate biological data, and provides an analytical tool for much more reliable risk evaluation and airborne chemical exposure level setting for humans.

  19. Implementation of E.U. Water Framework Directive: source assessment of metallic substances at catchment levels.

    PubMed

    Chon, Ho-Sik; Ohandja, Dieudonne-Guy; Voulvoulis, Nikolaos

    2010-01-01

    The E.U. Water Framework Directive (WFD) aims to prevent deterioration of water quality and to phase out or reduce the concentrations of priority substances at catchment levels. It requires changes in water management from a local scale to a river basin scale, and establishes Environmental Quality Standards (EQS) as a guideline for the chemical status of receiving waters. According to the Directive, the standard and the scope of the investigation for water management are more stringent and expanded than in the past, and this change also needs to be applied to restoring the level of metals in water bodies. The aim of this study was to identify anthropogenic emission sources of metallic substances at catchment levels. Potential sources providing substantial amounts of such substances in receiving waters included stormwater, industrial effluents, treated effluents, agricultural drainage, sediments, mining drainage and landfill leachates. Metallic substances have more emission sources than other dangerous substances at catchment levels. Therefore, source assessment for these substances is required to be considered more significantly to restore their chemical status in the context of the WFD. To improve source assessment quality, research on the role of societal and environmental parameters and contribution of each source to the chemical distribution in receiving waters need to be carried out. PMID:20081997

  20. Study of burn scar extraction automatically based on level set method using remote sensing data.

    PubMed

    Liu, Yang; Dai, Qin; Liu, Jianbo; Liu, ShiBin; Yang, Jin

    2014-01-01

    Burn scar extraction using remote sensing data is an efficient way to precisely evaluate burn area and measure vegetation recovery. Traditional burn scar extraction methodologies have no well effect on burn scar image with blurred and irregular edges. To address these issues, this paper proposes an automatic method to extract burn scar based on Level Set Method (LSM). This method utilizes the advantages of the different features in remote sensing images, as well as considers the practical needs of extracting the burn scar rapidly and automatically. This approach integrates Change Vector Analysis (CVA), Normalized Difference Vegetation Index (NDVI) and the Normalized Burn Ratio (NBR) to obtain difference image and modifies conventional Level Set Method Chan-Vese (C-V) model with a new initial curve which results from a binary image applying K-means method on fitting errors of two near-infrared band images. Landsat 5 TM and Landsat 8 OLI data sets are used to validate the proposed method. Comparison with conventional C-V model, OSTU algorithm, Fuzzy C-mean (FCM) algorithm are made to show that the proposed approach can extract the outline curve of fire burn scar effectively and exactly. The method has higher extraction accuracy and less algorithm complexity than that of the conventional C-V model. PMID:24503563

  1. Differential optimal dopamine levels for set-shifting and working memory in Parkinson's disease.

    PubMed

    Fallon, Sean James; Smulders, Katrijn; Esselink, Rianne A; van de Warrenburg, Bart P; Bloem, Bastiaan R; Cools, Roshan

    2015-10-01

    Parkinson's disease (PD) is an important model for the role of dopamine in supporting human cognition. However, despite the uniformity of midbrain dopamine depletion only some patients experience cognitive impairment. The neurocognitive mechanisms of this heterogeneity remain unclear. A genetic polymorphism in the catechol O-methyltransferase (COMT) enzyme, predominantly thought to exert its cognitive effect through acting on prefrontal cortex (PFC) dopamine transmission, provides us with an experimental window onto dopamine's role in cognitive performance in PD. In a large cohort of PD patients (n=372), we examined the association between COMT genotype and two tasks known to implicate prefrontal dopamine (spatial working memory and attentional set-shifting) and on a task less sensitive to prefrontal dopamine (paired associates learning). Consistent with the known neuroanatomical locus of its effects, differences between the COMT genotype groups were observed on dopamine-dependant tasks, but not the paired associates learning task. However, COMT genotype had differential effects on the two prefrontal dopamine tasks. Putative prefrontal dopamine levels influenced spatial working memory in an 'Inverted-U'-shaped fashion, whereas a linear, dose-dependant pattern was observed for attentional set-shifting. Cumulatively, these results revise our understanding of when COMT genotype modulates cognitive functioning in PD patients by showing that the behavioural consequences of genetic variation vary according to task demands, presumably because set-shifting and working memory have different optimal dopamine levels. PMID:26239947

  2. Modelling calving front dynamics using a level-set method: application to Jakobshavn Isbræ, West Greenland

    NASA Astrophysics Data System (ADS)

    Bondzio, Johannes H.; Seroussi, Hélène; Morlighem, Mathieu; Kleiner, Thomas; Rückamp, Martin; Humbert, Angelika; Larour, Eric Y.

    2016-03-01

    Calving is a major mechanism of ice discharge of the Antarctic and Greenland ice sheets, and a change in calving front position affects the entire stress regime of marine terminating glaciers. The representation of calving front dynamics in a 2-D or 3-D ice sheet model remains non-trivial. Here, we present the theoretical and technical framework for a level-set method, an implicit boundary tracking scheme, which we implement into the Ice Sheet System Model (ISSM). This scheme allows us to study the dynamic response of a drainage basin to user-defined calving rates. We apply the method to Jakobshavn Isbræ, a major marine terminating outlet glacier of the West Greenland Ice Sheet. The model robustly reproduces the high sensitivity of the glacier to calving, and we find that enhanced calving triggers significant acceleration of the ice stream. Upstream acceleration is sustained through a combination of mechanisms. However, both lateral stress and ice influx stabilize the ice stream. This study provides new insights into the ongoing changes occurring at Jakobshavn Isbræ and emphasizes that the incorporation of moving boundaries and dynamic lateral effects, not captured in flow-line models, is key for realistic model projections of sea level rise on centennial timescales.

  3. GPU-Based Visualization of 3D Fluid Interfaces using Level Set Methods

    NASA Astrophysics Data System (ADS)

    Kadlec, B. J.

    2009-12-01

    We model a simple 3D fluid-interface problem using the level set method and visualize the interface as a dynamic surface. Level set methods allow implicit handling of complex topologies deformed by evolutions where sharp changes and cusps are present without destroying the representation. We present a highly optimized visualization and computation algorithm that is implemented in CUDA to run on the NVIDIA GeForce 295 GTX. CUDA is a general purpose parallel computing architecture that allows the NVIDIA GPU to be treated like a data parallel supercomputer in order to solve many computational problems in a fraction of the time required on a CPU. CUDA is compared to the new OpenCL™ (Open Computing Language), which is designed to run on heterogeneous computing environments but does not take advantage of low-level features in NVIDIA hardware that provide significant speedups. Therefore, our technique is implemented using CUDA and results are compared to a single CPU implementation to show the benefits of using the GPU and CUDA for visualizing fluid-interface problems. We solve a 1024^3 problem and experience significant speedup using the NVIDIA GeForce 295 GTX. Implementation details for mapping the problem to the GPU architecture are described as well as discussion on porting the technique to heterogeneous devices (AMD, Intel, IBM) using OpenCL. The results present a new interactive system for computing and visualizing the evolution of fluid interface problems on the GPU.

  4. Computer aided root lesion detection using level set and complex wavelets

    NASA Astrophysics Data System (ADS)

    Li, Shuo; Fevens, Thomas; Krzyżak, Adam; Jin, Chao; Li, Song

    2007-03-01

    A computer aided root lesion detection method for digital dental X-rays is proposed using level set and complex wavelets. The detection method consists of two stages: preprocessing and root lesion detection. During preprocessing, a level set segmentation is applied to separate the teeth from the background. Tailored for the dental clinical environment, a segmentation clinical acceleration scheme is applied by using a support vector machine (SVM) classifier and individual principal component analysis (PCA) to provide an initial contour. Then, based on the segmentation result, root lesion detection is performed. Firstly, the teeth are isolated by the average intensity profile. Secondly, a center-line zero crossing based candidate generation is applied to generate the possible root lesion areas. Thirdly, the Dual-Tree Complex Wavelets Transform (DT-CWT) is used to further remove false positives. Lastly when the root lesion is detected, the area of root lesion is automatically marked with color indication representing different levels of seriousness. 150 real dental X-rays with various degrees of root lesions are used to test the proposed method. The results were validated by the dentist. Experimental results show that the proposed method is able to successfully detect the root lesion and provide visual assistance to the dentist.

  5. Street Level Hydrology: An Urban Application of the WRF-Hydro Framework in Denver, Colorado

    NASA Astrophysics Data System (ADS)

    Read, L.; Hogue, T. S.; Salas, F. R.; Gochis, D.

    2015-12-01

    Urban flood modeling at the watershed scale carries unique challenges in routing complexity, data resolution, social and political issues, and land surface - infrastructure interactions. The ability to accurately trace and predict the flow of water through the urban landscape enables better emergency response management, floodplain mapping, and data for future urban infrastructure planning and development. These services are of growing importance as urban population is expected to continue increasing by 1.84% per year for the next 25 years, increasing the vulnerability of urban regions to damages and loss of life from floods. Although a range of watershed-scale models have been applied in specific urban areas to examine these issues, there is a trend towards national scale hydrologic modeling enabled by supercomputing resources to understand larger system-wide hydrologic impacts and feedbacks. As such it is important to address how urban landscapes can be represented in large scale modeling processes. The current project investigates how coupling terrain and infrastructure routing can improve flow prediction and flooding events over the urban landscape. We utilize the WRF-Hydro modeling framework and a high-resolution terrain routing grid with the goal of compiling standard data needs necessary for fine scale urban modeling and dynamic flood forecasting in the urban setting. The city of Denver is selected as a case study, as it has experienced several large flooding events in the last five years and has an urban annual population growth rate of 1.5%, one of the highest in the U.S. Our work highlights the hydro-informatic challenges associated with linking channel networks and drainage infrastructure in an urban area using the WRF-Hydro modeling framework and high resolution urban models for short-term flood prediction.

  6. Framework for leadership and training of Biosafety Level 4 laboratory workers.

    PubMed

    Le Duc, James W; Anderson, Kevin; Bloom, Marshall E; Estep, James E; Feldmann, Heinz; Geisbert, Joan B; Geisbert, Thomas W; Hensley, Lisa; Holbrook, Michael; Jahrling, Peter B; Ksiazek, Thomas G; Korch, George; Patterson, Jean; Skvorak, John P; Weingartl, Hana

    2008-11-01

    Construction of several new Biosafety Level 4 (BSL-4) laboratories and expansion of existing operations have created an increased international demand for well-trained staff and facility leaders. Directors of most North American BSL-4 laboratories met and agreed upon a framework for leadership and training of biocontainment research and operations staff. They agreed on essential preparation and training that includes theoretical consideration of biocontainment principles, practical hands-on training, and mentored on-the-job experiences relevant to positional responsibilities as essential preparation before a person's independent access to a BSL-4 facility. They also agreed that the BSL-4 laboratory director is the key person most responsible for ensuring that staff members are appropriately prepared for BSL-4 operations. Although standardized certification of training does not formally exist, the directors agreed that facility-specific, time-limited documentation to recognize specific skills and experiences of trained persons is needed. PMID:18976549

  7. Physical therapy for young children diagnosed with autism spectrum disorders-clinical frameworks model in an israeli setting.

    PubMed

    Atun-Einy, Osnat; Lotan, Meir; Harel, Yael; Shavit, Efrat; Burstein, Shimshon; Kempner, Gali

    2013-01-01

    Recent research findings suggest that many children with Autism Spectrum Disorders (ASD) demonstrate delayed and atypical motor achievements. It has now become clear that a more holistic, integrative and multi-disciplinary intervention is required to effectively address the motor-related impairments of this population. It is also crucial to ensure that this group of clients has access to early physical therapy (PT) interventions. Despite accumulating research on physical interventions, little is known about intervention model for implementation at a national level. This report introduces a model that uniquely illustrates implementation of PT services for a large number of children with ASD. The model has been operating for the past 2 years in one country (Israel), and includes an optional implementation model of PT practice settings for young children diagnosed with ASD. The Israeli setting offers a unique opportunity for implementing PT services for a multitude of children with ASD on a regular basis as an accepted/needed service. The initial outcomes of the present implementation suggest that an intensive PT intervention program might enhance therapeutic outcomes for this population, and contribute to our knowledge on the potential of PT for individuals with ASD. PMID:24400265

  8. Non-Rigid Object Contour Tracking via a Novel Supervised Level Set Model.

    PubMed

    Sun, Xin; Yao, Hongxun; Zhang, Shengping; Li, Dong

    2015-11-01

    We present a novel approach to non-rigid objects contour tracking in this paper based on a supervised level set model (SLSM). In contrast to most existing trackers that use bounding box to specify the tracked target, the proposed method extracts the accurate contours of the target as tracking output, which achieves better description of the non-rigid objects while reduces background pollution to the target model. Moreover, conventional level set models only emphasize the regional intensity consistency and consider no priors. Differently, the curve evolution of the proposed SLSM is object-oriented and supervised by the specific knowledge of the targets we want to track. Therefore, the SLSM can ensure a more accurate convergence to the exact targets in tracking applications. In particular, we firstly construct the appearance model for the target in an online boosting manner due to its strong discriminative power between the object and the background. Then, the learnt target model is incorporated to model the probabilities of the level set contour by a Bayesian manner, leading the curve converge to the candidate region with maximum likelihood of being the target. Finally, the accurate target region qualifies the samples fed to the boosting procedure as well as the target model prepared for the next time step. We firstly describe the proposed mechanism of two-phase SLSM for single target tracking, then give its generalized multi-phase version for dealing with multi-target tracking cases. Positive decrease rate is used to adjust the learning pace over time, enabling tracking to continue under partial and total occlusion. Experimental results on a number of challenging sequences validate the effectiveness of the proposed method. PMID:26099142

  9. A three-dimensional coupled Nitsche and level set method for electrohydrodynamic potential flows in moving domains

    NASA Astrophysics Data System (ADS)

    Johansson, A.; Garzon, M.; Sethian, J. A.

    2016-03-01

    In this paper we present a new algorithm for computing three-dimensional electrohydrodynamic flow in moving domains which can undergo topological changes. We consider a non-viscous, irrotational, perfect conducting fluid and introduce a way to model the electrically charged flow with an embedded potential approach. To numerically solve the resulting system, we combine a level set method to track both the free boundary and the surface velocity potential with a Nitsche finite element method for solving the Laplace equations. This results in an algorithmic framework that does not require body-conforming meshes, works in three dimensions, and seamlessly tracks topological change. Assembling this coupled system requires care: while convergence and stability properties of Nitsche's methods have been well studied for static problems, they have rarely been considered for moving domains or for obtaining the gradients of the solution on the embedded boundary. We therefore investigate the performance of the symmetric and non-symmetric Nitsche formulations, as well as two different stabilization techniques. The global algorithm and in particular the coupling between the Nitsche solver and the level set method are also analyzed in detail. Finally we present numerical results for several time-dependent problems, each one designed to achieve a specific objective: (a) The oscillation of a perturbed sphere, which is used for convergence studies and the examination of the Nitsche methods; (b) The break-up of a two lobe droplet with axial symmetry, which tests the capability of the algorithm to go past flow singularities such as topological changes and preservation of an axi-symmetric flow, and compares results to previous axi-symmetric calculations; (c) The electrohydrodynamical deformation of a thin film and subsequent jet ejection, which will account for the presence of electrical forces in a non-axi-symmetric geometry.

  10. Springback assessment based on level set interpolation and shape manifolds in deep drawing

    NASA Astrophysics Data System (ADS)

    Le Quilliec, Guenhael; Raghavan, Balaji; Breitkopf, Piotr; Rassineux, Alain; Villon, Pierre; Roelandt, Jean-Marc

    2013-12-01

    In this paper, we introduce an original shape representation approach for automatic springback characterization. It is based on the generation of parameterized Level Set functions. The central idea is the concept of the shape manifold representing the design domain in the reduced-order shape-space. Performing Proper Orthogonal Decomposition on the shapes followed by using the Diffuse Approximation allows us to efficiently reduce the problem dimensionality and to interpolate uniquely between admissible input shapes, while also determining the smallest number of parameters needed to characterize the final formed shape. We apply this methodology to the problem of springback assessment for the deep drawing operation of metal sheets.

  11. A finite element implementation of a level set interface tracking technique for visoelastic filling

    SciTech Connect

    Montalbano, E.; Tullock, D.L.; Guell, D.C.

    1996-10-01

    In order to effectively model filling processes encountered in injection molding, an accurate and computationally efficient method for tracking the fluid interface is essential. In this work, several methods for interface tracking and mold filling are discussed in reference to three dimensional viscoelastic filling problems. This discussion emphasizes the interface tracking techniques in the context of the finite element method. A detailed outline of a level set method is presented. This method entails solving for the advection of a continuum variable and its subsequent reinitialization as a distance function.

  12. Topology-optimized multiple-disk resonators obtained using level set expression incorporating surface effects.

    PubMed

    Fujii, Garuda; Ueta, Tsuyoshi; Mizuno, Mamoru; Nakamura, Masayuki

    2015-05-01

    Topology-optimized designs of multiple-disk resonators are presented using level-set expression that incorporates surface effects. Effects from total internal reflection at the surfaces of the dielectric disks are precisely simulated by modeling clearly defined dielectric boundaries during topology optimization. The electric field intensity in optimal resonators increases to more than four and a half times the initial intensity in a resonant state, whereas in some cases the Q factor increases by three and a half times that for the initial state. Wavelength-scale link structures between neighboring disks improve the performance of the multiple-disk resonators. PMID:25969226

  13. Building a Community Framework for Adaptation to Sea Level Rise and Inundation

    NASA Astrophysics Data System (ADS)

    Culver, M. E.; Schubel, J.; Davidson, M. A.; Haines, J.

    2010-12-01

    Sea level rise and inundation pose a substantial risk to many coastal communities, and the risk is projected to increase because of continued development, changes in the frequency and intensity of inundation events, and acceleration in the rate of sea-level rise. Calls for action at all levels acknowledge that a viable response must engage federal, state and local expertise, perspectives, and resources in a coordinated and collaborative effort. Representatives from a variety of these agencies and organizations have developed a shared framework to help coastal communities structure and facilitate community-wide adaptation processes and to help agencies determine where investments should be made to enable states and local governments to assess impacts and initiate adaptation strategies over the next decade. For sea level rise planning and implementation, the requirements for high-quality data and information are vast and the availability is limited. Participants stressed the importance of data interoperability to ensure that users are able to apply data from a variety of sources and to improve availability and confidence in the data. Participants were able to prioritize the following six categories of data needed to support future sea level rise planning and implementation: - Data to understand land forms and where and how water will flow - Monitoring data and environmental drivers - Consistent sea level rise scenarios and projections across agencies to support local planning - Data to characterize vulnerabilities and impacts of sea level rise - Community characteristics - Legal frameworks and administrative structure. To develop a meaningful and effective sea level rise adaptation plan, state and local planners must understand how the availability, scale, and uncertainty of these types of data will impact new guidelines or adaptation measures. The tools necessary to carry-out the adaptation planning process need to be understood in terms of data requirements

  14. Vessel Segmentation and Blood Flow Simulation Using Level-Sets and Embedded Boundary Methods

    SciTech Connect

    Deschamps, T; Schwartz, P; Trebotich, D; Colella, P; Saloner, D; Malladi, R

    2004-12-09

    In this article we address the problem of blood flow simulation in realistic vascular objects. The anatomical surfaces are extracted by means of Level-Sets methods that accurately model the complex and varying surfaces of pathological objects such as aneurysms and stenoses. The surfaces obtained are defined at the sub-pixel level where they intersect the Cartesian grid of the image domain. It is therefore straightforward to construct embedded boundary representations of these objects on the same grid, for which recent work has enabled discretization of the Navier-Stokes equations for incompressible fluids. While most classical techniques require construction of a structured mesh that approximates the surface in order to extrapolate a 3D finite-element gridding of the whole volume, our method directly simulates the blood-flow inside the extracted surface without losing any complicated details and without building additional grids.

  15. On the Relationship between Variational Level Set-Based and SOM-Based Active Contours

    PubMed Central

    Abdelsamea, Mohammed M.; Gnecco, Giorgio; Gaber, Mohamed Medhat; Elyan, Eyad

    2015-01-01

    Most Active Contour Models (ACMs) deal with the image segmentation problem as a functional optimization problem, as they work on dividing an image into several regions by optimizing a suitable functional. Among ACMs, variational level set methods have been used to build an active contour with the aim of modeling arbitrarily complex shapes. Moreover, they can handle also topological changes of the contours. Self-Organizing Maps (SOMs) have attracted the attention of many computer vision scientists, particularly in modeling an active contour based on the idea of utilizing the prototypes (weights) of a SOM to control the evolution of the contour. SOM-based models have been proposed in general with the aim of exploiting the specific ability of SOMs to learn the edge-map information via their topology preservation property and overcoming some drawbacks of other ACMs, such as trapping into local minima of the image energy functional to be minimized in such models. In this survey, we illustrate the main concepts of variational level set-based ACMs, SOM-based ACMs, and their relationship and review in a comprehensive fashion the development of their state-of-the-art models from a machine learning perspective, with a focus on their strengths and weaknesses. PMID:25960736

  16. A mass conserving level set method for detailed numerical simulation of liquid atomization

    SciTech Connect

    Luo, Kun; Shao, Changxiao; Yang, Yue; Fan, Jianren

    2015-10-01

    An improved mass conserving level set method for detailed numerical simulations of liquid atomization is developed to address the issue of mass loss in the existing level set method. This method introduces a mass remedy procedure based on the local curvature at the interface, and in principle, can ensure the absolute mass conservation of the liquid phase in the computational domain. Three benchmark cases, including Zalesak's disk, a drop deforming in a vortex field, and the binary drop head-on collision, are simulated to validate the present method, and the excellent agreement with exact solutions or experimental results is achieved. It is shown that the present method is able to capture the complex interface with second-order accuracy and negligible additional computational cost. The present method is then applied to study more complex flows, such as a drop impacting on a liquid film and the swirling liquid sheet atomization, which again, demonstrates the advantages of mass conservation and the capability to represent the interface accurately.

  17. A level set method for determining critical curvatures for drainage and imbibition.

    PubMed

    Prodanović, Masa; Bryant, Steven L

    2006-12-15

    An accurate description of the mechanics of pore level displacement of immiscible fluids could significantly improve the predictions from pore network models of capillary pressure-saturation curves, interfacial areas and relative permeability in real porous media. If we assume quasi-static displacement, at constant pressure and surface tension, pore scale interfaces are modeled as constant mean curvature surfaces, which are not easy to calculate. Moreover, the extremely irregular geometry of natural porous media makes it difficult to evaluate surface curvature values and corresponding geometric configurations of two fluids. Finally, accounting for the topological changes of the interface, such as splitting or merging, is nontrivial. We apply the level set method for tracking and propagating interfaces in order to robustly handle topological changes and to obtain geometrically correct interfaces. We describe a simple but robust model for determining critical curvatures for throat drainage and pore imbibition. The model is set up for quasi-static displacements but it nevertheless captures both reversible and irreversible behavior (Haines jump, pore body imbibition). The pore scale grain boundary conditions are extracted from model porous media and from imaged geometries in real rocks. The method gives quantitative agreement with measurements and with other theories and computational approaches. PMID:17027812

  18. Time-optimal path planning in dynamic flows using level set equations: theory and schemes

    NASA Astrophysics Data System (ADS)

    Lolla, Tapovan; Lermusiaux, Pierre F. J.; Ueckermann, Mattheus P.; Haley, Patrick J.

    2014-09-01

    We develop an accurate partial differential equation-based methodology that predicts the time-optimal paths of autonomous vehicles navigating in any continuous, strong, and dynamic ocean currents, obviating the need for heuristics. The goal is to predict a sequence of steering directions so that vehicles can best utilize or avoid currents to minimize their travel time. Inspired by the level set method, we derive and demonstrate that a modified level set equation governs the time-optimal path in any continuous flow. We show that our algorithm is computationally efficient and apply it to a number of experiments. First, we validate our approach through a simple benchmark application in a Rankine vortex flow for which an analytical solution is available. Next, we apply our methodology to more complex, simulated flow fields such as unsteady double-gyre flows driven by wind stress and flows behind a circular island. These examples show that time-optimal paths for multiple vehicles can be planned even in the presence of complex flows in domains with obstacles. Finally, we present and support through illustrations several remarks that describe specific features of our methodology.

  19. Time-optimal path planning in dynamic flows using level set equations: theory and schemes

    NASA Astrophysics Data System (ADS)

    Lolla, Tapovan; Lermusiaux, Pierre F. J.; Ueckermann, Mattheus P.; Haley, Patrick J.

    2014-10-01

    We develop an accurate partial differential equation-based methodology that predicts the time-optimal paths of autonomous vehicles navigating in any continuous, strong, and dynamic ocean currents, obviating the need for heuristics. The goal is to predict a sequence of steering directions so that vehicles can best utilize or avoid currents to minimize their travel time. Inspired by the level set method, we derive and demonstrate that a modified level set equation governs the time-optimal path in any continuous flow. We show that our algorithm is computationally efficient and apply it to a number of experiments. First, we validate our approach through a simple benchmark application in a Rankine vortex flow for which an analytical solution is available. Next, we apply our methodology to more complex, simulated flow fields such as unsteady double-gyre flows driven by wind stress and flows behind a circular island. These examples show that time-optimal paths for multiple vehicles can be planned even in the presence of complex flows in domains with obstacles. Finally, we present and support through illustrations several remarks that describe specific features of our methodology.

  20. Density and level set-XFEM schemes for topology optimization of 3-D structures

    NASA Astrophysics Data System (ADS)

    Villanueva, Carlos H.; Maute, Kurt

    2014-07-01

    As the capabilities of additive manufacturing techniques increase, topology optimization provides a promising approach to design geometrically sophisticated structures. Traditional topology optimization methods aim at finding conceptual designs, but they often do not resolve sufficiently the geometry and the structural response such that the optimized designs can be directly used for manufacturing. To overcome these limitations, this paper studies the viability of the extended finite element method (XFEM) in combination with the level-set method (LSM) for topology optimization of three dimensional structures. The LSM describes the geometry by defining the nodal level set values via explicit functions of the optimization variables. The structural response is predicted by a generalized version of the XFEM. The LSM-XFEM approach is compared against results from a traditional Solid Isotropic Material with Penalization method for two-phase "solid-void" and "solid-solid" problems. The numerical results demonstrate that the LSM-XFEM approach describes crisply the geometry and predicts the structural response with acceptable accuracy even on coarse meshes.

  1. Time-optimal path planning in dynamic flows using level set equations: realistic applications

    NASA Astrophysics Data System (ADS)

    Lolla, Tapovan; Haley, Patrick J.; Lermusiaux, Pierre F. J.

    2014-09-01

    The level set methodology for time-optimal path planning is employed to predict collision-free and fastest-time trajectories for swarms of underwater vehicles deployed in the Philippine Archipelago region. To simulate the multiscale ocean flows in this complex region, a data-assimilative primitive-equation ocean modeling system is employed with telescoping domains that are interconnected by implicit two-way nesting. These data-driven multiresolution simulations provide a realistic flow environment, including variable large-scale currents, strong jets, eddies, wind-driven currents, and tides. The properties and capabilities of the rigorous level set methodology are illustrated and assessed quantitatively for several vehicle types and mission scenarios. Feasibility studies of all-to-all broadcast missions, leading to minimal time transmission between source and receiver locations, are performed using a large number of vehicles. The results with gliders and faster propelled vehicles are compared. Reachability studies, i.e., determining the boundaries of regions that can be reached by vehicles for exploratory missions, are then exemplified and analyzed. Finally, the methodology is used to determine the optimal strategies for fastest-time pick up of deployed gliders by means of underway surface vessels or stationary platforms. The results highlight the complex effects of multiscale flows on the optimal paths, the need to utilize the ocean environment for more efficient autonomous missions, and the benefits of including ocean forecasts in the planning of time-optimal paths.

  2. Time-optimal path planning in dynamic flows using level set equations: realistic applications

    NASA Astrophysics Data System (ADS)

    Lolla, Tapovan; Haley, Patrick J.; Lermusiaux, Pierre F. J.

    2014-10-01

    The level set methodology for time-optimal path planning is employed to predict collision-free and fastest-time trajectories for swarms of underwater vehicles deployed in the Philippine Archipelago region. To simulate the multiscale ocean flows in this complex region, a data-assimilative primitive-equation ocean modeling system is employed with telescoping domains that are interconnected by implicit two-way nesting. These data-driven multiresolution simulations provide a realistic flow environment, including variable large-scale currents, strong jets, eddies, wind-driven currents, and tides. The properties and capabilities of the rigorous level set methodology are illustrated and assessed quantitatively for several vehicle types and mission scenarios. Feasibility studies of all-to-all broadcast missions, leading to minimal time transmission between source and receiver locations, are performed using a large number of vehicles. The results with gliders and faster propelled vehicles are compared. Reachability studies, i.e., determining the boundaries of regions that can be reached by vehicles for exploratory missions, are then exemplified and analyzed. Finally, the methodology is used to determine the optimal strategies for fastest-time pick up of deployed gliders by means of underway surface vessels or stationary platforms. The results highlight the complex effects of multiscale flows on the optimal paths, the need to utilize the ocean environment for more efficient autonomous missions, and the benefits of including ocean forecasts in the planning of time-optimal paths.

  3. On the Relationship between Variational Level Set-Based and SOM-Based Active Contours.

    PubMed

    Abdelsamea, Mohammed M; Gnecco, Giorgio; Gaber, Mohamed Medhat; Elyan, Eyad

    2015-01-01

    Most Active Contour Models (ACMs) deal with the image segmentation problem as a functional optimization problem, as they work on dividing an image into several regions by optimizing a suitable functional. Among ACMs, variational level set methods have been used to build an active contour with the aim of modeling arbitrarily complex shapes. Moreover, they can handle also topological changes of the contours. Self-Organizing Maps (SOMs) have attracted the attention of many computer vision scientists, particularly in modeling an active contour based on the idea of utilizing the prototypes (weights) of a SOM to control the evolution of the contour. SOM-based models have been proposed in general with the aim of exploiting the specific ability of SOMs to learn the edge-map information via their topology preservation property and overcoming some drawbacks of other ACMs, such as trapping into local minima of the image energy functional to be minimized in such models. In this survey, we illustrate the main concepts of variational level set-based ACMs, SOM-based ACMs, and their relationship and review in a comprehensive fashion the development of their state-of-the-art models from a machine learning perspective, with a focus on their strengths and weaknesses. PMID:25960736

  4. Comparisons and Limitations of Gradient Augmented Level Set and Algebraic Volume of Fluid Methods

    NASA Astrophysics Data System (ADS)

    Anumolu, Lakshman; Ryddner, Douglas; Trujillo, Mario

    2014-11-01

    Recent numerical methods for implicit interface transport are generally presented as enjoying higher order of spatial-temporal convergence when compared to classical methods or less sophisticated approaches. However, when applied to test cases, which are designed to simulate practical industrial conditions, significant reduction in convergence is observed in higher-order methods, whereas for the less sophisticated approaches same convergence is achieved but a growth in the error norms occurs. This provides an opportunity to understand the underlying issues which causes this decrease in accuracy in both types of methods. As an example we consider the Gradient Augmented Level Set method (GALS) and a variant of the Volume of Fluid (VoF) method in our study. Results show that while both methods do suffer from a loss of accuracy, it is the higher order method that suffers more. The implication is a significant reduction in the performance advantage of the GALS method over the VoF scheme. Reasons for this lie in the behavior of the higher order derivatives, particular in situations where the level set field is highly distorted. For the VoF approach, serious spurious deformations of the interface are observed, albeit with a deceptive zero loss of mass.

  5. Soybean fruit development and set at the node level under combined photoperiod and radiation conditions

    PubMed Central

    Nico, Magalí; Mantese, Anita I.; Miralles, Daniel J.; Kantolic, Adriana G.

    2016-01-01

    In soybean, long days during post-flowering increase seed number. This positive photoperiodic effect on seed number has been previously associated with increments in the amount of radiation accumulated during the crop cycle because long days extend the duration of the crop cycle. However, evidence of intra-nodal processes independent of the availability of assimilates suggests that photoperiodic effects at the node level might also contribute to pod set. This work aims to identify the main mechanisms responsible for the increase in pod number per node in response to long days; including the dynamics of flowering, pod development, growth and set at the node level. Long days increased pods per node on the main stems, by increasing pods on lateral racemes (usually dominated positions) at some main stem nodes. Long days lengthened the flowering period and thereby increased the number of opened flowers on lateral racemes. The flowering period was prolonged under long days because effective seed filling was delayed on primary racemes (dominant positions). Long days also delayed the development of flowers into pods with filling seeds, delaying the initiation of pod elongation without modifying pod elongation rate. The embryo development matched the external pod length irrespective of the pod’s chronological age. These results suggest that long days during post-flowering enhance pod number per node through a relief of the competition between pods of different hierarchy within the node. The photoperiodic effect on the development of dominant pods, delaying their elongation and therefore postponing their active growth, extends flowering and allows pod set at positions that are usually dominated. PMID:26512057

  6. Soybean fruit development and set at the node level under combined photoperiod and radiation conditions.

    PubMed

    Nico, Magalí; Mantese, Anita I; Miralles, Daniel J; Kantolic, Adriana G

    2016-01-01

    In soybean, long days during post-flowering increase seed number. This positive photoperiodic effect on seed number has been previously associated with increments in the amount of radiation accumulated during the crop cycle because long days extend the duration of the crop cycle. However, evidence of intra-nodal processes independent of the availability of assimilates suggests that photoperiodic effects at the node level might also contribute to pod set. This work aims to identify the main mechanisms responsible for the increase in pod number per node in response to long days; including the dynamics of flowering, pod development, growth and set at the node level. Long days increased pods per node on the main stems, by increasing pods on lateral racemes (usually dominated positions) at some main stem nodes. Long days lengthened the flowering period and thereby increased the number of opened flowers on lateral racemes. The flowering period was prolonged under long days because effective seed filling was delayed on primary racemes (dominant positions). Long days also delayed the development of flowers into pods with filling seeds, delaying the initiation of pod elongation without modifying pod elongation rate. The embryo development matched the external pod length irrespective of the pod's chronological age. These results suggest that long days during post-flowering enhance pod number per node through a relief of the competition between pods of different hierarchy within the node. The photoperiodic effect on the development of dominant pods, delaying their elongation and therefore postponing their active growth, extends flowering and allows pod set at positions that are usually dominated. PMID:26512057

  7. What Makes Deeply Encoded Items Memorable? Insights into the Levels of Processing Framework from Neuroimaging and Neuromodulation

    PubMed Central

    Galli, Giulia

    2014-01-01

    When we form new memories, their mnestic fate largely depends upon the cognitive operations set in train during encoding. A typical observation in experimental as well as everyday life settings is that if we learn an item using semantic or “deep” operations, such as attending to its meaning, memory will be better than if we learn the same item using more “shallow” operations, such as attending to its structural features. In the psychological literature, this phenomenon has been conceptualized within the “levels of processing” framework and has been consistently replicated since its original proposal by Craik and Lockhart in 1972. However, the exact mechanisms underlying the memory advantage for deeply encoded items are not yet entirely understood. A cognitive neuroscience perspective can add to this field by clarifying the nature of the processes involved in effective deep and shallow encoding and how they are instantiated in the brain, but so far there has been little work to systematically integrate findings from the literature. This work aims to fill this gap by reviewing, first, some of the key neuroimaging findings on the neural correlates of deep and shallow episodic encoding and second, emerging evidence from studies using neuromodulatory approaches such as psychopharmacology and non-invasive brain stimulation. Taken together, these studies help further our understanding of levels of processing. In addition, by showing that deep encoding can be modulated by acting upon specific brain regions or systems, the reviewed studies pave the way for selective enhancements of episodic encoding processes. PMID:24904444

  8. Whole Abdominal Wall Segmentation using Augmented Active Shape Models (AASM) with Multi-Atlas Label Fusion and Level Set

    PubMed Central

    Xu, Zhoubing; Baucom, Rebeccah B.; Abramson, Richard G.; Poulose, Benjamin K.; Landman, Bennett A.

    2016-01-01

    The abdominal wall is an important structure differentiating subcutaneous and visceral compartments and intimately involved with maintaining abdominal structure. Segmentation of the whole abdominal wall on routinely acquired computed tomography (CT) scans remains challenging due to variations and complexities of the wall and surrounding tissues. In this study, we propose a slice-wise augmented active shape model (AASM) approach to robustly segment both the outer and inner surfaces of the abdominal wall. Multi-atlas label fusion (MALF) and level set (LS) techniques are integrated into the traditional ASM framework. The AASM approach globally optimizes the landmark updates in the presence of complicated underlying local anatomical contexts. The proposed approach was validated on 184 axial slices of 20 CT scans. The Hausdorff distance against the manual segmentation was significantly reduced using proposed approach compared to that using ASM, MALF, and LS individually. Our segmentation of the whole abdominal wall enables the subcutaneous and visceral fat measurement, with high correlation to the measurement derived from manual segmentation. This study presents the first generic algorithm that combines ASM, MALF, and LS, and demonstrates practical application for automatically capturing visceral and subcutaneous fat volumes. PMID:27127333

  9. Modelling Molecular Mechanisms: A Framework of Scientific Reasoning to Construct Molecular-Level Explanations for Cellular Behaviour

    ERIC Educational Resources Information Center

    van Mil, Marc H. W.; Boerwinkel, Dirk Jan; Waarlo, Arend Jan

    2013-01-01

    Although molecular-level details are part of the upper-secondary biology curriculum in most countries, many studies report that students fail to connect molecular knowledge to phenomena at the level of cells, organs and organisms. Recent studies suggest that students lack a framework to reason about complex systems to make this connection. In this…

  10. Critical Factors to Consider in Evaluating Standard-Setting Studies to Map Language Test Scores to Frameworks of Language Proficiency

    ERIC Educational Resources Information Center

    Tannenbaum, Richard J.; Cho, Yeonsuk

    2014-01-01

    In this article, we consolidate and present in one place what is known about quality indicators for setting standards so that stakeholders may be able to recognize the signs of standard-setting quality. We use the context of setting standards to associate English language test scores with language proficiency descriptions such as those presented…

  11. Levels and confounders of morning cortisol collected from adolescents in a naturalistic (school) setting.

    PubMed

    Kelly, Shona J; Young, Robert; Sweeting, Helen; Fischer, Joachim E; West, Patrick

    2008-10-01

    Salivary cortisol is widely used in research but little is known about the typical, or expected, functioning of the HPA-axis in adolescents in naturalistic settings, nor whether the extensive array of confounders documented in the literature is applicable in this situation. In a school-based study, 2995 15-year-old pupils provided two saliva samples, 30 min apart, in morning sessions timed to capture peak cortisol decline. The collection protocol was a balance between the large sample size obtainable in a school situation and a limited number of samples, constrained by the school timetable. In addition, pupils completed a questionnaire containing items previously shown to be associated with cortisol levels (e.g. time since awakening and life events), and their height and weight were measured. Outcome measures were cortisol levels at Times 1 and 2, and change (per minute) in cortisol between the two time points. Median (IQR) cortisol levels for males and females were 10.5 (8.1) and 11.6 (9.3) nmol/L at Time 1, and 8.2 (6.0) and 8.1 (6.5) nmol/L at Time 2. 73% had a decline in cortisol level of more than 10% across the two time points, compatible with the expected diurnal pattern. In bivariate analyses, cortisol sampled on Monday, times of measurement and since awakening, prior smoking and several life events were associated with cortisol levels at Times 1 and 2 in both sexes. However, in multivariate analysis, few of these variables remained after controlling for times of measurement and since awakening and, in addition, the final models differed between the sexes. Two events (friend dying and splitting with a boy/girlfriend) predicted cortisol levels in both sexes while age, maturity, recent eating and smoking were predictors only in males. Several factors associated with cortisol change differed from those observed for absolute levels. Further adjustment for school clustering affected some associations, particularly time of measurement. This study managed many of

  12. High-Order Discontinuous Galerkin Level Set Method for Interface Tracking and Re-Distancing on Unstructured Meshes

    NASA Astrophysics Data System (ADS)

    Greene, Patrick; Nourgaliev, Robert; Schofield, Sam

    2015-11-01

    A new sharp high-order interface tracking method for multi-material flow problems on unstructured meshes is presented. The method combines the marker-tracking algorithm with a discontinuous Galerkin (DG) level set method to implicitly track interfaces. DG projection is used to provide a mapping from the Lagrangian marker field to the Eulerian level set field. For the level set re-distancing, we developed a novel marching method that takes advantage of the unique features of the DG representation of the level set. The method efficiently marches outward from the zero level set with values in the new cells being computed solely from cell neighbors. Results are presented for a number of different interface geometries including ones with sharp corners and multiple hierarchical level sets. The method can robustly handle the level set discontinuities without explicit utilization of solution limiters. Results show that the expected high order (3rd and higher) of convergence for the DG representation of the level set is obtained for smooth solutions on unstructured meshes. High-order re-distancing on irregular meshes is a must for applications were the interfacial curvature is important for underlying physics, such as surface tension, wetting and detonation shock dynamics. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. Information management release number LLNL-ABS-675636.

  13. 3-D surface rendering of myocardial SPECT images segmented by level set technique.

    PubMed

    Lee, Hwun-Jae; Lee, Sangbock

    2012-06-01

    SPECT(single photon emission computed tomography) myocardial imaging is a diagnosis technique that images the region of interest and examines any change induced by disease using a computer after injects intravenously a radiopharmaceutical drug emitting gamma ray and the drug has dispersed evenly in the heart . Myocardial perfusion imaging, which contains functional information, is useful for non-invasive diagnosis of myocardial disease but noises caused by physical factors and low resolution give difficulty in reading the images. In order to help reading myocardial images, this study proposed a method that segments myocardial images and reconstructs the segmented region into a 3D image. To resolve difficulty in reading, we segmented the left ventricle, the region of interest, using a level set and modeled the segmented region into a 3D image. PMID:20839037

  14. A GPU-accelerated adaptive discontinuous Galerkin method for level set equation

    NASA Astrophysics Data System (ADS)

    Karakus, A.; Warburton, T.; Aksel, M. H.; Sert, C.

    2016-01-01

    This paper presents a GPU-accelerated nodal discontinuous Galerkin method for the solution of two- and three-dimensional level set (LS) equation on unstructured adaptive meshes. Using adaptive mesh refinement, computations are localised mostly near the interface location to reduce the computational cost. Small global time step size resulting from the local adaptivity is avoided by local time-stepping based on a multi-rate Adams-Bashforth scheme. Platform independence of the solver is achieved with an extensible multi-threading programming API that allows runtime selection of different computing devices (GPU and CPU) and different threading interfaces (CUDA, OpenCL and OpenMP). Overall, a highly scalable, accurate and mass conservative numerical scheme that preserves the simplicity of LS formulation is obtained. Efficiency, performance and local high-order accuracy of the method are demonstrated through distinct numerical test cases.

  15. Initialisation of 3D level set for hippocampus segmentation from volumetric brain MR images

    NASA Astrophysics Data System (ADS)

    Hajiesmaeili, Maryam; Dehmeshki, Jamshid; Bagheri Nakhjavanlo, Bashir; Ellis, Tim

    2014-04-01

    Shrinkage of the hippocampus is a primary biomarker for Alzheimer's disease and can be measured through accurate segmentation of brain MR images. The paper will describe the problem of initialisation of a 3D level set algorithm for hippocampus segmentation that must cope with the some challenging characteristics, such as small size, wide range of intensities, narrow width, and shape variation. In addition, MR images require bias correction, to account for additional inhomogeneity associated with the scanner technology. Due to these inhomogeneities, using a single initialisation seed region inside the hippocampus is prone to failure. Alternative initialisation strategies are explored, such as using multiple initialisations in different sections (such as the head, body and tail) of the hippocampus. The Dice metric is used to validate our segmentation results with respect to ground truth for a dataset of 25 MR images. Experimental results indicate significant improvement in segmentation performance using the multiple initialisations techniques, yielding more accurate segmentation results for the hippocampus.

  16. Numerical Schemes for the Hamilton-Jacobi and Level Set Equations on Triangulated Domains

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.; Sethian, James A.

    1997-01-01

    Borrowing from techniques developed for conservation law equations, numerical schemes which discretize the Hamilton-Jacobi (H-J), level set, and Eikonal equations on triangulated domains are presented. The first scheme is a provably monotone discretization for certain forms of the H-J equations. Unfortunately, the basic scheme lacks proper Lipschitz continuity of the numerical Hamiltonian. By employing a virtual edge flipping technique, Lipschitz continuity of the numerical flux is restored on acute triangulations. Next, schemes are introduced and developed based on the weaker concept of positive coefficient approximations for homogeneous Hamiltonians. These schemes possess a discrete maximum principle on arbitrary triangulations and naturally exhibit proper Lipschitz continuity of the numerical Hamiltonian. Finally, a class of Petrov-Galerkin approximations are considered. These schemes are stabilized via a least-squares bilinear form. The Petrov-Galerkin schemes do not possess a discrete maximum principle but generalize to high order accuracy.

  17. Curvelet initialized level set cell segmentation for touching cells in low contrast images.

    PubMed

    Kaur, Sarabpreet; Sahambi, J S

    2016-04-01

    Cell segmentation is an important element of automatic cell analysis. This paper proposes a method to extract the cell nuclei and the cell boundaries of touching cells in low contrast images. First, the contrast of the low contrast cell images is improved by a combination of multiscale top hat filter and h-maxima. Then, a curvelet initialized level set method has been proposed to detect the cell nuclei and the boundaries. The image enhancement results have been verified using PSNR (Peak Signal to noise ratio) and the segmentation results have been verified using accuracy, sensitivity and precision metrics. The results show improved values of the performance metrics with the proposed method. PMID:26922612

  18. Wave breaking over sloping beaches using a coupled boundary integral-level set method

    SciTech Connect

    Garzon, M.; Adalsteinsson, D.; Gray, L.; Sethian, J.A.

    2003-12-08

    We present a numerical method for tracking breaking waves over sloping beaches. We use a fully non-linear potential model for in-compressible, irrotational and inviscid flow, and consider the effects of beach topography on breaking waves. The algorithm uses a Boundary Element Method (BEM) to compute the velocity at the interface, coupled to a Narrow Band Level Set Method to track the evolving air/water interface, and an associated extension equation to update the velocity potential both on and off the interface. The formulation of the algorithm is applicable to two and three dimensional breaking waves; in this paper, we concentrate on two-dimensional results showing wave breaking and rollup, and perform numerical convergence studies and comparison with previous techniques.

  19. Prostate ultrasound image segmentation using level set-based region flow with shape guidance

    NASA Astrophysics Data System (ADS)

    Gong, Lixin; Ng, Lydia; Pathak, Sayan D.; Tutar, Ismail; Cho, Paul S.; Haynor, David R.; Kim, Yongmin

    2005-04-01

    Prostate segmentation in ultrasound images is a clinically important and technically challenging task. Despite several research attempts, few effective methods are available. One problem is the limited algorithmic robustness to common artifacts in clinical data sets. To improve the robustness, we have developed a hybrid level set method, which incorporates shape constraints into a region-based curve evolution process. The online segmentation method alternates between two steps, namely, shape model estimation (ME) and curve evolution (CE). The prior shape information is encoded in an implicit parametric model derived offline from manually outlined training data. Utilizing this prior shape information, the ME step tries to compute the maximum a posteriori estimate of the model parameters. The estimated shape is then used to guide the CE step, which in turn provides a new model initialization for the ME step. The process stops automatically when the curve locks onto the specific prostate shape. The ME and the CE steps complement each other to capture both global and local shape details. With shape guidance, this algorithm is less sensitive to initial contour placement and more robust even in the presence of large boundary gaps and strong clutter. Promising results are demonstrated on both synthetic and real prostate ultrasound images.

  20. An abdominal aortic aneurysm segmentation method: Level set with region and statistical information

    SciTech Connect

    Zhuge Feng; Rubin, Geoffrey D.; Sun Shaohua; Napel, Sandy

    2006-05-15

    We present a system for segmenting the human aortic aneurysm in CT angiograms (CTA), which, in turn, allows measurements of volume and morphological aspects useful for treatment planning. The system estimates a rough 'initial surface', and then refines it using a level set segmentation scheme augmented with two external analyzers: The global region analyzer, which incorporates a priori knowledge of the intensity, volume, and shape of the aorta and other structures, and the local feature analyzer, which uses voxel location, intensity, and texture features to train and drive a support vector machine classifier. Each analyzer outputs a value that corresponds to the likelihood that a given voxel is part of the aneurysm, which is used during level set iteration to control the evolution of the surface. We tested our system using a database of 20 CTA scans of patients with aortic aneurysms. The mean and worst case values of volume overlap, volume error, mean distance error, and maximum distance error relative to human tracing were 95.3%{+-}1.4% (s.d.); worst case=92.9%, 3.5%{+-}2.5% (s.d.); worst case=7.0%, 0.6{+-}0.2 mm (s.d.); worst case=1.0 mm, and 5.2{+-}2.3mm (s.d.); worstcase=9.6 mm, respectively. When implemented on a 2.8 GHz Pentium IV personal computer, the mean time required for segmentation was 7.4{+-}3.6min (s.d.). We also performed experiments that suggest that our method is insensitive to parameter changes within 10% of their experimentally determined values. This preliminary study proves feasibility for an accurate, precise, and robust system for segmentation of the abdominal aneurysm from CTA data, and may be of benefit to patients with aortic aneurysms.

  1. Texture analysis improves level set segmentation of the anterior abdominal wall

    SciTech Connect

    Xu, Zhoubing; Allen, Wade M.; Baucom, Rebeccah B.; Poulose, Benjamin K.; Landman, Bennett A.

    2013-12-15

    Purpose: The treatment of ventral hernias (VH) has been a challenging problem for medical care. Repair of these hernias is fraught with failure; recurrence rates ranging from 24% to 43% have been reported, even with the use of biocompatible mesh. Currently, computed tomography (CT) is used to guide intervention through expert, but qualitative, clinical judgments, notably, quantitative metrics based on image-processing are not used. The authors propose that image segmentation methods to capture the three-dimensional structure of the abdominal wall and its abnormalities will provide a foundation on which to measure geometric properties of hernias and surrounding tissues and, therefore, to optimize intervention.Methods: In this study with 20 clinically acquired CT scans on postoperative patients, the authors demonstrated a novel approach to geometric classification of the abdominal. The authors’ approach uses a texture analysis based on Gabor filters to extract feature vectors and follows a fuzzy c-means clustering method to estimate voxelwise probability memberships for eight clusters. The memberships estimated from the texture analysis are helpful to identify anatomical structures with inhomogeneous intensities. The membership was used to guide the level set evolution, as well as to derive an initial start close to the abdominal wall.Results: Segmentation results on abdominal walls were both quantitatively and qualitatively validated with surface errors based on manually labeled ground truth. Using texture, mean surface errors for the outer surface of the abdominal wall were less than 2 mm, with 91% of the outer surface less than 5 mm away from the manual tracings; errors were significantly greater (2–5 mm) for methods that did not use the texture.Conclusions: The authors’ approach establishes a baseline for characterizing the abdominal wall for improving VH care. Inherent texture patterns in CT scans are helpful to the tissue classification, and texture

  2. Teachers' Lives in Context: A Framework for Understanding Barriers to High Quality Teaching within Resource Deprived Settings

    ERIC Educational Resources Information Center

    Schwartz, Kate; Cappella, Elise; Aber, J. Lawrence

    2016-01-01

    Within low-income communities in low- and high-resource countries, there is a profound need for more effective schools that are better able to foster child and youth development and support student learning. This paper presents a theoretical framework for understanding the role of teacher ecology in influencing teacher effectiveness and, through…

  3. Information Seen as Part of the Development of Living Intelligence: the Five-Leveled Cybersemiotic Framework for FIS

    NASA Astrophysics Data System (ADS)

    Brier, Soren

    2003-06-01

    It is argued that a true transdisciplinary information science going from physical information to phenomenological understanding needs a metaphysical framework. Three different kinds of causality are implied: efficient, formal and final. And at least five different levels of existence are needed: 1. The quantum vacuum fields with entangled causation. 2. The physical level with is energy and force-based efficient causation. 3. The informational-chemical level with its formal causation based on pattern fitting. 4. The biological-semiotic level with its non-conscious final causation and 5. The social-linguistic level of self-consciousness with its conscious goal-oriented final causation. To integrate these consistently in an evolutionary theory as emergent levels, neither mechanical determinism nor complexity theory are sufficient because they cannot be a foundation for a theory of lived meaning. C. S. Peirce's triadic semiotic philosophy combined with a cybernetic and systemic view, like N. Luhmann's, could create the framework I call Cybersemiotics.

  4. A Framework for the Study of Multiple Realizations: The Importance of Levels of Analysis

    PubMed Central

    Overgaard, Morten; Mogensen, Jesper

    2010-01-01

    The brain may undergo functional reorganizations. Selective loss of sensory input or training within a restricted part of a modality cause “shifts” within for instance somatotopic or tonotopic maps. Cross-modal plasticity occurs when input within a modality is absent – e.g., in the congenitally blind. Reorganizations are also found in functional recovery after brain injury. Focusing on such reorganizations, it may be studied whether a cognitive or conscious process can exclusively be mediated by one neural substrate – or may be associated with multiple neural representations. This is typically known as the problem of multiple realization – an essentially empirical issue with wide theoretical implications. This issue may appear to have a simple solution. When, for instance, the symptoms associated with brain injury disappear and the recovery is associated with increased activities within spared regions of the brain, it is tempting to conclude that the processes originally associated with the injured part of the brain are now mediated by an alternative neural substrate. Such a conclusion is, however, not a simple matter. Without a more thorough analysis, it cannot be concluded that a functional recovery of for instance language or attention is necessarily associated with a novel representation of the processes lost to injury. Alternatively, for instance, the recovery may reflect that apparently similar surface phenomena are obtained via dissimilar cognitive mechanisms. In this paper we propose a theoretical framework, which we believe can guide the design and interpretations of studies of post-traumatic recovery. It is essential to distinguish between a number of levels of analysis – including a differentiation between the surface phenomena and the underlying information processing – when addressing, for instance, whether a pre-traumatic and post-traumatically recovered cognitive or conscious process are actually the same. We propose a (somewhat

  5. Landscape response to base-level fall in extensional settings: Amargosa River, Basin and Range, USA

    NASA Astrophysics Data System (ADS)

    Smith, J.; Brocklehurst, S. H.; Gawthorpe, R. L.; Finch, E.

    2012-12-01

    Studies examining transient landscapes within rift basins generally focus on settings where changes in boundary conditions are driven by active tectonics. However, the effect of drainage network re-organisation on landscape development and sediment routing has received significantly less attention. Within active rift settings it is common for drainage networks to become fragmented as uplift rates overcome the erosive potential of streams, while subsidence generates under-filled basins. On a regional-scale this results in poorly integrated drainage systems consisting of numerous internally drained basins. Integration can occur through the filling of sub-basins, lake over-spill, or drainage capture. This may dramatically affect base-level, catchment size, sediment flux and fluvial geomorphology, providing a natural experiment in fluvial response to changing boundary conditions, as well as representing a fundamental control on the ultimate preservation of sediments. We combine field and remote mapping with the available dating to investigate an example of late Pleistocene drainage integration in the southern Basin and Range, where drainage integration has resulted in a base-level fall and rejuvenation of the upstream landscape triggering further drainage rearrangement. The Amargosa River was previously part of an internally-drained basin, feeding the former Lake Tecopa. Drainage capture at 150-200 ka caused the Amargosa River to flow into Death Valley, carving the Amargosa Canyon through the Sperry Hills. The canyon itself has experienced aggradation as well as incision, with both terraces and fans representing levels above the current river. Upstream of the Amargosa Canyon, incision is reflected by minor knickpoints, and gullying along tributaries. For what is now westwards-flowing Willow Wash, the net incision of Amargosa Canyon has resulted in spectacular headward erosion, dissecting fan surfaces which previously graded northwest to Lake Tecopa. The Willow Wash

  6. Generalized cost-effectiveness analysis for national-level priority-setting in the health sector

    PubMed Central

    Hutubessy, Raymond; Chisholm, Dan; Edejer, Tessa Tan-Torres

    2003-01-01

    Cost-effectiveness analysis (CEA) is potentially an important aid to public health decision-making but, with some notable exceptions, its use and impact at the level of individual countries is limited. A number of potential reasons may account for this, among them technical shortcomings associated with the generation of current economic evidence, political expediency, social preferences and systemic barriers to implementation. As a form of sectoral CEA, Generalized CEA sets out to overcome a number of these barriers to the appropriate use of cost-effectiveness information at the regional and country level. Its application via WHO-CHOICE provides a new economic evidence base, as well as underlying methodological developments, concerning the cost-effectiveness of a range of health interventions for leading causes of, and risk factors for, disease. The estimated sub-regional costs and effects of different interventions provided by WHO-CHOICE can readily be tailored to the specific context of individual countries, for example by adjustment to the quantity and unit prices of intervention inputs (costs) or the coverage, efficacy and adherence rates of interventions (effectiveness). The potential usefulness of this information for health policy and planning is in assessing if current intervention strategies represent an efficient use of scarce resources, and which of the potential additional interventions that are not yet implemented, or not implemented fully, should be given priority on the grounds of cost-effectiveness. Health policy-makers and programme managers can use results from WHO-CHOICE as a valuable input into the planning and prioritization of services at national level, as well as a starting point for additional analyses of the trade-off between the efficiency of interventions in producing health and their impact on other key outcomes such as reducing inequalities and improving the health of the poor. PMID:14687420

  7. Standard Setting in Relation to the Common European Framework of Reference for Languages: The Case of the State Examination of Dutch as a Second Language

    ERIC Educational Resources Information Center

    Bechger, Timo M.; Kuijper, Henk; Maris, Gunter

    2009-01-01

    This article reports on two related studies carried out to link the State examination of Dutch as a second language to the Common European Framework of Reference for languages (CEFR). In the first study, key persons from institutions for higher education were asked to determine the minimally required language level of beginning students. In the…

  8. An ecofeminist conceptual framework to explore gendered environmental health inequities in urban settings and to inform healthy public policy.

    PubMed

    Chircop, Andrea

    2008-06-01

    This theoretical exploration is an attempt to conceptualize the link between gender and urban environmental health. The proposed ecofeminist framework enables an understanding of the link between the urban physical and social environments and health inequities mediated by gender and socioeconomic status. This framework is proposed as a theoretical magnifying glass to reveal the underlying logic that connects environmental exploitation on the one hand, and gendered health inequities on the other. Ecofeminism has the potential to reveal an inherent, normative conceptual analysis and argumentative justification of western society that permits the oppression of women and the exploitation of the environment. This insight will contribute to a better understanding of the mechanisms underlying gendered environmental health inequities and inform healthy public policy that is supportive of urban environmental health, particularly for low-income mothers. PMID:18476856

  9. 5-SPICE: the application of an original framework for community health worker program design, quality improvement and research agenda setting

    PubMed Central

    Palazuelos, Daniel; DaEun Im, Dana; Peckarsky, Matthew; Schwarz, Dan; Farmer, Didi Bertrand; Dhillon, Ranu; Johnson, Ari; Orihuela, Claudia; Hackett, Jill; Bazile, Junior; Berman, Leslie; Ballard, Madeleine; Panjabi, Raj; Ternier, Ralph; Slavin, Sam; Lee, Scott; Selinsky, Steve; Mitnick, Carole Diane

    2013-01-01

    Introduction Despite decades of experience with community health workers (CHWs) in a wide variety of global health projects, there is no established conceptual framework that structures how implementers and researchers can understand, study and improve their respective programs based on lessons learned by other CHW programs. Objective To apply an original, non-linear framework and case study method, 5-SPICE, to multiple sister projects of a large, international non-governmental organization (NGO), and other CHW projects. Design Engaging a large group of implementers, researchers and the best available literature, the 5-SPICE framework was refined and then applied to a selection of CHW programs. Insights gleaned from the case study method were summarized in a tabular format named the ‘5×5-SPICE chart’. This format graphically lists the ways in which essential CHW program elements interact, both positively and negatively, in the implementation field. Results The 5×5-SPICE charts reveal a variety of insights that come from a more complex understanding of how essential CHW projects interact and influence each other in their unique context. Some have been well described in the literature previously, while others are exclusive to this article. An analysis of how best to compensate CHWs is also offered as an example of the type of insights that this method may yield. Conclusions The 5-SPICE framework is a novel instrument that can be used to guide discussions about CHW projects. Insights from this process can help guide quality improvement efforts, or be used as hypothesis that will form the basis of a program's research agenda. Recent experience with research protocols embedded into successfully implemented projects demonstrates how such hypothesis can be rigorously tested. PMID:23561023

  10. Setting priorities for the adoption of health technologies on a national level -- the Israeli experience.

    PubMed

    Shani, S; Siebzehner, M I; Luxenburg, O; Shemer, J

    2000-12-01

    its recommended list with minor changes within a limited timeframe. In conclusion, we propose a practical and pragmatic model for the inclusion of new health technologies at a national level, based on health technology assessment and explicit priority setting. PMID:11154787

  11. An efficient, interface-preserving level set redistancing algorithm and its application to interfacial incompressible fluid flow

    SciTech Connect

    Sussman, M. . Dept. of Mathematics); Fatemi, E.

    1999-04-01

    In Sussman, Smereka, and Osher, a numerical scheme was presented for computing incompressible air-water flows using the level set method. Crucial to the above method was a new iteration method for maintaining the level set function as the signed distance from the zero level set. In this paper the authors implement a constraint along with higher order difference schemes in order to make the iteration method more accurate and efficient. Accuracy is measured in terms of the new computed signed distance function and the original level set function having the same zero level set. The authors apply the redistancing scheme to incompressible flows with noticeably better resolved results at reduced cost. They validate the results with experiment and theory. They show that the distance level set scheme with the added constraint competes well with available interface tracking schemes for basic advection of an interface. They perform basic accuracy checks and more stringent tests involving complicated interfacial structures. As with all level set schemes, the method is easy to implement.

  12. A Patient-Centred Redesign Framework to Support System-Level Process Changes for Multimorbidities and Chronic Conditions.

    PubMed

    Sampalli, Tara; Edwards, Lynn; Christian, Erin; Kohler, Graeme; Bedford, Lisa; Demmons, Jillian; Verma, Jennifer; Gibson, Rick; Carson, Shannon Ryan

    2015-01-01

    Recent trends show an increase in the prevalence and costs associated with managing individuals with multimorbidities. Enabling better care for these individuals requires system-level changes such as the shift from a focus on a single disease or single service to multimorbidities and integrated systems of care. In this paper, a novel patient-centred redesign framework that was developed to support system-level process changes in four service areas has been discussed. The novelty of this framework is that it is embedded in patient perspectives and in the chronic care model as the theoretical foundation. The aims of this paper are to present an application of the framework in the context of four chronic disease prevention and management services, and to discuss early results from the pilot initiative along with an overview of the spread opportunities for this initiative. PMID:26718252

  13. Low-level 14C methane oxidation rate measurements modified for remote field settings

    NASA Astrophysics Data System (ADS)

    Pack, M. A.; Pohlman, J.; Ruppel, C. D.; Xu, X.

    2012-12-01

    Aerobic methane oxidation limits atmospheric methane emissions from degraded subsea permafrost and dissociated methane hydrates in high latitude oceans. Methane oxidation rate measurements are a crucial tool for investigating the efficacy of this process, but are logistically challenging when working on small research vessels in remote settings. We modified a low-level 14C-CH4 oxidation rate measurement for use in the Beaufort Sea above hydrate bearing sediments during August 2012. Application of the more common 3H-CH4 rate measurement that uses 106 times more radioactivity was not practical because the R/V Ukpik cannot accommodate a radiation van. The low-level 14C measurement does not require a radiation van, but careful isolation of the 14C-label is essential to avoid contaminating natural abundance 14C measurements. We used 14C-CH4 with a total activity of 1.1 μCi, which is far below the 100 μCi permitting level. In addition, we modified field procedures to simplify and shorten sample processing. The original low-level 14C-CH4 method requires 6 steps in the field: (1) collect water samples in glass serum bottles, (2) inject 14C-CH4 into bottles, (3) incubate for 24 hours, (4) filter to separate the methanotrophic bacterial cells from the aqueous sample, (5) kill the filtrate with sodium hydroxide (NaOH), and (6) purge with nitrogen to remove unused 14C-CH4. Onshore, the 14C-CH4 respired to carbon dioxide or incorporated into cell material by methanotrophic bacteria during incubation is quantified by accelerator mass spectrometry (AMS). We conducted an experiment to test the possibility of storing samples for purging and filtering back onshore (steps 4 and 6). We subjected a series of water samples to steps 1-3 & 5, and preserved with mercuric chloride (HgCl2) instead of NaOH because HgCl2 is less likely to break down cell material during storage. The 14C-content of the carbon dioxide in samples preserved with HgCl2 and stored for up to 2 weeks was stable

  14. Detection of colonic polyp candidates with level set-based thickness mapping over the colon wall

    NASA Astrophysics Data System (ADS)

    Han, Hao; Li, Lihong; Duan, Chaijie; Zhao, Yang; Wang, Huafeng; Liang, Zhengrong

    2015-03-01

    Further improvement of computer-aided detection (CADe) of colonic polyps is vital to advance computed tomographic colonography (CTC) toward a screening modality, where the detection of flat polyps is especially challenging because limited image features can be extracted from flat polyps, and the traditional geometric features-based CADe methods usually fail to detect such polyps. In this paper, we present a novel pipeline to automatically detect initial polyp candidates (IPCs), especially flat polyps, from CTC images. First, the colon wall mucosa was extracted via a partial volume segmentation approach as a volumetric layer, where the inner border of colon wall can be obtained by shrinking the volumetric layer using level set based adaptive convolution. Then the outer border of colon wall (or the colon wall serosa) was segmented via a combined implementation of geodesic active contour and Mumford-Shah functional in a coarse-to-fine manner. Finally, the wall thickness was estimated along a unique path between the segmented inner and outer borders with consideration of the volumetric layers and was mapped onto a patient-specific three-dimensional (3D) colon wall model. The IPC detection results can usually be better visualized in a 2D image flattened from the 3D model, where abnormalities were detected by Z-score transformation of the thickness values. The proposed IPC detection approach was validated on 11 patients with 22 CTC scans, and each scan has at least one flat poly annotation. The above presented novel pipeline was effective to detect some flat polyps that were missed by our CADe system while keeping false detections in a relative low level. This preliminary study indicates that the presented pipeline can be incorporated into an existing CADe system to enhance the polyp detection power, especially for flat polyps.

  15. A New Ghost Cell/Level Set Method for Moving Boundary Problems: Application to Tumor Growth

    PubMed Central

    Macklin, Paul

    2011-01-01

    In this paper, we present a ghost cell/level set method for the evolution of interfaces whose normal velocity depend upon the solutions of linear and nonlinear quasi-steady reaction-diffusion equations with curvature-dependent boundary conditions. Our technique includes a ghost cell method that accurately discretizes normal derivative jump boundary conditions without smearing jumps in the tangential derivative; a new iterative method for solving linear and nonlinear quasi-steady reaction-diffusion equations; an adaptive discretization to compute the curvature and normal vectors; and a new discrete approximation to the Heaviside function. We present numerical examples that demonstrate better than 1.5-order convergence for problems where traditional ghost cell methods either fail to converge or attain at best sub-linear accuracy. We apply our techniques to a model of tumor growth in complex, heterogeneous tissues that consists of a nonlinear nutrient equation and a pressure equation with geometry-dependent jump boundary conditions. We simulate the growth of glioblastoma (an aggressive brain tumor) into a large, 1 cm square of brain tissue that includes heterogeneous nutrient delivery and varied biomechanical characteristics (white matter, gray matter, cerebrospinal fluid, and bone), and we observe growth morphologies that are highly dependent upon the variations of the tissue characteristics—an effect observed in real tumor growth. PMID:21331304

  16. Two-phase electro-hydrodynamic flow modeling by a conservative level set model.

    PubMed

    Lin, Yuan

    2013-03-01

    The principles of electro-hydrodynamic (EHD) flow have been known for more than a century and have been adopted for various industrial applications, for example, fluid mixing and demixing. Analytical solutions of such EHD flow only exist in a limited number of scenarios, for example, predicting a small deformation of a single droplet in a uniform electric field. Numerical modeling of such phenomena can provide significant insights about EHDs multiphase flows. During the last decade, many numerical results have been reported to provide novel and useful tools of studying the multiphase EHD flow. Based on a conservative level set method, the proposed model is able to simulate large deformations of a droplet by a steady electric field, which is beyond the region of theoretic prediction. The model is validated for both leaky dielectrics and perfect dielectrics, and is found to be in excellent agreement with existing analytical solutions and numerical studies in the literature. Furthermore, simulations of the deformation of a water droplet in decyl alcohol in a steady electric field match better with published experimental data than the theoretical prediction for large deformations. Therefore the proposed model can serve as a practical and accurate tool for simulating two-phase EHD flow. PMID:23161380

  17. Cerebral Arteries Extraction using Level Set Segmentation and Adaptive Tracing for CT Angiography

    SciTech Connect

    Zhang Yong; Zhou Xiaobo; Srinivasan, Ranga; Wong, Stephen T. C.; Young, Geoff

    2007-11-02

    We propose an approach for extracting cerebral arteries from partial Computed Tomography Angiography (CTA). The challenges of extracting cerebral arteries from CTA come from the fact that arteries are usually surrounded by bones and veins in the lower portion of a CTA volume. There exists strong intensity-value overlap between vessels and surrounding objects. Besides, it is inappropriate to assume the 2D cross sections of arteries are circle or ellipse, especially for abnormal vessels. The navigation of the arteries could change suddenly in the 3D space. In this paper, a method based on level set segmentation is proposed to target this challenging problem. For the lower portion of a CTA volume, we use geodesic active contour method to detect cross section of arteries in the 2D space. The medial axis of the artery is obtained by adaptively tracking along its navigation path. This is done by finding the minimal cross section from cutting the arteries under different angles in the 3D spherical space. This method is highly automated, with minimum user input of providing only the starting point and initial navigation direction of the arteries of interests.

  18. Measurement of thermally ablated lesions in sonoelastographic images using level set methods

    NASA Astrophysics Data System (ADS)

    Castaneda, Benjamin; Tamez-Pena, Jose Gerardo; Zhang, Man; Hoyt, Kenneth; Bylund, Kevin; Christensen, Jared; Saad, Wael; Strang, John; Rubens, Deborah J.; Parker, Kevin J.

    2008-03-01

    The capability of sonoelastography to detect lesions based on elasticity contrast can be applied to monitor the creation of thermally ablated lesion. Currently, segmentation of lesions depicted in sonoelastographic images is performed manually which can be a time consuming process and prone to significant intra- and inter-observer variability. This work presents a semi-automated segmentation algorithm for sonoelastographic data. The user starts by planting a seed in the perceived center of the lesion. Fast marching methods use this information to create an initial estimate of the lesion. Subsequently, level set methods refine its final shape by attaching the segmented contour to edges in the image while maintaining smoothness. The algorithm is applied to in vivo sonoelastographic images from twenty five thermal ablated lesions created in porcine livers. The estimated area is compared to results from manual segmentation and gross pathology images. Results show that the algorithm outperforms manual segmentation in accuracy, inter- and intra-observer variability. The processing time per image is significantly reduced.

  19. Cell spreading analysis with directed edge profile-guided level set active contours.

    PubMed

    Ersoy, I; Bunyak, F; Palaniappan, K; Sun, M; Forgacs, G

    2008-01-01

    Cell adhesion and spreading within the extracellular matrix (ECM) plays an important role in cell motility, cell growth and tissue organization. Measuring cell spreading dynamics enables the investigation of cell mechanosensitivity to external mechanical stimuli, such as substrate rigidity. A common approach to measure cell spreading dynamics is to take time lapse images and quantify cell size and perimeter as a function of time. In our experiments, differences in cell characteristics between different treatments are subtle and require accurate measurements of cell parameters across a large population of cells to ensure an adequate sample size for statistical hypothesis testing. This paper presents a new approach to estimate accurate cell boundaries with complex shapes by applying a modified geodesic active contour level set method that directly utilizes the halo effect typically seen in phase contrast microscopy. Contour evolution is guided by edge profiles in a perpendicular direction to ensure convergence to the correct cell boundary. The proposed approach is tested on bovine aortic endothelial cell images under different treatments, and demonstrates accurate segmentation for a wide range of cell sizes and shapes compared to manual ground truth. PMID:18979769

  20. A level set method for solid-liquid interface tracking in texturally equilibrated pore networks

    NASA Astrophysics Data System (ADS)

    Ghanbarzadeh, Soheil; Hesse, Marc; Prodanovic, Masa

    2015-04-01

    The properties of some porous media are determined by their evolution towards textural equilibrium. Melt drainage from temperate glacier ice and the accumulation of hydrocarbons beneath rock salt are two examples in natural systems. In these materials, pore geometry evolves to minimize the solid-liquid interfacial energy while maintaining dihedral angle, θ, at solid-liquid contact lines. In this work we present the first computations of 3-D texturally equilibrated pore networks using a novel level set method. Interfacial energy minimization is achieved by evolving interface under surface diffusion to constant mean curvature surface. The porosity and dihedral angle constraints are added to the formulation using virtual velocity terms. A domain decomposition scheme is devised to restrict the computational domain and the coupling between the interfaces is achieved on the original computational domain. For the last 30 years, explicit representation of the interfaces limited the computations to highly idealized geometries. The presented model overcomes these limitations and opens the door to the exploration of the physics of these materials in realistic systems. For example, our results show that the fully wetted grain boundaries exist even for θ>0 which reconciles the theory with experimental observations. This work is sponsored by the Statoil Fellows Program at The University of Texas.

  1. Coupled Segmentation of Nuclear and Membrane-bound Macromolecules through Voting and Multiphase Level Set

    PubMed Central

    Wen, Quan

    2014-01-01

    Membrane-bound macromolecules play an important role in tissue architecture and cell-cell communication, and is regulated by almost one-third of the genome. At the optical scale, one group of membrane proteins expresses themselves as linear structures along the cell surface boundaries, while others are sequestered; and this paper targets the former group. Segmentation of these membrane proteins on a cell-by-cell basis enables the quantitative assessment of localization for comparative analysis. However, such membrane proteins typically lack continuity, and their intensity distributions are often very heterogeneous; moreover, nuclei can form large clump, which further impedes the quantification of membrane signals on a cell-by-cell basis. To tackle these problems, we introduce a three-step process to (i) regularize the membrane signal through iterative tangential voting, (ii) constrain the location of surface proteins by nuclear features, where clumps of nuclei are segmented through a delaunay triangulation approach, and (iii) assign membrane-bound macromolecules to individual cells through an application of multi-phase geodesic level-set. We have validated our method using both synthetic data and a dataset of 200 images, and are able to demonstrate the efficacy of our approach with superior performance. PMID:25530633

  2. Semi-Automated Detection of Surface Degradation on Bridges Based on a Level Set Method

    NASA Astrophysics Data System (ADS)

    Masiero, A.; Guarnieri, A.; Pirotti, F.; Vettore, A.

    2015-08-01

    Due to the effect of climate factors, natural phenomena and human usage, buildings and infrastructures are subject of progressive degradation. The deterioration of these structures has to be monitored in order to avoid hazards for human beings and for the natural environment in their neighborhood. Hence, on the one hand, monitoring such infrastructures is of primarily importance. On the other hand, unfortunately, nowadays this monitoring effort is mostly done by expert and skilled personnel, which follow the overall data acquisition, analysis and result reporting process, making the whole monitoring procedure quite expensive for the public (and private, as well) agencies. This paper proposes the use of a partially user-assisted procedure in order to reduce the monitoring cost and to make the obtained result less subjective as well. The developed method relies on the use of images acquired with standard cameras by even inexperienced personnel. The deterioration on the infrastructure surface is detected by image segmentation based on a level sets method. The results of the semi-automated analysis procedure are remapped on a 3D model of the infrastructure obtained by means of a terrestrial laser scanning acquisition. The proposed method has been successfully tested on a portion of a road bridge in Perarolo di Cadore (BL), Italy.

  3. Tackling rare diseases at European level: why do we need a harmonized framework?

    PubMed

    Taruscio, Domenica; Trama, Annalisa; Stefanov, Rumen

    2007-01-01

    Since 1999 the European Commission has gradually developed a proactive approach towards rare diseases (RD). Despite the progress made over the last years, a comprehensive and evidence based approach is still missing in many EU Member States (MS), leading to an incomplete and often inadequate framework to address rare diseases. Healthcare systems in EU MS differ to great extent among countries in respect to their organization and funding. In general, they are not ready to face the specific problems and needs of people with rare diseases for possible prevention, timely diagnosis, adequate treatment and rehabilitation. Access to new advanced treatment and approved orphan drugs by EMEA is also a big challenge for many MS. A public health approach is needed to properly tackle rare diseases. It is a while that the idea of a comprehensive approach addressing the different challenges of rare diseases is under discussion. In our opinion, the first step to build a comprehensive approach is to properly plan the activities to undertake accordingly to needs, gaps and resources available in a Country. It is therefore important to develop a strategic plan. Adopting a strategic planning approach to rare diseases implies taking advantage of ongoing actions and building on it to adjust, re-orient or expand the response. So far only France has developed a national strategic plan for rare diseases, Bulgaria is in the process of approving its national plan for RD and Spain is in the process of developing it. In this context, considering the importance of developing national plans for RD, it would be very useful to develop recommendations for RD national plan development in order to provide an instrument to support Countries in designing their national plans. The three MS initiatives presented in this paper confirmed the availability of great experiences and expertises among many EU MS and supported the idea that all these different experiences available at the EU level should form the

  4. A Generic System-Level Framework for Self-Serve Health Monitoring System through Internet of Things (IoT).

    PubMed

    Ahmed, Mobyen Uddin; Björkman, Mats; Lindén, Maria

    2015-01-01

    Sensor data are traveling from sensors to a remote server, data is analyzed remotely in a distributed manner, and health status of a user is presented in real-time. This paper presents a generic system-level framework for a self-served health monitoring system through the Internet of Things (IoT) to facilities an efficient sensor data management. PMID:25980888

  5. A General Framework for Power Analysis to Detect the Moderator Effects in Two- and Three-Level Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Dong, Nianbo; Spybrook, Jessaca; Kelcey, Ben

    2016-01-01

    The purpose of this study is to propose a general framework for power analyses to detect the moderator effects in two- and three-level cluster randomized trials (CRTs). The study specifically aims to: (1) develop the statistical formulations for calculating statistical power, minimum detectable effect size (MDES) and its confidence interval to…

  6. Expected frontiers: Incorporating weather uncertainty into a policy analysis using an integrated bi-level multi-objective optimization framework

    EPA Science Inventory

    Weather is the main driver in both plant use of nutrients and fate and transport of nutrients in the environment. In previous work, we evaluated a green tax for control of agricultural nutrients in a bi-level optimization framework that linked deterministic models. In this study,...

  7. How Multi-Levels of Individual and Team Learning Interact in a Public Healthcare Organisation: A Conceptual Framework

    ERIC Educational Resources Information Center

    Doyle, Louise; Kelliher, Felicity; Harrington, Denis

    2016-01-01

    The aim of this paper is to review the relevant literature on organisational learning and offer a preliminary conceptual framework as a basis to explore how the multi-levels of individual learning and team learning interact in a public healthcare organisation. The organisational learning literature highlights a need for further understanding of…

  8. The Existence of Alternative Framework in Students' Scientific Imagination on the Concept of Matter at Submicroscopic Level: Macro Imagination

    ERIC Educational Resources Information Center

    Abdullah, Nurdiana; Surif, Johari

    2015-01-01

    This study is conducted with the purpose of identifying the alternative framework contained in students' imagination on the concept of matter at submicroscopic level. Through the design of purposive sampling techniques, a total of 15 students are interviewed to obtain the data. Data from analysis document is utilized to strengthen the interview.…

  9. Modeling Primary Breakup: A Three-Dimensional Eulerian Level Set/Vortex Sheet Method for Two-Phase Interface Dynamics

    NASA Technical Reports Server (NTRS)

    Herrmann, M.

    2003-01-01

    This paper is divided into four parts. First, the level set/vortex sheet method for three-dimensional two-phase interface dynamics is presented. Second, the LSS model for the primary breakup of turbulent liquid jets and sheets is outlined and all terms requiring subgrid modeling are identified. Then, preliminary three-dimensional results of the level set/vortex sheet method are presented and discussed. Finally, conclusions are drawn and an outlook to future work is given.

  10. Analysis of adequacy levels for human resources improvement within primary health care framework in Africa.

    PubMed

    Parent, Florence; Fromageot, Audrey; Coppieters, Yves; Lejeune, Colette; Lemenu, Dominique; Garant, Michèle; Piette, Danielle; Levêque, Alain; De Ketele, Jean-Marie

    2005-12-01

    Human resources in health care system in sub-Saharan Africa are generally picturing a lack of adequacy between expected skills from the professionals and health care needs expressed by the populations. It is, however, possible to analyse these various lacks of adequacy related to human resource management and their determinants to enhance the effectiveness of the health care system. From two projects focused on nurse professionals within the health care system in Central Africa, we present an analytic grid for adequacy levels looking into the following aspects:- adequacy between skills-based profiles for health system professionals, quality of care and service delivery (health care system /medical standards), needs and expectations from the populations,- adequacy between allocation of health system professionals, quality of care and services delivered (health care system /medical standards), needs and expectations from the populations,- adequacy between human resource management within health care system and medical standards,- adequacy between human resource management within education/teaching/training and needs from health care system and education sectors,- adequacy between basic and on-going education and realities of tasks expected and implemented by different categories of professionals within the health care system body,- adequacy between intentions for initial and on-going trainings and teaching programs in health sciences for trainers (teachers/supervisors/health care system professionals/ directors (teaching managers) of schools...). This tool is necessary for decision-makers as well as for health care system professionals who share common objectives for changes at each level of intervention within the health system. Setting this adequacy implies interdisciplinary and participative approaches for concerned actors in order to provide an overall vision of a more broaden system than health district, small island with self-rationality, and in which they

  11. Toward accurate tooth segmentation from computed tomography images using a hybrid level set model

    SciTech Connect

    Gan, Yangzhou; Zhao, Qunfei; Xia, Zeyang E-mail: jing.xiong@siat.ac.cn; Hu, Ying; Xiong, Jing E-mail: jing.xiong@siat.ac.cn; Zhang, Jianwei

    2015-01-15

    Purpose: A three-dimensional (3D) model of the teeth provides important information for orthodontic diagnosis and treatment planning. Tooth segmentation is an essential step in generating the 3D digital model from computed tomography (CT) images. The aim of this study is to develop an accurate and efficient tooth segmentation method from CT images. Methods: The 3D dental CT volumetric images are segmented slice by slice in a two-dimensional (2D) transverse plane. The 2D segmentation is composed of a manual initialization step and an automatic slice by slice segmentation step. In the manual initialization step, the user manually picks a starting slice and selects a seed point for each tooth in this slice. In the automatic slice segmentation step, a developed hybrid level set model is applied to segment tooth contours from each slice. Tooth contour propagation strategy is employed to initialize the level set function automatically. Cone beam CT (CBCT) images of two subjects were used to tune the parameters. Images of 16 additional subjects were used to validate the performance of the method. Volume overlap metrics and surface distance metrics were adopted to assess the segmentation accuracy quantitatively. The volume overlap metrics were volume difference (VD, mm{sup 3}) and Dice similarity coefficient (DSC, %). The surface distance metrics were average symmetric surface distance (ASSD, mm), RMS (root mean square) symmetric surface distance (RMSSSD, mm), and maximum symmetric surface distance (MSSD, mm). Computation time was recorded to assess the efficiency. The performance of the proposed method has been compared with two state-of-the-art methods. Results: For the tested CBCT images, the VD, DSC, ASSD, RMSSSD, and MSSD for the incisor were 38.16 ± 12.94 mm{sup 3}, 88.82 ± 2.14%, 0.29 ± 0.03 mm, 0.32 ± 0.08 mm, and 1.25 ± 0.58 mm, respectively; the VD, DSC, ASSD, RMSSSD, and MSSD for the canine were 49.12 ± 9.33 mm{sup 3}, 91.57 ± 0.82%, 0.27 ± 0.02 mm, 0

  12. Colonic wall thickness using level sets for CT virtual colonoscopy visual assessment and polyp detection

    NASA Astrophysics Data System (ADS)

    Van Uitert, Robert L.; Summers, Ronald M.

    2007-03-01

    The detection of polyps in virtual colonoscopy is an active area of research. One of the critical elements in detecting cancerous polyps using virtual colonoscopy, especially in conjunction with computer-aided detection, is the accurate segmentation of the colon wall. The large CT attenuation difference between the lumen and inner, mucosal layer of the colon wall makes the segmentation of the lumen easily performed by traditional threshold segmentation techniques. However, determining the location of the colon outer wall is often difficult due to the low contrast difference between the colon wall's outer serosal layer and the fat surrounding the colon. We have developed an automatic, level set based method to determine from a CT colonography scan the location of the colon inner boundary and the colon outer wall boundary. From the location of the inner and outer colon wall boundaries, the wall thickness throughout the colon can be computed. Color mapping of the wall thickness on the colon surface allows for easy visual determination of potential regions of interest. Since the colon wall tends to be thicker at polyp locations, potential polyps also can be detected automatically at sites of increased colon wall thickness. This method was validated on several CT colonography scans containing optical colonoscopy-proven polyps. The method accurately determined thicker colonic wall regions in areas where polyps are present in the ground truth datasets and detected the polyps at a false positive rate between 44.4% and 82.8% lower than a state-of-the-art curvature-based method for initial polyp detection.

  13. CT liver volumetry using geodesic active contour segmentation with a level-set algorithm

    NASA Astrophysics Data System (ADS)

    Suzuki, Kenji; Epstein, Mark L.; Kohlbrenner, Ryan; Obajuluwa, Ademola; Xu, Jianwu; Hori, Masatoshi; Baron, Richard

    2010-03-01

    Automatic liver segmentation on CT images is challenging because the liver often abuts other organs of a similar density. Our purpose was to develop an accurate automated liver segmentation scheme for measuring liver volumes. We developed an automated volumetry scheme for the liver in CT based on a 5 step schema. First, an anisotropic smoothing filter was applied to portal-venous phase CT images to remove noise while preserving the liver structure, followed by an edge enhancer to enhance the liver boundary. By using the boundary-enhanced image as a speed function, a fastmarching algorithm generated an initial surface that roughly estimated the liver shape. A geodesic-active-contour segmentation algorithm coupled with level-set contour-evolution refined the initial surface so as to more precisely fit the liver boundary. The liver volume was calculated based on the refined liver surface. Hepatic CT scans of eighteen prospective liver donors were obtained under a liver transplant protocol with a multi-detector CT system. Automated liver volumes obtained were compared with those manually traced by a radiologist, used as "gold standard." The mean liver volume obtained with our scheme was 1,520 cc, whereas the mean manual volume was 1,486 cc, with the mean absolute difference of 104 cc (7.0%). CT liver volumetrics based on an automated scheme agreed excellently with "goldstandard" manual volumetrics (intra-class correlation coefficient was 0.95) with no statistically significant difference (p(F<=f)=0.32), and required substantially less completion time. Our automated scheme provides an efficient and accurate way of measuring liver volumes.

  14. Free surface flow through rock-fill dams analyzed by FEM with level set approach

    NASA Astrophysics Data System (ADS)

    Sharif, N. H.; Wiberg, N.-E.; Levenstam, M.

    A stabilized-finite element formulation is coupled with a level set technique for computations of incompressible non-linear flow with interfaces between two immiscible fluids. An interface capturing formulation (ICF) for non-linear, free surface, seepage flow in rock-fill dams is proposed. The formulation is derived for two- and three-dimensional flow within a fixed mesh domain. The resulting formulation is general and applicable for various steady and transient two-phase flow problems. FE-refinement is processed for the entire fixed mesh domains. A general solver is also reviewed for large and non-symmetric non-positive definite linear system of equations with the GMRES-update technique based on a Newton-iterative method. The computational procedure has been implemented in MATLAB. A comparison is performed between the 2-D computed test problem for coarse and refined meshes together with some proposed analytical solutions for nonlinear seepage flow with free surface in rock-fill dams. An expansion of the 2-D program code to a 3-D one for a rectangular rock-fill dam is also developed and simulated in MATLAB. The performance of the computations in 3-D is very promising and its opening the future for possible industrial applications using the same simple technique. Computations for a simple 3-D seepage flow problem with free surface in rock-fill dam are included in present paper. A general mesh generator and solver for large scale and complex 3-D flow problems in a real embankment dam is also under construction in C++.

  15. Multiatlas segmentation of thoracic and abdominal anatomy with level set-based local search.

    PubMed

    Schreibmann, Eduard; Marcus, David M; Fox, Tim

    2014-01-01

    Segmentation of organs at risk (OARs) remains one of the most time-consuming tasks in radiotherapy treatment planning. Atlas-based segmentation methods using single templates have emerged as a practical approach to automate the process for brain or head and neck anatomy, but pose significant challenges in regions where large interpatient variations are present. We show that significant changes are needed to autosegment thoracic and abdominal datasets by combining multi-atlas deformable registration with a level set-based local search. Segmentation is hierarchical, with a first stage detecting bulk organ location, and a second step adapting the segmentation to fine details present in the patient scan. The first stage is based on warping multiple presegmented templates to the new patient anatomy using a multimodality deformable registration algorithm able to cope with changes in scanning conditions and artifacts. These segmentations are compacted in a probabilistic map of organ shape using the STAPLE algorithm. Final segmentation is obtained by adjusting the probability map for each organ type, using customized combinations of delineation filters exploiting prior knowledge of organ characteristics. Validation is performed by comparing automated and manual segmentation using the Dice coefficient, measured at an average of 0.971 for the aorta, 0.869 for the trachea, 0.958 for the lungs, 0.788 for the heart, 0.912 for the liver, 0.884 for the kidneys, 0.888 for the vertebrae, 0.863 for the spleen, and 0.740 for the spinal cord. Accurate atlas segmentation for abdominal and thoracic regions can be achieved with the usage of a multi-atlas and perstructure refinement strategy. To improve clinical workflow and efficiency, the algorithm was embedded in a software service, applying the algorithm automatically on acquired scans without any user interaction. PMID:25207393

  16. 3D mapping of airway wall thickening in asthma with MSCT: a level set approach

    NASA Astrophysics Data System (ADS)

    Fetita, Catalin; Brillet, Pierre-Yves; Hartley, Ruth; Grenier, Philippe A.; Brightling, Christopher

    2014-03-01

    Assessing the airway wall thickness in multi slice computed tomography (MSCT) as image marker for airway disease phenotyping such asthma and COPD is a current trend and challenge for the scientific community working in lung imaging. This paper addresses the same problem from a different point of view: considering the expected wall thickness-to-lumen-radius ratio for a normal subject as known and constant throughout the whole airway tree, the aim is to build up a 3D map of airway wall regions of larger thickness and to define an overall score able to highlight a pathological status. In this respect, the local dimension (caliber) of the previously segmented airway lumen is obtained on each point by exploiting the granulometry morphological operator. A level set function is defined based on this caliber information and on the expected wall thickness ratio, which allows obtaining a good estimate of the airway wall throughout all segmented lumen generations. Next, the vascular (or mediastinal dense tissue) contact regions are automatically detected and excluded from analysis. For the remaining airway wall border points, the real wall thickness is estimated based on the tissue density analysis in the airway radial direction; thick wall points are highlighted on a 3D representation of the airways and several quantification scores are defined. The proposed approach is fully automatic and was evaluated (proof of concept) on a patient selection coming from different databases including mild, severe asthmatics and normal cases. This preliminary evaluation confirms the discriminative power of the proposed approach regarding different phenotypes and is currently extending to larger cohorts.

  17. Volume analysis of treatment response of head and neck lesions using 3D level set segmentation

    NASA Astrophysics Data System (ADS)

    Hadjiiski, Lubomir; Street, Ethan; Sahiner, Berkman; Gujar, Sachin; Ibrahim, Mohannad; Chan, Heang-Ping; Mukherji, Suresh K.

    2008-03-01

    A computerized system for segmenting lesions in head and neck CT scans was developed to assist radiologists in estimation of the response to treatment of malignant lesions. The system performs 3D segmentations based on a level set model and uses as input an approximate bounding box for the lesion of interest. In this preliminary study, CT scans from a pre-treatment exam and a post one-cycle chemotherapy exam of 13 patients containing head and neck neoplasms were used. A radiologist marked 35 temporal pairs of lesions. 13 pairs were primary site cancers and 22 pairs were metastatic lymph nodes. For all lesions, a radiologist outlined a contour on the best slice on both the pre- and post treatment scans. For the 13 primary lesion pairs, full 3D contours were also extracted by a radiologist. The average pre- and post-treatment areas on the best slices for all lesions were 4.5 and 2.1 cm2, respectively. For the 13 primary site pairs the average pre- and post-treatment primary lesions volumes were 15.4 and 6.7 cm 3 respectively. The correlation between the automatic and manual estimates for the pre-to-post-treatment change in area for all 35 pairs was r=0.97, while the correlation for the percent change in area was r=0.80. The correlation for the change in volume for the 13 primary site pairs was r=0.89, while the correlation for the percent change in volume was r=0.79. The average signed percent error between the automatic and manual areas for all 70 lesions was 11.0+/-20.6%. The average signed percent error between the automatic and manual volumes for all 26 primary lesions was 37.8+/-42.1%. The preliminary results indicate that the automated segmentation system can reliably estimate tumor size change in response to treatment relative to radiologist's hand segmentation.

  18. Initial condition for efficient mapping of level set algorithms on many-core architectures

    NASA Astrophysics Data System (ADS)

    Tornai, Gábor János; Cserey, György

    2014-12-01

    In this paper, we investigated the effect of adding more small curves to the initial condition which determines the required number of iterations of a fast level set (LS) evolution. As a result, we discovered two new theorems and developed a proof on the worst case of the required number of iterations. Furthermore, we found that these kinds of initial conditions fit well to many-core architectures. To show this, we have included two case studies which are presented on different platforms. One runs on a graphical processing unit (GPU) and the other is executed on a cellular nonlinear network universal machine (CNN-UM). With the new initial conditions, the steady-state solutions of the LS are reached in less than eight iterations depending on the granularity of the initial condition. These dense iterations can be calculated very quickly on many-core platforms according to the two case studies. In the case of the proposed dense initial condition on GPU, there is a significant speedup compared to the sparse initial condition in all cases since our dense initial condition together with the algorithm utilizes the properties of the underlying architecture. Therefore, greater performance gain can be achieved (up to 18 times speedup compared to the sparse initial condition on GPU). Additionally, we have validated our concept against numerically approximated LS evolution of standard flows (mean curvature, Chan-Vese, geodesic active regions). The dice indexes between the fast LS evolutions and the evolutions of the numerically approximated partial differential equations are in the range of 0.99±0.003.

  19. Birth choices in Timor-Leste: a framework for understanding the use of maternal health services in low resource settings.

    PubMed

    Wild, Kayli; Barclay, Lesley; Kelly, Paul; Martins, Nelson

    2010-12-01

    The high rate of maternal mortality in Timor-Leste is a persistent problem which has been exacerbated by the long history of military occupation and ongoing political crises since independence in 1999. It is similar to other developing countries where there have been slow declines in maternal mortality despite 20 years of Safe Motherhood interventions. The national Ministry of Health, United Nations (UN) agencies and non-government organisations (NGOs) have attempted to reduce maternal mortality by enacting policies and interventions to increase the number of births in health centres and hospitals. Despite considerable effort in promoting facility-based delivery, most Timorese women birth at home and the lack of midwives means few women have access to a skilled birth attendant. This paper investigates factors influencing access to and use of maternal health services in rural areas of Timor-Leste. It draws on 21 interviews and 11 group discussions with Timorese women and their families collected over two periods of fieldwork, one month in September 2006 and five months from July to December 2007. Theoretical concepts from anthropology and health social science are used to explore individual, social, political and health system issues which affect the way in which maternal health services are utilised. In drawing together a range of theories this paper aims to extend explanations around access to maternal health services in developing countries. An empirically informed framework is proposed which illustrates the complex factors that influence women's birth choices. This framework can be used by policy-makers, practitioners, donors and researchers to think critically about policy decisions and where investments can have the most impact for improving maternal health in Timor-Leste and elsewhere. PMID:20971540

  20. Expert Consensus on the Rehabilitation Framework Guiding a Model of Care for People Living With HIV in a South African Setting.

    PubMed

    Chetty, Verusia; Hanass-Hancock, Jill; Myezwa, Hellen

    2016-01-01

    Disabilities and treatments related to HIV are a focus for rehabilitation professionals in HIV-endemic countries, yet these countries lack guidance to integrate rehabilitation into a model of care for people living with HIV. We asked HIV and rehabilitation experts in South Africa to engage in a modified Delphi survey based on findings from (a) an enquiry into stakeholder perspectives of a context-specific rehabilitation framework at a semi-rural setting and (b) an analysis of international models of care-guiding rehabilitation. Consensus was determined by an a priori threshold of 70% of agreement and interquartile range (≤ 1 on criterion) to be included as essential or useful in the model of care framework. Experts agreed that improving access to care, optimal communication between stakeholders, education and training for health care workers, and home-based rehabilitation were essential for the model. Furthermore, task shifting and evidence-based practice were seen as fundamental for optimal care. PMID:26585032

  1. Fostering Multirepresentational Levels of Chemical Concepts: A Framework to Develop Educational Software

    ERIC Educational Resources Information Center

    Marson, Guilherme A.; Torres, Bayardo B.

    2011-01-01

    This work presents a convenient framework for developing interactive chemical education software to facilitate the integration of macroscopic, microscopic, and symbolic dimensions of chemical concepts--specifically, via the development of software for gel permeation chromatography. The instructional role of the software was evaluated in a study…

  2. A conceptual framework for advanced practice nursing in a pediatric tertiary care setting: the SickKids' experience.

    PubMed

    LeGrow, Karen; Hubley, Pam; McAllister, Mary

    2010-05-01

    Advanced practice nurses (APNs) at The Hospital for Sick Children (SickKids) are pediatric healthcare providers who integrate principles and theories of advanced nursing with specialty knowledge to provide autonomous, independent, accountable, ethical and developmentally appropriate care in complex, often ambiguous and rapidly changing healthcare environments. Caring for children and adolescents requires culturally sensitive and family-centred approaches to care that incorporate a unique body of knowledge. Family-centred care is an approach to planning, delivery and evaluation of healthcare that is governed by the establishment of mutually beneficial partnerships among APNs, health professionals and children/families. The cornerstone of APN practice at SickKids is the recognition of "family" as the recipients of care. By valuing and developing relationships with families, APNs promote excellence in healthcare across the care continuum to optimize the child's and family's physical, emotional, social, psychological and spiritual well-being. This paper outlines the evolution of advanced practice nursing at SickKids, beginning with the introduction of APN roles in the 1970s and culminating in the current critical mass of APNs who have been integrated throughout the hospital's infrastructure. We describe the process used to create a common vision and a framework to guide pediatric advanced nursing practice. PMID:20530994

  3. Strengthening fairness, transparency and accountability in health care priority setting at district level in Tanzania

    PubMed Central

    Maluka, Stephen Oswald

    2011-01-01

    Health care systems are faced with the challenge of resource scarcity and have insufficient resources to respond to all health problems and target groups simultaneously. Hence, priority setting is an inevitable aspect of every health system. However, priority setting is complex and difficult because the process is frequently influenced by political, institutional and managerial factors that are not considered by conventional priority-setting tools. In a five-year EU-supported project, which started in 2006, ways of strengthening fairness and accountability in priority setting in district health management were studied. This review is based on a PhD thesis that aimed to analyse health care organisation and management systems, and explore the potential and challenges of implementing Accountability for Reasonableness (A4R) approach to priority setting in Tanzania. A qualitative case study in Mbarali district formed the basis of exploring the sociopolitical and institutional contexts within which health care decision making takes place. The study also explores how the A4R intervention was shaped, enabled and constrained by the contexts. Key informant interviews were conducted. Relevant documents were also gathered and group priority-setting processes in the district were observed. The study revealed that, despite the obvious national rhetoric on decentralisation, actual practice in the district involved little community participation. The assumption that devolution to local government promotes transparency, accountability and community participation, is far from reality. The study also found that while the A4R approach was perceived to be helpful in strengthening transparency, accountability and stakeholder engagement, integrating the innovation into the district health system was challenging. This study underscores the idea that greater involvement and accountability among local actors may increase the legitimacy and fairness of priority-setting decisions. A broader

  4. Strengthening fairness, transparency and accountability in health care priority setting at district level in Tanzania.

    PubMed

    Maluka, Stephen Oswald

    2011-01-01

    Health care systems are faced with the challenge of resource scarcity and have insufficient resources to respond to all health problems and target groups simultaneously. Hence, priority setting is an inevitable aspect of every health system. However, priority setting is complex and difficult because the process is frequently influenced by political, institutional and managerial factors that are not considered by conventional priority-setting tools. In a five-year EU-supported project, which started in 2006, ways of strengthening fairness and accountability in priority setting in district health management were studied. This review is based on a PhD thesis that aimed to analyse health care organisation and management systems, and explore the potential and challenges of implementing Accountability for Reasonableness (A4R) approach to priority setting in Tanzania. A qualitative case study in Mbarali district formed the basis of exploring the sociopolitical and institutional contexts within which health care decision making takes place. The study also explores how the A4R intervention was shaped, enabled and constrained by the contexts. Key informant interviews were conducted. Relevant documents were also gathered and group priority-setting processes in the district were observed. The study revealed that, despite the obvious national rhetoric on decentralisation, actual practice in the district involved little community participation. The assumption that devolution to local government promotes transparency, accountability and community participation, is far from reality. The study also found that while the A4R approach was perceived to be helpful in strengthening transparency, accountability and stakeholder engagement, integrating the innovation into the district health system was challenging. This study underscores the idea that greater involvement and accountability among local actors may increase the legitimacy and fairness of priority-setting decisions. A broader

  5. SET overexpression in HEK293 cells regulates mitochondrial uncoupling proteins levels within a mitochondrial fission/reduced autophagic flux scenario

    SciTech Connect

    Almeida, Luciana O.; Goto, Renata N.; Neto, Marinaldo P.C.; Sousa, Lucas O.; Curti, Carlos; Leopoldino, Andréia M.

    2015-03-06

    We hypothesized that SET, a protein accumulated in some cancer types and Alzheimer disease, is involved in cell death through mitochondrial mechanisms. We addressed the mRNA and protein levels of the mitochondrial uncoupling proteins UCP1, UCP2 and UCP3 (S and L isoforms) by quantitative real-time PCR and immunofluorescence as well as other mitochondrial involvements, in HEK293 cells overexpressing the SET protein (HEK293/SET), either in the presence or absence of oxidative stress induced by the pro-oxidant t-butyl hydroperoxide (t-BHP). SET overexpression in HEK293 cells decreased UCP1 and increased UCP2 and UCP3 (S/L) mRNA and protein levels, whilst also preventing lipid peroxidation and decreasing the content of cellular ATP. SET overexpression also (i) decreased the area of mitochondria and increased the number of organelles and lysosomes, (ii) increased mitochondrial fission, as demonstrated by increased FIS1 mRNA and FIS-1 protein levels, an apparent accumulation of DRP-1 protein, and an increase in the VDAC protein level, and (iii) reduced autophagic flux, as demonstrated by a decrease in LC3B lipidation (LC3B-II) in the presence of chloroquine. Therefore, SET overexpression in HEK293 cells promotes mitochondrial fission and reduces autophagic flux in apparent association with up-regulation of UCP2 and UCP3; this implies a potential involvement in cellular processes that are deregulated such as in Alzheimer's disease and cancer. - Highlights: • SET, UCPs and autophagy prevention are correlated. • SET action has mitochondrial involvement. • UCP2/3 may reduce ROS and prevent autophagy. • SET protects cell from ROS via UCP2/3.

  6. Modelling Molecular Mechanisms: A Framework of Scientific Reasoning to Construct Molecular-Level Explanations for Cellular Behaviour

    NASA Astrophysics Data System (ADS)

    van Mil, Marc H. W.; Boerwinkel, Dirk Jan; Waarlo, Arend Jan

    2013-01-01

    Although molecular-level details are part of the upper-secondary biology curriculum in most countries, many studies report that students fail to connect molecular knowledge to phenomena at the level of cells, organs and organisms. Recent studies suggest that students lack a framework to reason about complex systems to make this connection. In this paper, we present a framework that could help students to reason back and forth between cells and molecules. It represents both the general type of explanation in molecular biology and the research strategies scientists use to find these explanations. We base this framework on recent work in the philosophy of science that characterizes explanations in molecular biology as mechanistic explanations. Mechanistic explanations describe a phenomenon in terms of the entities involved, the activities displayed and the way these entities and activities are organized. We conclude that to describe cellular phenomena scientists use entities and activities at multiple levels between cells and molecules. In molecular biological research, scientists use heuristics based on these intermediate levels to construct mechanistic explanations. They subdivide a cellular activity into hypothetical lower-level activities (top-down approaches) and they predict and test the organization of macromolecules into functional modules that play a role in higher-level activities (bottom-up approaches). We suggest including molecular mechanistic reasoning in biology education and we identify criteria for designing such education. Education using molecular mechanistic reasoning can build on common intuitive reasoning about mechanisms. The heuristics that scientists use can help students to apply this intuitive notion to the levels in between molecules and cells.

  7. A conditioned level-set method with block-division strategy to flame front extraction based on OH-PLIF measurements

    NASA Astrophysics Data System (ADS)

    Han, Yue; Cai, Guo-Biao; Xu, Xu; Renou, Bruno; Boukhalfa, Abdelkrim

    2014-05-01

    A novel approach to extract flame fronts, which is called the conditioned level-set method with block division (CLSB), has been developed. Based on a two-phase level-set formulation, the conditioned initialization and region-lock optimization appear to be beneficial to improve the efficiency and accuracy of the flame contour identification. The original block-division strategy enables the approach to be unsupervised by calculating local self-adaptive threshold values autonomously before binarization. The CLSB approach has been applied to deal with a large set of experimental data involving swirl-stabilized premixed combustion in diluted regimes operating at atmospheric pressures. The OH-PLIF measurements have been carried out in this framework. The resulting images are, thus, featured by lower signal-to-noise ratios (SNRs) than the ideal image; relatively complex flame structures lead to significant non-uniformity in the OH signal intensity; and, the magnitude of the maximum OH gradient observed along the flame front can also vary depending on flow or local stoichiometry. Compared with other conventional edge detection operators, the CLSB method demonstrates a good ability to deal with the OH-PLIF images at low SNR and with the presence of a multiple scales of both OH intensity and OH gradient. The robustness to noise sensitivity and intensity inhomogeneity has been evaluated throughout a range of experimental images of diluted flames, as well as against a circle test as Ground Truth (GT).

  8. Farm Level--Setting Up and Using the Tripod Level, Staking Out Foundations, Differential Leveling, and Staking Out Fence Lines. Student Materials. V.A. III. V-E-1, V-E-2.

    ERIC Educational Resources Information Center

    Texas A and M Univ., College Station. Vocational Instructional Services.

    Designed for use by individuals enrolled in vocational agricultural classes, these student materials deal with setting up and using a tripod level, staking out foundations, differential leveling, and staking out fence lines. Topics covered in the unit are different kinds of tripod levels, the parts of a tripod level, transporting a tripod level,…

  9. Basis set limit geometries for ammonia at the SCF and MP2 levels of theory

    NASA Technical Reports Server (NTRS)

    Defrees, D. J.; Mclean, A. D.

    1984-01-01

    The controversy over the Hartree-Fock bond angle of NH3 is resolved and the convergence of the geometry for the molecule as the basis set is systematically improved with both SCF and correlated MP2 wave functions. The results of the geometrical optimizations, carried out in four stages with a series of uncontracted bases sets, are shown. The obtained structure for NH3 supports the results of Radom and Rodwell (1980) that the Hartree-Fock limit angle is significantly greater than was previously believed.

  10. Comparing Panelists' Understanding of Standard Setting across Multiple Levels of an Alternate Science Assessment

    ERIC Educational Resources Information Center

    Hansen, Mary A.; Lyon, Steven R.; Heh, Peter; Zigmond, Naomi

    2013-01-01

    Large-scale assessment programs, including alternate assessments based on alternate achievement standards (AA-AAS), must provide evidence of technical quality and validity. This study provides information about the technical quality of one AA-AAS by evaluating the standard setting for the science component. The assessment was designed to have…

  11. Optimal Design in Three-Level Block Randomized Designs with Two Levels of Nesting: An ANOVA Framework with Random Effects

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2013-01-01

    Large-scale experiments that involve nested structures may assign treatment conditions either to subgroups such as classrooms or to individuals such as students within subgroups. Key aspects of the design of such experiments include knowledge of the variance structure in higher levels and the sample sizes necessary to reach sufficient power to…

  12. Introduction of new technologies and decision making processes: a framework to adapt a Local Health Technology Decision Support Program for other local settings

    PubMed Central

    Poulin, Paule; Austen, Lea; Scott, Catherine M; Poulin, Michelle; Gall, Nadine; Seidel, Judy; Lafrenière, René

    2013-01-01

    Purpose Introducing new health technologies, including medical devices, into a local setting in a safe, effective, and transparent manner is a complex process, involving many disciplines and players within an organization. Decision making should be systematic, consistent, and transparent. It should involve translating and integrating scientific evidence, such as health technology assessment (HTA) reports, with context-sensitive evidence to develop recommendations on whether and under what conditions a new technology will be introduced. However, the development of a program to support such decision making can require considerable time and resources. An alternative is to adapt a preexisting program to the new setting. Materials and methods We describe a framework for adapting the Local HTA Decision Support Program, originally developed by the Department of Surgery and Surgical Services (Calgary, AB, Canada), for use by other departments. The framework consists of six steps: 1) development of a program review and adaptation manual, 2) education and readiness assessment of interested departments, 3) evaluation of the program by individual departments, 4) joint evaluation via retreats, 5) synthesis of feedback and program revision, and 6) evaluation of the adaptation process. Results Nine departments revised the Local HTA Decision Support Program and expressed strong satisfaction with the adaptation process. Key elements for success were identified. Conclusion Adaptation of a preexisting program may reduce duplication of effort, save resources, raise the health care providers’ awareness of HTA, and foster constructive stakeholder engagement, which enhances the legitimacy of evidence-informed recommendations for introducing new health technologies. We encourage others to use this framework for program adaptation and to report their experiences. PMID:24273415

  13. Contextual Effects in an Educational Setting: An Example of Level Three Research.

    ERIC Educational Resources Information Center

    Sears, Constance; Husak, William S.

    A systematic three-level ("Level 3") approach to research in the motor behavior area was used to investigate the influence of varying degrees of contextual interference in the acquisition of volleyball serving skills. One hundred and twenty-eight middle school subjects learned three volleyball serves during a 3-week long unit in a physical…

  14. Setting Us Free? Building Meaningful Models of Progression for a "Post-Levels" World

    ERIC Educational Resources Information Center

    Ford, Alex

    2014-01-01

    Alex Ford was thrilled by the prospect of freedom offered to history departments in England by the abolition of level descriptions within the National Curriculum. After analysing the range of competing purposes that the level descriptions were previously forced to serve, Ford argues that the three distinct tasks of measuring current attainment,…

  15. Identification of framework residues in a secreted recombinant antibody fragment that control production level and localization in Escherichia coli.

    PubMed

    Forsberg, G; Forsgren, M; Jaki, M; Norin, M; Sterky, C; Enhörning, A; Larsson, K; Ericsson, M; Björk, P

    1997-05-01

    The monoclonal antibody 5T4, directed against a human tumor-associated antigen, was expressed as a secreted Fab superantigen fusion protein in Escherichia coli. The product is a putative agent for immunotherapy of non-small cell lung cancer. During fermentation, most of the fusion protein leaked out from the periplasm to the growth medium at a level of approximately 40 mg/liter. This level was notably low compared with similar products containing identical CH1, CL, and superantigen moieties, and the Fv framework was therefore engineered. Using hybrid molecules, the light chain was found to limit high expression levels. Substituting five residues in VL increased the level almost 15 times, exceeding 500 mg/liter in the growth medium. Here, the substitutions Phe-10 --> Ser, Thr-45 --> Lys, Thr-77 --> Ser, and Leu-78 --> Val were most powerful. In addition, replacing four VH residues diminished cell lysis during fermentation. Thereby the product was preferentially located in the periplasm instead of the growth medium, and the total yield was more than 700 mg/liter. All engineered products retained a high affinity for the tumor-associated antigen. It is suggested that at least some of the identified framework residues generally have to be replaced to obtain high level production of recombinant Fab products in E. coli. PMID:9139690

  16. A framework for leveling informatics content across four years of a Bachelor of Science in Nursing (BSN) curriculum.

    PubMed

    Frisch, Noreen; Borycki, Elizabeth

    2013-01-01

    While there are several published statements of nursing informatics competencies needed for the Bachelor of Science in nursing (BSN) graduate, faculty at schools of nursing has little guidance on how to incorporate the teaching of such competencies into curricula that are already overloaded with required content. The authors present a framework for addressing nursing informatics content within teaching plans that already exist in virtually all BSN programs. The framework is based on an organization of curriculum content that moves the learner from elementary to complex nursing concepts and ideas as a means to level the content. Further, the framework is organized around four broad content areas included in all curricula: professional responsibility, care delivery, community and population-based nursing, and leadership/management. Examples of informatics content to be addressed at each level and content area are provided. Lastly a practice-appraisal tool, the UVIC Informatics Practice Appraisal - BSN is presented as a means to track student learning and outcomes across the four years of a BSN program. PMID:23388314

  17. Effects of Facility Developments and Encounter Levels on Perceptions of Settings, Crowding, and Norms in a Korean Park

    NASA Astrophysics Data System (ADS)

    Kim, Sang-Oh; Shelby, Bo; Needham, Mark D.

    2014-02-01

    This article examines potential effects of two physical developments (presence or absence of an aerial tramway, a road vs. a trail) and one social variable (increasing encounters with other people) on individuals' perceptions of settings (i.e., perceived settings), crowding, and acceptance of encounters (i.e., norms) in Mudeungsan Provincial Park in South Korea, where there have been proposals for a new aerial tramway. Data were obtained from 241 students at Chonnam National University, almost all of whom had previously visited this park (e.g., 66 % visited at least one of the two study locations in this park, 55 % visited this park in the past 12 months). Simulated photographs showed encounter levels (1 or 15 hikers), the presence or absence of a tramway, and a road versus a trail. Respondents encountering low numbers of other people felt less crowded, considered these use levels to be more acceptable, and perceived the area as more pristine and less developed. Locations containing an aerial tramway were perceived as more developed and less natural, and higher encounter levels were considered to be more acceptable at these locations. Whether settings contained a road or a trail did not influence perceived settings, crowding, or norms. Implications of these findings for future research and management of parks and related outdoor settings are discussed.

  18. Clinical-Education-Setting Standards Are Helpful in the Professional Preparation of Employed, Entry-Level Certified Athletic Trainers.

    PubMed

    Laurent, Tim; Weidner, Thomas G

    2002-12-01

    OBJECTIVE: To determine the helpfulness of clinical-education-setting standards in the professional preparation of entry-level certified athletic trainers. DESIGN AND SETTING: We developed a 22-item questionnaire based on the 12 standards presented by Weidner and Laurent. Subjects used a Likert scale (0 = no help, 5 = very helpful) to indicate their perceptions of the helpfulness of each standard in preparing them for their roles and responsibilities as certified athletic trainers. SUBJECTS: We surveyed employed, entry-level certified athletic trainers who recently completed Commission on Accreditation of Allied Health Education Programs-accredited athletic training education programs. MEASUREMENTS: Percentage means were computed for the helpfulness ratings of each standard. A percentage mean was computed for the overall contribution of clinical education to professional development. Chi-square analyses were used to assess the differences in helpfulness ratings among respondents. RESULTS: The overall mean score across all standards was 4.17. No significant differences in the helpfulness ratings of any of the respondents were noted regardless of sex, ethnicity, number of clinical-education hours, total semesters of clinical education, settings in which students gained clinical experience, or current employment (P settings are helpful and should be applied to all settings. Varying standards do not need to be imposed on our different athletic training clinical-education settings. PMID:12937553

  19. CMS software architecture. Software framework, services and persistency in high level trigger, reconstruction and analysis

    NASA Astrophysics Data System (ADS)

    Innocente, V.; Silvestris, L.; Stickland, D.; CMS Software Group

    2001-10-01

    This paper describes the design of a resilient and flexible software architecture that has been developed to satisfy the data processing requirements of a large HEP experiment, CMS, currently being constructed at the LHC machine at CERN. We describe various components of a software framework that allows integration of physics modules and which can be easily adapted for use in different processing environments both real-time (online trigger) and offline (event reconstruction and analysis). Features such as the mechanisms for scheduling algorithms, configuring the application and managing the dependences among modules are described in detail. In particular, a major effort has been placed on providing a service for managing persistent data and the experience using a commercial ODBMS (Objectivity/DB) is therefore described in detail.

  20. Robust Framework to Combine Diverse Classifiers Assigning Distributed Confidence to Individual Classifiers at Class Level

    PubMed Central

    Arshad, Sannia; Rho, Seungmin

    2014-01-01

    We have presented a classification framework that combines multiple heterogeneous classifiers in the presence of class label noise. An extension of m-Mediods based modeling is presented that generates model of various classes whilst identifying and filtering noisy training data. This noise free data is further used to learn model for other classifiers such as GMM and SVM. A weight learning method is then introduced to learn weights on each class for different classifiers to construct an ensemble. For this purpose, we applied genetic algorithm to search for an optimal weight vector on which classifier ensemble is expected to give the best accuracy. The proposed approach is evaluated on variety of real life datasets. It is also compared with existing standard ensemble techniques such as Adaboost, Bagging, and Random Subspace Methods. Experimental results show the superiority of proposed ensemble method as compared to its competitors, especially in the presence of class label noise and imbalance classes. PMID:25295302

  1. Embracing a Common Focus: A Framework for Middle Level Teacher Preparation

    ERIC Educational Resources Information Center

    Faulkner, Shawn A.; Howell, Penny B.; Cook, Chris M.

    2013-01-01

    As more and more states make a commitment to specialized middle level teacher preparation, teacher education programs across the country must make the necessary adjustments to ensure middle level teachers are prepared to be successful. Unfortunately, individual state and institutional requirements often make this challenging and can result in…

  2. Homelessness Outcome Reporting Normative Framework: Systems-Level Evaluation of Progress in Ending Homelessness

    ERIC Educational Resources Information Center

    Austen, Tyrone; Pauly, Bernie

    2012-01-01

    Homelessness is a serious and growing issue. Evaluations of systemic-level changes are needed to determine progress in reducing or ending homelessness. The report card methodology is one means of systems-level assessment. Rather than solely establishing an enumeration, homelessness report cards can capture pertinent information about structural…

  3. Wave energy level and geographic setting correlate with Florida beach water quality.

    PubMed

    Feng, Zhixuan; Reniers, Ad; Haus, Brian K; Solo-Gabriele, Helena M; Kelly, Elizabeth A

    2016-03-15

    Many recreational beaches suffer from elevated levels of microorganisms, resulting in beach advisories and closures due to lack of compliance with Environmental Protection Agency guidelines. We conducted the first statewide beach water quality assessment by analyzing decadal records of fecal indicator bacteria (enterococci and fecal coliform) levels at 262 Florida beaches. The objectives were to depict synoptic patterns of beach water quality exceedance along the entire Florida shoreline and to evaluate their relationships with wave condition and geographic location. Percent exceedances based on enterococci and fecal coliform were negatively correlated with both long-term mean wave energy and beach slope. Also, Gulf of Mexico beaches exceeded the thresholds significantly more than Atlantic Ocean ones, perhaps partially due to the lower wave energy. A possible linkage between wave energy level and water quality is beach sand, a pervasive nonpoint source that tends to harbor more bacteria in the low-wave-energy environment. PMID:26892203

  4. Simulation of Heterogeneous Atom Probe Tip Shapes Evolution during Field Evaporation Using a Level Set Method and Different Evaporation Models

    SciTech Connect

    Xu, Zhijie; Li, Dongsheng; Xu, Wei; Devaraj, Arun; Colby, Robert J.; Thevuthasan, Suntharampillai; Geiser, B. P.; Larson, David J.

    2015-04-01

    In atom probe tomography (APT), accurate reconstruction of the spatial positions of field evaporated ions from measured detector patterns depends upon a correct understanding of the dynamic tip shape evolution and evaporation laws of component atoms. Artifacts in APT reconstructions of heterogeneous materials can be attributed to the assumption of homogeneous evaporation of all the elements in the material in addition to the assumption of a steady state hemispherical dynamic tip shape evolution. A level set method based specimen shape evolution model is developed in this study to simulate the evaporation of synthetic layered-structured APT tips. The simulation results of the shape evolution by the level set model qualitatively agree with the finite element method and the literature data using the finite difference method. The asymmetric evolving shape predicted by the level set model demonstrates the complex evaporation behavior of heterogeneous tip and the interface curvature can potentially lead to the artifacts in the APT reconstruction of such materials. Compared with other APT simulation methods, the new method provides smoother interface representation with the aid of the intrinsic sub-grid accuracy. Two evaporation models (linear and exponential evaporation laws) are implemented in the level set simulations and the effect of evaporation laws on the tip shape evolution is also presented.

  5. The Daily Events and Emotions of Master's-Level Family Therapy Trainees in Off-Campus Practicum Settings

    ERIC Educational Resources Information Center

    Edwards, Todd M.; Patterson, Jo Ellen

    2012-01-01

    The Day Reconstruction Method (DRM) was used to assess the daily events and emotions of one program's master's-level family therapy trainees in off-campus practicum settings. This study examines the DRM reports of 35 family therapy trainees in the second year of their master's program in marriage and family therapy. Four themes emerged from the…

  6. County-Level Poverty Is Equally Associated with Unmet Health Care Needs in Rural and Urban Settings

    ERIC Educational Resources Information Center

    Peterson, Lars E.; Litaker, David G.

    2010-01-01

    Context: Regional poverty is associated with reduced access to health care. Whether this relationship is equally strong in both rural and urban settings or is affected by the contextual and individual-level characteristics that distinguish these areas, is unclear. Purpose: Compare the association between regional poverty with self-reported unmet…

  7. An on-line learning tracking of non-rigid target combining multiple-instance boosting and level set

    NASA Astrophysics Data System (ADS)

    Chen, Mingming; Cai, Jingju

    2013-10-01

    Visual tracking algorithms based on online boosting generally use a rectangular bounding box to represent the position of the target, while actually the shape of the target is always irregular. This will cause the classifier to learn the features of the non-target parts in the rectangle region, thereby the performance of the classifier is reduced, and drift would happen. To avoid the limitations of the bounding-box, we propose a novel tracking-by-detection algorithm involving the level set segmentation, which ensures the classifier only learn the features of the real target area in the tracking box. Because the shape of the target only changes a little between two adjacent frames and the current level set algorithm can avoid the re-initialization of the signed distance function, it only takes a few iterations to converge to the position of the target contour in the next frame. We also make some improvement on the level set energy function so that the zero level set would have less possible to converge to the false contour. In addition, we use gradient boost to improve the original multi-instance learning (MIL) algorithm like the WMILtracker, which greatly speed up the tracker. Our algorithm outperforms the original MILtracker both on speed and precision. Compared with the WMILtracker, our algorithm runs at a almost same speed, but we can avoid the drift caused by background learning, so the precision is better.

  8. Inversion and classification studies of live-site production-level MetalMapper data sets

    NASA Astrophysics Data System (ADS)

    Shubitidze, F.; Fernández, J. P.; Miller, J.; Keranen, J.; Barrowes, B. E.; Bijamov, A.

    2012-06-01

    This paper illustrates the discrimination performance of a set of advanced models at an actual UXO live site. The suite of methods, which combines the orthonormalized volume magnetic source (ONVMS) model, a data-preprocessing technique based on joint diagonalization (JD), and differential evolution (DE) minimization, among others, was tested at the former Camp Beale in California. The data for the study were collected independently by two UXO production teams from Parsons and CH2M HILL using the MetalMapper (MM) sensor in cued mode; each set of data was also processed independently. Initially all data were inverted using a multi-target version of the combined ONVMS-DE algorithm, which provided intrinsic parameters (the total ONVMS amplitudes) that were then used to perform classification after having been inspected by an expert. Classification of the Parsons data was conducted by a Sky Research production team using a fingerprinting approach; analysis of the CH2M HILL data was performed by a Sky/Dartmouth R&D team using unsupervised clustering. During the classification stage the analysts requested the ground truth for selected anomalies typical of the different clusters; this was then used to classify them using a probability function. This paper reviews the data inversion, processing, and discrimination schemes involving the advanced EMI methods and presents the classification results obtained for both the CH2M HILL and the Parsons data. Independent scoring by the Institute for Defense Analyses reveals superb all-around classification performance.

  9. Language Ready Exercises. "Ready-Set-ABE" To Ease Students' Transition into ABE Level Studies.

    ERIC Educational Resources Information Center

    Molek, Carol

    This booklet is intended to assist tutors in helping transitional and low-level adult basic education (ABE) students acquire the language skills required to make a successful adjustment to regular ABE classes. The exercises provided are intended primarily for use in student-tutor learning teams, with students gradually completing greater portions…

  10. The Integrated Behavioural Model for Water, Sanitation, and Hygiene: a systematic review of behavioural models and a framework for designing and evaluating behaviour change interventions in infrastructure-restricted settings

    PubMed Central

    2013-01-01

    Background Promotion and provision of low-cost technologies that enable improved water, sanitation, and hygiene (WASH) practices are seen as viable solutions for reducing high rates of morbidity and mortality due to enteric illnesses in low-income countries. A number of theoretical models, explanatory frameworks, and decision-making models have emerged which attempt to guide behaviour change interventions related to WASH. The design and evaluation of such interventions would benefit from a synthesis of this body of theory informing WASH behaviour change and maintenance. Methods We completed a systematic review of existing models and frameworks through a search of related articles available in PubMed and in the grey literature. Information on the organization of behavioural determinants was extracted from the references that fulfilled the selection criteria and synthesized. Results from this synthesis were combined with other relevant literature, and from feedback through concurrent formative and pilot research conducted in the context of two cluster-randomized trials on the efficacy of WASH behaviour change interventions to inform the development of a framework to guide the development and evaluation of WASH interventions: the Integrated Behavioural Model for Water, Sanitation, and Hygiene (IBM-WASH). Results We identified 15 WASH-specific theoretical models, behaviour change frameworks, or programmatic models, of which 9 addressed our review questions. Existing models under-represented the potential role of technology in influencing behavioural outcomes, focused on individual-level behavioural determinants, and had largely ignored the role of the physical and natural environment. IBM-WASH attempts to correct this by acknowledging three dimensions (Contextual Factors, Psychosocial Factors, and Technology Factors) that operate on five-levels (structural, community, household, individual, and habitual). Conclusions A number of WASH-specific models and frameworks

  11. Three stage level set segmentation of mass core, periphery, and spiculations for automated image analysis of digital mammograms

    NASA Astrophysics Data System (ADS)

    Ball, John Eugene

    In this dissertation, level set methods are employed to segment masses in digital mammographic images and to classify land cover classes in hyperspectral data. For the mammography computer aided diagnosis (CAD) application, level set-based segmentation methods are designed and validated for mass-periphery segmentation, spiculation segmentation, and core segmentation. The proposed periphery segmentation uses the narrowband level set method in conjunction with an adaptive speed function based on a measure of the boundary complexity in the polar domain. The boundary complexity term is shown to be beneficial for delineating challenging masses with ill-defined and irregularly shaped borders. The proposed method is shown to outperform periphery segmentation methods currently reported in the literature. The proposed mass spiculation segmentation uses a generalized form of the Dixon and Taylor Line Operator along with narrowband level sets using a customized speed function. The resulting spiculation features are shown to be very beneficial for classifying the mass as benign or malignant. For example, when using patient age and texture features combined with a maximum likelihood (ML) classifier, the spiculation segmentation method increases the overall accuracy to 92% with 2 false negatives as compared to 87% with 4 false negatives when using periphery segmentation approaches. The proposed mass core segmentation uses the Chan-Vese level set method with a minimal variance criterion. The resulting core features are shown to be effective and comparable to periphery features, and are shown to reduce the number of false negatives in some cases. Most mammographic CAD systems use only a periphery segmentation, so those systems could potentially benefit from core features.

  12. High performance in healthcare priority setting and resource allocation: A literature- and case study-based framework in the Canadian context.

    PubMed

    Smith, Neale; Mitton, Craig; Hall, William; Bryan, Stirling; Donaldson, Cam; Peacock, Stuart; Gibson, Jennifer L; Urquhart, Bonnie

    2016-08-01

    Priority setting and resource allocation, or PSRA, are key functions of executive teams in healthcare organizations. Yet decision-makers often base their choices on historical patterns of resource distribution or political pressures. Our aim was to provide leaders with guidance on how to improve PSRA practice, by creating organizational contexts which enable high performance. We carried out in-depth case studies of six Canadian healthcare organizations to obtain from healthcare leaders their understanding of the concept of high performance in PSRA and the factors which contribute to its achievement. Individual and group interviews were carried out (n = 62) with senior managers, middle managers and Board members. Site observations and document review were used to assist researchers in interpreting the interview data. Qualitative data were analyzed iteratively with the literature on empirical examples of PSRA practice, in order to develop a framework of high performance in PSRA. The framework consists of four domains - structures, processes, attitudes and behaviours, and outcomes - within which are 19 specific elements. The emergent themes derive from case studies in different kinds of health organizations (urban/rural, small/large) across Canada. The elements can serve as a checklist for 'high performance' in PSRA. This framework provides a means by which decision-makers in healthcare might assess their practice and identify key areas for improvement. The findings are likely generalizable, certainly within Canada but also across countries. This work constitutes, to our knowledge, the first attempt to present a full package of elements comprising high performance in health care PSRA. PMID:27367899

  13. Gaining entry-level clinical competence outside of the acute care setting.

    PubMed

    Lordly, Daphne; Taper, Janette

    2008-01-01

    Traditionally, an emphasis has been placed on dietetic interns' attainment of entry-level clinical competence in acute care facilities. The perceived risks and benefits of acquiring entry-level clinical competence within long-term and acute care clinical environments were examined. The study included a purposive sample of recent graduates and dietitians (n=14) involved in an integrated internship program. Study subjects participated in in-depth individual interviews. Data were thematically analyzed with the support of data management software QSR N6. Perceived risks and benefits were associated with receiving clinical training exclusively in either environment; risks in one area surfaced as benefits in the other. Themes that emerged included philosophy of care, approach to practice, working environment, depth and breadth of experience, relationships (both client and professional), practice outcomes, employment opportunities, and attitude. Entry-level clinical competence is achievable in both acute and long-term care environments; however, attention must be paid to identified risks. Interns who consider gaining clinical competence exclusively in one area can reduce risks and better position themselves for employment in either practice area by incorporating an affiliation in the other area into their internship program. PMID:18334052

  14. Building a conceptual framework to culturally adapt health promotion and prevention programs at the deep structural level.

    PubMed

    Wang-Schweig, Meme; Kviz, Frederick J; Altfeld, Susan J; Miller, Arlene M; Miller, Brenda A

    2014-07-01

    The debate on the effectiveness and merit for the amount of time, effort, and resources to culturally adapt health promotion and prevention programs continues. This may be due, in large part, to the lack of theory in commonly used methods to match programmatic content and delivery to the culture of a population, particularly at the deep structural level. This paper asserts that prior to the cultural adaptation of prevention programs, it is necessary to first develop a conceptual framework. We propose a multiphase approach to address key challenges in the science of cultural adaptation by first identifying and exploring relevant cultural factors that may affect the targeted health-related behavior prior to proceeding through steps of a stage model. The first phase involves developing an underlying conceptual framework that integrates cultural factors to ground this process. The second phase employs the different steps of a stage model. For Phase I of our approach, we offer four key steps and use our research study as an example of how these steps were applied to build a framework for the cultural adaptation of a family-based intervention to prevent adolescent alcohol use, Guiding Good Choices (GGC), to Chinese American families. We then provide a summary of the preliminary evidence from a few key relationships that were tested among our sample with the greater purpose of discussing how these findings might be used to culturally adapt GGC. PMID:24396122

  15. Income Level and Drug Related Harm among People Who Use Injection Drugs in a Canadian Setting

    PubMed Central

    Long, Cathy; DeBeck, Kora; Feng, Cindy; Montaner, Julio; Wood, Evan; Kerr, Thomas

    2014-01-01

    Background Higher income is generally associated with better health outcomes; however, among people who inject drugs (IDU) income generation frequently involves activities, such as sex work and drug dealing, which pose significant health risks. Therefore, we sought to examine the relationship between level of income and specific drug use patterns and related health risks. Methods This study involved IDU participating in a prospective cohort study in Vancouver, Canada. Monthly income was categorized based on non-fixed quartiles at each follow-up with the lowest level serving as the reference category in generalized linear mixed-effects regression. Results Among our sample of 1,032 IDU, the median average monthly income over the study follow-up was $1050 [Interquartile range=785–2000]. In multivariate analysis, the highest income category was significantly associated with sex work (Adjusted Odds Ratio [AOR]=7.65), drug dealing (AOR=5.06), daily heroin injection (AOR=2.97), daily cocaine injection (AOR=1.65), daily crack smoking (AOR=2.48), binge drug use (AOR=1.57) and unstable housing (AOR=1.67). The high income category was negatively associated with being female (AOR=0.61) and accessing addiction treatment (AOR=0.64), (all p < 0.05). In addition, higher income was strongly associated with higher monthly expenditure on drugs (>$400) (OR=97.8). Conclusion Among IDU in Vancouver, average monthly income levels were low and higher total monthly income was linked to high-risk income generation strategies as well as a range of drug use patterns characteristic of higher intensity addiction and HIV risk. These findings underscore the need for interventions that provide economic empowerment and address high intensity addiction, especially for female IDU. PMID:24380808

  16. Soil lead abatement and children's blood lead levels in an urban setting.

    PubMed Central

    Farrell, K P; Brophy, M C; Chisolm, J J; Rohde, C A; Strauss, W J

    1998-01-01

    OBJECTIVES: The effect of abating soil lead was assessed among Baltimore children. The hypothesis was that a reduction of 1000 parts per million would reduce children's blood lead levels by 0.14 to 0.29 mumol/L (3-6 micrograms/dL). METHODS: In 2 neighborhoods (study and control), 187 children completed the protocol. In the study area, contaminated soil was replaced with clean soil. RESULTS: Soil lead abatement in this study did not lower children's blood lead. CONCLUSIONS: Although it did not show an effect in this study, soil lead abatement may be useful in certain areas. PMID:9842383

  17. Optical coupling devices to a broadband low level laser therapy set

    NASA Astrophysics Data System (ADS)

    Gryko, L.; Zajac, A.

    2011-10-01

    Precise knowledge of the spatial distributions of optical radiation in the biological medium is required in all cases of medical laser procedures, but for the low-energy interactions influencing the course of photochemical processes (biostimulation treatments) has not yet been precisely controlled. The variety of procedures and results of the trials will mobilize to look for unequivocal parameters of laser radiation, which both in vitro and in vivo will result in acceleration of cell proliferation and the expected therapeutic efficacy. There is a need to conduct objective diagnostic tests of tissues during treatment using a laser measuring system analyzing the status of the tissue (its optical properties) during therapeutic exposition. It is necessary to build an illuminator providing homogeneous distribution of spectral power density and spatial power density on the surface of the test. An illumination set is composed of a collection of over a dozen diodes LED emitting in therapeutic window of biological tissue (range 600-1000 nm). In this paper are presented the optical couplers enable the implementation of this purpose - conical coupler and MM planar fiber.

  18. Comparing of goal setting strategy with group education method to increase physical activity level: A randomized trial

    PubMed Central

    Jiryaee, Nasrin; Siadat, Zahra Dana; Zamani, Ahmadreza; Taleban, Roya

    2015-01-01

    Background: Designing an intervention to increase physical activity is important to be based on the health care settings resources and be acceptable by the subject group. This study was designed to assess and compare the effect of the goal setting strategy with a group education method on increasing the physical activity of mothers of children aged 1 to 5. Materials and Methods: Mothers who had at least one child of 1-5 years were randomized into two groups. The effect of 1) goal-setting strategy and 2) group education method on increasing physical activity was assessed and compared 1 month and 3 months after the intervention. Also, the weight, height, body mass index (BMI), waist and hip circumference, and well-being were compared between the two groups before and after the intervention. Results: Physical activity level increased significantly after the intervention in the goal-setting group and it was significantly different between the two groups after intervention (P < 0.05). BMI, waist circumference, hip circumference, and well-being score were significantly different in the goal-setting group after the intervention. In the group education method, only the well-being score improved significantly (P < 0.05). Conclusion: Our study presented the effects of using the goal-setting strategy to boost physical activity, improving the state of well-being and decreasing BMI, waist, and hip circumference. PMID:26929765

  19. Levels of Cognitive Complexity: A Framework for the Measurement of Thinking.

    ERIC Educational Resources Information Center

    McDaniel, Ernest

    Some theoretical background is presented for the proposition that thinking processes can be measured by determining the levels of cognitive complexity apparent in written interpretations of complex situations. The rationale for scoring interpretations is presented, and some illustrative data are discussed. The approach to measurement of thinking…

  20. Optimal Sampling of Units in Three-Level Cluster Randomized Designs: An Ancova Framework

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2011-01-01

    Field experiments with nested structures assign entire groups such as schools to treatment and control conditions. Key aspects of such cluster randomized experiments include knowledge of the intraclass correlation structure and the sample sizes necessary to achieve adequate power to detect the treatment effect. The units at each level of the…

  1. Transmembrane proteoglycans control stretch-activated channels to set cytosolic calcium levels

    PubMed Central

    Gopal, Sandeep; Søgaard, Pernille; Multhaupt, Hinke A.B.; Pataki, Csilla; Okina, Elena; Xian, Xiaojie; Pedersen, Mikael E.; Stevens, Troy; Griesbeck, Oliver; Park, Pyong Woo; Pocock, Roger

    2015-01-01

    Transmembrane heparan sulfate proteoglycans regulate multiple aspects of cell behavior, but the molecular basis of their signaling is unresolved. The major family of transmembrane proteoglycans is the syndecans, present in virtually all nucleated cells, but with mostly unknown functions. Here, we show that syndecans regulate transient receptor potential canonical (TRPCs) channels to control cytosolic calcium equilibria and consequent cell behavior. In fibroblasts, ligand interactions with heparan sulfate of syndecan-4 recruit cytoplasmic protein kinase C to target serine714 of TRPC7 with subsequent control of the cytoskeleton and the myofibroblast phenotype. In epidermal keratinocytes a syndecan–TRPC4 complex controls adhesion, adherens junction composition, and early differentiation in vivo and in vitro. In Caenorhabditis elegans, the TRPC orthologues TRP-1 and -2 genetically complement the loss of syndecan by suppressing neuronal guidance and locomotory defects related to increases in neuronal calcium levels. The widespread and conserved syndecan–TRPC axis therefore fine tunes cytoskeletal organization and cell behavior. PMID:26391658

  2. Comparison of Unit-Level Patient Turnover Measures in Acute Care Hospital Settings.

    PubMed

    Park, Shin Hye; Dunton, Nancy; Blegen, Mary A

    2016-06-01

    High patient turnover is a critical factor increasing nursing workload. Despite the growing number of studies on patient turnover, no consensus about how to measure turnover has been achieved. This study was designed to assess the correlation among patient turnover measures commonly used in recent studies and to examine the degree of agreement among the measures for classifying units with different levels of patient turnover. Using unit-level data collected for this study from 292 units in 88 hospitals participating in the National Database of Nursing Quality Indicators®, we compared four patient turnover measures: the inverse of length of stay (1/LOS), admissions, discharges, and transfers per daily census (ADTC), ADTC with short-stay adjustment, and the number of ADTs and short-stay patients divided by the total number of treated patients, or Unit Activity Index (UAI). We assessed the measures' agreement on turnover quartile classifications, using percent agreement and Cohen's kappa statistic (weighted and unweighted). Pearson correlation coefficients also were calculated. ADTC with or without adjustment for short-stay patients had high correlations and substantial agreement with the measure of 1/LOS (κ = .62 to .91; r = .90 to .95). The UAI measure required data less commonly collected by participating hospital units and showed only moderate correlations and fair agreement with the other measures (κ = .23 to .39; r = .41 to .45). The UAI may not be comparable and interchangeable with other patient turnover measures when data are obtained from multiple units and hospitals. © 2016 Wiley Periodicals, Inc. PMID:26998744

  3. Synthesis of magnetic framework composites for the discrimination of Escherichia coli at the strain level.

    PubMed

    Wei, Ji-Ping; Qiao, Bin; Song, Wen-Jun; Chen, Tao; li, Fei; Li, Bo-Zhi; Wang, Jin; Han, Ye; Huang, Yan-Feng; Zhou, Zhi-Jiang

    2015-04-01

    Rapid and efficient characterization and identification of pathogens at the strain level is of key importance for epidemiologic investigations, which still remains a challenge. In this work, solvothermically Fe3O4-COOH@MIL-101 composites were fabricated by in situ crystallization approach. The composites combine the excellent properties of both chromium (III) terephthalate (MIL-101) and carboxylic-functionalized magnetite (Fe3O4-COOH) particles and possess the efficient peptides/proteins enrichment properties and magnetic responsiveness. Fe3O4-COOH@MIL-101 composites as magnetic solid phase extraction materials were used to increase the discriminatory power of MALDI-TOF MS profiles. BSA tryptic peptides at a low concentration of 0.25 fmol μL(-1) could be detected by MALDI-TOF MS. In addition, Fe3O4-COOH@MIL-101 composites were successfully applied in the selective enrichment of the protein biomarkers from bacterial cell lysates and discrimination of Escherichia coli at the strain level. This work provides the possibility for wide applications of magnetic MOFs to discriminate pathogens below the species level. PMID:25813232

  4. Framework for DOE mixed low-level waste disposal: Site fact sheets

    SciTech Connect

    Gruebel, M.M.; Waters, R.D.; Hospelhorn, M.B.; Chu, M.S.Y.

    1994-11-01

    The Department of Energy (DOE) is required to prepare and submit Site Treatment Plans (STPS) pursuant to the Federal Facility Compliance Act (FFCAct). Although the FFCAct does not require that disposal be addressed in the STPS, the DOE and the States recognize that treatment of mixed low-level waste will result in residues that will require disposal in either low-level waste or mixed low-level waste disposal facilities. As a result, the DOE is working with the States to define and develop a process for evaluating disposal-site suitability in concert with the FFCAct and development of the STPS. Forty-nine potential disposal sites were screened; preliminary screening criteria reduced the number of sites for consideration to twenty-six. The DOE then prepared fact sheets for the remaining sites. These fact sheets provided additional site-specific information for understanding the strengths and weaknesses of the twenty-six sites as potential disposal sites. The information also provided the basis for discussion among affected States and the DOE in recommending sites for more detailed evaluation.

  5. Using the World Health Organization's 4S-Framework to Strengthen National Strategies, Policies and Services to Address Mental Health Problems in Adolescents in Resource-Constrained Settings

    PubMed Central

    2011-01-01

    Background Most adolescents live in resource-constrained countries and their mental health has been less well recognised than other aspects of their health. The World Health Organization's 4-S Framework provides a structure for national initiatives to improve adolescent health through: gathering and using strategic information; developing evidence-informed policies; scaling up provision and use of health services; and strengthening linkages with other government sectors. The aim of this paper is to discuss how the findings of a recent systematic review of mental health problems in adolescents in resource-constrained settings might be applied using the 4-S Framework. Method Analysis of the implications of the findings of a systematic search of the English-language literature for national strategies, policies, services and cross-sectoral linkages to improve the mental health of adolescents in resource-constrained settings. Results Data are available for only 33/112 [29%] resource-constrained countries, but in all where data are available, non-psychotic mental health problems in adolescents are identifiable, prevalent and associated with reduced quality of life, impaired participation and compromised development. In the absence of evidence about effective interventions in these settings expert opinion is that a broad public policy response which addresses direct strategies for prevention, early intervention and treatment; health service and health workforce requirements; social inclusion of marginalised groups of adolescents; and specific education is required. Specific endorsed strategies include public education, parent education, training for teachers and primary healthcare workers, psycho-educational curricula, identification through periodic screening of the most vulnerable and referral for care, and the availability of counsellors or other identified trained staff members in schools from whom adolescents can seek assistance for personal, peer and family

  6. Towards tributyltin quantification in natural water at the Environmental Quality Standard level required by the Water Framework Directive.

    PubMed

    Alasonati, Enrica; Fettig, Ina; Richter, Janine; Philipp, Rosemarie; Milačič, Radmila; Sčančar, Janez; Zuliani, Tea; Tunç, Murat; Bilsel, Mine; Gören, Ahmet Ceyhan; Fisicaro, Paola

    2016-11-01

    The European Union (EU) has included tributyltin (TBT) and its compounds in the list of priority water pollutants. Quality standards demanded by the EU Water Framework Directive (WFD) require determination of TBT at so low concentration level that chemical analysis is still difficult and further research is needed to improve the sensitivity, the accuracy and the precision of existing methodologies. Within the frame of a joint research project "Traceable measurements for monitoring critical pollutants under the European Water Framework Directive" in the European Metrology Research Programme (EMRP), four metrological and designated institutes have developed a primary method to quantify TBT in natural water using liquid-liquid extraction (LLE) and species-specific isotope dilution mass spectrometry (SSIDMS). The procedure has been validated at the Environmental Quality Standard (EQS) level (0.2ngL(-1) as cation) and at the WFD-required limit of quantification (LOQ) (0.06ngL(-1) as cation). The LOQ of the methodology was 0.06ngL(-1) and the average measurement uncertainty at the LOQ was 36%, which agreed with WFD requirements. The analytical difficulties of the method, namely the presence of TBT in blanks and the sources of measurement uncertainties, as well as the interlaboratory comparison results are discussed in detail. PMID:27591644

  7. Hydrogeologic setting east of a low-level radioactive-waste disposal site near Sheffield, Illinois

    USGS Publications Warehouse

    Foster, J.B.; Garklavs, George; Mackey, G.W.

    1984-01-01

    Core samples from 45 test wells and 4 borings were used to describe the glacial geology of the area east of the low-level radioactive-waste disposal site near Sheffield, Bureau County, Illinois. Previous work has shown that shallow ground water beneath the disposal site flows east through a pebbly-sand unit of the Toulon Member of the Glasford Formation. The pebbly sand was found in core samples from wells in an area extending northeast from the waste-disposal site to a strip-mine lake and east along the south side of the lake. Other stratigraphic units identified in the study area are correlated with units found on the disposal site. The pebbly-sand unit of the Toulon Member grades from a pebbly sand on site into a coarse gravel with sand and pebbles towards the lake. The Hulick Till Member, a key bed, underlies the Toulon Member throughout most of the study area. A narrow channel-like depression in the Hulick Till is filled with coarse gravelly sand of the Toulon Member. The filled depression extends eastward from near the northeast corner of the waste-disposal site to the strip-mine lake. (USGS)

  8. An automatic method of brain tumor segmentation from MRI volume based on the symmetry of brain and level set method

    NASA Astrophysics Data System (ADS)

    Li, Xiaobing; Qiu, Tianshuang; Lebonvallet, Stephane; Ruan, Su

    2010-02-01

    This paper presents a brain tumor segmentation method which automatically segments tumors from human brain MRI image volume. The presented model is based on the symmetry of human brain and level set method. Firstly, the midsagittal plane of an MRI volume is searched, the slices with potential tumor of the volume are checked out according to their symmetries, and an initial boundary of the tumor in the slice, in which the tumor is in the largest size, is determined meanwhile by watershed and morphological algorithms; Secondly, the level set method is applied to the initial boundary to drive the curve evolving and stopping to the appropriate tumor boundary; Lastly, the tumor boundary is projected one by one to its adjacent slices as initial boundaries through the volume for the whole tumor. The experiment results are compared with hand tracking of the expert and show relatively good accordance between both.

  9. Interaction of a Two-Level Atom with the Morse Potential in the Framework of Jaynes-Cummings Model

    NASA Astrophysics Data System (ADS)

    Setare R., M.; Sh., Barzanjeh

    2009-09-01

    A theoretical study of the dynamical behaviors of the interaction between a two-level atom with a Morse potential in the framework of the Jaynes-Cummings model (JCM) is discussed. We show that this system is equivalent to an intensity-dependent coupling between the two-level atom and the non-deformed single-mode radiation field in the presence of an additional nonlinear interaction. We study the dynamical properties of the system such as, atomic population inversion, the probability distribution of cavity-field, the Mandel parameter and atomic dipole squeezing. It is shown how the depth of the Morse potential can be affected by non-classical properties of the system. Moreover, the temporal evolution of the Husimi-distribution function is explored.

  10. Exploring a morphodynamic modeling framework for reef island evolution under sea-level rise

    NASA Astrophysics Data System (ADS)

    Lorenzo Trueba, J.; Ashton, A. D.; Donnelly, J. P.

    2013-12-01

    Global sea-level rise rates have increased over the last century, with dramatic rate increases expected over the coming century and beyond. Not only are rates projected to approach those of the previous deglaciation, the actual increase in elevation by the end of the century (potentially 1m or more) will be significant in terms of the elevations of low-lying coastal landforms. Coral reef islands, often called 'cays' or 'motus', which generally comprise the subaerial portion of atolls, are particularly sensitive to sea-level rise. These landforms are typically low-lying (on the order of meters high), and are formed of wave-transported detrital sediment perched atop coralline rock. As opposed to barrier islands that can be supplied by offshore sediment from the shoreface, breakdown of corals and the shallow offshore lithology can serve as a source of sediment to reef islands, which can help build these islands as sea level rises. Here, we present a morphodynamic model to explore the combined effects of sea-level rise, sediment supply, and overwash processes on the evolution of reef islands. Model results demonstrate how reef islands are particularly sensitive to the offshore generation of sediment. When this onshore sediment supply is low, islands migrate lagoonward via storm overwash, Islands migrate over the proximal lagoonward regions, which tend to include a shallow (~2m) platform, until they reach the edge of a typically very deep lagoon (up to 60m or more). At the lagoon edge, reef islands stop their migration and eventually drown overwash sediment flux is lost to the lagoon. In contrast, a high sediment supply of offshore sediment can bulwark reef islands before reaching the lagoon edge. One possibility is that the island attains a ';static equilibrium' in which the overwash flux fills the top-barrier accommodation created by sea-level rise, and the island surface area is maintained. When the sediment supply is very high, however, the island can undergo rapid

  11. Classification of Normal and Apoptotic Cells from Fluorescence Microscopy Images Using Generalized Polynomial Chaos and Level Set Function.

    PubMed

    Du, Yuncheng; Budman, Hector M; Duever, Thomas A

    2016-06-01

    Accurate automated quantitative analysis of living cells based on fluorescence microscopy images can be very useful for fast evaluation of experimental outcomes and cell culture protocols. In this work, an algorithm is developed for fast differentiation of normal and apoptotic viable Chinese hamster ovary (CHO) cells. For effective segmentation of cell images, a stochastic segmentation algorithm is developed by combining a generalized polynomial chaos expansion with a level set function-based segmentation algorithm. This approach provides a probabilistic description of the segmented cellular regions along the boundary, from which it is possible to calculate morphological changes related to apoptosis, i.e., the curvature and length of a cell's boundary. These features are then used as inputs to a support vector machine (SVM) classifier that is trained to distinguish between normal and apoptotic viable states of CHO cell images. The use of morphological features obtained from the stochastic level set segmentation of cell images in combination with the trained SVM classifier is more efficient in terms of differentiation accuracy as compared with the original deterministic level set method. PMID:27142234

  12. Examining Screening-Level Multimedia Models Through a Comparison Framework for Landfill Management

    NASA Astrophysics Data System (ADS)

    Asif, Zunaira; Chen, Zhi

    2016-01-01

    Two models for evaluating transport and fate of benzene were studied and compared in this paper. A fugacity model and an analytical environmental multimedia model (AEMM) were used to reconcile fate and mass transfer of benzene observed in a landfill site. The comparison of two models were based on average concentrations and partition behavior of benzene among three different phases i.e., air, soil, and groundwater. In the study of fugacity method about 99.6 % of the total benzene flux was distributed into air from landfill source. According to AEMM the diffusion gas flux was also predominant mechanism for benzene released from landfill and advection of gas and liquid was second dominant transport mechanism at steady-state conditions. Overall study of fugacity modeling (Level I and II) confirms the fate and transport mechanism of benzene released from landfill by comparing it with AEMM. However, the values of predicted concentrations, advection, and diffusion flux of benzene using fugacity model were different from AEMM results due to variation in input parameters. In comparison with experimental observations, fugacity model showed more error difference as compared to AEMM as fugacity model is treated as a single unit box model. This study confirms that fugacity model is a screening level tool to be used in conjunction with detailed remediation followed by AEMM that can be evolved as strategic decision-making stage.

  13. Examining Screening-Level Multimedia Models Through a Comparison Framework for Landfill Management.

    PubMed

    Asif, Zunaira; Chen, Zhi

    2016-01-01

    Two models for evaluating transport and fate of benzene were studied and compared in this paper. A fugacity model and an analytical environmental multimedia model (AEMM) were used to reconcile fate and mass transfer of benzene observed in a landfill site. The comparison of two models were based on average concentrations and partition behavior of benzene among three different phases i.e., air, soil, and groundwater. In the study of fugacity method about 99.6 % of the total benzene flux was distributed into air from landfill source. According to AEMM the diffusion gas flux was also predominant mechanism for benzene released from landfill and advection of gas and liquid was second dominant transport mechanism at steady-state conditions. Overall study of fugacity modeling (Level I and II) confirms the fate and transport mechanism of benzene released from landfill by comparing it with AEMM. However, the values of predicted concentrations, advection, and diffusion flux of benzene using fugacity model were different from AEMM results due to variation in input parameters. In comparison with experimental observations, fugacity model showed more error difference as compared to AEMM as fugacity model is treated as a single unit box model. This study confirms that fugacity model is a screening level tool to be used in conjunction with detailed remediation followed by AEMM that can be evolved as strategic decision-making stage. PMID:26342953

  14. Levels of 8-OxodG Predict Hepatobiliary Pathology in Opisthorchis viverrini Endemic Settings in Thailand

    PubMed Central

    Jariwala, Amar R.; Sithithaworn, Jiraporn; Sripa, Banchob; Brindley, Paul J.; Laha, Thewarach; Mairiang, Eimorn; Pairojkul, Chawalit; Khuntikeo, Narong; Mulvenna, Jason; Sithithaworn, Paiboon; Bethony, Jeffrey M.

    2015-01-01

    Opisthorchis viverrini is distinct among helminth infections as it drives a chronic inflammatory response in the intrahepatic bile duct that progresses from advanced periductal fibrosis (APF) to cholangiocarcinoma (CCA). Extensive research shows that oxidative stress (OS) plays a critical role in the transition from chronic O. viverrini infection to CCA. OS also results in the excision of a modified DNA lesion (8-oxodG) into urine, the levels of which can be detected by immunoassay. Herein, we measured concentrations of urine 8-oxodG by immunoassay from the following four groups in the Khon Kaen Cancer Cohort study: (1) O. viverrini negative individuals, (2) O. viverrini positive individuals with no APF as determined by abdominal ultrasound, (3) O. viverrini positive individuals with APF as determined by abdominal ultrasound, and (4) O. viverrini induced cases of CCA. A logistic regression model was used to evaluate the utility of creatinine-adjusted urinary 8-oxodG among these groups, along with demographic, behavioral, and immunological risk factors. Receiver operating characteristic (ROC) curve analysis was used to evaluate the predictive accuracy of urinary 8-oxodG for APF and CCA. Elevated concentrations of 8-oxodG in urine positively associated with APF and CCA in a strongly dose-dependent manner. Urinary 8-oxodG concentrations also accurately predicted whether an individual presented with APF or CCA compared to O. viverrini infected individuals without these pathologies. In conclusion, urinary 8-oxodG is a robust ‘candidate’ biomarker of the progression of APF and CCA from chronic opisthorchiasis, which is indicative of the critical role that OS plays in both of these advanced hepatobiliary pathologies. The findings also confirm our previous observations that severe liver pathology occurs early and asymptomatically in residents of O. viverrini endemic regions, where individuals are infected for years (often decades) with this food-borne pathogen. These

  15. Development of a Software Framework for System-Level Carbon Sequestration Risk Assessment

    SciTech Connect

    Miller, R.

    2013-02-28

    The overall purpose of this project was to identify, evaluate, select, develop, and test a suite of enhancements to the GoldSim software program, in order to make it a better tool for use in support of Carbon Capture and Sequestration (CCS) projects. The GoldSim software is a foundational tool used by scientists at NETL and at other laboratories and research institutions to evaluate system-level risks of proposed CCS projects. The primary product of the project was a series of successively improved versions of the GoldSim software, supported by an extensive User’s Guide. All of the enhancements were tested by scientists at Los Alamos National Laboratory, and several of the enhancements have already been incorporated into the CO{sub 2}-PENS sequestration model.

  16. Selective removal of cesium and strontium using porous frameworks from high level nuclear waste.

    PubMed

    Aguila, Briana; Banerjee, Debasis; Nie, Zimin; Shin, Yongsoon; Ma, Shengqian; Thallapally, Praveen K

    2016-05-01

    Efficient and cost-effective removal of radioactive (137)Cs and (90)Sr found in spent fuel is an important step for safe, long-term storage of nuclear waste. Solid-state materials such as resins and titanosilicate zeolites have been assessed for the removal of Cs and Sr from aqueous solutions, but there is room for improvement in terms of capacity and selectivity. Herein, we report the Cs(+) and Sr(2+) exchange potential of an ultra stable MOF, namely, MIL-101-SO3H, as a function of different contact times, concentrations, pH levels, and in the presence of competing ions. Our preliminary results suggest that MOFs with suitable ion exchange groups can be promising alternate materials for cesium and strontium removal. PMID:27055254

  17. Fish welfare assurance system: initial steps to set up an effective tool to safeguard and monitor farmed fish welfare at a company level.

    PubMed

    van de Vis, J W; Poelman, M; Lambooij, E; Bégout, M-L; Pilarczyk, M

    2012-02-01

    The objective was to take a first step in the development of a process-oriented quality assurance (QA) system for monitoring and safeguarding of fish welfare at a company level. A process-oriented approach is focused on preventing hazards and involves establishment of critical steps in a process that requires careful control. The seven principles of the Hazard Analysis Critical Control Points (HACCP) concept were used as a framework to establish the QA system. HACCP is an internationally agreed approach for management of food safety, which was adapted for the purpose of safeguarding and monitoring the welfare of farmed fish. As the main focus of this QA system is farmed fish welfare assurance at a company level, it was named Fish Welfare Assurance System (FWAS). In this paper we present the initial steps of setting up FWAS for on growing of sea bass (Dicentrarchus labrax), carp (Cyprinus carpio) and European eel (Anguilla anguilla). Four major hazards were selected, which were fish species dependent. Critical Control Points (CCPs) that need to be controlled to minimize or avoid the four hazards are presented. For FWAS, monitoring of CCPs at a farm level is essential. For monitoring purposes, Operational Welfare Indicators (OWIs) are needed to establish whether critical biotic, abiotic, managerial and environmental factors are controlled. For the OWIs we present critical limits/target values. A critical limit is the maximum or minimum value to which a factor must be controlled at a critical control point to prevent, eliminate or reduce a hazard to an acceptable level. For managerial factors target levels are more appropriate than critical limits. Regarding the international trade of farmed fish products, we propose that FWAS needs to be standardized in aquaculture chains. For this standardization a consensus on the concept of fish welfare, methods to assess welfare objectively and knowledge on the needs of farmed fish are required. PMID:22278705

  18. Providing a navigable route for acute medicine nurses to advance their practice: a framework of ascending levels of practice.

    PubMed

    Lees-Deutsch, Liz; Christian, Jan; Setchfield, Ian

    2016-01-01

    This article conveys concerns raised by delegates at the International SAM Conference (Manchester, 2015) regarding how to advance nursing practice in acute medicine. It endeavors to capture the essence of 'how to advance practice' and 'how to integrate advanced practice' within the workforce structures of an acute medicine unit (AMU). It addresses the production of tacit knowledge and the recognition and integration of this to developing the nursing workforce. The current context of NHS efficiencies and recruitment issues emphasize the value of retaining tacit knowledge. Uniquely, this article offers an early conceptual framework through which levels of advancement and potential transition points to advance nursing practice in acute medicine are articulated. Determining how to advance requires identification of prior accomplishments such as, tacit knowledge, experiential learning, CPD, specialist courses and management experience. This requires nurses to make judicious decisions to advance their practice and the distinction between 'amassing experience' and 'career progression'. It aims to stimulate thinking around the practicalities of advancement, the value of tacit knowledge and potential realization through the framework trajectory. PMID:27441313

  19. Acute and Chronic Toxicity of Nitrate to Early Life Stages of Zebrafish--Setting Nitrate Safety Levels for Zebrafish Rearing.

    PubMed

    Learmonth, Cândida; Carvalho, António Paulo

    2015-08-01

    Recirculating aquaculture systems (RAS) have been widely used for zebrafish rearing, allowing holding of many thousands of fish at high densities. Water quality in RAS largely depends on biofilters that ultimately convert the extremely toxic ammonia excreted by fish into the much less toxic nitrate. However, when water renewal is minimal in RAS, nitrate can accumulate to high enough levels to negatively impact fish welfare and performance. Therefore, the setting of safety levels of nitrate for zebrafish should be a priority to avoid unwanted effects in both the intensive production of this species and research outputs. The present study aimed to define nitrate safety levels for zebrafish based on acute and chronic toxicity bioassays in early life stages of this species. Acute bioassays revealed ontogenetic changes in response to high nitrate levels. Based on NOEC (no observed effect concentration) values, safety levels should be set at 1450, 1855, and 1075 mg/L NO3(-)-N to prevent acute lethal effects in embryos, newly-hatched larvae, and swim-up larvae, respectively. In the chronic bioassay, larvae were exposed to nitrate concentrations of 50, 100, 200, and 400 mg/L NO3(-)-N during the entire larval period (23 days). No negative effects were observed either on larval performance or condition at concentrations up to 200 mg/L NO3(-)-N. However, at 400 mg/L NO3(-)-N, survival drastically decreased and fish showed reduced growth and evidence of morphological abnormalities. Accordingly, a safety level of 200 mg/L NO3(-)-N is recommended during the larval rearing of zebrafish to prevent negative impacts on juvenile production. PMID:25996778

  20. Inverting Glacial Isostatic Adjustment with Paleo Sea Level Records using Bayesian Framework and Burgers Rheology

    NASA Astrophysics Data System (ADS)

    Caron, L.; Metivier, L.; Greff-Lefftz, M.; Fleitout, L.; Rouby, H.

    2015-12-01

    Glacial Isostatic Adjustment models most often assume a mantle with a viscoelastic Maxwell rheology and a given ice history model. Here we use a Bayesian Monte Carlo with Markov Chains formalism to invert the global GIA signal simultaneously for the mechanical properties of the mantle and for the volume of the various ice-sheets using as starting ice models two distinct previously published ice histories. Burgers as well as Maxwell rheologies are considered.The fitted data consist of 5720 paleo sea level records from the last 35kyrs, with a world-wide distribution. Our ambition is to present not only the best fitting model, but also the range of possible solutions (within the explored space of parameters) with their respective probability of explaining the data, and thus reveal the trade-off effects and range of uncertainty affecting the parameters. Our a posteriori probality maps exhibit in all cases two distinct peaks: both are characterized by an upper mantle viscosity around 5.1020Pa.s but one of the peaks features a lower mantle viscosity around 3.1021Pa.s while the other indicates lower mantle viscosity of more than 1.1022Pa.s. The global maximum depends upon the starting ice history and the chosen rheology: the first peak (P1) has the highest probability only in the case with a Maxwell rheology and ice history based on ICE-5G, while the second peak (P2) is favored when using ANU-based ice history or Burgers rheology, and is our preferred solution as it is also consistent with long-term geodynamics and gravity gradients anomalies over Laurentide. P2 is associated with larger volumes for the Laurentian and Fennoscandian ice-sheets and as a consequence of total ice volume balance, smaller volumes for the Antactic ice-sheet. This last point interfers with the estimate of present-day ice-melting in Antarctica from GRACE data. Finally, we find that P2 with Burgers rheology favors the existence of a tectosphere, i.e. a viscous sublithospheric layer.

  1. Efficient model chemistries for peptides. I. General framework and a study of the heterolevel approximation in RHF and MP2 with Pople split-valence basis sets.

    PubMed

    Echenique, Pablo; Alonso, José Luis

    2008-07-15

    We present an exhaustive study of more than 250 ab initio potential energy surfaces (PESs) of the model dipeptide HCO-L-Ala-NH(2). The model chemistries (MCs) investigated are constructed as homo- and heterolevels involving possibly different RHF and MP2 calculations for the geometry and the energy. The basis sets used belong to a sample of 39 representants from Pople's split-valence families, ranging from the small 3-21G to the large 6-311++G(2df,2pd). The reference PES to which the rest are compared is the MP2/6-311++G(2df,2pd) homolevel, which, as far as we are aware, is the most accurate PES in the literature. All data sets have been analyzed according to a general framework, which can be extended to other complex problems and which captures the nearness concept in the space of MCs. The great number of MCs evaluated has allowed us to significantly explore this space and show that the correlation between accuracy and computational cost of the methods is imperfect, thus justifying a systematic search for the combination of features in a MC that is optimal to deal with peptides. Regarding the particular MCs studied, the most important conclusion is that the potentially very cost-saving heterolevel approximation is a very efficient one to describe the whole PES of HCO-L-Ala-NH(2). Finally, we show that, although RHF may be used to calculate the geometry if a MP2 single-point energy calculation follows, pure RHF//RHF homolevels are not recommendable for this problem. PMID:18270966

  2. Cervical cancer screening in low-resource settings: A cost-effectiveness framework for valuing tradeoffs between test performance and program coverage.

    PubMed

    Campos, Nicole G; Castle, Philip E; Wright, Thomas C; Kim, Jane J

    2015-11-01

    As cervical cancer screening programs are implemented in low-resource settings, protocols are needed to maximize health benefits under operational constraints. Our objective was to develop a framework for examining health and economic tradeoffs between screening test sensitivity, population coverage and follow-up of screen-positive women, to help decision makers identify where program investments yield the greatest value. As an illustrative example, we used an individual-based Monte Carlo simulation model of the natural history of human papillomavirus (HPV) and cervical cancer calibrated to epidemiologic data from Uganda. We assumed once in a lifetime screening at age 35 with two-visit HPV DNA testing or one-visit visual inspection with acetic acid (VIA). We assessed the health and economic tradeoffs that arise between (i) test sensitivity and screening coverage; (ii) test sensitivity and loss to follow-up (LTFU) of screen-positive women; and (iii) test sensitivity, screening coverage and LTFU simultaneously. The decline in health benefits associated with sacrificing HPV DNA test sensitivity by 20% (e.g., shifting from provider- to self-collection of specimens) could be offset by gains in coverage if coverage increased by at least 20%. When LTFU was 10%, two-visit HPV DNA testing with 80-90% sensitivity was more effective and more cost-effective than one-visit VIA with 40% sensitivity and yielded greater health benefits than VIA even as VIA sensitivity increased to 60% and HPV test sensitivity declined to 70%. As LTFU increased, two-visit HPV DNA testing became more costly and less effective than one-visit VIA. Setting-specific data on achievable test sensitivity, coverage, follow-up rates and programmatic costs are needed to guide decision making for cervical cancer screening. PMID:25943074

  3. An advanced approach for the generation of complex cellular material representative volume elements using distance fields and level sets

    NASA Astrophysics Data System (ADS)

    Sonon, B.; François, B.; Massart, T. J.

    2015-08-01

    A general and widely tunable method for the generation of representative volume elements for cellular materials based on distance and level set functions is presented. The approach is based on random tessellations constructed from random inclusion packings. A general methodology to obtain arbitrary-shaped tessellations to produce disordered foams is presented and illustrated. These tessellations can degenerate either in classical Voronoï tessellations potentially additively weighted depending on properties of the initial inclusion packing used, or in Laguerre tessellations through a simple modification of the formulation. A versatile approach to control the particular morphology of the obtained foam is introduced. Specific local features such as concave triangular Plateau borders and non-constant thickness heterogeneous coatings can be built from the tessellation in a straightforward way and are tuned by a small set of parameters with a clear morphological interpretation.

  4. Preliminary analysis of acceleration of sea level rise through the twentieth century using extended tide gauge data sets (August 2014)

    NASA Astrophysics Data System (ADS)

    Hogarth, Peter

    2014-11-01

    This work explores the potential for extending tide gauge time series from the Permanent Service for Mean Sea Level (PSMSL) using historical documents, PSMSL ancillary data, and by developing additional composite time series using near neighbor tide gauges. The aim was to increase the number, completeness, and geographical extent of records covering most or all of the twentieth century. The number of at least 75% complete century-scale time series have been approximately doubled over the original PSMSL data set. In total, over 4800 station years have been added, with 294 of these added to 10 long Southern Hemisphere records. Individual century-scale acceleration values derived from this new extended data set tend to converge on a value of 0.01 ± 0.008 mm/yr2. This result agrees closely with recent work and is statistically significant at the 1 sigma level. Possible causes of acceleration and errors are briefly discussed. Results confirm the importance of current data archeology projects involving digitization of the remaining archives of hard copy tide gauge data for sea level and climate studies.

  5. Fictitious domains and level sets for moving boundary problems. Applications to the numerical simulation of tumor growth

    NASA Astrophysics Data System (ADS)

    Carmen Calzada, M.; Camacho, Gema; Fernández-Cara, Enrique; Marín, Mercedes

    2011-02-01

    In this work we present a new strategy for solving numerically a (relatively simple) model of tumor growth. In principle, this is devoted to describe avascular growth although, by choosing the parameters appropriately, it also permits to give an idea of the behavior after vascularization. The numerical methods rely on fictitious domain and level set techniques, with a combination of quadratic finite elements and finite differences approximations. We present a collection of numerical results that essentially coincide with others, previously obtained with other techniques.

  6. 3-dimensional throat region segmentation from MRI data based on Fourier interpolation and 3-dimensional level set methods.

    PubMed

    Campbell, Sean; Doshi, Trushali; Soraghan, John; Petropoulakis, Lykourgos; Di Caterina, Gaetano; Grose, Derek; MacKenzie, Kenneth

    2015-08-01

    A new algorithm for 3D throat region segmentation from magnetic resonance imaging (MRI) is presented. The proposed algorithm initially pre-processes the MRI data to increase the contrast between the throat region and its surrounding tissues and to reduce artifacts. Isotropic 3D volume is reconstructed using the Fourier interpolation. Furthermore, a cube encompassing the throat region is evolved using level set method to form a smooth 3D boundary of the throat region. The results of the proposed algorithm on real and synthetic MRI data are used to validate the robustness and accuracy of the algorithm. PMID:26736782

  7. The l1-l2 regularization framework unmasks the hypoxia signature hidden in the transcriptome of a set of heterogeneous neuroblastoma cell lines

    PubMed Central

    Fardin, Paolo; Barla, Annalisa; Mosci, Sofia; Rosasco, Lorenzo; Verri, Alessandro; Varesio, Luigi

    2009-01-01

    Background Gene expression signatures are clusters of genes discriminating different statuses of the cells and their definition is critical for understanding the molecular bases of diseases. The identification of a gene signature is complicated by the high dimensional nature of the data and by the genetic heterogeneity of the responding cells. The l1-l2 regularization is an embedded feature selection technique that fulfills all the desirable properties of a variable selection algorithm and has the potential to generate a specific signature even in biologically complex settings. We studied the application of this algorithm to detect the signature characterizing the transcriptional response of neuroblastoma tumor cell lines to hypoxia, a condition of low oxygen tension that occurs in the tumor microenvironment. Results We determined the gene expression profile of 9 neuroblastoma cell lines cultured under normoxic and hypoxic conditions. We studied a heterogeneous set of neuroblastoma cell lines to mimic the in vivo situation and to test the robustness and validity of the l1-l2 regularization with double optimization. Analysis by hierarchical, spectral, and k-means clustering or supervised approach based on t-test analysis divided the cell lines on the bases of genetic differences. However, the disturbance of this strong transcriptional response completely masked the detection of the more subtle response to hypoxia. Different results were obtained when we applied the l1-l2 regularization framework. The algorithm distinguished the normoxic and hypoxic statuses defining signatures comprising 3 to 38 probesets, with a leave-one-out error of 17%. A consensus hypoxia signature was established setting the frequency score at 50% and the correlation parameter ε equal to 100. This signature is composed by 11 probesets representing 8 well characterized genes known to be modulated by hypoxia. Conclusion We demonstrate that l1-l2 regularization outperforms more conventional

  8. Level set simulation of coupled advection-diffusion and pore structure evolution due to mineral precipitation in porous media

    SciTech Connect

    Xiaoyi Li; Hai Huang; Paul Meakin

    2008-09-01

    The nonlinear coupling of fluid flow, reactive chemical transport and pore structure changes due to mineral precipitation (or dissolution) in porous media play a key role in a wide variety of processes of scientific interest and practical importance. Significant examples include the evolution of fracture apertures in the subsurface, acid fracturing stimulation for enhanced oil recovery and immobilizations of radionuclides and heavy metals in contaminated groundwater. We have developed a pore-scale simulation technique for modeling coupled reactive flow and structure evolution in porous media and fracture apertures. Advection, diffusion, and mineral precipitation resulting in changes in pore geometries are treated simultaneously by solving fully coupled fluid momentum and reactive solute transport equations. In this model, the reaction-induced evolution of solid grain surfaces is captured using a level set method. A sub-grid representation of the interface, based on the level set approach, is used instead of pixel representations of the interface often used in cellular-automata and most lattice-Boltzmann methods. The model is validated against analytical solutions for simplified geometries. Precipitation processes were simulated under various flow conditions and reaction rates, and the resulting pore geometry changes are discussed. Quantitative relationships between permeability and porosity under various flow conditions and reaction rates are reported.

  9. LV wall segmentation using the variational level set method (LSM) with additional shape constraint for oedema quantification

    NASA Astrophysics Data System (ADS)

    Kadir, K.; Gao, H.; Payne, A.; Soraghan, J.; Berry, C.

    2012-10-01

    In this paper an automatic algorithm for the left ventricle (LV) wall segmentation and oedema quantification from T2-weighted cardiac magnetic resonance (CMR) images is presented. The extent of myocardial oedema delineates the ischaemic area-at-risk (AAR) after myocardial infarction (MI). Since AAR can be used to estimate the amount of salvageable myocardial post-MI, oedema imaging has potential clinical utility in the management of acute MI patients. This paper presents a new scheme based on the variational level set method (LSM) with additional shape constraint for the segmentation of T2-weighted CMR image. In our approach, shape information of the myocardial wall is utilized to introduce a shape feature of the myocardial wall into the variational level set formulation. The performance of the method is tested using real CMR images (12 patients) and the results of the automatic system are compared to manual segmentation. The mean perpendicular distances between the automatic and manual LV wall boundaries are in the range of 1-2 mm. Bland-Altman analysis on LV wall area indicates there is no consistent bias as a function of LV wall area, with a mean bias of -121 mm2 between individual investigator one (IV1) and LSM, and -122 mm2 between individual investigator two (IV2) and LSM when compared to two investigators. Furthermore, the oedema quantification demonstrates good correlation when compared to an expert with an average error of 9.3% for 69 slices of short axis CMR image from 12 patients.

  10. Mask pattern recovery by level set method based inverse inspection technology (IIT) and its application on defect auto disposition

    NASA Astrophysics Data System (ADS)

    Park, Jin-Hyung; Chung, Paul D. H.; Jeon, Chan-Uk; Cho, Han Ku; Pang, Linyong; Peng, Danping; Tolani, Vikram; Cecil, Tom; Kim, David; Baik, KiHo

    2009-10-01

    At the most advanced technology nodes, such as 32nm and 22nm, aggressive OPC and Sub-Resolution Assist Features (SRAFs) are required. However, their use results in significantly increased mask complexity, making mask defect disposition more challenging than ever. This paper describes how mask patterns can first be recovered from the inspection images by applying patented algorithms using Level Set Methods. The mask pattern recovery step is then followed by aerial/wafer image simulation, the results of which can be plugged into an automated mask defect disposition system based on aerial/wafer image. The disposition criteria are primarily based on wafer-plane CD variance. The system also connects to a post-OPC lithography verification tool that can provide gauges and CD specs, thereby enabling them to be used in mask defect disposition as well. Results on both programmed defects and production defects collected at Samsung mask shop are presented to show the accuracy and consistency of using the Level Set Methods and aerial/wafer image based automated mask disposition.

  11. Level-set reconstruction algorithm for ultrafast limited-angle X-ray computed tomography of two-phase flows

    PubMed Central

    Bieberle, M.; Hampel, U.

    2015-01-01

    Tomographic image reconstruction is based on recovering an object distribution from its projections, which have been acquired from all angular views around the object. If the angular range is limited to less than 180° of parallel projections, typical reconstruction artefacts arise when using standard algorithms. To compensate for this, specialized algorithms using a priori information about the object need to be applied. The application behind this work is ultrafast limited-angle X-ray computed tomography of two-phase flows. Here, only a binary distribution of the two phases needs to be reconstructed, which reduces the complexity of the inverse problem. To solve it, a new reconstruction algorithm (LSR) based on the level-set method is proposed. It includes one force function term accounting for matching the projection data and one incorporating a curvature-dependent smoothing of the phase boundary. The algorithm has been validated using simulated as well as measured projections of known structures, and its performance has been compared to the algebraic reconstruction technique and a binary derivative of it. The validation as well as the application of the level-set reconstruction on a dynamic two-phase flow demonstrated its applicability and its advantages over other reconstruction algorithms. PMID:25939623

  12. Level-set reconstruction algorithm for ultrafast limited-angle X-ray computed tomography of two-phase flows.

    PubMed

    Bieberle, M; Hampel, U

    2015-06-13

    Tomographic image reconstruction is based on recovering an object distribution from its projections, which have been acquired from all angular views around the object. If the angular range is limited to less than 180° of parallel projections, typical reconstruction artefacts arise when using standard algorithms. To compensate for this, specialized algorithms using a priori information about the object need to be applied. The application behind this work is ultrafast limited-angle X-ray computed tomography of two-phase flows. Here, only a binary distribution of the two phases needs to be reconstructed, which reduces the complexity of the inverse problem. To solve it, a new reconstruction algorithm (LSR) based on the level-set method is proposed. It includes one force function term accounting for matching the projection data and one incorporating a curvature-dependent smoothing of the phase boundary. The algorithm has been validated using simulated as well as measured projections of known structures, and its performance has been compared to the algebraic reconstruction technique and a binary derivative of it. The validation as well as the application of the level-set reconstruction on a dynamic two-phase flow demonstrated its applicability and its advantages over other reconstruction algorithms. PMID:25939623

  13. A new non-overlapping concept to improve the Hybrid Particle Level Set method in multi-phase fluid flows

    NASA Astrophysics Data System (ADS)

    Archer, Philip J.; Bai, Wei

    2015-02-01

    A novel non-overlapping concept is augmented to the Hybrid Particle Level Set (HPLS) method to improve its accuracy and suitability for the modelling of multi-phase fluid flows. The concept addresses shortcomings in the reseeding algorithm, which maintains resolution of the surface at runtime. These shortcomings result in the misplacement of newly seeded particles in the opposite signed domain and necessitate a restriction on the distance that a particle can escape without deletion, which reduces the effectiveness of the method. The non-overlapping concept judges the suitability of potential new particles based on information already contained within the particle representation of the surface. By preventing the misplacement of particles it is possible to significantly relax the distance restriction thereby increasing the accuracy of the HPLS method in multi-phase flows. To demonstrate its robustness and efficiency, the concept is examined with a number of challenging test cases, including both level-set-only simulations and two-phase fluid flows.

  14. Comparison of image segmentation of lungs using methods: connected threshold, neighborhood connected, and threshold level set segmentation

    NASA Astrophysics Data System (ADS)

    Amanda, A. R.; Widita, R.

    2016-03-01

    The aim of this research is to compare some image segmentation methods for lungs based on performance evaluation parameter (Mean Square Error (MSE) and Peak Signal Noise to Ratio (PSNR)). In this study, the methods compared were connected threshold, neighborhood connected, and the threshold level set segmentation on the image of the lungs. These three methods require one important parameter, i.e the threshold. The threshold interval was obtained from the histogram of the original image. The software used to segment the image here was InsightToolkit-4.7.0 (ITK). This research used 5 lung images to be analyzed. Then, the results were compared using the performance evaluation parameter determined by using MATLAB. The segmentation method is said to have a good quality if it has the smallest MSE value and the highest PSNR. The results show that four sample images match the criteria of connected threshold, while one sample refers to the threshold level set segmentation. Therefore, it can be concluded that connected threshold method is better than the other two methods for these cases.

  15. Intervening at the Setting Level to Prevent Behavioral Incidents in Residential Child Care: Efficacy of the CARE Program Model.

    PubMed

    Izzo, Charles V; Smith, Elliott G; Holden, Martha J; Norton, Catherine I; Nunno, Michael A; Sellers, Deborah E

    2016-07-01

    The current study examined the impact of a setting-level intervention on the prevention of aggressive or dangerous behavioral incidents involving youth living in group care environments. Eleven group care agencies implemented Children and Residential Experiences (CARE), a principle-based program that helps agencies use a set of evidence-informed principles to guide programming and enrich the relational dynamics throughout the agency. All agencies served mostly youth referred from child welfare. The 3-year implementation of CARE involved intensive agency-wide training and on-site consultation to agency leaders and managers around supporting and facilitating day-to-day application of the principles in both childcare and staff management arenas. Agencies provided data over 48 months on the monthly frequency of behavioral incidents most related to program objectives. Using multiple baseline interrupted time series analysis to assess program effects, we tested whether trends during the program implementation period declined significantly compared to the 12 months before implementation. Results showed significant program effects on incidents involving youth aggression toward adult staff, property destruction, and running away. Effects on aggression toward peers and self-harm were also found but were less consistent. Staff ratings of positive organizational social context (OSC) predicted fewer incidents, but there was no clear relationship between OSC and observed program effects. Findings support the potential efficacy of the CARE model and illustrate that intervening "upstream" at the setting level may help to prevent coercive caregiving patterns and increase opportunities for healthy social interactions. PMID:27138932

  16. Towards people-centred health systems: a multi-level framework for analysing primary health care governance in low- and middle-income countries.

    PubMed

    Abimbola, Seye; Negin, Joel; Jan, Stephen; Martiniuk, Alexandra

    2014-09-01

    Although there is evidence that non-government health system actors can individually or collectively develop practical strategies to address primary health care (PHC) challenges in the community, existing frameworks for analysing health system governance largely focus on the role of governments, and do not sufficiently account for the broad range of contribution to PHC governance. This is important because of the tendency for weak governments in low- and middle-income countries (LMICs). We present a multi-level governance framework for use as a thinking guide in analysing PHC governance in LMICs. This framework has previously been used to analyse the governance of common-pool resources such as community fisheries and irrigation systems. We apply the framework to PHC because, like common-pool resources, PHC facilities in LMICs tend to be commonly owned by the community such that individual and collective action is often required to avoid the 'tragedy of the commons'-destruction and degradation of the resource resulting from lack of concern for its continuous supply. In the multi-level framework, PHC governance is conceptualized at three levels, depending on who influences the supply and demand of PHC services in a community and how: operational governance (individuals and providers within the local health market), collective governance (community coalitions) and constitutional governance (governments at different levels and other distant but influential actors). Using the example of PHC governance in Nigeria, we illustrate how the multi-level governance framework offers a people-centred lens on the governance of PHC in LMICs, with a focus on relations among health system actors within and between levels of governance. We demonstrate the potential impact of health system actors functioning at different levels of governance on PHC delivery, and how governance failure at one level can be assuaged by governance at another level. PMID:25274638

  17. Towards people-centred health systems: a multi-level framework for analysing primary health care governance in low- and middle-income countries

    PubMed Central

    Abimbola, Seye; Negin, Joel; Jan, Stephen; Martiniuk, Alexandra

    2014-01-01

    Although there is evidence that non-government health system actors can individually or collectively develop practical strategies to address primary health care (PHC) challenges in the community, existing frameworks for analysing health system governance largely focus on the role of governments, and do not sufficiently account for the broad range of contribution to PHC governance. This is important because of the tendency for weak governments in low- and middle-income countries (LMICs). We present a multi-level governance framework for use as a thinking guide in analysing PHC governance in LMICs. This framework has previously been used to analyse the governance of common-pool resources such as community fisheries and irrigation systems. We apply the framework to PHC because, like common-pool resources, PHC facilities in LMICs tend to be commonly owned by the community such that individual and collective action is often required to avoid the ‘tragedy of the commons’—destruction and degradation of the resource resulting from lack of concern for its continuous supply. In the multi-level framework, PHC governance is conceptualized at three levels, depending on who influences the supply and demand of PHC services in a community and how: operational governance (individuals and providers within the local health market), collective governance (community coalitions) and constitutional governance (governments at different levels and other distant but influential actors). Using the example of PHC governance in Nigeria, we illustrate how the multi-level governance framework offers a people-centred lens on the governance of PHC in LMICs, with a focus on relations among health system actors within and between levels of governance. We demonstrate the potential impact of health system actors functioning at different levels of governance on PHC delivery, and how governance failure at one level can be assuaged by governance at another level. PMID:25274638

  18. Establishing a Strong Foundation: District and School-Level Supports for Classroom Implementation of the LDC and MDC Frameworks. Executive Summary

    ERIC Educational Resources Information Center

    Reumann-Moore, Rebecca; Lawrence, Nancy; Sanders, Felicia; Christman, Jolley Bruce; Duffy, Mark

    2011-01-01

    The Bill and Melinda Gates Foundation has invested in the development and dissemination of high-quality instructional and formative assessment tools to support teachers' incorporation of the Core Common State Standards (CCSS) into their classroom instruction. Literacy experts have developed a framework and a set of templates that teachers can use…

  19. Practical Recommendations for Robot-Assisted Treadmill Therapy (Lokomat) in Children with Cerebral Palsy: Indications, Goal Setting, and Clinical Implementation within the WHO-ICF Framework.

    PubMed

    Aurich-Schuler, Tabea; Warken, Birgit; Graser, Judith V; Ulrich, Thilo; Borggraefe, Ingo; Heinen, Florian; Meyer-Heim, Andreas; van Hedel, Hubertus J A; Schroeder, A Sebastian

    2015-08-01

    Active participation and the highest level of independence during daily living are primary goals in neurorehabilitation. Therefore, standing and walking are key factors in many rehabilitation programs. Despite inconclusive evidence considering the best application and efficacy of robotic tools in the field of pediatric neurorehabilitation, robotic technologies have been implemented to complement conventional therapies in recent years. A group of experienced therapists and physicians joined in an "expert panel." They compared their clinical application protocols, discussed recurring open questions, and developed experience-based recommendations for robot-assisted treadmill therapy (exemplified by the Lokomat, Hocoma, Volketswil, Switzerland) with a focus on children with cerebral palsy. Specific indications and therapeutic goals were defined considering the severity of motor impairments and the International Classification of Functioning, Disability and Health framework (ICF). After five meetings, consensus was found and recommendations for the implementation of robot-assisted treadmill therapy including postsurgery rehabilitation were proposed. This article aims to provide a comprehensive overview on therapeutical applications in a fast developing field of medicine, where scientific evidence is still scarce. These recommendations can help physicians and therapists to plan the child's individual therapy protocol of robot-assisted treadmill therapy. PMID:26011438

  20. Best Practices for Ethical Sharing of Individual-Level Health Research Data From Low- and Middle-Income Settings

    PubMed Central

    Cheah, Phaik Yeong; Denny, Spencer; Jao, Irene; Marsh, Vicki; Merson, Laura; Shah More, Neena; Nhan, Le Nguyen Thanh; Osrin, David; Tangseefa, Decha; Wassenaar, Douglas; Parker, Michael

    2015-01-01

    Sharing individual-level data from clinical and public health research is increasingly being seen as a core requirement for effective and efficient biomedical research. This article discusses the results of a systematic review and multisite qualitative study of key stakeholders’ perspectives on best practices in ethical data sharing in low- and middle-income settings. Our research suggests that for data sharing to be effective and sustainable, multiple social and ethical requirements need to be met. An effective model of data sharing will be one in which considered judgments will need to be made about how best to achieve scientific progress, minimize risks of harm, promote fairness and reciprocity, and build and sustain trust. PMID:26297751

  1. Best Practices for Ethical Sharing of Individual-Level Health Research Data From Low- and Middle-Income Settings.

    PubMed

    Bull, Susan; Cheah, Phaik Yeong; Denny, Spencer; Jao, Irene; Marsh, Vicki; Merson, Laura; Shah More, Neena; Nhan, Le Nguyen Thanh; Osrin, David; Tangseefa, Decha; Wassenaar, Douglas; Parker, Michael

    2015-07-01

    Sharing individual-level data from clinical and public health research is increasingly being seen as a core requirement for effective and efficient biomedical research. This article discusses the results of a systematic review and multisite qualitative study of key stakeholders' perspectives on best practices in ethical data sharing in low- and middle-income settings. Our research suggests that for data sharing to be effective and sustainable, multiple social and ethical requirements need to be met. An effective model of data sharing will be one in which considered judgments will need to be made about how best to achieve scientific progress, minimize risks of harm, promote fairness and reciprocity, and build and sustain trust. PMID:26297751

  2. Assessement of serum amyloid A levels in the rehabilitation setting in the Florida manatee (Trichechus manatus latirostris).

    PubMed

    Cray, Carolyn; Dickey, Meranda; Brewer, Leah Brinson; Arheart, Kristopher L

    2013-12-01

    The acute phase protein serum amyloid A (SAA) has been previously shown to have value as a biomarker of inflammation and infection in many species, including manatees (Trichechus manatus latirostris). In the current study, results from an automated assay for SAA were used in a rehabilitation setting. Reference intervals were established from clinically normal manatees using the robust method: 0-46 mg/L. More than 30-fold higher mean SAA levels were observed in manatees suffering from cold stress and boat-related trauma. Poor correlations were observed between SAA and total white blood count, percentage of neutrophils, albumin, and albumin/globulin ratio. A moderate correlation was observed between SAA and the presence of nucleated red blood cells. The sensitivity of SAA testing was 93% and the specificity was 98%, representing the highest combined values of all the analytes. The results indicate that the automated method for SAA quantitation can provide important clinical data for manatees in a rehabilitation setting. PMID:24450049

  3. Calculation of contact angles at triple phase boundary in solid oxide fuel cell anode using the level set method

    SciTech Connect

    Sun, Xiaojun; Hasegawa, Yosuke; Kohno, Haruhiko; Jiao, Zhenjun; Hayakawa, Koji; Okita, Kohei; Shikazono, Naoki

    2014-10-15

    A level set method is applied to characterize the three dimensional structures of nickel, yttria stabilized zirconia and pore phases in solid oxide fuel cell anode reconstructed by focused ion beam-scanning electron microscope. A numerical algorithm is developed to evaluate the contact angles at the triple phase boundary based on interfacial normal vectors which can be calculated from the signed distance functions defined for each of the three phases. Furthermore, surface tension force is estimated from the contact angles by assuming the interfacial force balance at the triple phase boundary. The average contact angle values of nickel, yttria stabilized zirconia and pore are found to be 143°–156°, 83°–138° and 82°–123°, respectively. The mean contact angles remained nearly unchanged after 100 hour operation. However, the contact angles just after reduction are different for the cells with different sintering temperatures. In addition, standard deviations of the contact angles are very large especially for yttria stabilized zirconia and pore phases. The calculated surface tension forces from mean contact angles were close to the experimental values found in the literature. Slight increase of surface tensions of nickel/pore and nickel/yttria stabilized zirconia were observed after operation. Present data are expected to be used not only for the understanding of the degradation mechanism, but also for the quantitative prediction of the microstructural temporal evolution of solid oxide fuel cell anode. - Highlights: • A level set method is applied to characterize the 3D structures of SOFC anode. • A numerical algorithm is developed to evaluate the contact angles at the TPB. • Surface tension force is estimated from the contact angles. • The average contact angle values are found to be 143o-156o, 83o-138o and 82o-123o. • Present data are expected to understand degradation and predict evolution of SOFC.

  4. A level set method for image segmentation in the presence of intensity inhomogeneities with application to MRI.

    PubMed

    Li, Chunming; Huang, Rui; Ding, Zhaohua; Gatenby, J Chris; Metaxas, Dimitris N; Gore, John C

    2011-07-01

    Intensity inhomogeneity often occurs in real-world images, which presents a considerable challenge in image segmentation. The most widely used image segmentation algorithms are region-based and typically rely on the homogeneity of the image intensities in the regions of interest, which often fail to provide accurate segmentation results due to the intensity inhomogeneity. This paper proposes a novel region-based method for image segmentation, which is able to deal with intensity inhomogeneities in the segmentation. First, based on the model of images with intensity inhomogeneities, we derive a local intensity clustering property of the image intensities, and define a local clustering criterion function for the image intensities in a neighborhood of each point. This local clustering criterion function is then integrated with respect to the neighborhood center to give a global criterion of image segmentation. In a level set formulation, this criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, by minimizing this energy, our method is able to simultaneously segment the image and estimate the bias field, and the estimated bias field can be used for intensity inhomogeneity correction (or bias correction). Our method has been validated on synthetic images and real images of various modalities, with desirable performance in the presence of intensity inhomogeneities. Experiments show that our method is more robust to initialization, faster and more accurate than the well-known piecewise smooth model. As an application, our method has been used for segmentation and bias correction of magnetic resonance (MR) images with promising results. PMID:21518662

  5. Microporous Cd(II) metal-organic framework as fluorescent sensor for nitroaromatic explosives at the sub-ppm level

    NASA Astrophysics Data System (ADS)

    Wang, Xing-Po; Han, Lu-Lu; Wang, Zhi; Guo, Ling-Yu; Sun, Di

    2016-03-01

    A novel Cd(II) metal-organic framework (MOF) based on a rigid biphenyltetracarboxylic acid, [Cd4(bptc)2(DMA)4(H2O)2·4DMA] (1) was successfully synthesized under the solvothermal condition and characterized by single-crystal X-ray diffraction and further consolidated by elemental analyses, powder X-ray diffraction (PXRD), infrared spectra (IR) and luminescent measurements. Single crystal X-ray diffraction analysis reveals that compound 1 is 4-connected PtS (Point symbol: {42·84}) network based on [Cd2(COO)4] secondary building units (SBUs). Its inherent porous and emissive characteristics make them to be a suitable fluorescent probe to sense small solvents and nitroaromatic explosives. Compound 1 shows obviously solvent-dependent emissive behaviors, especially for acetone with very high fluorescence quenching effect. Moreover, compound 1 displays excellent sensing of nitroaromatic explosives at sub-ppm level, giving a detection limit of 0.43 ppm and 0.37 ppm for nitrobenzene (NB) and p-nitrotoluene (PNT), respectively. This shows this Cd(II) MOF can be used as fluorescence probe for the detection of nitroaromatic explosives.

  6. Determinants of symptom profile and severity of conduct disorder in a tertiary level pediatric care set up: A pilot study

    PubMed Central

    Jayaprakash, R.; Rajamohanan, K.; Anil, P.

    2014-01-01

    Background: Conduct disorders (CDs) are one of the most common causes for referral to child and adolescent mental health centers. CD varies in its environmental factors, symptom profile, severity, co-morbidity, and functional impairment. Aims: The aim was to analyze the determinants of symptom profile and severity among childhood and adolescent onset CD. Settings and Design: Clinic based study with 60 consecutive children between 6 and 18 years of age satisfying International Classification of Disease-10 Development Control Rules guidelines for CD, attending behavioral pediatrics unit outpatient. Materials and Methods: The family psychopathology, symptom severity, and functional level were assessed using parent interview schedule, revised behavioral problem checklist and Children's Global Assessment Scale. Statistical Analysis: The correlation and predictive power of the variables were analyzed using SPSS 16.0 version. Results: There was significant male dominance (88.3%) with boy girl ratio 7.5:1. Most common comorbidity noticed was hyperkinetic disorders (45%). Childhood onset group was more predominant (70%). Prevalence of comorbidity was more among early onset group (66.7%) than the late-onset group (33.3%). The family psychopathology, symptom severity, and the functional impairment were significantly higher in the childhood onset group. Conclusion: The determinants of symptom profile and severity are early onset (childhood onset CD), nature, and quantity of family psychopathology, prevalence, and type of comorbidity and nature of symptom profile itself. The family psychopathology is positively correlated with the symptom severity and negatively correlated with the functional level of the children with CD. The symptom severity was negatively correlated with the functional level of the child with CD. PMID:25568472

  7. Three dimensional level set based semiautomatic segmentation of atherosclerotic carotid artery wall volume using 3D ultrasound imaging

    NASA Astrophysics Data System (ADS)

    Hossain, Md. Murad; AlMuhanna, Khalid; Zhao, Limin; Lal, Brajesh K.; Sikdar, Siddhartha

    2014-03-01

    3D segmentation of carotid plaque from ultrasound (US) images is challenging due to image artifacts and poor boundary definition. Semiautomatic segmentation algorithms for calculating vessel wall volume (VWV) have been proposed for the common carotid artery (CCA) but they have not been applied on plaques in the internal carotid artery (ICA). In this work, we describe a 3D segmentation algorithm that is robust to shadowing and missing boundaries. Our algorithm uses distance regularized level set method with edge and region based energy to segment the adventitial wall boundary (AWB) and lumen-intima boundary (LIB) of plaques in the CCA, ICA and external carotid artery (ECA). The algorithm is initialized by manually placing points on the boundary of a subset of transverse slices with an interslice distance of 4mm. We propose a novel user defined stopping surface based energy to prevent leaking of evolving surface across poorly defined boundaries. Validation was performed against manual segmentation using 3D US volumes acquired from five asymptomatic patients with carotid stenosis using a linear 4D probe. A pseudo gold-standard boundary was formed from manual segmentation by three observers. The Dice similarity coefficient (DSC), Hausdor distance (HD) and modified HD (MHD) were used to compare the algorithm results against the pseudo gold-standard on 1205 cross sectional slices of 5 3D US image sets. The algorithm showed good agreement with the pseudo gold standard boundary with mean DSC of 93.3% (AWB) and 89.82% (LIB); mean MHD of 0.34 mm (AWB) and 0.24 mm (LIB); mean HD of 1.27 mm (AWB) and 0.72 mm (LIB). The proposed 3D semiautomatic segmentation is the first step towards full characterization of 3D plaque progression and longitudinal monitoring.

  8. Evaluation of stopping criteria for level set segmentation of breast masses in contrast-enhanced dedicated breast CT

    NASA Astrophysics Data System (ADS)

    Kuo, H.; Giger, M. L.; Reiser, I.; Boone, J. M.; Lindfors, K. K.; Yang, K.; Edwards, A.

    2012-03-01

    Dedicated breast CT (bCT) is an emerging technology that produces 3D images of the breast, thus allowing radiologists to detect and evaluate breast lesions in 3D. However, assessing potential cancers in the bCT volume can prove time consuming and difficult. Thus, we are developing automated 3D lesion segmentation methods to aid in the interpretation of bCT images. Based on previous studies using a 3D radial-gradient index (RGI) method [1], we are investigating whether 3D active contour segmentation can be applied in 3D to capture additional details of the lesion margin. Our data set includes 40 contract-enhanced bCT scans. Based on a radiologist-marked lesion center of each mass, an initial RGI contour is obtained that serves as the input to an active contour segmentation method. In this study, active contour level set segmentation, an iterative segmentation technique, is extended to 3D. Three stopping criteria are compared, based on 1) the change of volume (ΔV/V), 2) the mean value of the increased volume at each iteratin (dμ/dt), and 3) the changing rate of intensity inside and outside the lesion (Δvw). Lesion segmentation was evaluated by determining the overlap ratio between computer-determined segmentations and manually-drawn lesion outlines. For a given lesion, the overlap ratio was averaged across coronal, sagittal, and axial planes. The average overlap ratios for the three stopping criteria were found to be 0.66 (ΔV/V), 0.68 (dμ/dt), 0.69 (Δvw).

  9. The application of language-game theory to the analysis of science learning: Developing an interpretive classroom-level learning framework

    NASA Astrophysics Data System (ADS)

    Ahmadibasir, Mohammad

    In this study an interpretive learning framework that aims to measure learning on the classroom level is introduced. In order to develop and evaluate the value of the framework, a theoretical/empirical study is designed. The researcher attempted to illustrate how the proposed framework provides insights on the problem of classroom-level learning. The framework is developed by construction of connections between the current literature on science learning and Wittgenstein's language-game theory. In this framework learning is defined as change of classroom language-game or discourse. In the proposed framework, learning is measured by analysis of classroom discourse. The empirical explanation power of the framework is evaluated by applying the framework in the analysis of learning in a fifth-grade science classroom. The researcher attempted to analyze how students' colloquial discourse changed to a discourse that bears more resemblance to science discourse. The results of the empirical part of the investigation are presented in three parts: first, the gap between what students did and what they were supposed to do was reported. The gap showed that students during the classroom inquiry wanted to do simple comparisons by direct observation, while they were supposed to do tool-assisted observation and procedural manipulation for a complete comparison. Second, it was illustrated that the first attempt to connect the colloquial to science discourse was done by what was immediately intelligible for students and then the teacher negotiated with students in order to help them to connect the old to the new language-game more purposefully. The researcher suggested that these two events in the science classroom are critical in discourse change. Third, it was illustrated that through the academic year, the way that students did the act of comparison was improved and by the end of the year more accurate causal inferences were observable in classroom communication. At the end of the

  10. Are Providers More Likely to Contribute to Healthcare Disparities Under High Levels of Cognitive Load? How Features of the Healthcare Setting May Lead to Biases in Medical Decision Making

    PubMed Central

    Burgess, Diana J.

    2014-01-01

    Systematic reviews of healthcare disparities suggest that clinicians’ diagnostic and therapeutic decision making varies by clinically irrelevant characteristics, such as patient race, and that this variation may contribute to healthcare disparities. However, there is little understanding of the particular features of the healthcare setting under which clinicians are most likely to be inappropriately influenced by these characteristics. This study delineates several hypotheses to stimulate future research in this area. It is posited that healthcare settings in which providers experience high levels of cognitive load will increase the likelihood of racial disparities via 2 pathways. First, providers who experience higher levels of cognitive load are hypothesized to make poorer medical decisions and provide poorer care for all patients, due to lower levels of controlled processing (H1). Second, under greater levels of cognitive load, it is hypothesized that healthcare providers’ medical decisions and interpersonal behaviors will be more likely to be influenced by racial stereotypes, leading to poorer processes and outcomes of care for racial minority patients (H2). It is further hypothesized that certain characteristics of healthcare settings will result in higher levels of cognitive load experienced by providers (H3). Finally, it is hypothesized that minority patients will be disproportionately likely to be treated in healthcare settings in which providers experience greater levels of cognitive load (H4a), which will result in racial disparities due to lower levels of controlled processing by providers (H4b) and the influence of racial stereotypes (H4c).The study concludes with implications for research and practice that flow from this framework. PMID:19726783

  11. Health sector priority setting at meso-level in lower and middle income countries: lessons learned, available options and suggested steps.

    PubMed

    Hipgrave, David B; Alderman, Katarzyna Bolsewicz; Anderson, Ian; Soto, Eliana Jimenez

    2014-02-01

    Setting priority for health programming and budget allocation is an important issue, but there is little consensus on related processes. It is particularly relevant in low resource settings and at province- and district- or "meso-level", where contextual influences may be greater, information scarce and capacity lower. Although recent changes in disease epidemiology and health financing suggest even greater need to allocate resources effectively, the literature is relatively silent on evidence-based priority-setting in low and middle income countries (LMICs). We conducted a comprehensive review of the peer-reviewed and grey literature on health resource priority-setting in LMICs, focussing on meso-level and the evidence-based priority-setting processes (PSPs) piloted or suggested there. Our objective was to assess PSPs according to whether they have influenced resource allocation and impacted the outcome indicators prioritised. An exhaustive search of the peer-reviewed and grey literature published in the last decade yielded 57 background articles and 75 reports related to priority-setting at meso-level in LMICs. Although proponents of certain PSPs still advocate their use, other experts instead suggest broader elements to guide priority-setting. We conclude that currently no process can be confidently recommended for such settings. We also assessed the common reasons for failure at all levels of priority-setting and concluded further that local authorities should additionally consider contextual and systems limitations likely to prevent a satisfactory process and outcomes, particularly at meso-level. Recent literature proposes a list of related attributes and warning signs, and facilitated our preparation of a simple decision-tree or roadmap to help determine whether or not health systems issues should be improved in parallel to support for needed priority-setting; what elements of the PSP need improving; monitoring, and evaluation. Health priority-setting at

  12. Optical flow 3D segmentation and interpretation: a variational method with active curve evolution and level sets.

    PubMed

    Mitiche, Amar; Sekkati, Hicham

    2006-11-01

    This study investigates a variational, active curve evolution method for dense three-dimentional (3D) segmentation and interpretation of optical flow in an image sequence of a scene containing moving rigid objects viewed by a possibly moving camera. This method jointly performs 3D motion segmentation, 3D interpretation (recovery of 3D structure and motion), and optical flow estimation. The objective functional contains two data terms for each segmentation region, one based on the motion-only equation which relates the essential parameters of 3D rigid body motion to optical flow, and the other on the Horn and Schunck optical flow constraint. It also contains two regularization terms for each region, one for optical flow, the other for the region boundary. The necessary conditions for a minimum of the functional result in concurrent 3D-motion segmentation, by active curve evolution via level sets, and linear estimation of each region essential parameters and optical flow. Subsequently, the screw of 3D motion and regularized relative depth are recovered analytically for each region from the estimated essential parameters and optical flow. Examples are provided which verify the method and its implementation. PMID:17063686

  13. Developmental Screening Tools: Feasibility of Use at Primary Healthcare Level in Low- and Middle-income Settings

    PubMed Central

    Morris, Jodi; Martines, José

    2014-01-01

    ABSTRACT An estimated 150 million children have a disability. Early identification of developmental disabilities is a high priority for the World Health Organization to allow action to reduce impairments through Gap Action Program on mental health. The study identified the feasibility of using the developmental screening and monitoring tools for children aged 0-3 year(s) by non-specialist primary healthcare providers in low-resource settings. A systematic review of the literature was conducted to identify the tools, assess their psychometric properties, and feasibility of use in low- and middle-income countries (LMICs). Key indicators to examine feasibility in LMICs were derived from a consultation with 23 international experts. We identified 426 studies from which 14 tools used in LMICs were extracted for further examination. Three tools reported adequate psychometric properties and met most of the feasibility criteria. Three tools appear promising for use in identifying and monitoring young children with disabilities at primary healthcare level in LMICs. Further research and development are needed to optimize these tools. PMID:25076668

  14. Using computerised patient-level costing data for setting DRG weights: the Victorian (Australia) cost weight studies.

    PubMed

    Jackson, T

    2001-05-01

    Casemix-funding systems for hospital inpatient care require a set of resource weights which will not inadvertently distort patterns of patient care. Few health systems have very good sources of cost information, and specific studies to derive empirical cost relativities are themselves costly. This paper reports a 5 year program of research into the use of data from hospital management information systems (clinical costing systems) to estimate resource relativities for inpatient hospital care used in Victoria's DRG-based payment system. The paper briefly describes international approaches to cost weight estimation. It describes the architecture of clinical costing systems, and contrasts process and job costing approaches to cost estimation. Techniques of data validation and reliability testing developed in the conduct of four of the first five of the Victorian Cost Weight Studies (1993-1998) are described. Improvement in sampling, data validity and reliability are documented over the course of the research program, the advantages of patient-level data are highlighted. The usefulness of these byproduct data for estimation of relative resource weights and other policy applications may be an important factor in hospital and health system decisions to invest in clinical costing technology. PMID:11275303

  15. A particle-level set-based sharp interface cartesian grid method for impact, penetration, and void collapse

    NASA Astrophysics Data System (ADS)

    Tran, L. B.; Udaykumar, H. S.

    2004-01-01

    An Eulerian, sharp interface, Cartesian grid method is developed for the numerical simulation of the response of materials to impact, shocks and detonations. The mass, momentum, and energy equations are solved along with evolution equations for deviatoric stresses and equivalent plastic strain. These equations are cast in Eulerian conservation law form. The Mie-Grüneisen equation of state is used to obtain pressure and the material is modeled as a Johnson-Cook solid. The ENO scheme is employed to capture shocks in combination with a hybrid particle level set technique to evolve sharp immersed boundaries. The numerical technique is able to handle collisions between multiple materials and can accurately compute the dynamics of the immersed boundaries. Results of calculations for axisymmetric Taylor bar impact and penetration of a Tungsten rod into steel plate show good agreement with moving finite element solutions and experimental results. Qualitative agreement with theory is shown for the void collapse phenomenon in an impacted material containing a spherical void.

  16. Automatic optimal filament segmentation with sub-pixel accuracy using generalized linear models and B-spline level-sets.

    PubMed

    Xiao, Xun; Geyer, Veikko F; Bowne-Anderson, Hugo; Howard, Jonathon; Sbalzarini, Ivo F

    2016-08-01

    Biological filaments, such as actin filaments, microtubules, and cilia, are often imaged using different light-microscopy techniques. Reconstructing the filament curve from the acquired images constitutes the filament segmentation problem. Since filaments have lower dimensionality than the image itself, there is an inherent trade-off between tracing the filament with sub-pixel accuracy and avoiding noise artifacts. Here, we present a globally optimal filament segmentation method based on B-spline vector level-sets and a generalized linear model for the pixel intensity statistics. We show that the resulting optimization problem is convex and can hence be solved with global optimality. We introduce a simple and efficient algorithm to compute such optimal filament segmentations, and provide an open-source implementation as an ImageJ/Fiji plugin. We further derive an information-theoretic lower bound on the filament segmentation error, quantifying how well an algorithm could possibly do given the information in the image. We show that our algorithm asymptotically reaches this bound in the spline coefficients. We validate our method in comprehensive benchmarks, compare with other methods, and show applications from fluorescence, phase-contrast, and dark-field microscopy. PMID:27104582

  17. Correction to ``Extracting Man-Made Objects From High Spatial Resolution Remote Sensing Images via Fast Level Set Evolutions''

    NASA Astrophysics Data System (ADS)

    Li, Zhongbin; Shi, Wenzhong; Wang, Qunming; Miao, Zelang

    2015-10-01

    Object extraction from remote sensing images has long been an intensive research topic in the field of surveying and mapping. Most existing methods are devoted to handling just one type of object and little attention has been paid to improving the computational efficiency. In recent years, level set evolution (LSE) has been shown to be very promising for object extraction in the community of image processing and computer vision because it can handle topological changes automatically while achieving high accuracy. However, the application of state-of-the-art LSEs is compromised by laborious parameter tuning and expensive computation. In this paper, we proposed two fast LSEs for man-made object extraction from high spatial resolution remote sensing images. The traditional mean curvature-based regularization term is replaced by a Gaussian kernel and it is mathematically sound to do that. Thus a larger time step can be used in the numerical scheme to expedite the proposed LSEs. In contrast to existing methods, the proposed LSEs are significantly faster. Most importantly, they involve much fewer parameters while achieving better performance. The advantages of the proposed LSEs over other state-of-the-art approaches have been verified by a range of experiments.

  18. Cardiac Multi-detector CT Segmentation Based on Multiscale Directional Edge Detector and 3D Level Set.

    PubMed

    Antunes, Sofia; Esposito, Antonio; Palmisano, Anna; Colantoni, Caterina; Cerutti, Sergio; Rizzo, Giovanna

    2016-05-01

    Extraction of the cardiac surfaces of interest from multi-detector computed tomographic (MDCT) data is a pre-requisite step for cardiac analysis, as well as for image guidance procedures. Most of the existing methods need manual corrections, which is time-consuming. We present a fully automatic segmentation technique for the extraction of the right ventricle, left ventricular endocardium and epicardium from MDCT images. The method consists in a 3D level set surface evolution approach coupled to a new stopping function based on a multiscale directional second derivative Gaussian filter, which is able to stop propagation precisely on the real boundary of the structures of interest. We validated the segmentation method on 18 MDCT volumes from healthy and pathologic subjects using manual segmentation performed by a team of expert radiologists as gold standard. Segmentation errors were assessed for each structure resulting in a surface-to-surface mean error below 0.5 mm and a percentage of surface distance with errors less than 1 mm above 80%. Moreover, in comparison to other segmentation approaches, already proposed in previous work, our method presented an improved accuracy (with surface distance errors less than 1 mm increased of 8-20% for all structures). The obtained results suggest that our approach is accurate and effective for the segmentation of ventricular cavities and myocardium from MDCT images. PMID:26319010

  19. Mechanical behavior of pathological and normal red blood cells in microvascular flow based on modified level-set method

    NASA Astrophysics Data System (ADS)

    Zhang, XiWen; Ma, FangChao; Hao, PengFei; Yao, ZhaoHui

    2016-01-01

    The research of the motion and deformation of the RBCs is important to reveal the mechanism of blood diseases. A numerical method has been developed with level set formulation for elastic membrane immersed in incompressible fluid. The numerical model satisfies mass and energy conservation without the leaking problems in classical Immersed Boundary Method (IBM), at the same time, computing grid we used can be much smaller than the general literatures. The motion and deformation of a red blood cell (including pathological & normal status) in microvascular flow are simulated. It is found that the Reynolds number and membrane's stiffness play an important role in the transmutation and oscillation of the elastic membrane. The normal biconcave shape of the RBC is propitious to create high deformation than other pathological shapes. With reduced viscosity of the interior fluid both the velocity of the blood and the deformability of the cell reduced. With increased viscosity of the plasma both the velocity of the blood and the deformability of the cell reduced. The tank treading of the RBC membrane is observed at low enough viscosity contrast in shear flow. The tank tread fixed inclination angle of the cell depends on the shear ratio and viscosity contrast, which can be compared with the experimental observation well.

  20. On the use of the resting potential and level set methods for identifying ischemic heart disease: An inverse problem

    NASA Astrophysics Data System (ADS)

    Nielsen, Bjørn Fredrik; Lysaker, Marius; Tveito, Aslak

    2007-01-01

    The electrical activity in the heart is modeled by a complex, nonlinear, fully coupled system of differential equations. Several scientists have studied how this model, referred to as the bidomain model, can be modified to incorporate the effect of heart infarctions on simulated ECG (electrocardiogram) recordings. We are concerned with the associated inverse problem; how can we use ECG recordings and mathematical models to identify the position, size and shape of heart infarctions? Due to the extreme CPU efforts needed to solve the bidomain equations, this model, in its full complexity, is not well-suited for this kind of problems. In this paper we show how biological knowledge about the resting potential in the heart and level set techniques can be combined to derive a suitable stationary model, expressed in terms of an elliptic PDE, for such applications. This approach leads to a nonlinear ill-posed minimization problem, which we propose to regularize and solve with a simple iterative scheme. Finally, our theoretical findings are illuminated through a series of computer simulations for an experimental setup involving a realistic heart in torso geometry. More specifically, experiments with synthetic ECG recordings, produced by solving the bidomain model, indicate that our method manages to identify the physical characteristics of the ischemic region(s) in the heart. Furthermore, the ill-posed nature of this inverse problem is explored, i.e. several quantitative issues of our scheme are explored.

  1. A hybrid smoothed extended finite element/level set method for modeling equilibrium shapes of nano-inhomogeneities

    NASA Astrophysics Data System (ADS)

    Zhao, Xujun; Bordas, Stéphane P. A.; Qu, Jianmin

    2013-12-01

    Interfacial energy plays an important role in equilibrium morphologies of nanosized microstructures of solid materials due to the high interface-to-volume ratio, and can no longer be neglected as it does in conventional mechanics analysis. When designing nanodevices and to understand the behavior of materials at the nano-scale, this interfacial energy must therefore be taken into account. The present work develops an effective numerical approach by means of a hybrid smoothed extended finite element/level set method to model nanoscale inhomogeneities with interfacial energy effect, in which the finite element mesh can be completely independent of the interface geometry. The Gurtin-Murdoch surface elasticity model is used to account for the interface stress effect and the Wachspress interpolants are used for the first time to construct the shape functions in the smoothed extended finite element method. Selected numerical results are presented to study the accuracy and efficiency of the proposed method as well as the equilibrium shapes of misfit particles in elastic solids. The presented results compare very well with those obtained from theoretical solutions and experimental observations, and the computational efficiency of the method is shown to be superior to that of its most advanced competitor.

  2. A Research Study Using the Delphi Method to Define Essential Competencies for a High School Game Art and Design Course Framework at the National Level

    ERIC Educational Resources Information Center

    Mack, Nayo Corenus-Geneva

    2011-01-01

    This research study reports the findings of a Delphi study conducted to determine the essential competencies and objectives for a high school Game Art and Design course framework at the national level. The Delphi panel consisted of gaming, industry and educational experts from all over the world who were members of the International Game…

  3. Joint optimization of segmentation and shape prior from level-set-based statistical shape model, and its application to the automated segmentation of abdominal organs.

    PubMed

    Saito, Atsushi; Nawano, Shigeru; Shimizu, Akinobu

    2016-02-01

    The goal of this study is to provide a theoretical framework for accurately optimizing the segmentation energy considering all of the possible shapes generated from the level-set-based statistical shape model (SSM). The proposed algorithm solves the well-known open problem, in which a shape prior may not be optimal in terms of an objective functional that needs to be minimized during segmentation. The algorithm allows the selection of an optimal shape prior from among all possible shapes generated from an SSM by conducting a branch-and-bound search over an eigenshape space. The proposed algorithm does not require predefined shape templates or the construction of a hierarchical clustering tree before graph-cut segmentation. It jointly optimizes an objective functional in terms of both the shape prior and segmentation labeling, and finds an optimal solution by considering all possible shapes generated from an SSM. We apply the proposed algorithm to both pancreas and spleen segmentation using multiphase computed tomography volumes, and we compare the results obtained with those produced by a conventional algorithm employing a branch-and-bound search over a search tree of predefined shapes, which were sampled discretely from an SSM. The proposed algorithm significantly improves the segmentation performance in terms of the Jaccard index and Dice similarity index. In addition, we compare the results with the state-of-the-art multiple abdominal organs segmentation algorithm, and confirmed that the performances of both algorithms are comparable to each other. We discuss the high computational efficiency of the proposed algorithm, which was determined experimentally using a normalized number of traversed nodes in a search tree, and the extensibility of the proposed algorithm to other SSMs or energy functionals. PMID:26716720

  4. Assessment of the Coastal Landscape Response to Sea-Level Rise Using a Decision-Support Framework

    NASA Astrophysics Data System (ADS)

    Lentz, E. E.; Thieler, E. R.; Plant, N. G.; Stippa, S.; Horton, R. M.; Gesch, D. B.

    2014-12-01

    Identifying the form and nature of coastal landscape changes that may occur in response to future sea-level rise (SLR) is essential to support decision making for resource allocation that improves climate change resilience. Both natural ecosystems and the built environment are subject to these changes and require associated resilience assessments. Existing assessments of coastal change driven by SLR typically focus on two categories of coastal response: 1) inundation by flooding as the water level rises; and 2) dynamic change resulting from movement of landforms and/or ecosystems. Results from these assessments are not always straightforward to apply in a decision support context, as it can be unclear what the dominant response type may be in a given coastal setting (e.g., barrier island, headland, wetland, forest). Furthermore, an important decision support element is to capture and clearly convey the associated uncertainty of both the underlying datasets (e.g., elevation) and climate drivers (e.g., relative SLR). We developed a Bayesian network model of SLR assessment that uses publicly available geospatial datasets—land cover, elevation, and vertical land movement—and their associated uncertainties to generate probabilistic predictions of those areas likely to inundate versus dynamically respond to various SLR scenarios. SLR projections were generated using multiple sources of information, including Coupled Model Intercomparison Project Phase 5 (CMIP5) models. Model outputs include predictions of potential future land-surface elevation and coastal response type at landscape (>100 km) to local (5-10 km) scales for the Northeastern U.S., commensurate with decision-making needs. The probabilistic approach allows us to objectively and transparently describe prediction certainty to decision makers. From this approach, we are also able to highlight areas in which more data or knowledge may be needed to provide a stronger basis for decision making.

  5. From papers to practices: district level priority setting processes and criteria for family planning, maternal, newborn and child health interventions in Tanzania

    PubMed Central

    2011-01-01

    Background Successful priority setting is increasingly known to be an important aspect in achieving better family planning, maternal, newborn and child health (FMNCH) outcomes in developing countries. However, far too little attention has been paid to capturing and analysing the priority setting processes and criteria for FMNCH at district level. This paper seeks to capture and analyse the priority setting processes and criteria for FMNCH at district level in Tanzania. Specifically, we assess the FMNCH actor's engagement and understanding, the criteria used in decision making and the way criteria are identified, the information or evidence and tools used to prioritize FMNCH interventions at district level in Tanzania. Methods We conducted an exploratory study mixing both qualitative and quantitative methods to capture and analyse the priority setting for FMNCH at district level, and identify the criteria for priority setting. We purposively sampled the participants to be included in the study. We collected the data using the nominal group technique (NGT), in-depth interviews (IDIs) with key informants and documentary review. We analysed the collected data using both content analysis for qualitative data and correlation analysis for quantitative data. Results We found a number of shortfalls in the district's priority setting processes and criteria which may lead to inefficient and unfair priority setting decisions in FMNCH. In addition, participants identified the priority setting criteria and established the perceived relative importance of the identified criteria. However, we noted differences exist in judging the relative importance attached to the criteria by different stakeholders in the districts. Conclusions In Tanzania, FMNCH contents in both general development policies and sector policies are well articulated. However, the current priority setting process for FMNCH at district levels are wanting in several aspects rendering the priority setting process for

  6. The Resilience Activation Framework: A conceptual model of how access to social resources promotes adaptation and rapid recovery in post-disaster settings

    PubMed Central

    Abramson, David M.; Grattan, Lynn M.; Mayer, Brian; Colten, Craig E.; Arosemena, Farah A.; Rung, Ariane; Lichtveld, Maureen

    2014-01-01

    A number of governmental agencies have called for enhancing citizen’s resilience as a means of preparing populations in advance of disasters, and as a counter-balance to social and individual vulnerabilities. This increasing scholarly, policy and programmatic interest in promoting individual and communal resilience presents a challenge to the research and practice communities: to develop a translational framework that can accommodate multi-disciplinary scientific perspectives into a single, applied model. The Resilience Activation Framework provides a basis for testing how access to social resources, such as formal and informal social support and help, promotes positive adaptation or reduced psychopathology among individuals and communities exposed to the acute collective stressors associated with disasters, whether manmade, natural, or technological in origin. Articulating the mechanisms by which access to social resources activate and sustain resilience capacities for optimal mental health outcomes post-disaster can lead to the development of effective preventive and early intervention programs. PMID:24870399

  7. A Simple Model Framework to Explore the Deeply Uncertain, Local Sea Level Response to Climate Change. A Case Study on New Orleans, Louisiana

    NASA Astrophysics Data System (ADS)

    Bakker, Alexander; Louchard, Domitille; Keller, Klaus

    2016-04-01

    Sea-level rise threatens many coastal areas around the world. The integrated assessment of potential adaptation and mitigation strategies requires a sound understanding of the upper tails and the major drivers of the uncertainties. Global warming causes sea-level to rise, primarily due to thermal expansion of the oceans and mass loss of the major ice sheets, smaller ice caps and glaciers. These components show distinctly different responses to temperature changes with respect to response time, threshold behavior, and local fingerprints. Projections of these different components are deeply uncertain. Projected uncertainty ranges strongly depend on (necessary) pragmatic choices and assumptions; e.g. on the applied climate scenarios, which processes to include and how to parameterize them, and on error structure of the observations. Competing assumptions are very hard to objectively weigh. Hence, uncertainties of sea-level response are hard to grasp in a single distribution function. The deep uncertainty can be better understood by making clear the key assumptions. Here we demonstrate this approach using a relatively simple model framework. We present a mechanistically motivated, but simple model framework that is intended to efficiently explore the deeply uncertain sea-level response to anthropogenic climate change. The model consists of 'building blocks' that represent the major components of sea-level response and its uncertainties, including threshold behavior. The framework's simplicity enables the simulation of large ensembles allowing for an efficient exploration of parameter uncertainty and for the simulation of multiple combined adaptation and mitigation strategies. The model framework can skilfully reproduce earlier major sea level assessments, but due to the modular setup it can also be easily utilized to explore high-end scenarios and the effect of competing assumptions and parameterizations.

  8. Gender Mainstreaming in Education at the Level of Field Operations: The Case of CARE USA's Indicator Framework

    ERIC Educational Resources Information Center

    Miske, Shirley; Meagher, Margaret; DeJaeghere, Joan

    2010-01-01

    Following the adoption of gender mainstreaming at the Beijing Conference for Women in 1995 as a major strategy to promote gender equality and the recognition of gender analysis as central to this process, Gender and Development (GAD) frameworks have provided tools for gender analysis in various sectors. Gender mainstreaming in basic education has…

  9. Self-Compassion: A Mentorship Framework for Counselor Educator Mothers

    ERIC Educational Resources Information Center

    Solomon, Coralis; Barden, Sejal Mehta

    2016-01-01

    Counselor educators experience high levels of stress. Mothers in academia face an additional set of emotional stressors. The authors offer a self-compassion framework for mentors to increase emotional resilience of mothers in counselor education.

  10. The Development of a Resource for Physically Active School Settings

    ERIC Educational Resources Information Center

    Bradley, Vicki R.; O'Connor, Justen P.

    2009-01-01

    This project describes the development of a resource designed to facilitate the exploration of factors influencing physical activity within school settings across multiple levels. Using a socio-ecological framework, the study draws upon factors across three domains that potentially impact physical activity levels within school settings: The…

  11. The relative noise levels of parallel axis gear sets with various contact ratios and gear tooth forms

    NASA Technical Reports Server (NTRS)

    Drago, Raymond J.; Lenski, Joseph W., Jr.; Spencer, Robert H.; Valco, Mark; Oswald, Fred B.

    1993-01-01

    The real noise reduction benefits which may be obtained through the use of one gear tooth form as compared to another is an important design parameter for any geared system, especially for helicopters in which both weight and reliability are very important factors. This paper describes the design and testing of nine sets of gears which are as identical as possible except for their basic tooth geometry. Noise measurements were made at various combinations of load and speed for each gear set so that direct comparisons could be made. The resultant data was analyzed so that valid conclusions could be drawn and interpreted for design use.

  12. Robust Systems Test Framework

    SciTech Connect

    Ballance, Robert A.

    2003-01-01

    The Robust Systems Test Framework (RSTF) provides a means of specifying and running test programs on various computation platforms. RSTF provides a level of specification above standard scripting languages. During a set of runs, standard timing information is collected. The RSTF specification can also gather job-specific information, and can include ways to classify test outcomes. All results and scripts can be stored into and retrieved from an SQL database for later data analysis. RSTF also provides operations for managing the script and result files, and for compiling applications and gathering compilation information such as optimization flags.

  13. Robust Systems Test Framework

    Energy Science and Technology Software Center (ESTSC)

    2003-01-01

    The Robust Systems Test Framework (RSTF) provides a means of specifying and running test programs on various computation platforms. RSTF provides a level of specification above standard scripting languages. During a set of runs, standard timing information is collected. The RSTF specification can also gather job-specific information, and can include ways to classify test outcomes. All results and scripts can be stored into and retrieved from an SQL database for later data analysis. RSTF alsomore » provides operations for managing the script and result files, and for compiling applications and gathering compilation information such as optimization flags.« less

  14. Pedometer-Based Physical Activity Level and Body Composition among Minority Children in a Physical Activity Setting

    ERIC Educational Resources Information Center

    Agbuga, Bulent

    2011-01-01

    Most studies focusing on the relationship between physical activity and obesity have been conducted in middle class Caucasian adults and children and few such studies are available concerning minority children in physical activity settings (Johnson, Kulinna, Tudor-Locke, Darst, & Pangrazi, 2007; Rowlands et al., 1999; Tudor-Locke, Lee, Morgan,…

  15. Critical Skill Sets of Entry-Level IT Professionals: An Empirical Examination of Perceptions from Field Personnel

    ERIC Educational Resources Information Center

    McMurtrey, Mark E.; Downey, James P.; Zeltmann, Steven M.; Friedman, William H.

    2008-01-01

    Understanding the skill sets required of IT personnel is a critical endeavor for both business organizations and academic or training institutions. Companies spend crucial resources training personnel, particularly new IT employees, and educational institutions must know what skills are essential in order to plan an effective curriculum. Rapid…

  16. End of FY10 report - used fuel disposition technical bases and lessons learned : legal and regulatory framework for high-level waste disposition in the United States.

    SciTech Connect

    Weiner, Ruth F.; Blink, James A.; Rechard, Robert Paul; Perry, Frank; Jenkins-Smith, Hank C.; Carter, Joe; Nutt, Mark; Cotton, Tom

    2010-09-01

    This report examines the current policy, legal, and regulatory framework pertaining to used nuclear fuel and high level waste management in the United States. The goal is to identify potential changes that if made could add flexibility and possibly improve the chances of successfully implementing technical aspects of a nuclear waste policy. Experience suggests that the regulatory framework should be established prior to initiating future repository development. Concerning specifics of the regulatory framework, reasonable expectation as the standard of proof was successfully implemented and could be retained in the future; yet, the current classification system for radioactive waste, including hazardous constituents, warrants reexamination. Whether or not consideration of multiple sites are considered simultaneously in the future, inclusion of mechanisms such as deliberate use of performance assessment to manage site characterization would be wise. Because of experience gained here and abroad, diversity of geologic media is not particularly necessary as a criterion in site selection guidelines for multiple sites. Stepwise development of the repository program that includes flexibility also warrants serious consideration. Furthermore, integration of the waste management system from storage, transportation, and disposition, should be examined and would be facilitated by integration of the legal and regulatory framework. Finally, in order to enhance acceptability of future repository development, the national policy should be cognizant of those policy and technical attributes that enhance initial acceptance, and those policy and technical attributes that maintain and broaden credibility.

  17. Infodemiology and Infoveillance: Framework for an Emerging Set of Public Health Informatics Methods to Analyze Search, Communication and Publication Behavior on the Internet

    PubMed Central

    2009-01-01

    Infodemiology can be defined as the science of distribution and determinants of information in an electronic medium, specifically the Internet, or in a population, with the ultimate aim to inform public health and public policy. Infodemiology data can be collected and analyzed in near real time. Examples for infodemiology applications include: the analysis of queries from Internet search engines to predict disease outbreaks (eg. influenza); monitoring peoples' status updates on microblogs such as Twitter for syndromic surveillance; detecting and quantifying disparities in health information availability; identifying and monitoring of public health relevant publications on the Internet (eg. anti-vaccination sites, but also news articles or expert-curated outbreak reports); automated tools to measure information diffusion and knowledge translation, and tracking the effectiveness of health marketing campaigns. Moreover, analyzing how people search and navigate the Internet for health-related information, as well as how they communicate and share this information, can provide valuable insights into health-related behavior of populations. Seven years after the infodemiology concept was first introduced, this paper revisits the emerging fields of infodemiology and infoveillance and proposes an expanded framework, introducing some basic metrics such as information prevalence, concept occurrence ratios, and information incidence. The framework distinguishes supply-based applications (analyzing what is being published on the Internet, eg. on Web sites, newsgroups, blogs, microblogs and social media) from demand-based methods (search and navigation behavior), and further distinguishes passive from active infoveillance methods. Infodemiology metrics follow population health relevant events or predict them. Thus, these metrics and methods are potentially useful for public health practice and research, and should be further developed and standardized. PMID:19329408

  18. Infodemiology and infoveillance: framework for an emerging set of public health informatics methods to analyze search, communication and publication behavior on the Internet.

    PubMed

    Eysenbach, Gunther

    2009-01-01

    Infodemiology can be defined as the science of distribution and determinants of information in an electronic medium, specifically the Internet, or in a population, with the ultimate aim to inform public health and public policy. Infodemiology data can be collected and analyzed in near real time. Examples for infodemiology applications include the analysis of queries from Internet search engines to predict disease outbreaks (eg. influenza), monitoring peoples' status updates on microblogs such as Twitter for syndromic surveillance, detecting and quantifying disparities in health information availability, identifying and monitoring of public health relevant publications on the Internet (eg. anti-vaccination sites, but also news articles or expert-curated outbreak reports), automated tools to measure information diffusion and knowledge translation, and tracking the effectiveness of health marketing campaigns. Moreover, analyzing how people search and navigate the Internet for health-related information, as well as how they communicate and share this information, can provide valuable insights into health-related behavior of populations. Seven years after the infodemiology concept was first introduced, this paper revisits the emerging fields of infodemiology and infoveillance and proposes an expanded framework, introducing some basic metrics such as information prevalence, concept occurrence ratios, and information incidence. The framework distinguishes supply-based applications (analyzing what is being published on the Internet, eg. on Web sites, newsgroups, blogs, microblogs and social media) from demand-based methods (search and navigation behavior), and further distinguishes passive from active infoveillance methods. Infodemiology metrics follow population health relevant events or predict them. Thus, these metrics and methods are potentially useful for public health practice and research, and should be further developed and standardized. PMID:19329408

  19. Critical Review: Building on the HIV Cascade: A Complementary "HIV States and Transitions" Framework for Describing HIV Diagnosis, Care, and Treatment at the Population Level.

    PubMed

    Powers, Kimberly A; Miller, William C

    2015-07-01

    The HIV cascade--often referred to as "the HIV continuum"--provides a valuable framework for population-level representations of engagement with the HIV healthcare system. The importance and appeal of this framework are evidenced by a large body of scientific literature, as well as by the adoption of cascade-related indicators by medical and public health organizations worldwide. Despite its centrality in the fields of HIV treatment and prevention, however, the traditional cascade provides limited description of the processes affecting the numbers it represents. Representations that describe these processes and capture the dynamic nature of HIV-infected persons' pathways through the healthcare system are essential for monitoring and predicting intervention effects and epidemic trends. We propose here a complementary schema--termed the "HIV States and Transitions" framework--designed to maintain key strengths of the traditional cascade while addressing key limitations and more fully describing the dynamic aspects of HIV testing, care, and treatment at the population level. PMID:25835604

  20. The Simple View of Reading as a Framework for National Literacy Initiatives: A Hierarchical Model of Pupil-Level and Classroom-Level Factors

    ERIC Educational Resources Information Center

    Savage, Robert; Burgos, Giovani; Wood, Eileen; Piquette, Noella

    2015-01-01

    The Simple View of Reading (SVR) describes Reading Comprehension as the product of distinct child-level variance in decoding (D) and linguistic comprehension (LC) component abilities. When used as a model for educational policy, distinct classroom-level influences of each of the components of the SVR model have been assumed, but have not yet been…

  1. Integrating Frequency-Based Mathematics Instruction with a Multi-Level Assessment System to Enhance Response to Intervention Frameworks

    ERIC Educational Resources Information Center

    Moors, Alison; Weisenburgh-Snyder, Amy; Robbins, Joanne

    2010-01-01

    The American government set new standards mandating States to demonstrate adequate yearly progress for all students with the inception of the No Child Left Behind Act. To be eligible for the more recent Race to the Top funds, states must show, in part, a commitment to "building data systems that measure student growth and success, and inform…

  2. European Qualifications Framework and the Comparison of Academically-Oriented and Professionally-Oriented Master's Degrees

    ERIC Educational Resources Information Center

    Isopahkala-Bouret, Ulpukka; Rantanen, Teemu; Raij, Katariina; Järveläinen, Eeva

    2011-01-01

    With the moderating power of the European Qualification Framework (EQF), European higher education has, for some time, been moving in a competence-oriented direction in some educational systems. The EQF is a competence framework that defines standards to all levels of qualification. The framework is set up to facilitate comparison of…

  3. A New Kernel-Based Fuzzy Level Set Method for Automated Segmentation of Medical Images in the Presence of Intensity Inhomogeneity

    PubMed Central

    Shanbehzadeh, Jamshid

    2014-01-01

    Researchers recently apply an integrative approach to automate medical image segmentation for benefiting available methods and eliminating their disadvantages. Intensity inhomogeneity is a challenging and open problem in this area, which has received less attention by this approach. It has considerable effects on segmentation accuracy. This paper proposes a new kernel-based fuzzy level set algorithm by an integrative approach to deal with this problem. It can directly evolve from the initial level set obtained by Gaussian Kernel-Based Fuzzy C-Means (GKFCM). The controlling parameters of level set evolution are also estimated from the results of GKFCM. Moreover the proposed algorithm is enhanced with locally regularized evolution based on an image model that describes the composition of real-world images, in which intensity inhomogeneity is assumed as a component of an image. Such improvements make level set manipulation easier and lead to more robust segmentation in intensity inhomogeneity. The proposed algorithm has valuable benefits including automation, invariant of intensity inhomogeneity, and high accuracy. Performance evaluation of the proposed algorithm was carried on medical images from different modalities. The results confirm its effectiveness for medical image segmentation. PMID:24624225

  4. Characteristics of Students Receiving Special Education Services in a Central Minnesota School District According to Setting, Classification, and Level of Service.

    ERIC Educational Resources Information Center

    Ittenbach, Richard F.; And Others

    The records of 1,231 preschool, elementary, and secondary students receiving special education services in a central Minnesota school district were evaluated to provide information on differences according to setting, classification, and level of service. Data were analyzed within the context of four broad domains: demographics (age, race, gender,…

  5. Entry-Level Athletic Trainers' Self-Confidence in Clinical Skill Preparedness for Treating Athletic and Emergent Settings Populations

    ERIC Educational Resources Information Center

    Morin, Gary E.; Misasi, Sharon; Davis, Charles; Hannah, Corey; Rothbard, Matthew

    2014-01-01

    Context: Clinical education is an important component of athletic training education. Concern exists regarding whether clinical experience adequately prepares students to perform professional skills after graduation, particularly with patients in emerging settings. Objective: To determine the confidence levels of athletic training graduates in…

  6. A Ghost Fluid/Level Set Method for boiling flows and liquid evaporation: Application to the Leidenfrost effect

    NASA Astrophysics Data System (ADS)

    Rueda Villegas, Lucia; Alis, Romain; Lepilliez, Mathieu; Tanguy, Sébastien

    2016-07-01

    The development of numerical methods for the direct numerical simulation of two-phase flows with phase change, in the framework of interface capturing or interface tracking methods, is the main topic of this study. We propose a novel numerical method, which allows dealing with both evaporation and boiling at the interface between a liquid and a gas. Indeed, in some specific situations involving very heterogeneous thermodynamic conditions at the interface, the distinction between boiling and evaporation is not always possible. For instance, it can occur for a Leidenfrost droplet; a water drop levitating above a hot plate whose temperature is much higher than the boiling temperature. In this case, boiling occurs in the film of saturated vapor which is entrapped between the bottom of the drop and the plate, whereas the top of the water droplet evaporates in contact of ambient air. The situation can also be ambiguous for a superheated droplet or at the contact line between a liquid and a hot wall whose temperature is higher than the saturation temperature of the liquid. In these situations, the interface temperature can locally reach the saturation temperature (boiling point), for instance near a contact line, and be cooler in other places. Thus, boiling and evaporation can occur simultaneously on different regions of the same liquid interface or occur successively at different times of the history of an evaporating droplet. Standard numerical methods are not able to perform computations in these transient regimes, therefore, we propose in this paper a novel numerical method to achieve this challenging task. Finally, we present several accuracy validations against theoretical solutions and experimental results to strengthen the relevance of this new method.

  7. "Notice the Similarities between the Two Sets …": Imperative Usage in a Corpus of Upper-Level Student Papers

    ERIC Educational Resources Information Center

    Neiderhiser, Justine A.; Kelley, Patrick; Kennedy, Kohlee M.; Swales, John M.; Vergaro, Carla

    2016-01-01

    The sparse literature on the use of imperatives in research papers suggests that they are relatively common in a small number of disciplines, but rare, if used at all, in others. The present study addresses the use of imperatives in a corpus of upper-level A-graded student papers from 16 disciplines. A total of 822 papers collected within the past…

  8. Splitting of the Low Landau Levels into a Set of Positive Lebesgue Measure under Small Periodic Perturbations

    NASA Astrophysics Data System (ADS)

    Dinaburg, E. I.; Sinai, Ya. G.; Soshnikov, A. B.

    We study the spectral properties of a two-dimensional Schrödinger operator with a uniform magnetic field and a small external periodic field: where and , are small parameters. Representing as the direct integral of one-dimensional quasi-periodic difference operators with long-range potential and employing recent results of E.I.Dinaburg about Anderson localization for such operators (we assume to be typical irrational) we construct the full set of generalised eigenfunctions for the low Landau bands. We also show that the Lebesgue measure of the low bands is positive and proportional in the main order to .

  9. Selection of the sub-noise gain level for acquisition of VOCAL data sets: a reliability study.

    PubMed

    Sanderson, Jennifer; Wu, Linda; Mahajan, Aditi; Meriki, Neama; Henry, Amanda; Welsh, Alec W

    2014-03-01

    This study was aimed at assessing the intra-observer and inter-observer repeatability of selecting the sub-noise gain (SNG) level when acquiring placental volumes with 3-D power Doppler for analysis using virtual organ computer-aided analysis (VOCAL). Sixty women with uncomplicated singleton pregnancies between 20 and 38 wk of gestation were recruited. Two women were excluded for flash artifact noted during image analysis. Two blinded observers independently adjusted gain to their perceived SNG level before acquiring a static 3-D volume of the placenta at the cord insertion; observers alternated after each acquisition until each had acquired two volumes. A single observer operated the probe at all times. During offline analysis, SNG levels were recorded and VOCAL indices were calculated. SNG exhibited excellent intra-observer and inter-observer reliability. Intra-observer intra-class correlation coefficients (95% confidence intervals) were 0.98 (0.97-0.99) and 0.98 (0.98-0.99) for observers 1 and 2, respectively. The inter-observer intra-class correlation coefficient was 0.96 (0.93-0.98). Despite its perceived inherent subjectivity, the excellent intra-class correlation coefficients obtained in this study support SNG as a promising tool for future research using 3-D power Doppler. PMID:24361225

  10. Levels of Reconstruction as Complementarity in Mixed Methods Research: A Social Theory-Based Conceptual Framework for Integrating Qualitative and Quantitative Research

    PubMed Central

    Carroll, Linda J.; Rothe, J. Peter

    2010-01-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson’s metaphysical work on the ‘ways of knowing’. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions. PMID:20948937

  11. The impact of justice climate and justice orientation on work outcomes: a cross-level multifoci framework.

    PubMed

    Liao, Hui; Rupp, Deborah E

    2005-03-01

    In this article, which takes a person-situation approach, the authors propose and test a cross-level multifoci model of workplace justice. They crossed 3 types of justice (procedural, informational, and interpersonal) with 2 foci (organization and supervisor) and aggregated to the group level to create 6 distinct justice climate variables. They then tested for the effects of these variables on either organization-directed or supervisor-directed commitment, satisfaction, and citizenship behavior. The authors also tested justice orientation as a moderator of these relationships. The results, based on 231 employees constituting 44 work groups representing multiple organizations and occupations, revealed that 4 forms of justice climate (organization-focused procedural and informational justice climate and supervisor-focused procedural and interpersonal justice climate) were significantly related to various work outcomes after controlling for corresponding individual-level justice perceptions. In addition, some moderation effects were found. Implications for organizations and future research are discussed. PMID:15769235

  12. The politics of agenda setting at the global level: key informant interviews regarding the International Labour Organization Decent Work Agenda

    PubMed Central

    2014-01-01

    Background Global labour markets continue to undergo significant transformations resulting from socio-political instability combined with rises in structural inequality, employment insecurity, and poor working conditions. Confronted by these challenges, global institutions are providing policy guidance to protect and promote the health and well-being of workers. This article provides an account of how the International Labour Organization’s Decent Work Agenda contributes to the work policy agendas of the World Health Organization and the World Bank. Methods This qualitative study involved semi-structured interviews with representatives from three global institutions – the International Labour Organization (ILO), the World Health Organization and the World Bank. Of the 25 key informants invited to participate, 16 took part in the study. Analysis for key themes was followed by interpretation using selected agenda setting theories. Results Interviews indicated that through the Decent Work Agenda, the International Labour Organization is shaping the global policy narrative about work among UN agencies, and that the pursuit of decent work and the Agenda were perceived as important goals with the potential to promote just policies. The Agenda was closely linked to the World Health Organization’s conception of health as a human right. However, decent work was consistently identified by World Bank informants as ILO terminology in contrast to terms such as job creation and job access. The limited evidence base and its conceptual nature were offered as partial explanations for why the Agenda has yet to fully influence other global institutions. Catalytic events such as the economic crisis were identified as creating the enabling conditions to influence global work policy agendas. Conclusions Our evidence aids our understanding of how an issue like decent work enters and stays on the policy agendas of global institutions, using the Decent Work Agenda as an illustrative

  13. War exposure, daily stressors, and mental health in conflict and post-conflict settings: bridging the divide between trauma-focused and psychosocial frameworks.

    PubMed

    Miller, Kenneth E; Rasmussen, Andrew

    2010-01-01

    This paper seeks to bridge the divisive split between advocates of trauma-focused and psychosocial approaches to understanding and addressing mental health needs in conflict and post-conflict settings by emphasizing the role that daily stressors play in mediating direct war exposure and mental health outcomes. The authors argue that trauma-focused advocates tend to overemphasize the impact of direct war exposure on mental health, and fail to consider the contribution of stressful social and material conditions (daily stressors). Drawing on the findings of recent studies that have examined the relationship of both war exposure and daily stressors to mental health status, a model is proposed in which daily stressors partially mediate the relationship of war exposure to mental health. Based on that model, and on the growing body of research that supports it, an integrative, sequenced approach to intervention is proposed in which daily stressors are first addressed, and specialized interventions are then provided for individuals whose distress does not abate with the repair of the social ecology. PMID:19854552

  14. Developing a Leveling Framework of Mathematical Belief and Mathematical Knowledge for Teaching of Indonesian Pre-Service Teachers

    ERIC Educational Resources Information Center

    Novikasari, Ifada; Darhim, Didi Suryadi

    2015-01-01

    This study explored the characteristics of pre-service primary teachers (PSTs) influenced by mathematical belief and mathematical knowledge for teaching (MKT) PSTs'. A qualitative approach was used to investigate the levels of PSTs on mathematical belief and MKT. The two research instruments used in this study were an interview-based task and a…

  15. Assessing the Potential for Openness: A Framework for Examining Course-Level OER Implementation in Higher Education

    ERIC Educational Resources Information Center

    Judith, Kate; Bull, David

    2016-01-01

    The implementation of open educational resources (OER) at the course level in higher education poses numerous challenges to education practitioners--ranging from discoverability challenges to the lack of knowledge on how to best localize and utilize OER as courseware. Drawing on case studies of OER initiatives globally, the article discusses…

  16. toyLIFE: a computational framework to study the multi-level organisation of the genotype-phenotype map

    PubMed Central

    Arias, Clemente F.; Catalán, Pablo; Manrubia, Susanna; Cuesta, José A.

    2014-01-01

    The genotype-phenotype map is an essential object to understand organismal complexity and adaptability. However, its experimental characterisation is a daunting task. Thus, simple models have been proposed and investigated. They have revealed that genotypes differ in their robustness to mutations; phenotypes are represented by a broadly varying number of genotypes, and simple point mutations suffice to navigate the space of genotypes while maintaining a phenotype. Nonetheless, most current models focus only on one level of the map (folded molecules, gene regulatory networks, or networks of metabolic reactions), so that many relevant questions cannot be addressed. Here we introduce toyLIFE, a multi-level model for the genotype-phenotype map based on simple genomes and interaction rules from which a complex behaviour at upper levels emerges —remarkably plastic gene regulatory networks and metabolism. toyLIFE is a tool that permits the investigation of how different levels are coupled, in particular how and where mutations affect phenotype or how the presence of certain metabolites determines the dynamics of toyLIFE gene regulatory networks. The model can easily incorporate evolution through more complex mutations, recombination, or gene duplication and deletion, thus opening an avenue to explore extended genotype-phenotype maps. PMID:25520296

  17. Local-level mortality surveillance in resource-limited settings: a case study of Cape Town highlights disparities in health

    PubMed Central

    Bradshaw, Debbie; Daniels, Johann; Zinyakatira, Nesbert; Matzopoulos, Richard; Bourne, David; Shaikh, Najma; Naledi, Tracey

    2010-01-01

    Abstract Objective To identify the leading causes of mortality and premature mortality in Cape Town, South Africa, and its subdistricts, and to compare levels of mortality between subdistricts. Methods Cape Town mortality data for the period 2001–2006 were analysed by age, cause of death and sex. Cause-of-death codes were aggregated into three main cause groups: (i) pre-transitional causes (e.g. communicable diseases, maternal causes, perinatal conditions and nutritional deficiencies), (ii) noncommunicable diseases and (iii) injuries. Premature mortality was calculated in years of life lost (YLLs). Population estimates for the Cape Town Metro district were used to calculate age-specific rates per 100 000 population, which were then age-standardized and compared across subdistricts. Findings The pattern of mortality in Cape Town reflects the quadruple burden of disease observed in the national cause-of-death profile, with HIV/AIDS, other infectious diseases, injuries and noncommunicable diseases all accounting for a significant proportion of deaths. HIV/AIDS has replaced homicide as the leading cause of death. HIV/AIDS, homicide, tuberculosis and road traffic injuries accounted for 44% of all premature mortality. Khayelitsha, the poorest subdistrict, had the highest levels of mortality for all main cause groups. Conclusion Local mortality surveillance highlights the differential needs of the population of Cape Town and provides a wealth of data to inform planning and implementation of targeted interventions. Multisectoral interventions will be required to reduce the burden of disease. PMID:20539858

  18. The Complete Set of Genes Encoding Major Intrinsic Proteins in Arabidopsis Provides a Framework for a New Nomenclature for Major Intrinsic Proteins in Plants1

    PubMed Central

    Johanson, Urban; Karlsson, Maria; Johansson, Ingela; Gustavsson, Sofia; Sjövall, Sara; Fraysse, Laure; Weig, Alfons R.; Kjellbom, Per

    2001-01-01

    Major intrinsic proteins (MIPs) facilitate the passive transport of small polar molecules across membranes. MIPs constitute a very old family of proteins and different forms have been found in all kinds of living organisms, including bacteria, fungi, animals, and plants. In the genomic sequence of Arabidopsis, we have identified 35 different MIP-encoding genes. Based on sequence similarity, these 35 proteins are divided into four different subfamilies: plasma membrane intrinsic proteins, tonoplast intrinsic proteins, NOD26-like intrinsic proteins also called NOD26-like MIPs, and the recently discovered small basic intrinsic proteins. In Arabidopsis, there are 13 plasma membrane intrinsic proteins, 10 tonoplast intrinsic proteins, nine NOD26-like intrinsic proteins, and three small basic intrinsic proteins. The gene structure in general is conserved within each subfamily, although there is a tendency to lose introns. Based on phylogenetic comparisons of maize (Zea mays) and Arabidopsis MIPs (AtMIPs), it is argued that the general intron patterns in the subfamilies were formed before the split of monocotyledons and dicotyledons. Although the gene structure is unique for each subfamily, there is a common pattern in how transmembrane helices are encoded on the exons in three of the subfamilies. The nomenclature for plant MIPs varies widely between different species but also between subfamilies in the same species. Based on the phylogeny of all AtMIPs, a new and more consistent nomenclature is proposed. The complete set of AtMIPs, together with the new nomenclature, will facilitate the isolation, classification, and labeling of plant MIPs from other species. PMID:11500536

  19. Resource allocation in integrated delivery systems and healthcare networks: a proposed framework to guide ethical thinking.

    PubMed

    Macdonald, M

    1999-01-01

    Drawing on a management perspective and the literature, this article suggests an ethical framework to be used at the meso or community level of resource allocation in a Canadian setting. The suggested framework enlarges on the program-level framework developed by Meslin et al primarily by building in stakeholder inclusiveness and public accountability, both of which are essential to resource allocation at the population-based level. PMID:10788068

  20. A possible approach for setting a mercury risk-based action level based on tribal fish ingestion rates.

    PubMed

    Harper, Barbara L; Harris, Stuart G

    2008-05-01

    Risks from mercury and other contaminants in fish for a large Columbia River dataset are evaluated in this paper for a range of consumption rates. Extensive ethnohistorical, nutritional, recent ethnographic surveys, and other documentation was reviewed to confirm previous determinations that the traditional subsistence fish consumption rate is 500 pounds per capita annually, or 620 g per day (gpd). Lower comtemporary consumption rates for other population subsets are also discussed. The causes of the current suppression of fish consumption are discussed and the cultural, educational, social, and trade and economic impacts of the loss of fish are considered. Action levels for mercury for riverine Tribes in the Columbia Basin are suggested at 0.1 ppm or less based on the combined risk from mercury plus other contaminants, the higher fish consumption rates, the existing cultural deficit due to loss of salmon and other stressors, the health benefits of fish, and the cultural and economic importance of fish. The goal of fish advisories is to reduce fish consumption even further, which shifts the burden of avoiding risk to the very people who already bear the burdens of contaminant exposure, socio-economic impacts and cultural loss. However, because Tribal communities often do not have the choice of giving up more food, income, religion, culture, and heritage in order to avoid contamination, they are forced into choosing between culture and health. Many tribal members choose to incur chemical risk rather than giving up their culture and religion. We believe that lowering the action level for mercury is part of the federal fiduciary responsibility to American Indian Tribes. PMID:17631290

  1. Hospital admissions as a function of temperature, other weather phenomena and pollution levels in an urban setting in China

    PubMed Central

    Goggins, William B; Yue, Janice SK; Lee, Poyi

    2013-01-01

    Abstract Objective To explore the relationship between weather phenomena and pollution levels and daily hospital admissions (as an approximation to morbidity patterns) in Hong Kong Special Administrative Region (SAR), China, in 1998–2009. Methods Generalized additive models and lag models were constructed with data from official sources on hospital admissions and on mean daily temperature, mean daily wind speed, mean relative humidity, daily total global solar radiation, total daily rainfall and daily pollution levels. Findings During the hot season, admissions increased by 4.5% for every increase of 1 °C above 29 °C; during the cold season, admissions increased by 1.4% for every decrease of 1 °C within the 8.2–26.9 °C range. In subgroup analyses, admissions for respiratory and infectious diseases increased during extreme heat and cold, but cardiovascular disease admissions increased only during cold temperatures. For every increase of 1 °C above 29 °C, admissions for unintentional injuries increased by 1.9%. During the cold season, for every decrease of 1 °C within the 8.2–26.9 °C range, admissions for cardiovascular diseases and intentional injuries rose by 2.1% and 2.4%, respectively. Admission patterns were not sensitive to sex. Admissions for respiratory diseases rose during hot and cold temperatures among children but only during cold temperatures among the elderly. In people aged 75 years or older, admissions for infectious diseases rose during both temperature extremes. Conclusion In Hong Kong SAR, hospitalizations rise during extreme temperatures. Public health interventions should be developed to protect children, the elderly and other vulnerable groups from excessive heat and cold. PMID:23940405

  2. Qualifications Frameworks: Implementation and Impact in Botswana

    ERIC Educational Resources Information Center

    Tau, Daniel; Modesto, Stanslaus T.

    2011-01-01

    A growing number of countries are introducing qualification frameworks (QFs) following a common definition of outcomes, level descriptors, and a set of occupational or knowledge fields. Botswana has been no exception to this trend. The passing of the Vocational Training Act (1998) led to the creation of the Botswana National Vocational…

  3. Single chirped pulse control of hyperfine states population in Rb atom in the framework of the four-level system

    NASA Astrophysics Data System (ADS)

    Zakharov, Vladislav; Malinovskaya, Svetlana

    2012-06-01

    Electron population dynamics within the hyperfine structure in the Rb atom induced by a single ns pulse is theoretically investigated. The aim is to develop a methodology of the implementation of linearly chirped laser pulses for the desired excitations in the Rb atoms resulting in the creation of predetermined non-equilibrium states. A semi-classical model of laser pulse interaction with a four-level system representing the hyperfine energy levels of the Rb atom involved into dynamics has been developed. The equations for the probability amplitudes were obtained from the Schrodinger equation with the Hamiltonian that described the time evolution of the population of the four states in the field interaction representation. A code was written in Fortran for a numerical analysis of the time evolution of probability amplitudes as a function of the field parameters. The dependence of the quantum yield on the pulse duration, the linear chirp parameter and the Rabi frequency was studied to reveal the conditions for the entire population transfer to the upper hyperfine state of the 5S1/2 electronic level. The results may provide a robust tool for quantum operations in the alkali atoms.

  4. Using the Consolidated Framework for Implementation Research to Identify Barriers and Facilitators for the Implementation of an Internet-Based Patient-Provider Communication Service in Five Settings: A Qualitative Study

    PubMed Central

    Varsi, Cecilie; Ekstedt, Mirjam; Gammon, Deede

    2015-01-01

    Background Although there is growing evidence of the positive effects of Internet-based patient-provider communication (IPPC) services for both patients and health care providers, their implementation into clinical practice continues to be a challenge. Objective The 3 aims of this study were to (1) identify and compare barriers and facilitators influencing the implementation of an IPPC service in 5 hospital units using the Consolidated Framework for Implementation Research (CFIR), (2) assess the ability of the different constructs of CFIR to distinguish between high and low implementation success, and (3) compare our findings with those from other studies that used the CFIR to discriminate between high and low implementation success. Methods This study was based on individual interviews with 10 nurses, 6 physicians, and 1 nutritionist who had used the IPPC to answer messages from patients. Results Of the 36 CFIR constructs, 28 were addressed in the interviews, of which 12 distinguished between high and low implementation units. Most of the distinguishing constructs were related to the inner setting domain of CFIR, indicating that institutional factors were particularly important for successful implementation. Health care providers’ beliefs in the intervention as useful for themselves and their patients as well as the implementation process itself were also important. A comparison of constructs across ours and 2 other studies that also used the CFIR to discriminate between high and low implementation success showed that 24 CFIR constructs distinguished between high and low implementation units in at least 1 study; 11 constructs distinguished in 2 studies. However, only 2 constructs (patient need and resources and available resources) distinguished consistently between high and low implementation units in all 3 studies. Conclusions The CFIR is a helpful framework for illuminating barriers and facilitators influencing IPPC implementation. However, CFIR’s strength

  5. Investigation into the validity of extrapolation in setting maximum residue levels for pesticides in crops of similar morphology.

    PubMed

    Reynolds, S L; Fussell, R J; MacArthur, R

    2005-01-01

    Field trials were initiated to investigate if extrapolation procedures, which were adopted to limit costs of pesticide registration for minor crops, are valid. Three pairs of crops of similar morphology; carrots/swedes, cauliflower/calabrese (broccoli) and French beans/edible-podded peas; were grown in parallel at four different geographical locations within the UK. The crops were treated with both systemic and non-systemic pesticides under maximum registered use conditions, i.e. the maximum permitted application rates and the minimum harvest intervals. Once mature, the crops were harvested and analysed for residues of the applied pesticides. The limits of quantification were in the range 0.005-0.02 mg kg(-1). Analysis of variance and bootstrap estimates showed that in general, the mean residue concentrations for the individual pesticides were significantly different between crop pairs grown on each site. Similarly, the mean residue concentrations of most of the pesticides in each crop across sites were significantly different. These findings demonstrate that the extrapolations of residue levels for most of the selected pesticide/crop combinations investigated; chlorfenvinphos and iprodione from carrots to swedes; carbendazim, chlorpyrifos, diflubenzuron and dimethoate from cauliflower to calabrese; and malathion, metalaxyl and pirimicarb from French beans to edible-podded peas; appear invalid. PMID:15895609

  6. Spatial gradients of protein-level time delays set the pace of the traveling segmentation clock waves

    PubMed Central

    Ay, Ahmet; Holland, Jack; Sperlea, Adriana; Devakanmalai, Gnanapackiam Sheela; Knierer, Stephan; Sangervasi, Sebastian; Stevenson, Angel; Özbudak, Ertuğrul M.

    2014-01-01

    The vertebrate segmentation clock is a gene expression oscillator controlling rhythmic segmentation of the vertebral column during embryonic development. The period of oscillations becomes longer as cells are displaced along the posterior to anterior axis, which results in traveling waves of clock gene expression sweeping in the unsegmented tissue. Although various hypotheses necessitating the inclusion of additional regulatory genes into the core clock network at different spatial locations have been proposed, the mechanism underlying traveling waves has remained elusive. Here, we combined molecular-level computational modeling and quantitative experimentation to solve this puzzle. Our model predicts the existence of an increasing gradient of gene expression time delays along the posterior to anterior direction to recapitulate spatiotemporal profiles of the traveling segmentation clock waves in different genetic backgrounds in zebrafish. We validated this prediction by measuring an increased time delay of oscillatory Her1 protein production along the unsegmented tissue. Our results refuted the need for spatial expansion of the core feedback loop to explain the occurrence of traveling waves. Spatial regulation of gene expression time delays is a novel way of creating dynamic patterns; this is the first report demonstrating such a control mechanism in any tissue and future investigations will explore the presence of analogous examples in other biological systems. PMID:25336742

  7. Quantitative characterization of metastatic disease in the spine. Part I. Semiautomated segmentation using atlas-based deformable registration and the level set method

    SciTech Connect

    Hardisty, M.; Gordon, L.; Agarwal, P.; Skrinskas, T.; Whyne, C.

    2007-08-15

    Quantitative assessment of metastatic disease in bone is often considered immeasurable and, as such, patients with skeletal metastases are often excluded from clinical trials. In order to effectively quantify the impact of metastatic tumor involvement in the spine, accurate segmentation of the vertebra is required. Manual segmentation can be accurate but involves extensive and time-consuming user interaction. Potential solutions to automating segmentation of metastatically involved vertebrae are demons deformable image registration and level set methods. The purpose of this study was to develop a semiautomated method to accurately segment tumor-bearing vertebrae using the aforementioned techniques. By maintaining morphology of an atlas, the demons-level set composite algorithm was able to accurately differentiate between trans-cortical tumors and surrounding soft tissue of identical intensity. The algorithm successfully segmented both the vertebral body and trabecular centrum of tumor-involved and healthy vertebrae. This work validates our approach as equivalent in accuracy to an experienced user.

  8. Short communication: A low-cost method for analyzing nevirapine levels in hair as a marker of adherence in resource-limited settings.

    PubMed

    Gandhi, Monica; Yang, Qiyun; Bacchetti, Peter; Huang, Yong

    2014-01-01

    The measurement of antiretroviral concentrations in hair is emerging as an important technology to objectively quantify adherence to combination antiretroviral therapy. Hair levels of antiretrovirals are the strongest independent predictor of virologic success in large prospective cohorts of HIV-infected patients and surpass self-report in predicting outcomes. Hair is easy to collect and store, but validated methods to analyze antiretroviral levels in hair using liquid chromatography tandem mass spectrometry (LC-MS/MS) are expensive. We report here on the development of a thin-layer chromatography (TLC) assay for the semiquantitative analysis of nevirapine in hair. TLC assay results from 11 samples were consistent with results using LC-MS/MS [Spearman correlation coefficient 0.99 (95% CI 0.95-0.996)]. This simple, low-cost method of analyzing nevirapine concentrations in hair may provide a novel monitoring tool for antiretroviral adherence in resource-limited settings and merits further study in clinical settings. PMID:24164410

  9. Can we avoid high levels of dose escalation for high-risk prostate cancer in the setting of androgen deprivation?

    PubMed Central

    Shakespeare, Thomas P; Wilcox, Shea W; Aherne, Noel J

    2016-01-01

    Aim Both dose-escalated external beam radiotherapy (DE-EBRT) and androgen deprivation therapy (ADT) improve outcomes in patients with high-risk prostate cancer. However, there is little evidence specifically evaluating DE-EBRT for patients with high-risk prostate cancer receiving ADT, particularly for EBRT doses >74 Gy. We aimed to determine whether DE-EBRT >74 Gy improves outcomes for patients with high-risk prostate cancer receiving long-term ADT. Patients and methods Patients with high-risk prostate cancer were treated on an institutional protocol prescribing 3–6 months neoadjuvant ADT and DE-EBRT, followed by 2 years of adjuvant ADT. Between 2006 and 2012, EBRT doses were escalated from 74 Gy to 76 Gy and then to 78 Gy. We interrogated our electronic medical record to identify these patients and analyzed our results by comparing dose levels. Results In all, 479 patients were treated with a 68-month median follow-up. The 5-year biochemical disease-free survivals for the 74 Gy, 76 Gy, and 78 Gy groups were 87.8%, 86.9%, and 91.6%, respectively. The metastasis-free survivals were 95.5%, 94.5%, and 93.9%, respectively, and the prostate cancer-specific survivals were 100%, 94.4%, and 98.1%, respectively. Dose escalation had no impact on any outcome in either univariate or multivariate analysis. Conclusion There was no benefit of DE-EBRT >74 Gy in our cohort of high-risk prostate patients treated with long-term ADT. As dose escalation has higher risks of radiotherapy-induced toxicity, it may be feasible to omit dose escalation beyond 74 Gy in this group of patients. Randomized studies evaluating dose escalation for high-risk patients receiving ADT should be considered. PMID:27274277

  10. Modeling the effect of climate change on U.S. state-level buildings energy demands in an integrated assessment framework

    SciTech Connect

    Zhou, Yuyu; Clarke, Leon E.; Eom, Jiyong; Kyle, G. Page; Patel, Pralit L.; Kim, Son H.; Dirks, James A.; Jensen, Erik A.; Liu, Ying; Rice, Jennie S.; Schmidt, Laurel C.; Seiple, Timothy E.

    2014-01-01

    As long-term socioeconomic transformation and energy service expansion show large spatial heterogeneity, advanced understanding of climate impact on building energy use at the sub-national level will offer useful insights into climate policy and regional energy system planning. In this study, we presented a detailed building energy model with a U.S. state-level representation, nested in the GCAM integrated assessment framework. We projected state-level building energy demand and its spatial pattern over the century, considering the impact of climate change based on the estimates of heating and cooling degree days derived from downscaled USGS CASCaDE temperature data. The result indicates that climate change has a large impact on heating and cooling building energy and fuel use at the state level, exhibiting large spatial heterogeneity across states (ranges from -10% to +10%). The sensitivity analysis reveals that the building energy demand is subject to multiple key factors, such as the magnitude of climate change, the choice of climate models, and the growth of population and GDP, and that their relative contributions vary greatly across the space. The scale impact in building energy use modeling highlights the importance of constructing a building energy model with the spatially-explicit representation of socioeconomics, energy system development, and climate change. These findings will help the climate-based policy decision and energy system, especially utility planning related to building sector at the U.S. state and regional level facing the potential climate change.

  11. An adaptive spectral/DG method for a reduced phase-space based level set approach to geometrical optics on curved elements

    NASA Astrophysics Data System (ADS)

    Cockburn, Bernardo; Kao, Chiu-Yen; Reitich, Fernando

    2014-02-01

    We present an adaptive spectral/discontinuous Galerkin (DG) method on curved elements to simulate high-frequency wavefronts within a reduced phase-space formulation of geometrical optics. Following recent work, the approach is based on the use of level sets defined by functions satisfying the Liouville equations in reduced phase-space and, in particular, it relies on the smoothness of these functions to represent them by rapidly convergent spectral expansions in the phase variables. The resulting (hyperbolic) system of equations for the coefficients in these expansions are then amenable to a high-order accurate treatment via DG approximations. In the present work, we significantly expand on the applicability and efficiency of the approach by incorporating mechanisms that allow for its use in scattering simulations and for a reduced overall computational cost. With regards to the former we demonstrate that the incorporation of curved elements is necessary to attain any kind of accuracy in calculations that involve scattering off non-flat interfaces. With regards to efficiency, on the other hand, we also show that the level-set formulation allows for a space p-adaptive scheme that under-resolves the level-set functions away from the wavefront without incurring in a loss of accuracy in the approximation of its location. As we show, these improvements enable simulations that are beyond the capabilities of previous implementations of these numerical procedures.

  12. Level structure of the Ge, Se, and Kr (N =52 , 53 ) isotopes within the framework of the interacting boson model

    NASA Astrophysics Data System (ADS)

    Al-Khudair, Falih H.

    2015-05-01

    The levels structure and electromagnetic transitions of Ge,8584 ,Se,8786 , and Kr,8988 isotopes have been investigated in the interacting boson model (IBM-2), including the interacting boson-fermion model (IBFM-2), as a first application of the model. In the calculation, the fermions are allowed to excite to the 2 d5 /2,1 g7 /2 ,2 d3 /2 ,3 s1 /2 , and 1 h11 /2 single-particle orbitals. The results of the model calculation have been found to be in good agreement with both shell-model and available experimental data. Also the strong staggering of the yrast band has been found to be well described.

  13. Effect of liner design, pulsator setting, and vacuum level on bovine teat tissue changes and milking characteristics as measured by ultrasonography.

    PubMed

    Gleeson, David E; O'Callaghan, Edmond J; Rath, Myles V

    2004-01-01

    : Friesian-type dairy cows were milked with different machine settings to determine the effect of these settings on teat tissue reaction and on milking characteristics. Three teat-cup liner designs were used with varying upper barrel dimensions (wide-bore WB = 31.6 mm; narrow-bore NB = 21.0 mm; narrow-bore NB1 = 25.0 mm). These liners were tested with alternate and simultaneous pulsation patterns, pulsator ratios (60:40 and 67:33) and three system vacuum levels (40, 44 and 50 kPa). Teat tissue was measured using ultrasonography, before milking and directly after milking. The measurements recorded were teat canal length (TCL), teat diameter (TD), cistern diameter (CD) and teat wall thickness (TWT).Teat tissue changes were similar with a system vacuum level of either 50 kPa (mid-level) or 40 kPa (low-level). Widening the liner upper barrel bore dimension from 21.0 mm (P < 0.01) or 25.0 mm (P < 0.001) to 31.6 mm increased the magnitude of changes in TD and TWT after machine milking. Milk yield per cow was significantly (P < 0.05) higher and cluster-on time was reduced (P < 0.01) with the WB cluster as compared to the NB1 cluster. Minimum changes in teat tissue parameters were achieved with system vacuum level of 40 kPa and 50 kPa using NB and WB clusters, respectively. Similar changes in teat tissue and milk yield per cow were observed with alternate and simultaneous pulsation patterns. Widening pulsator ratio from 60:40 to 67:33 did not have negative effects on changes in teat tissue and had a positive effect on milk yield and milking time. Milk liner design had a bigger effect on teat tissue changes and milking characteristics than pulsation settings. PMID:21851658

  14. Estimating and validating disability-adjusted life years at the global level: a methodological framework for cancer

    PubMed Central

    2012-01-01

    Background Disability-adjusted life years (DALYs) link data on disease occurrence to health outcomes, and they are a useful aid in establishing country-specific agendas regarding cancer control. The variables required to compute DALYs are however multiple and not readily available in many countries. We propose a methodology that derives global DALYs and validate variables and DALYs based on data from various cancer registries. Methods We estimated DALYs for four countries (Norway, Bulgaria, India and Uganda) within each category of the human development index (HDI). The following sources (indicators) were used: Globocan2008 (incidence and mortality), various cancer registries (proportion cured, proportion treated and duration of disease), treatment guidelines (duration of treatment), specific burden of disease studies (sequelae and disability weights), alongside expert opinion. We obtained country-specific population estimates and identified resource levels using the HDI, DALYs are computed as the sum of years of life lost and years lived with disabilities. Results Using mortality:incidence ratios to estimate country-specific survival, and by applying the human development index we derived country-specific estimates of the proportion cured and the proportion treated. The fit between the estimates and observed data from the cancer registries was relatively good. The final DALY estimates were similar to those computed using observed values in Norway, and in WHO’s earlier global burden of disease study. Marked cross-country differences in the patterns of DALYs by cancer sites were observed. In Norway and Bulgaria, breast, colorectal, prostate and lung cancer were the main contributors to DALYs, representing 54% and 45%, respectively, of the totals. These cancers contributed only 27% and 18%, respectively, of total DALYs in India and Uganda. Conclusions Our approach resulted in a series of variables that can be used to estimate country-specific DALYs, enabling global

  15. Preparing for the SWOT mission by evaluating the simulations of river water levels within a regional-scale hydrometeorological modeling framework

    NASA Astrophysics Data System (ADS)

    Häfliger, Vincent; Martin, Eric; Boone, Aaron; Habets, Florence; David, Cédric H.; Garambois, Pierre-André; Roux, Hélène; Ricci, Sophie

    2014-05-01

    The upcoming Surface Water Ocean Topography (SWOT) mission will provide unprecedented observations of water elevation in rivers and lakes. The vertical accuracy of SWOT measurements is expected to be around 10 cm for rivers of width greater than 50-100m. Over France, new observations will be available every 5 days. Such observations will allow new opportunities for validation of hydrological models and for data assimilation within these models. The objective of the proposed work is to evaluate the quality of simulated river water levels in the Garonne River Basin (55,000 km²) located in Southwestern France. The simulations are produced using a distributed regional-scale hydrometeorological modeling framework composed of a land surface model (ISBA), a hydrogeological model (MODCOU) and a river network model (RAPID). The modeling framework had been initially calibrated over France although this study focuses on the smaller Garonne Basin and the proposed research emphasizes on modifications made to RAPID. First, the existing RAPID parameters (i.e. temporally-constant but spatially-variable Muskingum parameters) were updated in the Garonne River Basin based on estimations made using a lagged cross correlation method applied to observed hydrographs. Second, the model code was modified to allow for the use of a kinematic or a kinematic-diffusive wave equation for routing, both allowing for temporally and spatially variables wave celerities. Such modification required prescribing the values of hydraulic parameters of the river-channel. Initial results show that the variable flow velocity scheme is advantageous for discharge computations when compared to the original Muskingum method in RAPID. Additionally, water level computations led to root mean square errors of 50-60 cm in the improved Muskingum method and 40-50 cm in the kinematic-diffusive wave method. Discharge computations were also shown to be comparable to those obtained with high-resolution models solving the

  16. Improved Fuzzy C-Means based Particle Swarm Optimization (PSO) initialization and outlier rejection with level set methods for MR brain image segmentation.

    PubMed

    Mekhmoukh, Abdenour; Mokrani, Karim

    2015-11-01

    In this paper, a new image segmentation method based on Particle Swarm Optimization (PSO) and outlier rejection combined with level set is proposed. A traditional approach to the segmentation of Magnetic Resonance (MR) images is the Fuzzy C-Means (FCM) clustering algorithm. The membership function of this conventional algorithm is sensitive to the outlier and does not integrate the spatial information in the image. The algorithm is very sensitive to noise and in-homogeneities in the image, moreover, it depends on cluster centers initialization. To improve the outlier rejection and to reduce the noise sensitivity of conventional FCM clustering algorithm, a novel extended FCM algorithm for image segmentation is presented. In general, in the FCM algorithm the initial cluster centers are chosen randomly, with the help of PSO algorithm the clusters centers are chosen optimally. Our algorithm takes also into consideration the spatial neighborhood information. These a priori are used in the cost function to be optimized. For MR images, the resulting fuzzy clustering is used to set the initial level set contour. The results confirm the effectiveness of the proposed algorithm. PMID:26299609

  17. Change in Vitamin D Levels Occurs Early after Antiretroviral Therapy Initiation and Depends on Treatment Regimen in Resource-Limited Settings

    PubMed Central

    Havers, Fiona P.; Detrick, Barbara; Cardoso, Sandra W.; Berendes, Sima; Lama, Javier R.; Sugandhavesa, Patcharaphan; Mwelase, Noluthando H.; Campbell, Thomas B.; Gupta, Amita

    2014-01-01

    Study Background Vitamin D has wide-ranging effects on the immune system, and studies suggest that low serum vitamin D levels are associated with worse clinical outcomes in HIV. Recent studies have identified an interaction between antiretrovirals used to treat HIV and reduced serum vitamin D levels, but these studies have been done in North American and European populations. Methods Using a prospective cohort study design nested in a multinational clinical trial, we examined the effect of three combination antiretroviral (cART) regimens on serum vitamin D levels in 270 cART-naïve, HIV-infected adults in nine diverse countries, (Brazil, Haiti, Peru, Thailand, India, Malawi, South Africa, Zimbabwe and the United States). We evaluated the change between baseline serum vitamin D levels and vitamin D levels 24 and 48 weeks after cART initiation. Results Serum vitamin D levels decreased significantly from baseline to 24 weeks among those randomized to efavirenz/lamivudine/zidovudine (mean change: −7.94 [95% Confidence Interval (CI) −10.42, −5.54] ng/ml) and efavirenz/emtricitabine/tenofovir-DF (mean change: −6.66 [95% CI −9.40, −3.92] ng/ml) when compared to those randomized to atazanavir/emtricitabine/didanosine-EC (mean change: −2.29 [95% CI –4.83, 0.25] ng/ml). Vitamin D levels did not change significantly between week 24 and 48. Other factors that significantly affected serum vitamin D change included country (p<0.001), season (p<0.001) and baseline vitamin D level (p<0.001). Conclusion Efavirenz-containing cART regimens adversely affected vitamin D levels in patients from economically, geographically and racially diverse resource-limited settings. This effect was most pronounced early after cART initiation. Research is needed to define the role of Vitamin D supplementation in HIV care. PMID:24752177

  18. The Preconception Stress and Resiliency Pathways Model: a multi-level framework on maternal, paternal, and child health disparities derived by community-based participatory research.

    PubMed

    Ramey, Sharon Landesman; Schafer, Peter; DeClerque, Julia L; Lanzi, Robin G; Hobel, Calvin; Shalowitz, Madeleine; Chinchilli, Vern; Raju, Tonse N K

    2015-04-01

    Emerging evidence supports the theoretical and clinical importance of the preconception period in influencing pregnancy outcomes and child health. Collectively, this evidence affirms the need for a novel, integrative theoretical framework to design future investigations, integrate new findings, and identify promising, evidence-informed interventions to improve intergenerational health and reduce disparities. This article presents a transdisciplinary framework developed by the NIH Community Child Health Network (CCHN) through community-based participatory research processes. CCHN developed a Preconception Stress and Resiliency Pathways (PSRP) model by building local and multi-site community-academic participatory partnerships that established guidelines for research planning and decision-making; reviewed relevant findings diverse disciplinary and community perspectives; and identified the major themes of stress and resilience within the context of families and communities. The PSRP model focuses on inter-relating the multiple, complex, and dynamic biosocial influences theoretically linked to family health disparities. The PSRP model borrowed from and then added original constructs relating to developmental origins of lifelong health, epigenetics, and neighborhood and community influences on pregnancy outcome and family functioning (cf. MCHJ 2014). Novel elements include centrality of the preconception/inter-conception period, role of fathers and the parental relationship, maternal allostatic load (a composite biomarker index of cumulative wear-and-tear of stress), resilience resources of parents, and local neighborhood and community level influences (e.g., employment, housing, education, health care, and stability of basic necessities). CCHN's integrative framework embraces new ways of thinking about how to improve outcomes for future generations, by starting before conception, by including all family members, and by engaging the community vigorously at multiple

  19. Toward Realistic Simulation of low-Level Clouds Using a Multiscale Modeling Framework With a Third-Order Turbulence Closure in its Cloud-Resolving Model Component

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man; Cheng, Anning

    2010-01-01

    This study presents preliminary results from a multiscale modeling framework (MMF) with an advanced third-order turbulence closure in its cloud-resolving model (CRM) component. In the original MMF, the Community Atmosphere Model (CAM3.5) is used as the host general circulation model (GCM), and the System for Atmospheric Modeling with a first-order turbulence closure is used as the CRM for representing cloud processes in each grid box of the GCM. The results of annual and seasonal means and diurnal variability are compared between the modified and original MMFs and the CAM3.5. The global distributions of low-level cloud amounts and precipitation and the amounts of low-level clouds in the subtropics and middle-level clouds in mid-latitude storm track regions in the modified MMF show substantial improvement relative to the original MMF when both are compared to observations. Some improvements can also be seen in the diurnal variability of precipitation.

  20. Influence of anisotropic grain boundary properties on the evolution of grain boundary character distribution during grain growth—a 2D level set study

    NASA Astrophysics Data System (ADS)

    Hallberg, Håkan

    2014-12-01

    The present study elaborates on a 2D level set model of polycrystal microstructures that was recently established by adding the influence of anisotropic grain boundary energy and mobility on microstructure evolution. The new model is used to trace the evolution of grain boundary character distribution during grain growth. The employed level set formulation conveniently allows the grain boundary characteristics to be quantified in terms of coincidence site lattice (CSL) type per unit of grain boundary length, providing a measure of the distribution of such boundaries. In the model, both the mobility and energy of the grain boundaries are allowed to vary with misorientation. In addition, the influence of initial polycrystal texture is studied by comparing results obtained from a polycrystal with random initial texture against results from a polycrystal that initially has a cube texture. It is shown that the proposed level set formulation can readily incorporate anisotropic grain boundary properties and the simulation results further show that anisotropic grain boundary properties only have a minor influence on the evolution of CSL boundary distribution during grain growth. As anisotropic boundary properties are considered, the most prominent changes in the CSL distributions are an increase of general low-angle Σ1 boundaries as well as a more stable presence of Σ3 boundaries. The observations also hold for the case of an initially cube-textured polycrystal. The presence of this kind of texture has little influence over the evolution of the CSL distribution. Taking into consideration the anisotropy of grain boundary properties, grain growth alone does not seem to be sufficient to promote any significantly increased overall presence of CSL boundaries.

  1. A new watershed assessment framework for Nova Scotia: A high-level, integrated approach for regions without a dense network of monitoring stations

    NASA Astrophysics Data System (ADS)

    Sterling, Shannon M.; Garroway, Kevin; Guan, Yue; Ambrose, Sarah M.; Horne, Peter; Kennedy, Gavin W.

    2014-11-01

    High-level, integrated watershed assessments are a basic requirement for freshwater planning, as they create regional summaries of multiple environmental stressors for the prioritization of watershed conservation, restoration, monitoring, and mitigation. There is a heightened need for a high-level, integrated watershed assessment in Nova Scotia as it faces pressing watershed issues relating to acidification, soil erosion, acid rock drainage, eutrophication, and water withdrawals related to potential shale gas development. But because of the relative sparseness of the on-the-ground effects-based data, for example on water quality or fish assemblages, previously created approaches for integrated watershed assessment cannot be used. In a government/university collaboration, we developed a new approach that relies solely on easier-to-collect and more available exposure-based variables to perform the first high-level watershed assessment in Nova Scotia. In this assessment, a total of 295 watershed units were studied. We used Geographic Information Systems (GIS) to map and analyze 13 stressor variables that represent risks to aquatic environment (e.g., road/stream crossing density, acid rock drainage risk, surface water withdrawals, human land use, and dam density). We developed a model to link stressors with impacts to aquatic systems to serve as a basis for a watershed threat ranking system. Resource management activities performed by government and other stakeholders were also included in this analysis. Our assessment identifies the most threatened watersheds, enables informed comparisons among watersheds, and indicates where to focus resource management and monitoring efforts. Stakeholder communication tools produced by the NSWAP include a watershed atlas to communicate the assessment results to a broader audience, including policy makers and public stakeholders. This new framework for high-level watershed assessments provides a resource for other regions that also

  2. On the Diurnal Cycle of Deep Convection, High-Level Cloud, and Upper Troposphere Water Vapor in the Multiscale Modeling Framework

    SciTech Connect

    Zhang, Yunyan; Klein, Stephen A.; Liu, Chuntao; Tian, Baijun; Marchand, Roger T.; Haynes, J. M.; McCoy, Renata; Zhang, Yuying; Ackerman, Thomas P.

    2008-08-22

    The Multiscale Modeling Framework (MMF), also called ‘‘superparameterization’’, embeds a cloud-resolving model (CRM) at each grid column of a general circulation model to replace traditional parameterizations of moist convection and large-scale condensation. This study evaluates the diurnal cycle of deep convection, high-level clouds, and upper troposphere water vapor by applying an infrared (IR) brightness temperature (Tb) and a precipitation radar (PR) simulator to the CRM column data. Simulator results are then compared with IR radiances from geostationary satellites and PR reflectivities from the Tropical Rainfall Measuring Mission (TRMM). While the actual surface precipitation rate in the MMF has a reasonable diurnal phase and amplitude when compared with TRMM observations, the IR simulator results indicate an inconsistency in the diurnal anomalies of high-level clouds between the model and the geostationary satellite data. Primarily because of its excessive high-level clouds, the MMF overestimates the simulated precipitation index (PI) and fails to reproduce the observed diurnal cycle phase relationships among PI, high-level clouds, and upper troposphere relative humidity. The PR simulator results show that over the tropical oceans, the occurrence fraction of reflectivity in excess of 20 dBZ is almost 1 order of magnitude larger than the TRMM data especially at altitudes above 6 km. Both results suggest that the MMF oceanic convection is overactive and possible reasons for this bias are discussed. However, the joint distribution of simulated IR Tb and PR reflectivity indicates that the most intense deep convection is found more often over tropical land than ocean, in agreement with previous observational studies.

  3. Use of the reciprocal calculation procedure for setting workplace emergency action levels for hydrocarbon mixtures and their relationship to lower explosive limits.

    PubMed

    Gardner, Ron

    2012-04-01

    This paper proposes a novel use of the reciprocal calculation procedure (RCP) to calculate workplace emergency action levels (WEALs) for accidental releases of hydrocarbon mixtures. WEALs are defined here as the concentration in air that area monitors should alarm at to provide adequate warning and be sufficiently protective of health to allow at least enough time to don respiratory protective equipment (RPE) and escape. The rationale for the approach is analysed, and ways of defining suitable substance group guidance values (GVs) for input into the RCP are considered and compared. WEAL GVs could be based on: 3× RCP GVs (i.e. using the 3× rule), the 5× RCP GVs (i.e. using the 5× rule for calculating ceiling values), emergency exposure limits, or immediately dangerous to life or health values (IDLHs). Of these, the method of choice is to base WEAL GVs on health-based IDLH values, which were developed for emergency situations in the workplace. However, IDLHs have only been set for 11 hydrocarbons, so the choice of GVs is also informed by comparison with possible GVs based on the other approaches. Using the proposed GVs, WEALs were calculated for various hydrocarbon mixtures, and the way they vary with the composition of the mixture was examined. Also, the level of health protection given by the current practice of setting emergency area alarms in the oil and gas industry at 10% of the lower explosive limit (LEL) was tested by comparing this with the WEAL. In the event of an accidental release, this comparison suggests that, provided that aromatics constitute <50% of the mixture, an alarm set at 10% LEL should provide adequate warning and be sufficiently protective of health to at allow at least enough time to don RPE and escape. In the absence of better information or specific acute toxicity concerns (such as the presence of hydrogen sulphide), it is proposed that the WEALs be used as a guide for assessing the adequacy of area alarm levels in respect of warning

  4. Pore-scale simulation of coupled reactive transport and dissolution in fractures and porous media using the level set interface tracking method

    SciTech Connect

    Hai Huang; Xiaoyi Li

    2011-01-01

    A level set simulation methodology developed for modeling coupled reactive transport and structure evolution has been applied to dissolution in fracture apertures and porous media. The coupled processes such as fluid flow, reactant transport and dissolution at the solid-liquid interfaces are handled simultaneously. The reaction-induced evolution of solid-liquid interfaces is captured using the level set method, with the advantage of representing the interface with sub-grid scale resolution. The coupled processes are simulated for several geometric models of fractures and porous media under various flow conditions and reaction rates. Quantitative relationships between permeability and porosity are obtained from some of the simulation results and compared with analytical constitutive relations (i.e., the conventional cubic law and the Carman-Kozeny law) based on simplified pore space geometries and reaction induced geometric evolutions. The drastic deviation of the simulation results from these analytical theories is explained by the development of large local concentration gradients of reactants within fracture apertures and individual pores observed in the simulation results and consequently the complex geometric evolution patterns of fracture apertures and pores due to mineral dissolution. The simulation results support the argument that traditional constitutive relations based on simplified geometries and conditions have limited applicability in predicting field scale reactive transport and that incorporation of micro-scale physics is necessary.

  5. Impact of dose rate on accuracy of intensity modulated radiation therapy plan delivery using the pretreatment portal dosimetry quality assurance and setting up the workflow at hospital levels

    PubMed Central

    Kaviarasu, Karunakaran; Raj, N. Arunai Nambi; Murthy, K. Krishna; Babu, A. Ananda Giri; Prasad, Bhaskar Laxman Durga

    2015-01-01

    The aim of this study was to examine the impact of dose rate on accuracy of intensity modulated radiation therapy (IMRT) plan delivery by comparing the gamma agreement between the calculated and measured portal doses by pretreatment quality assurance (QA) using electronic portal imaging device dosimetry and creating a workflow for the pretreatment IMRT QA at hospital levels. As the improvement in gamma agreement leads to increase in the quality of IMRT treatment delivery, gamma evaluation was carried out for the calculated and the measured portal images for the criteria of 3% dose difference and 3 mm distance-to-agreement (DTA). Three gamma parameters: Maximum gamma, average gamma, and percentage of the field area with a gamma value>1.0 were analyzed. Three gamma index parameters were evaluated for 40 IMRT plans (315 IMRT fields) which were calculated for 400 monitor units (MU)/min dose rate and maximum multileaf collimator (MLC) speed of 2.5 cm/s. Gamma parameters for all 315 fields are within acceptable limits set at our center. Further, to improve the gamma results, we set an action level for this study using the mean and standard deviation (SD) values from the 315 fields studied. Forty out of 315 IMRT fields showed low gamma agreement (gamma parameters>2 SD as per action level of the study). The parameters were recalculated and reanalyzed for the dose rates of 300, 400 and 500 MU/min. Lowering the dose rate helped in getting an enhanced gamma agreement between the calculated and measured portal doses of complicated fields. This may be attributed to the less complex motion of MLC over time and the MU of the field/segment. An IMRT QA work flow was prepared which will help in improving the quality of IMRT delivery. PMID:26865759

  6. Aggregating Hydrometeorological Data from International Monitoring Networks Across Earth's Largest Lake System to Quantify Uncertainty in Historical Water Budget Records, Improve Regional Water Budget Projections, and Differentiate Drivers Behind a Recent Record-Setting Surge in Water Levels

    NASA Astrophysics Data System (ADS)

    Gronewold, A.; Bruxer, J.; Smith, J.; Hunter, T.; Fortin, V.; Clites, A. H.; Durnford, D.; Qian, S.; Seglenieks, F.

    2015-12-01

    Resolving and projecting the water budget of the North American Great Lakes basin (Earth's largest lake system) requires aggregation of data from a complex array of in situ monitoring and remote sensing products that cross an international border (leading to potential sources of bias and other inconsistencies), and are relatively sparse over the surfaces of the lakes themselves. Data scarcity over the surfaces of the lakes is a particularly significant problem because, unlike Earth's other large freshwater basins, the Great Lakes basin water budget is (on annual scales) comprised of relatively equal contributions from runoff, over-lake precipitation, and over-lake evaporation. Consequently, understanding drivers behind changes in regional water storage and water levels requires a data management framework that can reconcile uncertainties associated with data scarcity and bias, and propagate those uncertainties into regional water budget projections and historical records. Here, we assess the development of a historical hydrometeorological database for the entire Great Lakes basin with records dating back to the late 1800s, and describe improvements that are specifically intended to differentiate hydrological, climatological, and anthropogenic drivers behind recent extreme changes in Great Lakes water levels. Our assessment includes a detailed analysis of the extent to which extreme cold winters in central North America in 2013-2014 (caused by the anomalous meridional upper air flow - commonly referred to in the public media as the "polar vortex" phenomenon) altered the thermal and hydrologic regimes of the Great Lakes and led to a record setting surge in water levels between January 2014 and December 2015.

  7. Hair mercury and urinary cadmium levels in Belgian children and their mothers within the framework of the COPHES/DEMOCOPHES projects.

    PubMed

    Pirard, Catherine; Koppen, Gudrun; De Cremer, Koen; Van Overmeire, Ilse; Govarts, Eva; Dewolf, Marie-Christine; Van De Mieroop, Els; Aerts, Dominique; Biot, Pierre; Casteleyn, Ludwine; Kolossa-Gehring, Marike; Schwedler, Gerda; Angerer, Jürgen; Koch, Holger M; Schindler, Birgit K; Castaño, Argelia; Esteban, Marta; Schoeters, Greet; Den Hond, Elly; Sepai, Ovnair; Exley, Karen; Horvat, Milena; Bloemen, Louis; Knudsen, Lisbeth E; Joas, Reinhard; Joas, Anke; Van Loco, Joris; Charlier, Corinne

    2014-02-15

    A harmonized human biomonitoring pilot study was set up within the frame of the European projects DEMOCOPHES and COPHES. In 17 European countries, biomarkers of some environmental pollutants, including urinary cadmium and hair mercury, were measured in children and their mothers in order to obtain European-wide comparison values on these chemicals. The Belgian participant population consisted in 129 school children (6-11 years) and their mothers (≤ 45 years) living in urban or rural areas of Belgium. The geometric mean levels for mercury in hair were 0.383 μg/g and 0.204 μg/g for respectively mothers and children. Cadmium in mother's and children's urine was detected at a geometric mean concentration of respectively 0.21 and 0.04 μg/l. For both biomarkers, levels measured in the mothers and their child were correlated. While the urinary cadmium levels increased with age, no trend was found for hair mercury content, except the fact that mothers hold higher levels than children. The hair mercury content increased significantly with the number of dental amalgam fillings, explaining partially the higher levels in the mothers by their higher presence rate of these amalgams compared to children. Fish or seafood consumption was the other main parameter determining the mercury levels in hair. No relationship was found between smoking status and cadmium or mercury levels, but the studied population included very few smokers. Urinary cadmium levels were higher in both mothers and children living in urban areas, while for mercury this difference was only significant for children. Our small population showed urinary cadmium and hair mercury levels lower than the health based guidelines suggested by the WHO or the JECFA (Joint FAO/WHO Expert Committee on Food Additives). Only 1% had cadmium level slightly higher than the German HBM-I value (1 μg/l for adults), and 9% exceeded the 1 μg mercury/g hair suggested by the US EPA. PMID:24333995

  8. Computer-aided measurement of liver volumes in CT by means of geodesic active contour segmentation coupled with level-set algorithms

    SciTech Connect

    Suzuki, Kenji; Kohlbrenner, Ryan; Epstein, Mark L.; Obajuluwa, Ademola M.; Xu Jianwu; Hori, Masatoshi

    2010-05-15

    Purpose: Computerized liver extraction from hepatic CT images is challenging because the liver often abuts other organs of a similar density. The purpose of this study was to develop a computer-aided measurement of liver volumes in hepatic CT. Methods: The authors developed a computerized liver extraction scheme based on geodesic active contour segmentation coupled with level-set contour evolution. First, an anisotropic diffusion filter was applied to portal-venous-phase CT images for noise reduction while preserving the liver structure, followed by a scale-specific gradient magnitude filter to enhance the liver boundaries. Then, a nonlinear grayscale converter enhanced the contrast of the liver parenchyma. By using the liver-parenchyma-enhanced image as a speed function, a fast-marching level-set algorithm generated an initial contour that roughly estimated the liver shape. A geodesic active contour segmentation algorithm coupled with level-set contour evolution refined the initial contour to define the liver boundaries more precisely. The liver volume was then calculated using these refined boundaries. Hepatic CT scans of 15 prospective liver donors were obtained under a liver transplant protocol with a multidetector CT system. The liver volumes extracted by the computerized scheme were compared to those traced manually by a radiologist, used as ''gold standard.''Results: The mean liver volume obtained with our scheme was 1504 cc, whereas the mean gold standard manual volume was 1457 cc, resulting in a mean absolute difference of 105 cc (7.2%). The computer-estimated liver volumetrics agreed excellently with the gold-standard manual volumetrics (intraclass correlation coefficient was 0.95) with no statistically significant difference (F=0.77; p(F{<=}f)=0.32). The average accuracy, sensitivity, specificity, and percent volume error were 98.4%, 91.1%, 99.1%, and 7.2%, respectively. Computerized CT liver volumetry would require substantially less completion time

  9. Evaluation of the Serum Levels of Nitric Oxide among Diabetic Patients and its Correlation with Lipid Profile as well as Oxidative Stress in North Indian Setting

    PubMed Central

    Trivedi, Arvind; Verma, Neetu; Panwar, Ajay; Kumar, Pradeep

    2016-01-01

    Introduction Diabetes mellitus is a disease with a rapidly increasing prevalence, needs continue research for novel methods to both prevent and treat this dis