Science.gov

Sample records for level set framework

  1. An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs

    PubMed Central

    Mosaliganti, Kishore R.; Gelas, Arnaud; Megason, Sean G.

    2013-01-01

    In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish

  2. Setting Healthcare Priorities at the Macro and Meso Levels: A Framework for Evaluation

    PubMed Central

    Barasa, Edwine W.; Molyneux, Sassy; English, Mike; Cleary, Susan

    2015-01-01

    Background: Priority setting in healthcare is a key determinant of health system performance. However, there is no widely accepted priority setting evaluation framework. We reviewed literature with the aim of developing and proposing a framework for the evaluation of macro and meso level healthcare priority setting practices. Methods: We systematically searched Econlit, PubMed, CINAHL, and EBSCOhost databases and supplemented this with searches in Google Scholar, relevant websites and reference lists of relevant papers. A total of 31 papers on evaluation of priority setting were identified. These were supplemented by broader theoretical literature related to evaluation of priority setting. A conceptual review of selected papers was undertaken. Results: Based on a synthesis of the selected literature, we propose an evaluative framework that requires that priority setting practices at the macro and meso levels of the health system meet the following conditions: (1) Priority setting decisions should incorporate both efficiency and equity considerations as well as the following outcomes; (a) Stakeholder satisfaction, (b) Stakeholder understanding, (c) Shifted priorities (reallocation of resources), and (d) Implementation of decisions. (2) Priority setting processes should also meet the procedural conditions of (a) Stakeholder engagement, (b) Stakeholder empowerment, (c) Transparency, (d) Use of evidence, (e) Revisions, (f) Enforcement, and (g) Being grounded on community values. Conclusion: Available frameworks for the evaluation of priority setting are mostly grounded on procedural requirements, while few have included outcome requirements. There is, however, increasing recognition of the need to incorporate both consequential and procedural considerations in priority setting practices. In this review, we adapt an integrative approach to develop and propose a framework for the evaluation of priority setting practices at the macro and meso levels that draws from these

  3. A unified variational segmentation framework with a level-set based sparse composite shape prior

    NASA Astrophysics Data System (ADS)

    Liu, Wenyang; Ruan, Dan

    2015-03-01

    Image segmentation plays an essential role in many medical applications. Low SNR conditions and various artifacts makes its automation challenging. To achieve robust and accurate segmentation results, a good approach is to introduce proper shape priors. In this study, we present a unified variational segmentation framework that regularizes the target shape with a level-set based sparse composite prior. When the variational problem is solved with a block minimization/decent scheme, the regularizing impact of the sparse composite prior can be observed to adjust to the most recent shape estimate, and may be interpreted as a ‘dynamic’ shape prior, yet without compromising convergence thanks to the unified energy framework. The proposed method was applied to segment corpus callosum from 2D MR images and liver from 3D CT volumes. Its performance was evaluated using Dice Similarity Coefficient and Hausdorff distance, and compared with two benchmark level-set based segmentation methods. The proposed method has achieved statistically significant higher accuracy in both experiments and avoided faulty inclusion/exclusion of surrounding structures with similar intensities, as opposed to the benchmark methods.

  4. A Unified Variational Segmentation Framework with a Level-set based Sparse Composite Shape Prior

    PubMed Central

    Liu, Wenyang; Ruan, Dan

    2015-01-01

    Image segmentation plays an essential role in many medical applications. Low SNR conditions and various artifacts makes its automation challenging. To achieve robust and accurate segmentation results, a good approach is to introduce proper shape priors. In this study, we present a unified variational segmentation framework that regularizes the target shape with a level-set based sparse composite prior. When the variational problem is solved with a block minimization/decent scheme, the regularizing impact of the sparse composite prior can be observed to adjust to the most recent shape estimate, and may be interpreted as a “dynamic” shape prior, yet without compromising convergence thanks to the unified energy framework. The proposed method was applied to segment corpus callosum from 2D MR images and liver from 3D CT volumes. Its performance was evaluated using Dice Similarity Coefficient and Hausdorff distance, and compared with two benchmark level-set based segmentation methods. The proposed method has achieved statistically significant higher accuracy in both experiments and avoided faulty inclusion/exclusion of surrounding structures with similar intensities, as opposed to the benchmark methods. PMID:25668234

  5. A multi-phase level set framework for source reconstruction in bioluminescence tomography

    SciTech Connect

    Huang Heyu; Qu Xiaochao; Liang Jimin; He Xiaowei; Chen Xueli; Yang Da'an; Tian Jie

    2010-07-01

    We propose a novel multi-phase level set algorithm for solving the inverse problem of bioluminescence tomography. The distribution of unknown interior source is considered as piecewise constant and represented by using multiple level set functions. The localization of interior bioluminescence source is implemented by tracing the evolution of level set function. An alternate search scheme is incorporated to ensure the global optimal of reconstruction. Both numerical and physical experiments are performed to evaluate the developed level set reconstruction method. Reconstruction results show that the proposed method can stably resolve the interior source of bioluminescence tomography.

  6. A coupled level-set framework for bladder wall segmentation with application to MRI-based virtual cystoscopy

    NASA Astrophysics Data System (ADS)

    Duan, Chaijie; Bao, Shanglian; Liang, Zhengrong

    2009-02-01

    In this paper, we propose a coupled level-set framework for segmentation of bladder wall using T1-weighted magnetic resonance (MR) images. The segmentation results will be used for non-invasive MR-based virtual cystoscopy (VCys). The framework uses two level-set functions to segment inner and outer borders of the bladder wall respectively. Based on Chan-Vese (C-V) model, a local adaptive fitting (LAF) image energy is introduced to capture local intensity contrast. Comparing with previous work, our method has the following advantages. First of all, unlike most other work which only segments the boundary of the bladder but not inner border and outer border respectively, our method extracts the inner border as well as the outer border of bladder wall automatically. Secondly, we focus on T1-weighted MR images which decrease the image intensity of the urine and therefore minimize the partial volume effect (PVE) on the bladder wall for detection of abnormalities on the mucosa layer in contrast to others' work on CT images and T2-weighted MR images which enhance the intensity of the urine and encounter the PVE. In addition, T1-weighted MR images provide the best tissue contrast for detection of the outer border of the bladder wall. Since MR images tend to be inhomogeneous and have ghost artifacts due to motion and other causes as compared to computer tomography (CT)-based VCys, our framework is easy to control the geometric property of level-set functions to mitigate the influences of inhomogeneity and ghosts. Finally, a variety of geometric parameters, such as the thickness of bladder wall, etc, can be measured easily under the level-set framework. These parameters are clinically important for VCys. The segmentation results were evaluated by experienced radiologists, whose feedback strongly demonstrated the usefulness of such coupled level-set framework for VCys.

  7. Improving district level health planning and priority setting in Tanzania through implementing accountability for reasonableness framework: Perceptions of stakeholders

    PubMed Central

    2010-01-01

    Background In 2006, researchers and decision-makers launched a five-year project - Response to Accountable Priority Setting for Trust in Health Systems (REACT) - to improve planning and priority-setting through implementing the Accountability for Reasonableness framework in Mbarali District, Tanzania. The objective of this paper is to explore the acceptability of Accountability for Reasonableness from the perspectives of the Council Health Management Team, local government officials, health workforce and members of user boards and committees. Methods Individual interviews were carried out with different categories of actors and stakeholders in the district. The interview guide consisted of a series of questions, asking respondents to describe their perceptions regarding each condition of the Accountability for Reasonableness framework in terms of priority setting. Interviews were analysed using thematic framework analysis. Documentary data were used to support, verify and highlight the key issues that emerged. Results Almost all stakeholders viewed Accountability for Reasonableness as an important and feasible approach for improving priority-setting and health service delivery in their context. However, a few aspects of Accountability for Reasonableness were seen as too difficult to implement given the socio-political conditions and traditions in Tanzania. Respondents mentioned: budget ceilings and guidelines, low level of public awareness, unreliable and untimely funding, as well as the limited capacity of the district to generate local resources as the major contextual factors that hampered the full implementation of the framework in their context. Conclusion This study was one of the first assessments of the applicability of Accountability for Reasonableness in health care priority-setting in Tanzania. The analysis, overall, suggests that the Accountability for Reasonableness framework could be an important tool for improving priority-setting processes in the

  8. Monitoring street-level spatial-temporal variations of carbon monoxide in urban settings using a wireless sensor network (WSN) framework.

    PubMed

    Wen, Tzai-Hung; Jiang, Joe-Air; Sun, Chih-Hong; Juang, Jehn-Yih; Lin, Tzu-Shiang

    2013-12-01

    Air pollution has become a severe environmental problem due to urbanization and heavy traffic. Monitoring street-level air quality is an important issue, but most official monitoring stations are installed to monitor large-scale air quality conditions, and their limited spatial resolution cannot reflect the detailed variations in air quality that may be induced by traffic jams. By deploying wireless sensors on crossroads and main roads, this study established a pilot framework for a wireless sensor network (WSN)-based real-time monitoring system to understand street-level spatial-temporal changes of carbon monoxide (CO) in urban settings. The system consists of two major components. The first component is the deployment of wireless sensors. We deployed 44 sensor nodes, 40 transmitter nodes and four gateway nodes in this study. Each sensor node includes a signal processing module, a CO sensor and a wireless communication module. In order to capture realistic human exposure to traffic pollutants, all sensors were deployed at a height of 1.5 m on lampposts and traffic signs. The study area covers a total length of 1.5 km of Keelung Road in Taipei City. The other component is a map-based monitoring platform for sensor data visualization and manipulation in time and space. Using intensive real-time street-level monitoring framework, we compared the spatial-temporal patterns of air pollution in different time periods. Our results capture four CO concentration peaks throughout the day at the location, which was located along an arterial and nearby traffic sign. The hourly average could reach 5.3 ppm from 5:00 pm to 7:00 pm due to the traffic congestion. The proposed WSN-based framework captures detailed ground information and potential risk of human exposure to traffic-related air pollution. It also provides street-level insights into real-time monitoring for further early warning of air pollution and urban environmental management.

  9. Monitoring Street-Level Spatial-Temporal Variations of Carbon Monoxide in Urban Settings Using a Wireless Sensor Network (WSN) Framework

    PubMed Central

    Wen, Tzai-Hung; Jiang, Joe-Air; Sun, Chih-Hong; Juang, Jehn-Yih; Lin, Tzu-Shiang

    2013-01-01

    Air pollution has become a severe environmental problem due to urbanization and heavy traffic. Monitoring street-level air quality is an important issue, but most official monitoring stations are installed to monitor large-scale air quality conditions, and their limited spatial resolution cannot reflect the detailed variations in air quality that may be induced by traffic jams. By deploying wireless sensors on crossroads and main roads, this study established a pilot framework for a wireless sensor network (WSN)-based real-time monitoring system to understand street-level spatial-temporal changes of carbon monoxide (CO) in urban settings. The system consists of two major components. The first component is the deployment of wireless sensors. We deployed 44 sensor nodes, 40 transmitter nodes and four gateway nodes in this study. Each sensor node includes a signal processing module, a CO sensor and a wireless communication module. In order to capture realistic human exposure to traffic pollutants, all sensors were deployed at a height of 1.5 m on lampposts and traffic signs. The study area covers a total length of 1.5 km of Keelung Road in Taipei City. The other component is a map-based monitoring platform for sensor data visualization and manipulation in time and space. Using intensive real-time street-level monitoring framework, we compared the spatial-temporal patterns of air pollution in different time periods. Our results capture four CO concentration peaks throughout the day at the location, which was located along an arterial and nearby traffic sign. The hourly average could reach 5.3 ppm from 5:00 pm to 7:00 pm due to the traffic congestion. The proposed WSN-based framework captures detailed ground information and potential risk of human exposure to traffic-related air pollution. It also provides street-level insights into real-time monitoring for further early warning of air pollution and urban environmental management. PMID:24287859

  10. A framework and a set of tools called Nutting models to estimate retention capacities and loads of nitrogen and phosphorus in rivers at catchment and national level (France)

    NASA Astrophysics Data System (ADS)

    Legeay, Pierre-Louis; Moatar, Florentina; Dupas, Rémi; Gascuel-Odoux, Chantal

    2016-04-01

    The Nutting-N and Nutting-P models (Dupas et al., 2013, 2015) have been developed to estimate Nitrogen and Phosphorus nonpoint-source emissions to surface water, using readily available data. These models were inspired from US model SPARROW (Smith al., 1997) and European model GREEN (Grizzetti et al., 2008), i.e. statistical approaches consisting of linking nitrogen and phosphorus surplus to catchment's land and rivers characteristics to find the catchment relative retention capacities. The nutrient load (L) at the outlet of each catchment is expressed as: L=R*(B*DS+PS) [1] where DS is diffuse sources (i.e. surplus in kg.ha-1/yr-1 for N, P storage in soil for P), PS is point sources from domestic and industrial origin (kg.ha-1.yr-1), R and B are the river system and basin reduction factor, respectively and they combine observed variables and calibrated parameters. The model was calibrated on independent catchments for the 2005-2009 and 2008-2012 periods. Variables were selected according to Bayesian Information Criterion (BIC) in order to optimize the predictive performance of the models. From these basic models, different improvements have been realized to build a framework and a set of tools: 1) a routing module has been added in order to improve estimations on 4 or 5 stream order, i.e. upscaling the basic Nutting approach; 2) a territorial module, in order to test the models at local scale (from 500 to 5000 km²); 3) a seasonal estimation has been investigated. The basic approach as well territorial application will be illustrated. These tools allow water manager to identify areas at risk where high nutrients loads are estimated, as well areas where retention is potentially high and can buffer high nutrient sources. References Dupas R., Curie F., Gascuel-Odoux C., Moatar F., Delmas M., Parnaudeau, V., Durand P., 2013. Assessing N emissions in surface water at the national level: Comparison of country-wide vs. regionalized models. Science of the Total Environment

  11. Conceptual frameworks for setting environmental standards.

    PubMed

    Philipp, R

    1996-01-01

    Following the Second European Conference on Environment and Health, held from 20 to 22 June 1994 in Helsinki, the World Health Organization (WHO) established a National Environmental Health Action Plan pilot project. During 1995, and as part of its work for this project with the WHO European Environmental Health Committee, the UK Royal Commission on Environmental Pollution began to seek evidence for the basis of setting environmental standards and to ask if a more consistent and robust basis can be found for establishing them. This paper explores the conceptual frameworks needed to help establish policy and address practical questions associated with different pollutants, exposures and environmental settings. It addresses sustainable development, inter-generational equity and environmental quality, the European Charter on Environment and Health, the Treaty of Maastricht, economic, educational and training issues, risk assessment, the role of environmental epidemiology, and definitions of environmental quality objectives, environmental health indicators, environmental epidemiology and environmental impact assessment.

  12. An adaptive level set method

    SciTech Connect

    Milne, R.B.

    1995-12-01

    This thesis describes a new method for the numerical solution of partial differential equations of the parabolic type on an adaptively refined mesh in two or more spatial dimensions. The method is motivated and developed in the context of the level set formulation for the curvature dependent propagation of surfaces in three dimensions. In that setting, it realizes the multiple advantages of decreased computational effort, localized accuracy enhancement, and compatibility with problems containing a range of length scales.

  13. High-Level Application Framework for LCLS

    SciTech Connect

    Chu, P; Chevtsov, S.; Fairley, D.; Larrieu, C.; Rock, J.; Rogind, D.; White, G.; Zalazny, M.; /SLAC

    2008-04-22

    A framework for high level accelerator application software is being developed for the Linac Coherent Light Source (LCLS). The framework is based on plug-in technology developed by an open source project, Eclipse. Many existing functionalities provided by Eclipse are available to high-level applications written within this framework. The framework also contains static data storage configuration and dynamic data connectivity. Because the framework is Eclipse-based, it is highly compatible with any other Eclipse plug-ins. The entire infrastructure of the software framework will be presented. Planned applications and plug-ins based on the framework are also presented.

  14. Standard Setting to an International Reference Framework: Implications for Theory and Practice

    ERIC Educational Resources Information Center

    Lim, Gad S.; Geranpayeh, Ardeshir; Khalifa, Hanan; Buckendahl, Chad W.

    2013-01-01

    Standard setting theory has largely developed with reference to a typical situation, determining a level or levels of performance for one exam for one context. However, standard setting is now being used with international reference frameworks, where some parameters and assumptions of classical standard setting do not hold. We consider the…

  15. Can frameworks inform knowledge about health policy processes? Reviewing health policy papers on agenda setting and testing them against a specific priority-setting framework.

    PubMed

    Walt, Gill; Gilson, Lucy

    2014-12-01

    This article systematically reviews a set of health policy papers on agenda setting and tests them against a specific priority-setting framework. The article applies the Shiffman and Smith framework in extracting and synthesizing data from an existing set of papers, purposively identified for their relevance and systematically reviewed. Its primary aim is to assess how far the component parts of the framework help to identify the factors that influence the agenda setting stage of the policy process at global and national levels. It seeks to advance the field and inform the development of theory in health policy by examining the extent to which the framework offers a useful approach for organizing and analysing data. Applying the framework retrospectively to the selected set of papers, it aims to explore influences on priority setting and to assess how far the framework might gain from further refinement or adaptation, if used prospectively. In pursuing its primary aim, the article also demonstrates how the approach of framework synthesis can be used in health policy analysis research.

  16. Towards a Framework for Change Detection in Data Sets

    NASA Astrophysics Data System (ADS)

    Böttcher, Mirko; Nauck, Detlef; Ruta, Dymitr; Spott, Martin

    Since the world with its markets, innovations and customers is changing faster than ever before, the key to survival for businesses is the ability to detect, assess and respond to changing conditions rapidly and intelligently. Discovering changes and reacting to or acting upon them before others do has therefore become a strategical issue for many companies. However, existing data analysis techniques are insufflent for this task since they typically assume that the domain under consideration is stable over time. This paper presents a framework that detects changes within a data set at virtually any level of granularity. The underlying idea is to derive a rule-based description of the data set at different points in time and to subsequently analyse how these rules change. Nevertheless, further techniques are required to assist the data analyst in interpreting and assessing their changes. Therefore the framework also contains methods to discard rules that are non-drivers for change and to assess the interestingness of detected changes.

  17. Fast Sparse Level Sets on Graphics Hardware.

    PubMed

    Jalba, Andrei C; van der Laan, Wladimir J; Roerdink, Jos B T M

    2013-01-01

    The level-set method is one of the most popular techniques for capturing and tracking deformable interfaces. Although level sets have demonstrated great potential in visualization and computer graphics applications, such as surface editing and physically based modeling, their use for interactive simulations has been limited due to the high computational demands involved. In this paper, we address this computational challenge by leveraging the increased computing power of graphics processors, to achieve fast simulations based on level sets. Our efficient, sparse GPU level-set method is substantially faster than other state-of-the-art, parallel approaches on both CPU and GPU hardware. We further investigate its performance through a method for surface reconstruction, based on GPU level sets. Our novel multiresolution method for surface reconstruction from unorganized point clouds compares favorably with recent, existing techniques and other parallel implementations. Finally, we point out that both level-set computations and rendering of level-set surfaces can be performed at interactive rates, even on large volumetric grids. Therefore, many applications based on level sets can benefit from our sparse level-set method.

  18. International Review of Frameworks for Standard Setting & Labeling Development

    SciTech Connect

    Zhou, Nan; Khanna, Nina Zheng; Fridley, David; Romankiewicz, John

    2012-09-01

    As appliance energy efficiency standards and labeling (S&L) programs reach a broader geographic and product scope, a series of sophisticated and complex technical and economic analyses have been adopted by different countries in the world to support and enhance these growing S&L programs. The initial supporting techno-economic and impact analyses for S&L development make up a defined framework and process for setting and developing appropriate appliance efficiency standards and labeling programs. This report reviews in-depth the existing framework for standards setting and label development in the well-established programs of the U.S., Australia and the EU to identify and evaluate major trends in how and why key analyses are undertaken and to understand major similarities and differences between each of the frameworks.

  19. Level Set Method for Positron Emission Tomography

    PubMed Central

    Chan, Tony F.; Li, Hongwei; Lysaker, Marius; Tai, Xue-Cheng

    2007-01-01

    In positron emission tomography (PET), a radioactive compound is injected into the body to promote a tissue-dependent emission rate. Expectation maximization (EM) reconstruction algorithms are iterative techniques which estimate the concentration coefficients that provide the best fitted solution, for example, a maximum likelihood estimate. In this paper, we combine the EM algorithm with a level set approach. The level set method is used to capture the coarse scale information and the discontinuities of the concentration coefficients. An intrinsic advantage of the level set formulation is that anatomical information can be efficiently incorporated and used in an easy and natural way. We utilize a multiple level set formulation to represent the geometry of the objects in the scene. The proposed algorithm can be applied to any PET configuration, without major modifications. PMID:18354724

  20. An information-aware framework for exploring multivariate data sets.

    PubMed

    Biswas, Ayan; Dutta, Soumya; Shen, Han-Wei; Woodring, Jonathan

    2013-12-01

    Information theory provides a theoretical framework for measuring information content for an observed variable, and has attracted much attention from visualization researchers for its ability to quantify saliency and similarity among variables. In this paper, we present a new approach towards building an exploration framework based on information theory to guide the users through the multivariate data exploration process. In our framework, we compute the total entropy of the multivariate data set and identify the contribution of individual variables to the total entropy. The variables are classified into groups based on a novel graph model where a node represents a variable and the links encode the mutual information shared between the variables. The variables inside the groups are analyzed for their representativeness and an information based importance is assigned. We exploit specific information metrics to analyze the relationship between the variables and use the metrics to choose isocontours of selected variables. For a chosen group of points, parallel coordinates plots (PCP) are used to show the states of the variables and provide an interface for the user to select values of interest. Experiments with different data sets reveal the effectiveness of our proposed framework in depicting the interesting regions of the data sets taking into account the interaction among the variables.

  1. Priority setting: what constitutes success? A conceptual framework for successful priority setting

    PubMed Central

    Sibbald, Shannon L; Singer, Peter A; Upshur, Ross; Martin, Douglas K

    2009-01-01

    Background The sustainability of healthcare systems worldwide is threatened by a growing demand for services and expensive innovative technologies. Decision makers struggle in this environment to set priorities appropriately, particularly because they lack consensus about which values should guide their decisions. One way to approach this problem is to determine what all relevant stakeholders understand successful priority setting to mean. The goal of this research was to develop a conceptual framework for successful priority setting. Methods Three separate empirical studies were completed using qualitative data collection methods (one-on-one interviews with healthcare decision makers from across Canada; focus groups with representation of patients, caregivers and policy makers; and Delphi study including scholars and decision makers from five countries). Results This paper synthesizes the findings from three studies into a framework of ten separate but interconnected elements germane to successful priority setting: stakeholder understanding, shifted priorities/reallocation of resources, decision making quality, stakeholder acceptance and satisfaction, positive externalities, stakeholder engagement, use of explicit process, information management, consideration of values and context, and revision or appeals mechanism. Conclusion The ten elements specify both quantitative and qualitative dimensions of priority setting and relate to both process and outcome components. To our knowledge, this is the first framework that describes successful priority setting. The ten elements identified in this research provide guidance for decision makers and a common language to discuss priority setting success and work toward improving priority setting efforts. PMID:19265518

  2. Novelty detection using level set methods.

    PubMed

    Ding, Xuemei; Li, Yuhua; Belatreche, Ammar; Maguire, Liam P

    2015-03-01

    This paper presents a level set boundary description (LSBD) approach for novelty detection that treats the nonlinear boundary directly in the input space. The proposed approach consists of level set function (LSF) construction, boundary evolution, and termination of the training process. It employs kernel density estimation to construct the LSF of the initial boundary for the training data set. Then, a sign of the LSF-based algorithm is proposed to evolve the boundary and make it fit more tightly in the data distribution. The training process terminates when an expected fraction of rejected normal data is reached. The evolution process utilizes the signs of the LSF values at all training data points to decide whether to expand or shrink the boundary. Extensive experiments are conducted on benchmark data sets to evaluate the proposed LSBD method and compare it against four representative novelty detection methods. The experimental results demonstrate that the novelty detector modeled with the proposed LSBD can effectively detect anomalies. PMID:25720011

  3. Level Set Segmentation of Lumbar Vertebrae Using Appearance Models

    NASA Astrophysics Data System (ADS)

    Fritscher, Karl; Leber, Stefan; Schmölz, Werner; Schubert, Rainer

    For the planning of surgical interventions of the spine exact knowledge about 3D shape and the local bone quality of vertebrae are of great importance in order to estimate the anchorage strength of screws or implants. As a prerequisite for quantitative analysis a method for objective and therefore automated segmentation of vertebrae is needed. In this paper a framework for the automatic segmentation of vertebrae using 3D appearance models in a level set framework is presented. In this framework model information as well as gradient information and probabilities of pixel intensities at object edges in the unseen image are used. The method is tested on 29 lumbar vertebrae leading to accurate results, which can be useful for surgical planning and further analysis of the local bone quality.

  4. Setting the stage for master's level success

    NASA Astrophysics Data System (ADS)

    Roberts, Donna

    Comprehensive reading, writing, research, and study skills play a critical role in a graduate student's success and ability to contribute to a field of study effectively. The literature indicated a need to support graduate student success in the areas of mentoring, navigation, as well as research and writing. The purpose of this two-phased mixed methods explanatory study was to examine factors that characterize student success at the Master's level in the fields of education, sociology and social work. The study was grounded in a transformational learning framework which focused on three levels of learning: technical knowledge, practical or communicative knowledge, and emancipatory knowledge. The study included two data collection points. Phase one consisted of a Master's Level Success questionnaire that was sent via Qualtrics to graduate level students at three colleges and universities in the Central Valley of California: a California State University campus, a University of California campus, and a private college campus. The results of the chi-square indicated that seven questionnaire items were significant with p values less than .05. Phase two in the data collection included semi-structured interview questions that resulted in three themes emerged using Dedoose software: (1) the need for more language and writing support at the Master's level, (2) the need for mentoring, especially for second-language learners, and (3) utilizing the strong influence of faculty in student success. It is recommended that institutions continually assess and strengthen their programs to meet the full range of learners and to support students to degree completion.

  5. Etch Profile Simulation Using Level Set Methods

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    Etching and deposition of materials are critical steps in semiconductor processing for device manufacturing. Both etching and deposition may have isotropic and anisotropic components, due to directional sputtering and redeposition of materials, for example. Previous attempts at modeling profile evolution have used so-called "string theory" to simulate the moving solid-gas interface between the semiconductor and the plasma. One complication of this method is that extensive de-looping schemes are required at the profile corners. We will present a 2D profile evolution simulation using level set theory to model the surface. (1) By embedding the location of the interface in a field variable, the need for de-looping schemes is eliminated and profile corners are more accurately modeled. This level set profile evolution model will calculate both isotropic and anisotropic etch and deposition rates of a substrate in low pressure (10s mTorr) plasmas, considering the incident ion energy angular distribution functions and neutral fluxes. We will present etching profiles of Si substrates in Ar/Cl2 discharges for various incident ion energies and trench geometries.

  6. Simulation of Etching Profiles Using Level Sets

    NASA Technical Reports Server (NTRS)

    Hwang, Helen; Govindan, T. R.; Meyyappan, M.; Arnold, James O. (Technical Monitor)

    1998-01-01

    Using plasma discharges to etch trenches and via holes in substrates is an important process in semiconductor manufacturing. Ion enhanced etching involves both neutral fluxes, which are isotropic, and ion fluxes, which are anisotropic. The angular distributions for the ions determines the degree of vertical etch, while the amount of the neutral fluxes determines the etch rate. We have developed a 2D profile evolution simulation which uses level set methods to model the plasma-substrate interface. Using level sets instead of traditional string models avoids the use of complicated delooping algorithms. The simulation calculates the etch rate based on the fluxes and distribution functions of both ions and neutrals. We will present etching profiles of Si substrates in low pressure (10s mTorr) Ar/Cl2 discharges for a variety of incident ion angular distributions. Both ion and neutral re-emission fluxes are included in the calculation of the etch rate, and their contributions to the total etch profile will be demonstrated. In addition, we will show RIE lag effects as a function of different trench aspect ratios. (For sample profiles, please see http://www.ipt.arc.nasa.gov/hwangfig1.html)

  7. Level set based structural topology optimization for minimizing frequency response

    NASA Astrophysics Data System (ADS)

    Shu, Lei; Wang, Michael Yu; Fang, Zongde; Ma, Zhengdong; Wei, Peng

    2011-11-01

    For the purpose of structure vibration reduction, a structural topology optimization for minimizing frequency response is proposed based on the level set method. The objective of the present study is to minimize the frequency response at the specified points or surfaces on the structure with an excitation frequency or a frequency range, subject to the given amount of the material over the admissible design domain. The sensitivity analysis with respect to the structural boundaries is carried out, while the Extended finite element method (X-FEM) is employed for solving the state equation and the adjoint equation. The optimal structure with smooth boundaries is obtained by the level set evolution with advection velocity, derived from the sensitivity analysis and the optimization algorithm. A number of numerical examples, in the frameworks of two-dimension (2D) and three-dimension (3D), are presented to demonstrate the feasibility and effectiveness of the proposed approach.

  8. A probabilistic level set formulation for interactive organ segmentation

    NASA Astrophysics Data System (ADS)

    Cremers, Daniel; Fluck, Oliver; Rousson, Mikael; Aharon, Shmuel

    2007-03-01

    Level set methods have become increasingly popular as a framework for image segmentation. Yet when used as a generic segmentation tool, they suffer from an important drawback: Current formulations do not allow much user interaction. Upon initialization, boundaries propagate to the final segmentation without the user being able to guide or correct the segmentation. In the present work, we address this limitation by proposing a probabilistic framework for image segmentation which integrates input intensity information and user interaction on equal footings. The resulting algorithm determines the most likely segmentation given the input image and the user input. In order to allow a user interaction in real-time during the segmentation, the algorithm is implemented on a graphics card and in a narrow band formulation.

  9. Framework for State-Level Renewable Energy Market Potential Studies

    SciTech Connect

    Kreycik, C.; Vimmerstedt, L.; Doris, E.

    2010-01-01

    State-level policymakers are relying on estimates of the market potential for renewable energy resources as they set goals and develop policies to accelerate the development of these resources. Therefore, accuracy of such estimates should be understood and possibly improved to appropriately support these decisions. This document provides a framework and next steps for state officials who require estimates of renewable energy market potential. The report gives insight into how to conduct a market potential study, including what supporting data are needed and what types of assumptions need to be made. The report distinguishes between goal-oriented studies and other types of studies, and explains the benefits of each.

  10. Chemically Induced Surface Evolutions with Level Sets

    2006-11-17

    ChISELS is used for the theoretical modeling of detailed surface chemistry and consomitant surface evolutions occurring during microsystem fabrication processes conducted at low pressures. Examples include physical vapor deposition (PVD), low pressure chemical vapor deposition (PECVD), and plasma etching. Evolving interfaces are represented using the level-set method and the evolution equations time integrated using a Semi-Lagrangian approach. A Ballistic transport model is employed to solve for the fluxes incident on each of the surface elements.more » Surface chemistry leading to etching or deposition is computed by either coupling to Surface Chemkin (a commercially available code) or by providing user defined subroutines. The computational meshes used are quad-trees (2-D) and oct-trees (3-D), constructed such that grid refinement is localized to regions near the surface interfaces. As the interface evolves, the mesh is dynamically reconstructed as needed for the grid to remain fine only around the interface. For parallel computation, a domain decomposition scheme with dynamic load balancing is used to distribute the computational work across processors.« less

  11. Advanced level set segmentation of the right atrium in MR

    NASA Astrophysics Data System (ADS)

    Chen, Siqi; Kohlberger, Timo; Kirchberg, Klaus J.

    2011-03-01

    Atrial fibrillation is a common heart arrhythmia, and can be effectively treated with ablation. Ablation planning requires 3D models of the patient's left atrium (LA) and/or right atrium (RA), therefore an automatic segmentation procedure to retrieve these models is desirable. In this study, we investigate the use of advanced level set segmentation approaches to automatically segment RA in magnetic resonance angiographic (MRA) volume images. Low contrast to noise ratio makes the boundary between the RA and the nearby structures nearly indistinguishable. Therefore, pure data driven segmentation approaches such as watershed and ChanVese methods are bound to fail. Incorporating training shapes through PCA modeling to constrain the segmentation is one popular solution, and is also used in our segmentation framework. The shape parameters from PCA are optimized with a global histogram based energy model. However, since the shape parameters span a much smaller space, it can not capture fine details of the shape. Therefore, we employ a second refinement step after the shape based segmentation stage, which follows closely the recent work of localized appearance model based techniques. The local appearance model is established through a robust point tracking mechanism and is learned through landmarks embedded on the surface of training shapes. The key contribution of our work is the combination of a statistical shape prior and a localized appearance prior for level set segmentation of the right atrium from MRA. We test this two step segmentation framework on porcine RA to verify the algorithm.

  12. Efficient molecular surface generation using level-set methods.

    PubMed

    Can, Tolga; Chen, Chao-I; Wang, Yuan-Fang

    2006-12-01

    Molecules interact through their surface residues. Calculation of the molecular surface of a protein structure is thus an important step for a detailed functional analysis. One of the main considerations in comparing existing methods for molecular surface computations is their speed. Most of the methods that produce satisfying results for small molecules fail to do so for large complexes. In this article, we present a level-set-based approach to compute and visualize a molecular surface at a desired resolution. The emerging level-set methods have been used for computing evolving boundaries in several application areas from fluid mechanics to computer vision. Our method provides a uniform framework for computing solvent-accessible, solvent-excluded surfaces and interior cavities. The computation is carried out very efficiently even for very large molecular complexes with tens of thousands of atoms. We compared our method to some of the most widely used molecular visualization tools (Swiss-PDBViewer, PyMol, and Chimera) and our results show that we can calculate and display a molecular surface 1.5-3.14 times faster on average than all three of the compared programs. Furthermore, we demonstrate that our method is able to detect all of the interior inaccessible cavities that can accommodate one or more water molecules. PMID:16621636

  13. Non-Euclidean basis function based level set segmentation with statistical shape prior.

    PubMed

    Ruiz, Esmeralda; Reisert, Marco; Bai, Li

    2013-01-01

    We present a new framework for image segmentation with statistical shape model enhanced level sets represented as a linear combination of non-Euclidean radial basis functions (RBFs). The shape prior for the level set is represented as a probabilistic map created from the training data and registered with the target image. The new framework has the following advantages: 1) the explicit RBF representation of the level set allows the level set evolution to be represented as ordinary differential equations and reinitialization is no longer required. 2) The non-Euclidean distance RBFs makes it possible to incorporate image information into the basis functions, which results in more accurate and topologically more flexible solutions. Experimental results are presented to demonstrate the advantages of the method, as well as critical analysis of level sets versus the combination of both methods.

  14. Beyond SMART? A New Framework for Goal Setting

    ERIC Educational Resources Information Center

    Day, Trevor; Tosey, Paul

    2011-01-01

    This article extends currently reported theory and practice in the use of learning goals or targets with students in secondary and further education. Goal-setting and action-planning constructs are employed in personal development plans (PDPs) and personal learning plans (PLPs) and are advocated as practice within the English national policy…

  15. A Systematic Framework for Addressing Treatment Integrity in School Settings

    ERIC Educational Resources Information Center

    Kupzyk, Sara; Shriver, Mark D.

    2016-01-01

    School psychologists are tasked with ensuring treatment integrity because the level of intervention implementation affects decisions about student progress. Treatment integrity includes multiple dimensions that may impact the effectiveness of an intervention including adherence, dosage, quality, and engagement. Unfortunately, treatment integrity…

  16. A contribution to set a legal framework for biofertilisers.

    PubMed

    Malusá, E; Vassilev, N

    2014-08-01

    The extensive research, production and use of microorganisms to improve plant nutrition have resulted in an inconsistent definition of the term "biofertiliser" which, in some cases, is due to the different microbial mechanisms involved. The rationale for adopting the term biofertiliser is that it derives from "biological fertiliser", that, in turn, implies the use of living microorganisms. Here, we propose a definition for this kind of products which is distinguishing them from biostimulants or other inorganic and organic fertilisers. Special emphasis is given to microorganism(s) with multifunctional properties and biofertilisers containing more than one microorganism. This definition could be included in legal provisions regulating registration and marketing requirements. A set of rules is also proposed which could guarantee the quality of biofertilisers present on the market and thus foster their use by farmers.

  17. A contribution to set a legal framework for biofertilisers.

    PubMed

    Malusá, E; Vassilev, N

    2014-08-01

    The extensive research, production and use of microorganisms to improve plant nutrition have resulted in an inconsistent definition of the term "biofertiliser" which, in some cases, is due to the different microbial mechanisms involved. The rationale for adopting the term biofertiliser is that it derives from "biological fertiliser", that, in turn, implies the use of living microorganisms. Here, we propose a definition for this kind of products which is distinguishing them from biostimulants or other inorganic and organic fertilisers. Special emphasis is given to microorganism(s) with multifunctional properties and biofertilisers containing more than one microorganism. This definition could be included in legal provisions regulating registration and marketing requirements. A set of rules is also proposed which could guarantee the quality of biofertilisers present on the market and thus foster their use by farmers. PMID:24903811

  18. Tailoring Healthy Workplace Interventions to Local Healthcare Settings: A Complexity Theory-Informed Workplace of Well-Being Framework.

    PubMed

    Brand, Sarah L; Fleming, Lora E; Wyatt, Katrina M

    2015-01-01

    Many healthy workplace interventions have been developed for healthcare settings to address the consistently low scores of healthcare professionals on assessments of mental and physical well-being. Complex healthcare settings present challenges for the scale-up and spread of successful interventions from one setting to another. Despite general agreement regarding the importance of the local setting in affecting intervention success across different settings, there is no consensus on what it is about a local setting that needs to be taken into account to design healthy workplace interventions appropriate for different local settings. Complexity theory principles were used to understand a workplace as a complex adaptive system and to create a framework of eight domains (system characteristics) that affect the emergence of system-level behaviour. This Workplace of Well-being (WoW) framework is responsive and adaptive to local settings and allows a shared understanding of the enablers and barriers to behaviour change by capturing local information for each of the eight domains. We use the results of applying the WoW framework to one workplace, a UK National Health Service ward, to describe the utility of this approach in informing design of setting-appropriate healthy workplace interventions that create workplaces conducive to healthy behaviour change. PMID:26380358

  19. Tailoring Healthy Workplace Interventions to Local Healthcare Settings: A Complexity Theory-Informed Workplace of Well-Being Framework

    PubMed Central

    Brand, Sarah L.; Fleming, Lora E.; Wyatt, Katrina M.

    2015-01-01

    Many healthy workplace interventions have been developed for healthcare settings to address the consistently low scores of healthcare professionals on assessments of mental and physical well-being. Complex healthcare settings present challenges for the scale-up and spread of successful interventions from one setting to another. Despite general agreement regarding the importance of the local setting in affecting intervention success across different settings, there is no consensus on what it is about a local setting that needs to be taken into account to design healthy workplace interventions appropriate for different local settings. Complexity theory principles were used to understand a workplace as a complex adaptive system and to create a framework of eight domains (system characteristics) that affect the emergence of system-level behaviour. This Workplace of Well-being (WoW) framework is responsive and adaptive to local settings and allows a shared understanding of the enablers and barriers to behaviour change by capturing local information for each of the eight domains. We use the results of applying the WoW framework to one workplace, a UK National Health Service ward, to describe the utility of this approach in informing design of setting-appropriate healthy workplace interventions that create workplaces conducive to healthy behaviour change. PMID:26380358

  20. Tailoring Healthy Workplace Interventions to Local Healthcare Settings: A Complexity Theory-Informed Workplace of Well-Being Framework.

    PubMed

    Brand, Sarah L; Fleming, Lora E; Wyatt, Katrina M

    2015-01-01

    Many healthy workplace interventions have been developed for healthcare settings to address the consistently low scores of healthcare professionals on assessments of mental and physical well-being. Complex healthcare settings present challenges for the scale-up and spread of successful interventions from one setting to another. Despite general agreement regarding the importance of the local setting in affecting intervention success across different settings, there is no consensus on what it is about a local setting that needs to be taken into account to design healthy workplace interventions appropriate for different local settings. Complexity theory principles were used to understand a workplace as a complex adaptive system and to create a framework of eight domains (system characteristics) that affect the emergence of system-level behaviour. This Workplace of Well-being (WoW) framework is responsive and adaptive to local settings and allows a shared understanding of the enablers and barriers to behaviour change by capturing local information for each of the eight domains. We use the results of applying the WoW framework to one workplace, a UK National Health Service ward, to describe the utility of this approach in informing design of setting-appropriate healthy workplace interventions that create workplaces conducive to healthy behaviour change.

  1. A new level set model for multimaterial flows

    SciTech Connect

    Starinshak, David P.; Karni, Smadar; Roe, Philip L.

    2014-01-08

    We present a new level set model for representing multimaterial flows in multiple space dimensions. Instead of associating a level set function with a specific fluid material, the function is associated with a pair of materials and the interface that separates them. A voting algorithm collects sign information from all level sets and determines material designations. M(M ₋1)/2 level set functions might be needed to represent a general M-material configuration; problems of practical interest use far fewer functions, since not all pairs of materials share an interface. The new model is less prone to producing indeterminate material states, i.e. regions claimed by more than one material (overlaps) or no material at all (vacuums). It outperforms existing material-based level set models without the need for reinitialization schemes, thereby avoiding additional computational costs and preventing excessive numerical diffusion.

  2. Levels of racism: a theoretic framework and a gardener's tale.

    PubMed

    Jones, C P

    2000-08-01

    The author presents a theoretic framework for understanding racism on 3 levels: institutionalized, personally mediated, and internalized. This framework is useful for raising new hypotheses about the basis of race-associated differences in health outcomes, as well as for designing effective interventions to eliminate those differences. She then presents an allegory about a gardener with 2 flower boxes, rich and poor soil, and red and pink flowers. This allegory illustrates the relationship between the 3 levels of racism and may guide our thinking about how to intervene to mitigate the impacts of racism on health. It may also serve as a tool for starting a national conversation on racism.

  3. An efficient MRF embedded level set method for image segmentation.

    PubMed

    Yang, Xi; Gao, Xinbo; Tao, Dacheng; Li, Xuelong; Li, Jie

    2015-01-01

    This paper presents a fast and robust level set method for image segmentation. To enhance the robustness against noise, we embed a Markov random field (MRF) energy function to the conventional level set energy function. This MRF energy function builds the correlation of a pixel with its neighbors and encourages them to fall into the same region. To obtain a fast implementation of the MRF embedded level set model, we explore algebraic multigrid (AMG) and sparse field method (SFM) to increase the time step and decrease the computation domain, respectively. Both AMG and SFM can be conducted in a parallel fashion, which facilitates the processing of our method for big image databases. By comparing the proposed fast and robust level set method with the standard level set method and its popular variants on noisy synthetic images, synthetic aperture radar (SAR) images, medical images, and natural images, we comprehensively demonstrate the new method is robust against various kinds of noises. In particular, the new level set method can segment an image of size 500 × 500 within 3 s on MATLAB R2010b installed in a computer with 3.30-GHz CPU and 4-GB memory.

  4. Fine Level Set Structure of Flat Isometric Immersions

    NASA Astrophysics Data System (ADS)

    Hornung, Peter

    2011-03-01

    A result by Pogorelov asserts that C 1 isometric immersions u of a bounded domain {S subset mathbb R^2} into {mathbb {R}^3} whose normal takes values in a set of zero area enjoy the following regularity property: the gradient {f := nabla u} is `developable' in the sense that the nondegenerate level sets of f consist of straight line segments intersecting the boundary of S at both endpoints. Motivated by applications in nonlinear elasticity, we study the level set structure of such f when S is an arbitrary bounded Lipschitz domain. We show that f can be approximated by uniformly bounded maps with a simplified level set structure. We also show that the domain S can be decomposed (up to a controlled remainder) into finitely many subdomains, each of which admits a global line of curvature parametrization.

  5. An Expanded Theoretical Framework of Care Coordination Across Transitions in Care Settings.

    PubMed

    Radwin, Laurel E; Castonguay, Denise; Keenan, Carolyn B; Hermann, Cherice

    2016-01-01

    For many patients, high-quality, patient-centered, and cost-effective health care requires coordination among multiple clinicians and settings. Ensuring optimal care coordination requires a clear understanding of how clinician activities and continuity during transitions affect patient-centeredness and quality outcomes. This article describes an expanded theoretical framework to better understand care coordination. The framework provides clear articulation of concepts. Examples are provided of ways to measure the concepts.

  6. A Conceptual Framework for a Psychometric Theory for Standard Setting with Examples of Its Use for Evaluating the Functioning of Two Standard Setting Methods

    ERIC Educational Resources Information Center

    Reckase, Mark D.

    2006-01-01

    A conceptual framework is proposed for a psychometric theory of standard setting. The framework suggests that participants in a standard setting process (panelists) develop an internal, intended standard as a result of training and the participant's background. The goal of a standard setting process is to convert panelists' intended standards to…

  7. Hippocampus segmentation using locally weighted prior based level set

    NASA Astrophysics Data System (ADS)

    Achuthan, Anusha; Rajeswari, Mandava

    2015-12-01

    Segmentation of hippocampus in the brain is one of a major challenge in medical image segmentation due to its' imaging characteristics, with almost similar intensity between another adjacent gray matter structure, such as amygdala. The intensity similarity has causes the hippocampus to have weak or fuzzy boundaries. With this main challenge being demonstrated by hippocampus, a segmentation method that relies on image information alone may not produce accurate segmentation results. Therefore, it is needed an assimilation of prior information such as shape and spatial information into existing segmentation method to produce the expected segmentation. Previous studies has widely integrated prior information into segmentation methods. However, the prior information has been utilized through a global manner integration, and this does not reflect the real scenario during clinical delineation. Therefore, in this paper, a locally integrated prior information into a level set model is presented. This work utilizes a mean shape model to provide automatic initialization for level set evolution, and has been integrated as prior information into the level set model. The local integration of edge based information and prior information has been implemented through an edge weighting map that decides at voxel level which information need to be observed during a level set evolution. The edge weighting map shows which corresponding voxels having sufficient edge information. Experiments shows that the proposed integration of prior information locally into a conventional edge-based level set model, known as geodesic active contour has shown improvement of 9% in averaged Dice coefficient.

  8. Public Health and Health Promotion Capacity at National and Regional Level: A Review of Conceptual Frameworks

    PubMed Central

    Aluttis, Christoph; den Broucke, Stephan Van; Chiotan, Cristina; Costongs, Caroline; Michelsen, Kai; Brand, Helmut

    2014-01-01

    The concept of capacity building for public health has gained much attention during the last decade. National as well as international organizations increasingly focus their efforts on capacity building to improve performance in the health sector. During the past two decades, a variety of conceptual frameworks have been developed which describe relevant dimensions for public health capacity. Notably, these frameworks differ in design and conceptualization. This paper therefore reviews the existing conceptual frameworks and integrates them into one framework, which contains the most relevant dimensions for public health capacity at the country- or regional level. A comprehensive literature search was performed to identify frameworks addressing public health capacity building at the national or regional level. We content-analysed these frameworks to identify the core dimensions of public health capacity. The dimensions were subsequently synthesized into a set of thematic areas to construct a conceptual framework which describes the most relevant dimensions for capacities at the national- or regional level. The systematic review resulted in the identification of seven core domains for public health capacity: resources, organizational structures, workforce, partnerships, leadership and governance, knowledge development and country specific context. Accordingly, these dimensions were used to construct a framework, which describes these core domains more in detail. Our research shows that although there is no generally agreedupon model of public health capacity, a number of key domains for public health and health promotion capacity are consistently recurring in existing frameworks, regardless of their geographical location or thematic area. As only little work on the core concepts of public health capacities has yet taken place, this study adds value to the discourse by identifying these consistencies across existing frameworks and by synthesising them into a new

  9. Public health and health promotion capacity at national and regional level: a review of conceptual frameworks.

    PubMed

    Aluttis, Christoph; den Broucke, Stephan Van; Chiotan, Cristina; Costongs, Caroline; Michelsen, Kai; Brand, Helmut

    2014-03-26

    The concept of capacity building for public health has gained much attention during the last decade. National as well as international organizations increasingly focus their efforts on capacity building to improve performance in the health sector. During the past two decades, a variety of conceptual frameworks have been developed which describe relevant dimensions for public health capacity. Notably, these frameworks differ in design and conceptualization. This paper therefore reviews the existing conceptual frameworks and integrates them into one framework, which contains the most relevant dimensions for public health capacity at the country- or regional level. A comprehensive literature search was performed to identify frameworks addressing public health capacity building at the national or regional level. We content-analysed these frameworks to identify the core dimensions of public health capacity. The dimensions were subsequently synthesized into a set of thematic areas to construct a conceptual framework which describes the most relevant dimensions for capacities at the national- or regional level. The systematic review resulted in the identification of seven core domains for public health capacity: resources, organizational structures, workforce, partnerships, leadership and governance, knowledge development and country specific context. Accordingly, these dimensions were used to construct a framework, which describes these core domains more in detail. Our research shows that although there is no generally agreedupon model of public health capacity, a number of key domains for public health and health promotion capacity are consistently recurring in existing frameworks, regardless of their geographical location or thematic area. As only little work on the core concepts of public health capacities has yet taken place, this study adds value to the discourse by identifying these consistencies across existing frameworks and by synthesising them into a new

  10. The exchange boundary framework: understanding the evolution of power within collaborative decision-making settings.

    PubMed

    Watson, Erin R; Foster-Fishman, Pennie G

    2013-03-01

    Many community decision-making bodies encounter challenges in creating conditions where stakeholders from disadvantaged populations can authentically participate in ways that give them actual influence over decisions affecting their lives (Foster-Fishman et al., Lessons for the journey: Strategies and suggestions for guiding planning, governance, and sustainability in comprehensive community initiatives. W.K. Kellogg Foundation, Battle Creek, MI, 2004). These challenges are often rooted in asymmetrical power dynamics operating within the settings (Prilleltensky, J Commun Psychol 36:116-136, 2008). In response, this paper presents the Exchange Boundary Framework, a new approach for understanding and promoting authentic, empowered participation within collaborative decision-making settings. The framework expands upon theories currently used in the field of community psychology by focusing on the underlying processes through which power operates in relationships and examining the evolution of power dynamics over time. By integrating concepts from social exchange theory (Emerson, Am Soc Rev 27:31-41, 1962) and social boundaries theory (Hayward, Polity 31(1):1-22, 1998), the framework situates power within parallel processes of resources exchange and social regulation. The framework can be used to understand the conditions leading to power asymmetries within collaborative decisionmaking processes, and guide efforts to promote more equitable and authentic participation by all stakeholders within these settings. In this paper we describe the Exchange Boundary Framework, apply it to three distinct case studies, and discuss key considerations for its application within collaborative community settings.

  11. The adoption of the Reference Framework for diabetes care among primary care physicians in primary care settings

    PubMed Central

    Wong, Martin C.S.; Wang, Harry H.X.; Kwan, Mandy W.M.; Chan, Wai Man; Fan, Carmen K.M.; Liang, Miaoyin; Li, Shannon TS; Fung, Franklin D.H.; Yeung, Ming Sze; Chan, David K.L.; Griffiths, Sian M.

    2016-01-01

    Abstract The prevalence of diabetes mellitus has been increasing both globally and locally. Primary care physicians (PCPs) are in a privileged position to provide first contact and continuing care for diabetic patients. A territory-wide Reference Framework for Diabetes Care for Adults has been released by the Hong Kong Primary Care Office in 2010, with the aim to further enhance evidence-based and high quality care for diabetes in the primary care setting through wide adoption of the Reference Framework. A valid questionnaire survey was conducted among PCPs to evaluate the levels of, and the factors associated with, their adoption of the Reference Framework. A total of 414 completed surveys were received with the response rate of 13.0%. The average adoption score was 3.29 (SD 0.51) out of 4. Approximately 70% of PCPs highly adopted the Reference Framework in their routine practice. Binary logistic regression analysis showed that the PCPs perceptions on the inclusion of sufficient local information (adjusted odds ratio [aOR] = 4.748, 95%CI 1.597–14.115, P = 0.005) and reduction of professional autonomy of PCPs (aOR = 1.859, 95%CI 1.013–3.411, P = 0.045) were more likely to influence their adoption level of the Reference Framework for diabetes care in daily practices. The overall level of guideline adoption was found to be relatively high among PCPs for adult diabetes in primary care settings. The adoption barriers identified in this study should be addressed in the continuous updating of the Reference Framework. Strategies need to be considered to enhance the guideline adoption and implementation capacity. PMID:27495018

  12. Level-Set Topology Optimization with Aeroelastic Constraints

    NASA Technical Reports Server (NTRS)

    Dunning, Peter D.; Stanford, Bret K.; Kim, H. Alicia

    2015-01-01

    Level-set topology optimization is used to design a wing considering skin buckling under static aeroelastic trim loading, as well as dynamic aeroelastic stability (flutter). The level-set function is defined over the entire 3D volume of a transport aircraft wing box. Therefore, the approach is not limited by any predefined structure and can explore novel configurations. The Sequential Linear Programming (SLP) level-set method is used to solve the constrained optimization problems. The proposed method is demonstrated using three problems with mass, linear buckling and flutter objective and/or constraints. A constraint aggregation method is used to handle multiple buckling constraints in the wing skins. A continuous flutter constraint formulation is used to handle difficulties arising from discontinuities in the design space caused by a switching of the critical flutter mode.

  13. An adaptive level set segmentation on a triangulated mesh.

    PubMed

    Xu, Meihe; Thompson, Paul M; Toga, Arthur W

    2004-02-01

    Level set methods offer highly robust and accurate methods for detecting interfaces of complex structures. Efficient techniques are required to transform an interface to a globally defined level set function. In this paper, a novel level set method based on an adaptive triangular mesh is proposed for segmentation of medical images. Special attention is paid to an adaptive mesh refinement and redistancing technique for level set propagation, in order to achieve higher resolution at the interface with minimum expense. First, a narrow band around the interface is built in an upwind fashion. An active square technique is used to determine the shortest distance correspondence (SDC) for each grid vertex. Simultaneously, we also give an efficient approach for signing the distance field. Then, an adaptive improvement algorithm is proposed, which essentially combines two basic techniques: a long-edge-based vertex insertion strategy, and a local improvement. These guarantee that the refined triangulation is related to features along the front and has elements with appropriate size and shape, which fit the front well. We propose a short-edge elimination scheme to coarsen the refined triangular mesh, in order to reduce the extra storage. Finally, we reformulate the general evolution equation by updating 1) the velocities and 2) the gradient of level sets on the triangulated mesh. We give an approach for tracing contours from the level set on the triangulated mesh. Given a two-dimensional image with N grids along a side, the proposed algorithms run in O(kN) time at each iteration. Quantitative analysis shows that our algorithm is of first order accuracy; and when the interface-fitted property is involved in the mesh refinement, both the convergence speed and numerical accuracy are greatly improved. We also analyze the effect of redistancing frequency upon convergence speed and accuracy. Numerical examples include the extraction of inner and outer surfaces of the cerebral cortex

  14. An improved level set method for vertebra CT image segmentation

    PubMed Central

    2013-01-01

    Background Clinical diagnosis and therapy for the lumbar disc herniation requires accurate vertebra segmentation. The complex anatomical structure and the degenerative deformations of the vertebrae makes its segmentation challenging. Methods An improved level set method, namely edge- and region-based level set method (ERBLS), is proposed for vertebra CT images segmentation. By considering the gradient information and local region characteristics of images, the proposed model can efficiently segment images with intensity inhomogeneity and blurry or discontinuous boundaries. To reduce the dependency on manual initialization in many active contour models and for an automatic segmentation, a simple initialization method for the level set function is built, which utilizes the Otsu threshold. In addition, the need of the costly re-initialization procedure is completely eliminated. Results Experimental results on both synthetic and real images demonstrated that the proposed ERBLS model is very robust and efficient. Compared with the well-known local binary fitting (LBF) model, our method is much more computationally efficient and much less sensitive to the initial contour. The proposed method has also applied to 56 patient data sets and produced very promising results. Conclusions An improved level set method suitable for vertebra CT images segmentation is proposed. It has the flexibility of segmenting the vertebra CT images with blurry or discontinuous edges, internal inhomogeneity and no need of re-initialization. PMID:23714300

  15. Barker's Behavior Setting Theory: A Useful Conceptual Framework for Research on Educational Administration.

    ERIC Educational Resources Information Center

    Eklund, S. J.; Scott, M. M.

    1985-01-01

    Research in educational administration needs a coherent empirical base for a comprehensive, ecologically valid theory of administration. This paper describes Roger Barker's Behavior Setting Theory and promotes it as a broad-based conceptual framework for research on educational administration. (Author/TE)

  16. The ICF: A Framework for Setting Goals for Children with Speech Impairment

    ERIC Educational Resources Information Center

    McLeod, Sharynne; Bleile, Ken

    2004-01-01

    The International Classification of Functioning, Disability and Health (ICF) (World Health Organization, 2001) is proposed as a framework for integrative goal setting for children with speech impairment. The ICF incorporates both impairment and social factors to consider when selecting appropriate goals to bring about change in the lives of…

  17. Geologic setting of the low-level burial grounds

    SciTech Connect

    Lindsey, K.A.; Jaeger, G.K.; Slate, J.L.; Swett, K.J.; Mercer, R.B.

    1994-10-13

    This report describes the regional and site specific geology of the Hanford Sites low-level burial grounds in the 200 East and West Areas. The report incorporates data from boreholes across the entire 200 Areas, integrating the geology of this area into a single framework. Geologic cross-sections, isopach maps, and structure contour maps of all major geological units from the top of the Columbia River Basalt Group to the surface are included. The physical properties and characteristics of the major suprabasalt sedimentary units also are discussed.

  18. A Quadrature Free Discontinuous Galerkin Conservative Level Set Scheme

    NASA Astrophysics Data System (ADS)

    Czajkowski, Mark; Desjardins, Olivier

    2010-11-01

    In an effort to improve the scalability and accuracy of the Accurate Conservative Level Set (ACLS) scheme [Desjardins et al., J COMPUT PHYS 227 (2008)], a scheme based on the quadrature free discontinuous Galerkin (DG) methodology has been developed. ACLS relies on a hyperbolic tangent level set function that is transported and reinitialized using conservative schemes in order to alleviate mass conservation issues known to plague level set methods. DG allows for an arbitrarily high order representation of the interface by using a basis of high order polynomials while only using data from the faces of neighboring cells. The small stencil allows DG to have excellent parallel scalability. The diffusion term present in the conservative reinitialization equation is handled using local DG method [Cockburn et al., SIAM J NUMER ANAL 39, (2001)] while the normals are computed from a limited form of the level set function in order to avoid spurious oscillations. The resulting scheme is shown to be both robust, accurate, and highly scalable, making it a method of choice for large-scale simulations of multiphase flows with complex interfacial topology.

  19. Skull-stripping magnetic resonance brain images using a model-based level set.

    PubMed

    Zhuang, Audrey H; Valentino, Daniel J; Toga, Arthur W

    2006-08-01

    The segmentation of brain tissue from nonbrain tissue in magnetic resonance (MR) images, commonly referred to as skull stripping, is an important image processing step in many neuroimage studies. A new mathematical algorithm, a model-based level set (MLS), was developed for controlling the evolution of the zero level curve that is implicitly embedded in the level set function. The evolution of the curve was controlled using two terms in the level set equation, whose values represented the forces that determined the speed of the evolving curve. The first force was derived from the mean curvature of the curve, and the second was designed to model the intensity characteristics of the cortex in MR images. The combination of these forces in a level set framework pushed or pulled the curve toward the brain surface. Quantitative evaluation of the MLS algorithm was performed by comparing the results of the MLS algorithm to those obtained using expert segmentation in 29 sets of pediatric brain MR images and 20 sets of young adult MR images. Another 48 sets of elderly adult MR images were used for qualitatively evaluating the algorithm. The MLS algorithm was also compared to two existing methods, the brain extraction tool (BET) and the brain surface extractor (BSE), using the data from the Internet brain segmentation repository (IBSR). The MLS algorithm provides robust skull-stripping results, making it a promising tool for use in large, multi-institutional, population-based neuroimaging studies.

  20. High-fidelity interface tracking in compressible flows: Unlimited anchored adaptive level set

    NASA Astrophysics Data System (ADS)

    Nourgaliev, R. R.; Theofanous, T. G.

    2007-06-01

    The interface-capturing-fidelity issue of the level set method is addressed wholly within the Eulerian framework. Our aim is for a practical and efficient way to realize the expected benefits of grid resolution and high order schemes. Based on a combination of structured adaptive mesh refinement (SAMR), rather than quad/octrees, and on high-order spatial discretization, rather than the use of Lagrangian particles, our method is tailored to compressible flows, while it provides a potentially useful alternative to the particle level set (PLS) for incompressible flows. Interesting salient features of our method include (a) avoidance of limiting (in treating the Hamiltonian of the level set equation), (b) anchoring the level set in a manner that ensures no drift and no spurious oscillations of the zero level during PDE-reinitialization, and (c) a non-linear tagging procedure for defining the neighborhood of the interface subject to mesh refinement. Numerous computational results on a set of benchmark problems (strongly deforming, stretching and tearing interfaces) demonstrate that with this approach, implemented up to 11th order accuracy, the level set method becomes essentially free of mass conservation errors and also free of parasitic interfacial oscillations, while it is still highly efficient, and convenient for 3D parallel implementation. In addition, demonstration of performance in fully-coupled simulations is presented for multimode Rayleigh-Taylor instability (low-Mach number regime) and shock-induced, bubble-collapse (highly compressible regime).

  1. A Framework for Translating a High Level Security Policy into Low Level Security Mechanisms

    NASA Astrophysics Data System (ADS)

    Hassan, Ahmed A.; Bahgat, Waleed M.

    2010-01-01

    Security policies have different components; firewall, active directory, and IDS are some examples of these components. Enforcement of network security policies to low level security mechanisms faces some essential difficulties. Consistency, verification, and maintenance are the major ones of these difficulties. One approach to overcome these difficulties is to automate the process of translation of high level security policy into low level security mechanisms. This paper introduces a framework of an automation process that translates a high level security policy into low level security mechanisms. The framework is described in terms of three phases; in the first phase all network assets are categorized according to their roles in the network security and relations between them are identified to constitute the network security model. This proposed model is based on organization based access control (OrBAC). However, the proposed model extend the OrBAC model to include not only access control policy but also some other administrative security policies like auditing policy. Besides, the proposed model enables matching of each rule of the high level security policy with the corresponding ones of the low level security policy. Through the second phase of the proposed framework, the high level security policy is mapped into the network security model. The second phase could be considered as a translation of the high level security policy into an intermediate model level. Finally, the intermediate model level is translated automatically into low level security mechanism. The paper illustrates the applicability of proposed approach through an application example.

  2. Skull defect reconstruction based on a new hybrid level set.

    PubMed

    Zhang, Ziqun; Zhang, Ran; Song, Zhijian

    2014-01-01

    Skull defect reconstruction is an important aspect of surgical repair. Historically, a skull defect prosthesis was created by the mirroring technique, surface fitting, or formed templates. These methods are not based on the anatomy of the individual patient's skull, and therefore, the prosthesis cannot precisely correct the defect. This study presented a new hybrid level set model, taking into account both the global optimization region information and the local accuracy edge information, while avoiding re-initialization during the evolution of the level set function. Based on the new method, a skull defect was reconstructed, and the skull prosthesis was produced by rapid prototyping technology. This resulted in a skull defect prosthesis that well matched the skull defect with excellent individual adaptation.

  3. A new level set model for cell image segmentation

    NASA Astrophysics Data System (ADS)

    Ma, Jing-Feng; Hou, Kai; Bao, Shang-Lian; Chen, Chun

    2011-02-01

    In this paper we first determine three phases of cell images: background, cytoplasm and nucleolus according to the general physical characteristics of cell images, and then develop a variational model, based on these characteristics, to segment nucleolus and cytoplasm from their relatively complicated backgrounds. In the meantime, the preprocessing obtained information of cell images using the OTSU algorithm is used to initialize the level set function in the model, which can speed up the segmentation and present satisfactory results in cell image processing.

  4. Improvements to Level Set, Immersed Boundary methods for Interface Tracking

    NASA Astrophysics Data System (ADS)

    Vogl, Chris; Leveque, Randy

    2014-11-01

    It is not uncommon to find oneself solving a moving boundary problem under flow in the context of some application. Of particular interest is when the moving boundary exerts a curvature-dependent force on the liquid. Such a force arises when observing a boundary that is resistant to bending or has surface tension. Numerically speaking, stable numerical computation of the curvature can be difficult as it is often described in terms of high-order derivatives of either marker particle positions or of a level set function. To address this issue, the level set method is modified to track not only the position of the boundary, but the curvature as well. The definition of the signed-distance function that is used to modify the level set method is also used to develop an interpolation-free, closest-point method. These improvements are used to simulate a bending-resistant, inextensible boundary under shear flow to highlight area and volume conservation, as well as stable curvature calculation. Funded by a NSF MSPRF grant.

  5. A linear optimal transportation framework for quantifying and visualizing variations in sets of images

    PubMed Central

    Wang, Wei; Slepčev, Dejan; Basu, Saurav; Ozolek, John A.

    2012-01-01

    Transportation-based metrics for comparing images have long been applied to analyze images, especially where one can interpret the pixel intensities (or derived quantities) as a distribution of ‘mass’ that can be transported without strict geometric constraints. Here we describe a new transportation-based framework for analyzing sets of images. More specifically, we describe a new transportation-related distance between pairs of images, which we denote as linear optimal transportation (LOT). The LOT can be used directly on pixel intensities, and is based on a linearized version of the Kantorovich-Wasserstein metric (an optimal transportation distance, as is the earth mover’s distance). The new framework is especially well suited for computing all pairwise distances for a large database of images efficiently, and thus it can be used for pattern recognition in sets of images. In addition, the new LOT framework also allows for an isometric linear embedding, greatly facilitating the ability to visualize discriminant information in different classes of images. We demonstrate the application of the framework to several tasks such as discriminating nuclear chromatin patterns in cancer cells, decoding differences in facial expressions, galaxy morphologies, as well as sub cellular protein distributions. PMID:23729991

  6. Priority-setting in healthcare: a framework for reasonable clinical judgements.

    PubMed

    Baerøe, K

    2009-08-01

    What are the criteria for reasonable clinical judgements? The reasonableness of macro-level decision-making has been much discussed, but little attention has been paid to the reasonableness of applying guidelines generated at a macro-level to individual cases. This paper considers a framework for reasonable clinical decision-making that will capture cases where relevant guidelines cannot reasonably be followed. There are three main sections. (1) Individual claims on healthcare from the point of view of concerns about equity are analysed. (2) The demands of responsibility and equity on professional clinical performance are discussed, and how the combination of these demands emerges into seven requirements that constitute the framework is explored. Since this framework is developed to assist in reasonable clinical decision-making, practical implications of all these requirements are also suggested. (3) Challenges concerning the framework are discussed. First, a crucial presumption that the framework relies upon is considered-namely, clinicians' willingness to justify their decisions as requested. Then how public deliberation may influence clinical decision-making is discussed. Next is a consideration of how clinicians' need to have confidence in their own judgements in order to perform in a manner worthy of trust would be compatible with adherence to the framework supported by public deliberation. It is concluded that fair distribution in the interplay between macro- and micro-level considerations can be secured by legitimising procedures on each level, by ensuring well-organised and continuing public debate and by basing individual clinical judgements upon well-justified and principled normative bases. PMID:19644007

  7. Online adaptive decision fusion framework based on projections onto convex sets with application to wildfire detection in video

    NASA Astrophysics Data System (ADS)

    Günay, Osman; Töreyin, Behcet Uǧur; Çetin, Ahmet Enis

    2011-07-01

    In this paper, an online adaptive decision fusion framework is developed for image analysis and computer vision applications. In this framework, it is assumed that the compound algorithm consists of several sub-algorithms, each of which yields its own decision as a real number centered around zero, representing the confidence level of that particular sub-algorithm. Decision values are linearly combined with weights that are updated online according to an active fusion method based on performing orthogonal projections onto convex sets describing sub-algorithms. It is assumed that there is an oracle, who is usually a human operator, providing feedback to the decision fusion method. A video-based wildfire detection system is developed to evaluate the performance of the algorithm in handling the problems where data arrives sequentially. In this case, the oracle is the security guard of the forest lookout tower verifying the decision of the combined algorithm. Simulation results are presented.

  8. A geometric level set model for ultrasounds analysis

    SciTech Connect

    Sarti, A.; Malladi, R.

    1999-10-01

    We propose a partial differential equation (PDE) for filtering and segmentation of echocardiographic images based on a geometric-driven scheme. The method allows edge-preserving image smoothing and a semi-automatic segmentation of the heart chambers, that regularizes the shapes and improves edge fidelity especially in presence of distinct gaps in the edge map as is common in ultrasound imagery. A numerical scheme for solving the proposed PDE is borrowed from level set methods. Results on human in vivo acquired 2D, 2D+time,3D, 3D+time echocardiographic images are shown.

  9. Multiregion level-set partitioning of synthetic aperture radar images.

    PubMed

    Ben Ayed, Ismail; Mitiche, Amar; Belhadj, Ziad

    2005-05-01

    The purpose of this study is to investigate Synthetic Aperture Radar (SAR) image segmentation into a given but arbitrary number of gamma homogeneous regions via active contours and level sets. The segmentation of SAR images is a difficult problem due to the presence of speckle which can be modeled as strong, multiplicative noise. The proposed algorithm consists of evolving simple closed planar curves within an explicit correspondence between the interiors of curves and regions of segmentation to minimize a criterion containing a term of conformity of data to a speckle model of noise and a term of regularization. Results are shown on both synthetic and real images.

  10. Microarray missing data imputation based on a set theoretic framework and biological knowledge.

    PubMed

    Gan, Xiangchao; Liew, Alan Wee-Chung; Yan, Hong

    2006-01-01

    Gene expressions measured using microarrays usually suffer from the missing value problem. However, in many data analysis methods, a complete data matrix is required. Although existing missing value imputation algorithms have shown good performance to deal with missing values, they also have their limitations. For example, some algorithms have good performance only when strong local correlation exists in data while some provide the best estimate when data is dominated by global structure. In addition, these algorithms do not take into account any biological constraint in their imputation. In this paper, we propose a set theoretic framework based on projection onto convex sets (POCS) for missing data imputation. POCS allows us to incorporate different types of a priori knowledge about missing values into the estimation process. The main idea of POCS is to formulate every piece of prior knowledge into a corresponding convex set and then use a convergence-guaranteed iterative procedure to obtain a solution in the intersection of all these sets. In this work, we design several convex sets, taking into consideration the biological characteristic of the data: the first set mainly exploit the local correlation structure among genes in microarray data, while the second set captures the global correlation structure among arrays. The third set (actually a series of sets) exploits the biological phenomenon of synchronization loss in microarray experiments. In cyclic systems, synchronization loss is a common phenomenon and we construct a series of sets based on this phenomenon for our POCS imputation algorithm. Experiments show that our algorithm can achieve a significant reduction of error compared to the KNNimpute, SVDimpute and LSimpute methods.

  11. A conceptual framework of computations in mid-level vision

    PubMed Central

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P.

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words—or, rather, descriptors—capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations. PMID:25566044

  12. Interface Surface Area Tracking for the Conservative Level Set Method

    NASA Astrophysics Data System (ADS)

    Firehammer, Stephanie; Desjardins, Olivier

    2015-11-01

    One key question in liquid-gas flows is how to model the interface between phases in a way that is mass, momentum, and energy conserving. The accurate conservative level set (ACLS) method of Desjardins et al. provides a tool for tracking a liquid-gas interface with minimal mass conservation issues; however, it does not explicitly compute the interface surface area and thus nothing can be said a priori about the balance between kinetic energy and surface energy. This work examines an equation for the transport of interface surface area density, which can be written in terms of the gradient of the volume fraction. Furthermore this presentation will outline a numerical method for jointly transporting a conservative level set and surface area density. Finally, we will explore oppportunities for energy conservation via the accurate exchange of energy between the flow field and the interface through surface tension, with test cases to show the results of our extended ACLS method. Funding from the National Science Foundation is gratefully acknowledged.

  13. Variational level set segmentation for forest based on MCMC sampling

    NASA Astrophysics Data System (ADS)

    Yang, Tie-Jun; Huang, Lin; Jiang, Chuan-xian; Nong, Jian

    2014-11-01

    Environmental protection is one of the themes of today's world. The forest is a recycler of carbon dioxide and natural oxygen bar. Protection of forests, monitoring of forest growth is long-term task of environmental protection. It is very important to automatically statistic the forest coverage rate using optical remote sensing images and the computer, by which we can timely understand the status of the forest of an area, and can be freed from tedious manual statistics. Towards the problem of computational complexity of the global optimization using convexification, this paper proposes a level set segmentation method based on Markov chain Monte Carlo (MCMC) sampling and applies it to forest segmentation in remote sensing images. The presented method needs not to do any convexity transformation for the energy functional of the goal, and uses MCMC sampling method with global optimization capability instead. The possible local minima occurring by using gradient descent method is also avoided. There are three major contributions in the paper. Firstly, by using MCMC sampling, the convexity of the energy functional is no longer necessary and global optimization can still be achieved. Secondly, taking advantage of the data (texture) and knowledge (a priori color) to guide the construction of Markov chain, the convergence rate of Markov chains is improved significantly. Finally, the level set segmentation method by integrating a priori color and texture for forest is proposed. The experiments show that our method can efficiently and accurately segment forest in remote sensing images.

  14. XFEM schemes for level set based structural optimization

    NASA Astrophysics Data System (ADS)

    Li, Li; Wang, Michael Yu; Wei, Peng

    2012-12-01

    In this paper, some elegant extended finite element method (XFEM) schemes for level set method structural optimization are proposed. Firstly, two-dimension (2D) and three-dimension (3D) XFEM schemes with partition integral method are developed and numerical examples are employed to evaluate their accuracy, which indicate that an accurate analysis result can be obtained on the structural boundary. Furthermore, the methods for improving the computational accuracy and efficiency of XFEM are studied, which include the XFEM integral scheme without quadrature sub-cells and higher order element XFEM scheme. Numerical examples show that the XFEM scheme without quadrature sub-cells can yield similar accuracy of structural analysis while prominently reducing the time cost and that higher order XFEM elements can improve the computational accuracy of structural analysis in the boundary elements, but the time cost is increasing. Therefore, the balance of time cost between FE system scale and the order of element needs to be discussed. Finally, the reliability and advantages of the proposed XFEM schemes are illustrated with several 2D and 3D mean compliance minimization examples that are widely used in the recent literature of structural topology optimization. All numerical results demonstrate that the proposed XFEM is a promising structural analysis approach for structural optimization with the level set method.

  15. PET image reconstruction with anatomical edge guided level set prior

    NASA Astrophysics Data System (ADS)

    Cheng-Liao, Jinxiu; Qi, Jinyi

    2011-11-01

    Acquiring both anatomical and functional images during one scan, PET/CT systems improve the ability to detect and localize abnormal uptakes. In addition, CT images provide anatomical boundary information that can be used to regularize positron emission tomography (PET) images. Here we propose a new approach to maximum a posteriori reconstruction of PET images with a level set prior guided by anatomical edges. The image prior models both the smoothness of PET images and the similarity between functional boundaries in PET and anatomical boundaries in CT. Level set functions (LSFs) are used to represent smooth and closed functional boundaries. The proposed method does not assume an exact match between PET and CT boundaries. Instead, it encourages similarity between the two boundaries, while allowing different region definition in PET images to accommodate possible signal and position mismatch between functional and anatomical images. While the functional boundaries are guaranteed to be closed by the LSFs, the proposed method does not require closed anatomical boundaries and can utilize incomplete edges obtained from an automatic edge detection algorithm. We conducted computer simulations to evaluate the performance of the proposed method. Two digital phantoms were constructed based on the Digimouse data and a human CT image, respectively. Anatomical edges were extracted automatically from the CT images. Tumors were simulated in the PET phantoms with different mismatched anatomical boundaries. Compared with existing methods, the new method achieved better bias-variance performance. The proposed method was also applied to real mouse data and achieved higher contrast than other methods.

  16. Crossing levels in systems ergonomics: a framework to support 'mesoergonomic' inquiry.

    PubMed

    Karsh, Ben-Tzion; Waterson, Patrick; Holden, Richard J

    2014-01-01

    In this paper we elaborate and articulate the need for what has been termed 'mesoergonomics'. In particular, we argue that the concept has the potential to bridge the gap between, and integrate, established work within the domains of micro- and macroergonomics. Mesoergonomics is defined as an open systems approach to human factors and ergonomics (HFE) theory and research whereby the relationship between variables in at least two different system levels or echelons is studied, and where the dependent variables are human factors and ergonomic constructs. We present a framework which can be used to structure a set of questions for future work and prompt further empirical and conceptual inquiry. The framework consists of four steps: (1) establishing the purpose of the mesoergonomic investigation; (2) selecting human factors and ergonomics variables; (3) selecting a specific type of mesoergonomic investigation; and (4) establishing relationships between system levels. In addition, we describe two case studies which illustrate the workings of the framework and the value of adopting a mesoergonomic perspective within HFE. The paper concludes with a set of issues which could form part of a future agenda for research within systems ergonomics.

  17. Two-way coupled SPH and particle level set fluid simulation.

    PubMed

    Losasso, Frank; Talton, Jerry; Kwatra, Nipun; Fedkiw, Ronald

    2008-01-01

    Grid-based methods have difficulty resolving features on or below the scale of the underlying grid. Although adaptive methods (e.g. RLE, octrees) can alleviate this to some degree, separate techniques are still required for simulating small-scale phenomena such as spray and foam, especially since these more diffuse materials typically behave quite differently than their denser counterparts. In this paper, we propose a two-way coupled simulation framework that uses the particle level set method to efficiently model dense liquid volumes and a smoothed particle hydrodynamics (SPH) method to simulate diffuse regions such as sprays. Our novel SPH method allows us to simulate both dense and diffuse water volumes, fully incorporates the particles that are automatically generated by the particle level set method in under-resolved regions, and allows for two way mixing between dense SPH volumes and grid-based liquid representations.

  18. Level Set Segmentation of Medical Images Based on Local Region Statistics and Maximum a Posteriori Probability

    PubMed Central

    Wang, Yi; Lei, Tao; Fan, Yangyu; Feng, Yan

    2013-01-01

    This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP) and Bayes' rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method. PMID:24302974

  19. A level-set method for interfacial flows with surfactant

    NASA Astrophysics Data System (ADS)

    Xu, Jian-Jun; Li, Zhilin; Lowengrub, John; Zhao, Hongkai

    2006-03-01

    A level-set method for the simulation of fluid interfaces with insoluble surfactant is presented in two-dimensions. The method can be straightforwardly extended to three-dimensions and to soluble surfactants. The method couples a semi-implicit discretization for solving the surfactant transport equation recently developed by Xu and Zhao [J. Xu, H. Zhao. An Eulerian formulation for solving partial differential equations along a moving interface, J. Sci. Comput. 19 (2003) 573-594] with the immersed interface method originally developed by LeVeque and Li and [R. LeVeque, Z. Li. The immersed interface method for elliptic equations with discontinuous coefficients and singular sources, SIAM J. Numer. Anal. 31 (1994) 1019-1044] for solving the fluid flow equations and the Laplace-Young boundary conditions across the interfaces. Novel techniques are developed to accurately conserve component mass and surfactant mass during the evolution. Convergence of the method is demonstrated numerically. The method is applied to study the effects of surfactant on single drops, drop-drop interactions and interactions among multiple drops in Stokes flow under a steady applied shear. Due to Marangoni forces and to non-uniform Capillary forces, the presence of surfactant results in larger drop deformations and more complex drop-drop interactions compared to the analogous cases for clean drops. The effects of surfactant are found to be most significant in flows with multiple drops. To our knowledge, this is the first time that the level-set method has been used to simulate fluid interfaces with surfactant.

  20. Statistics of dark matter halos in the excursion set peak framework

    SciTech Connect

    Lapi, A.; Danese, L. E-mail: danese@sissa.it

    2014-07-01

    We derive approximated, yet very accurate analytical expressions for the abundance and clustering properties of dark matter halos in the excursion set peak framework; the latter relies on the standard excursion set approach, but also includes the effects of a realistic filtering of the density field, a mass-dependent threshold for collapse, and the prescription from peak theory that halos tend to form around density maxima. We find that our approximations work excellently for diverse power spectra, collapse thresholds and density filters. Moreover, when adopting a cold dark matter power spectra, a tophat filtering and a mass-dependent collapse threshold (supplemented with conceivable scatter), our approximated halo mass function and halo bias represent very well the outcomes of cosmological N-body simulations.

  1. Comprehensive evaluation of long-term hydrological data sets: Constraints of the Budyko framework

    NASA Astrophysics Data System (ADS)

    Greve, Peter; Orlowsky, Boris; Seneviratne, Sonia I.

    2013-04-01

    An accurate estimate of the climatological land water balance is essential for a wide range of socio-economical issues. Despite the simplicity of the underlying water balance equation, its individual variables are of complex nature. Global estimates, either derived from observations or from models, of precipitation (P ) and especially evapotranspiration (ET) are characterized by high uncertainties. This leads to inconsistent results in determining conditions related to the land water balance and its components. In this study, we consider the Budyko framework as a constraint to evaluate long-term hydrological data sets within the period from 1984 to 2005. The Budyko framework is a well established empirically based relationsship between ET-P and Ep-P , with Ep being the potential evaporation. We use estimates of ET associated with the LandFlux-EVAL initiative (Mueller et. al., 2012), either derived from observations, CMIP5 models or land-surface models (LSMs) driven with observation-based forcing or atmospheric reanalyses. Data sets of P comprise all commonly used global observation-based estimates. Ep is determined by methods of differing complexity with recent global temperature and radiation data sets. Based on this comprehensive synthesis of data sets and methods to determine Ep, more than 2000 possible combinations for ET-P in conjunction with Ep-P are created. All combinations are validated against the Budyko curve and against physical limits within the Budyko phase space. For this purpose we develop an error measure based on the root mean square error which combines both constraints. We find that uncertainties are mainly induced by the ET data sets. In particular, reanalysis and CMIP5 data sets are characterized by low realism. The realism of LSMs is further not primarily controlled by the forcing, as different LSMs driven with the same forcing show significantly different error measures. Our comprehensive approach is thus suitable to detect uncertainties

  2. Implementing accountability for reasonableness framework at district level in Tanzania: a realist evaluation

    PubMed Central

    2011-01-01

    Background Despite the growing importance of the Accountability for Reasonableness (A4R) framework in priority setting worldwide, there is still an inadequate understanding of the processes and mechanisms underlying its influence on legitimacy and fairness, as conceived and reflected in service management processes and outcomes. As a result, the ability to draw scientifically sound lessons for the application of the framework to services and interventions is limited. This paper evaluates the experiences of implementing the A4R approach in Mbarali District, Tanzania, in order to find out how the innovation was shaped, enabled, and constrained by the interaction between contexts, mechanisms and outcomes. Methods This study draws on the principles of realist evaluation -- a largely qualitative approach, chiefly concerned with testing and refining programme theories by exploring the complex interactions of contexts, mechanisms, and outcomes. Mixed methods were used in data collection, including individual interviews, non-participant observation, and document reviews. A thematic framework approach was adopted for the data analysis. Results The study found that while the A4R approach to priority setting was helpful in strengthening transparency, accountability, stakeholder engagement, and fairness, the efforts at integrating it into the current district health system were challenging. Participatory structures under the decentralisation framework, central government's call for partnership in district-level planning and priority setting, perceived needs of stakeholders, as well as active engagement between researchers and decision makers all facilitated the adoption and implementation of the innovation. In contrast, however, limited local autonomy, low level of public awareness, unreliable and untimely funding, inadequate accountability mechanisms, and limited local resources were the major contextual factors that hampered the full implementation. Conclusion This study

  3. Powerful Set-Based Gene-Environment Interaction Testing Framework for Complex Diseases.

    PubMed

    Jiao, Shuo; Peters, Ulrike; Berndt, Sonja; Bézieau, Stéphane; Brenner, Hermann; Campbell, Peter T; Chan, Andrew T; Chang-Claude, Jenny; Lemire, Mathieu; Newcomb, Polly A; Potter, John D; Slattery, Martha L; Woods, Michael O; Hsu, Li

    2015-12-01

    Identification of gene-environment interaction (G × E) is important in understanding the etiology of complex diseases. Based on our previously developed Set Based gene EnviRonment InterAction test (SBERIA), in this paper we propose a powerful framework for enhanced set-based G × E testing (eSBERIA). The major challenge of signal aggregation within a set is how to tell signals from noise. eSBERIA tackles this challenge by adaptively aggregating the interaction signals within a set weighted by the strength of the marginal and correlation screening signals. eSBERIA then combines the screening-informed aggregate test with a variance component test to account for the residual signals. Additionally, we develop a case-only extension for eSBERIA (coSBERIA) and an existing set-based method, which boosts the power not only by exploiting the G-E independence assumption but also by avoiding the need to specify main effects for a large number of variants in the set. Through extensive simulation, we show that coSBERIA and eSBERIA are considerably more powerful than existing methods within the case-only and the case-control method categories across a wide range of scenarios. We conduct a genome-wide G × E search by applying our methods to Illumina HumanExome Beadchip data of 10,446 colorectal cancer cases and 10,191 controls and identify two novel interactions between nonsteroidal anti-inflammatory drugs (NSAIDs) and MINK1 and PTCHD3.

  4. Powerful Set-Based Gene-Environment Interaction Testing Framework for Complex Diseases.

    PubMed

    Jiao, Shuo; Peters, Ulrike; Berndt, Sonja; Bézieau, Stéphane; Brenner, Hermann; Campbell, Peter T; Chan, Andrew T; Chang-Claude, Jenny; Lemire, Mathieu; Newcomb, Polly A; Potter, John D; Slattery, Martha L; Woods, Michael O; Hsu, Li

    2015-12-01

    Identification of gene-environment interaction (G × E) is important in understanding the etiology of complex diseases. Based on our previously developed Set Based gene EnviRonment InterAction test (SBERIA), in this paper we propose a powerful framework for enhanced set-based G × E testing (eSBERIA). The major challenge of signal aggregation within a set is how to tell signals from noise. eSBERIA tackles this challenge by adaptively aggregating the interaction signals within a set weighted by the strength of the marginal and correlation screening signals. eSBERIA then combines the screening-informed aggregate test with a variance component test to account for the residual signals. Additionally, we develop a case-only extension for eSBERIA (coSBERIA) and an existing set-based method, which boosts the power not only by exploiting the G-E independence assumption but also by avoiding the need to specify main effects for a large number of variants in the set. Through extensive simulation, we show that coSBERIA and eSBERIA are considerably more powerful than existing methods within the case-only and the case-control method categories across a wide range of scenarios. We conduct a genome-wide G × E search by applying our methods to Illumina HumanExome Beadchip data of 10,446 colorectal cancer cases and 10,191 controls and identify two novel interactions between nonsteroidal anti-inflammatory drugs (NSAIDs) and MINK1 and PTCHD3. PMID:26095235

  5. Device for timing and power level setting for microwave applications

    NASA Astrophysics Data System (ADS)

    Ursu, M.-P.; Buidoş, T.

    2016-08-01

    Nowadays, the microwaves are widely used for various technological processes. The microwaves are emitted by magnetrons, which have strict requirements concerning power supplies for anode and filament cathodes, intensity of magnetic field, cooling and electromagnetic shielding. The magnetrons do not tolerate any alteration of their required voltages, currents and magnetic fields, which means that their output microwave power is fixed, so the only way to alter the power level is to use time-division, by turning the magnetron on and off by repetitive time patterns. In order to attain accurate and reproducible results, as well as correct and safe operation of the microwave device, all these requirements must be fulfilled. Safe, correct and reproducible operation of the microwave appliance can be achieved by means of a specially built electronic device, which ensures accurate and reproducible exposure times, interlocking of the commands and automatic switch off when abnormal operating conditions occur. This driving device, designed and realized during the completion of Mr.Ursu's doctoral thesis, consists of a quartz time-base, several programmable frequency and duration dividers, LED displays, sensors and interlocking gates. The active and passive electronic components are placed on custom-made PCB's, designed and made by means of computer-aided applications and machines. The driving commands of the electronic device are delivered to the magnetron power supplies by means of optic zero-passing relays. The inputs of the electronic driving device can sense the status of the microwave appliance. The user is able to enter the total exposure time, the division factor that sets the output power level and, as a novelty, the clock frequency of the time divider.

  6. Parallel level-set methods on adaptive tree-based grids

    NASA Astrophysics Data System (ADS)

    Mirzadeh, Mohammad; Guittet, Arthur; Burstedde, Carsten; Gibou, Frederic

    2016-10-01

    We present scalable algorithms for the level-set method on dynamic, adaptive Quadtree and Octree Cartesian grids. The algorithms are fully parallelized and implemented using the MPI standard and the open-source p4est library. We solve the level set equation with a semi-Lagrangian method which, similar to its serial implementation, is free of any time-step restrictions. This is achieved by introducing a scalable global interpolation scheme on adaptive tree-based grids. Moreover, we present a simple parallel reinitialization scheme using the pseudo-time transient formulation. Both parallel algorithms scale on the Stampede supercomputer, where we are currently using up to 4096 CPU cores, the limit of our current account. Finally, a relevant application of the algorithms is presented in modeling a crystallization phenomenon by solving a Stefan problem, illustrating a level of detail that would be impossible to achieve without a parallel adaptive strategy. We believe that the algorithms presented in this article will be of interest and useful to researchers working with the level-set framework and modeling multi-scale physics in general.

  7. Towards a Dynamic Conceptual Framework for English-Medium Education in Multilingual University Settings

    ERIC Educational Resources Information Center

    Dafouz, Emma; Smit, Ute

    2016-01-01

    At a time of increasing internationalization in tertiary education, English-Medium Education in Multilingual University Settings (EMEMUS) has become a common practice. While there is already ample research describing this phenomenon at a local level (Smit and Dafouz 2012a), the theoretical side needs to be elaborated. This article thus aims to…

  8. An Analysis Framework Addressing the Scale and Legibility of Large Scientific Data Sets

    SciTech Connect

    Childs, Hank R.

    2006-01-01

    Much of the previous work in the large data visualization area has solely focused on handling the scale of the data. This task is clearly a great challenge and necessary, but it is not sufficient. Applying standard visualization techniques to large scale data sets often creates complicated pictures where meaningful trends are lost. A second challenge, then, is to also provide algorithms that simplify what an analyst must understand, using either visual or quantitative means. This challenge can be summarized as improving the legibility or reducing the complexity of massive data sets. Fully meeting both of these challenges is the work of many, many PhD dissertations. In this dissertation, we describe some new techniques to address both the scale and legibility challenges, in hope of contributing to the larger solution. In addition to our assumption of simultaneously addressing both scale and legibility, we add an additional requirement that the solutions considered fit well within an interoperable framework for diverse algorithms, because a large suite of algorithms is often necessary to fully understand complex data sets. For scale, we present a general architecture for handling large data, as well as details of a contract-based system for integrating advanced optimizations into a data flow network design. We also describe techniques for volume rendering and performing comparisons at the extreme scale. For legibility, we present several techniques. Most noteworthy are equivalence class functions, a technique to drive visualizations using statistical methods, and line-scan based techniques for characterizing shape.

  9. A framework for automatic construction of 3D PDM from segmented volumetric neuroradiological data sets.

    PubMed

    Fu, Yili; Gao, Wenpeng; Xiao, Yongfei; Liu, Jimin

    2010-03-01

    3D point distribution model (PDM) of subcortical structures can be applied in medical image analysis by providing priori-knowledge. However, accurate shape representation and point correspondence are still challenging for building 3D PDM. This paper presents a novel framework for the automated construction of 3D PDMs from a set of segmented volumetric images. First, a template shape is generated according to the spatial overlap. Then the corresponding landmarks among shapes are automatically identified by a novel hierarchical global-to-local approach, which combines iterative closest point based global registration and active surface model based local deformation to transform the template shape to all other shapes. Finally, a 3D PDM is constructed. Experiment results on four subcortical structures show that the proposed method is able to construct 3D PDMs with a high quality in compactness, generalization and specificity, and more efficient and effective than the state-of-art methods such as MDL and SPHARM. PMID:19631401

  10. Choosing a framework for ethical analysis in advanced practice settings: the case for casuistry.

    PubMed

    Artnak, K E; Dimmitt, J H

    1996-02-01

    The need for advanced practice nurses to incorporate ethical analysis into case management is becoming more apparent -- particularly for the increasingly independent practice settings of psychiatric and mental health nursing. The nursing literature contains many articles dealing with the more abstract treatment of clinical ethics, but for the practitioner there is unfortunately little information available that uses ethical principles in a practical framework, which addresses the concrete reality of daily, difficult clinical decision making. This article applies a model of reasoned analysis to an actual case study using the concepts of casuistry or case-based reasoning. This method offers an alternative to the more popular paradigm of principilism. Complicated by the existence of violence and abuse, the case examines several ethical issues including patient privacy and legitimate breaches of patient confidentiality.

  11. Profile Evolution Simulation in Etching Systems Using Level Set Methods

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Govindan, T. R.; Meyyappan, M.

    1998-01-01

    Semiconductor device profiles are determined by the characteristics of both etching and deposition processes. In particular, a highly anisotropic etch is required to achieve vertical sidewalls. However, etching is comprised of both anisotropic and isotropic components, due to ion and neutral fluxes, respectively. In Ar/Cl2 plasmas, for example, neutral chlorine reacts with the Si surfaces to form silicon chlorides. These compounds are then removed by the impinging ion fluxes. Hence the directionality of the ions (and thus the ion angular distribution function, or IAD), as well as the relative fluxes of neutrals and ions determines the amount of undercutting. One method of modeling device profile evolution is to simulate the moving solid-gas interface between the semiconductor and the plasma as a string of nodes. The velocity of each node is calculated and then the nodes are advanced accordingly. Although this technique appears to be relatively straightforward, extensive looping schemes are required at the profile corners. An alternate method is to use level set theory, which involves embedding the location of the interface in a field variable. The normal speed is calculated at each mesh point, and the field variable is updated. The profile comers are more accurately modeled as the need for looping algorithms is eliminated. The model we have developed is a 2-D Level Set Profile Evolution Simulation (LSPES). The LSPES calculates etch rates of a substrate in low pressure plasmas due to the incident ion and neutral fluxes. For a Si substrate in an Ar/C12 gas mixture, for example, the predictions of the LSPES are identical to those from a string evolution model for high neutral fluxes and two different ion angular distributions.(2) In the figure shown, the relative neutral to ion flux in the bulk plasma is 100 to 1. For a moderately isotropic ion angular distribution function as shown in the cases in the left hand column, both the LSPES (top row) and rude's string

  12. NPN fuzzy sets and NPN qualitative algebra: a computational framework for bipolar cognitive modeling and multiagent decision analysis.

    PubMed

    Zhang, W R

    1996-01-01

    An NPN (Negative-Positive-Neutral) fuzzy set theory and an NPN qualitative algebra (Q-algebra) are proposed which form a computational framework for bipolar cognitive modeling and multiagent decision analysis. First a 6-valued NPN logic is introduced which extends the usual 4-valued Q-algebra (S, approximately , plus sign in circle,multiply sign in circle) and S={+,-,0,?} by adding one more level of specification; and then a real-valued NPN fuzzy logic is introduced which extends the 6-valued model to the real space { for all(x,y)|(x,y)in[-1,0]x[0,1]} and adds infinite levels of specifications, As a generalization, a fuzzy set theory is presented that allows beta-level fuzzy number-based NPN variables (x,y) to be substituted into (S, approximately , plus sign in circle,multiply sign in circle) where multiply sign in circle stands for any NPN T-norm; plus sign in circle stands for disjunction (V) or union ( union or logical sum), and beta is the number of alpha-cuts.

  13. INSTITUTIONALIZING SAFEGUARDS-BY-DESIGN: HIGH-LEVEL FRAMEWORK

    SciTech Connect

    Trond Bjornard PhD; Joseph Alexander; Robert Bean; Brian Castle; Scott DeMuth, Ph.D.; Phillip Durst; Michael Ehinger; Prof. Michael Golay, Ph.D.; Kevin Hase, Ph.D.; David J. Hebditch, DPhil; John Hockert, Ph.D.; Bruce Meppen; James Morgan; Jerry Phillips, Ph.D., PE

    2009-02-01

    The application of a Safeguards-by-Design (SBD) process for new nuclear facilities can reduce proliferation risks. A multi-laboratory team was sponsored in Fiscal Year (FY) 2008 to define a SBD process and determine how it could be incorporated into existing facility design and construction processes. The possibility to significantly influence major design features, such as process selection and plant layout, largely ends with the conceptual design step. Therefore SBD’s principal focus must be on the early inclusion of safeguards requirements and the early identification of beneficial design features. The result could help form the basis for a new international norm for integrating safeguards into facility design. This is an interim report describing progress and project status as of the end of FY08. In this effort, SBD is defined as a structured approach to ensure the timely, efficient, and cost-effective integration of international and national safeguards, physical security, and other nonproliferation objectives into the overall design process for a nuclear facility. A key objective is to ensure that security and nonproliferation issues are considered when weighing facility design alternatives. Central to the work completed in FY08 was a study in which a SBD process was developed in the context of the current DOE facility acquisition process. The DOE study enabled the development of a “SBD design loop” that is suitable for use in any facility design process. It is a graded, iterative process that incorporates safeguards concerns throughout the conceptual, preliminary and final design processes. Additionally, a set of proposed design principles for SBD was developed. A “Generic SBD Process” was then developed. Key features of the process include the initiation of safeguards design activities in the pre-conceptual planning phase, early incorporation of safeguards requirements into the project requirements, early appointment of an SBD team, and

  14. Some free boundary problems in potential flow regime usinga based level set method

    SciTech Connect

    Garzon, M.; Bobillo-Ares, N.; Sethian, J.A.

    2008-12-09

    Recent advances in the field of fluid mechanics with moving fronts are linked to the use of Level Set Methods, a versatile mathematical technique to follow free boundaries which undergo topological changes. A challenging class of problems in this context are those related to the solution of a partial differential equation posed on a moving domain, in which the boundary condition for the PDE solver has to be obtained from a partial differential equation defined on the front. This is the case of potential flow models with moving boundaries. Moreover the fluid front will possibly be carrying some material substance which will diffuse in the front and be advected by the front velocity, as for example the use of surfactants to lower surface tension. We present a Level Set based methodology to embed this partial differential equations defined on the front in a complete Eulerian framework, fully avoiding the tracking of fluid particles and its known limitations. To show the advantages of this approach in the field of Fluid Mechanics we present in this work one particular application: the numerical approximation of a potential flow model to simulate the evolution and breaking of a solitary wave propagating over a slopping bottom and compare the level set based algorithm with previous front tracking models.

  15. A Relation Extraction Framework for Biomedical Text Using Hybrid Feature Set

    PubMed Central

    Muzaffar, Abdul Wahab; Azam, Farooque; Qamar, Usman

    2015-01-01

    The information extraction from unstructured text segments is a complex task. Although manual information extraction often produces the best results, it is harder to manage biomedical data extraction manually because of the exponential increase in data size. Thus, there is a need for automatic tools and techniques for information extraction in biomedical text mining. Relation extraction is a significant area under biomedical information extraction that has gained much importance in the last two decades. A lot of work has been done on biomedical relation extraction focusing on rule-based and machine learning techniques. In the last decade, the focus has changed to hybrid approaches showing better results. This research presents a hybrid feature set for classification of relations between biomedical entities. The main contribution of this research is done in the semantic feature set where verb phrases are ranked using Unified Medical Language System (UMLS) and a ranking algorithm. Support Vector Machine and Naïve Bayes, the two effective machine learning techniques, are used to classify these relations. Our approach has been validated on the standard biomedical text corpus obtained from MEDLINE 2001. Conclusively, it can be articulated that our framework outperforms all state-of-the-art approaches used for relation extraction on the same corpus. PMID:26347797

  16. A Relation Extraction Framework for Biomedical Text Using Hybrid Feature Set.

    PubMed

    Muzaffar, Abdul Wahab; Azam, Farooque; Qamar, Usman

    2015-01-01

    The information extraction from unstructured text segments is a complex task. Although manual information extraction often produces the best results, it is harder to manage biomedical data extraction manually because of the exponential increase in data size. Thus, there is a need for automatic tools and techniques for information extraction in biomedical text mining. Relation extraction is a significant area under biomedical information extraction that has gained much importance in the last two decades. A lot of work has been done on biomedical relation extraction focusing on rule-based and machine learning techniques. In the last decade, the focus has changed to hybrid approaches showing better results. This research presents a hybrid feature set for classification of relations between biomedical entities. The main contribution of this research is done in the semantic feature set where verb phrases are ranked using Unified Medical Language System (UMLS) and a ranking algorithm. Support Vector Machine and Naïve Bayes, the two effective machine learning techniques, are used to classify these relations. Our approach has been validated on the standard biomedical text corpus obtained from MEDLINE 2001. Conclusively, it can be articulated that our framework outperforms all state-of-the-art approaches used for relation extraction on the same corpus.

  17. A universal surface complexation framework for modeling proton binding onto bacterial surfaces in geologic settings

    USGS Publications Warehouse

    Borrok, D.; Turner, B.F.; Fein, J.B.

    2005-01-01

    Adsorption onto bacterial cell walls can significantly affect the speciation and mobility of aqueous metal cations in many geologic settings. However, a unified thermodynamic framework for describing bacterial adsorption reactions does not exist. This problem originates from the numerous approaches that have been chosen for modeling bacterial surface protonation reactions. In this study, we compile all currently available potentiometric titration datasets for individual bacterial species, bacterial consortia, and bacterial cell wall components. Using a consistent, four discrete site, non-electrostatic surface complexation model, we determine total functional group site densities for all suitable datasets, and present an averaged set of 'universal' thermodynamic proton binding and site density parameters for modeling bacterial adsorption reactions in geologic systems. Modeling results demonstrate that the total concentrations of proton-active functional group sites for the 36 bacterial species and consortia tested are remarkably similar, averaging 3.2 ?? 1.0 (1??) ?? 10-4 moles/wet gram. Examination of the uncertainties involved in the development of proton-binding modeling parameters suggests that ignoring factors such as bacterial species, ionic strength, temperature, and growth conditions introduces relatively small error compared to the unavoidable uncertainty associated with the determination of cell abundances in realistic geologic systems. Hence, we propose that reasonable estimates of the extent of bacterial cell wall deprotonation can be made using averaged thermodynamic modeling parameters from all of the experiments that are considered in this study, regardless of bacterial species used, ionic strength, temperature, or growth condition of the experiment. The average site densities for the four discrete sites are 1.1 ?? 0.7 ?? 10-4, 9.1 ?? 3.8 ?? 10-5, 5.3 ?? 2.1 ?? 10-5, and 6.6 ?? 3.0 ?? 10-5 moles/wet gram bacteria for the sites with pKa values of 3

  18. Research on classified real-time flood forecasting framework based on K-means cluster and rough set.

    PubMed

    Xu, Wei; Peng, Yong

    2015-01-01

    This research presents a new classified real-time flood forecasting framework. In this framework, historical floods are classified by a K-means cluster according to the spatial and temporal distribution of precipitation, the time variance of precipitation intensity and other hydrological factors. Based on the classified results, a rough set is used to extract the identification rules for real-time flood forecasting. Then, the parameters of different categories within the conceptual hydrological model are calibrated using a genetic algorithm. In real-time forecasting, the corresponding category of parameters is selected for flood forecasting according to the obtained flood information. This research tests the new classified framework on Guanyinge Reservoir and compares the framework with the traditional flood forecasting method. It finds that the performance of the new classified framework is significantly better in terms of accuracy. Furthermore, the framework can be considered in a catchment with fewer historical floods. PMID:26442493

  19. Research on classified real-time flood forecasting framework based on K-means cluster and rough set.

    PubMed

    Xu, Wei; Peng, Yong

    2015-01-01

    This research presents a new classified real-time flood forecasting framework. In this framework, historical floods are classified by a K-means cluster according to the spatial and temporal distribution of precipitation, the time variance of precipitation intensity and other hydrological factors. Based on the classified results, a rough set is used to extract the identification rules for real-time flood forecasting. Then, the parameters of different categories within the conceptual hydrological model are calibrated using a genetic algorithm. In real-time forecasting, the corresponding category of parameters is selected for flood forecasting according to the obtained flood information. This research tests the new classified framework on Guanyinge Reservoir and compares the framework with the traditional flood forecasting method. It finds that the performance of the new classified framework is significantly better in terms of accuracy. Furthermore, the framework can be considered in a catchment with fewer historical floods.

  20. Parallel Computation of the Topology of Level Sets

    SciTech Connect

    Pascucci, V; Cole-McLaughlin, K

    2004-12-16

    This paper introduces two efficient algorithms that compute the Contour Tree of a 3D scalar field F and its augmented version with the Betti numbers of each isosurface. The Contour Tree is a fundamental data structure in scientific visualization that is used to preprocess the domain mesh to allow optimal computation of isosurfaces with minimal overhead storage. The Contour Tree can also be used to build user interfaces reporting the complete topological characterization of a scalar field, as shown in Figure 1. Data exploration time is reduced since the user understands the evolution of level set components with changing isovalue. The Augmented Contour Tree provides even more accurate information segmenting the range space of the scalar field in portion of invariant topology. The exploration time for a single isosurface is also improved since its genus is known in advance. Our first new algorithm augments any given Contour Tree with the Betti numbers of all possible corresponding isocontours in linear time with the size of the tree. Moreover we show how to extend the scheme introduced in [3] with the Betti number computation without increasing its complexity. Thus, we improve on the time complexity from our previous approach [10] from O(m log m) to O(n log n + m), where m is the number of cells and n is the number of vertices in the domain of F. Our second contribution is a new divide-and-conquer algorithm that computes the Augmented Contour Tree with improved efficiency. The approach computes the output Contour Tree by merging two intermediate Contour Trees and is independent of the interpolant. In this way we confine any knowledge regarding a specific interpolant to an independent function that computes the tree for a single cell. We have implemented this function for the trilinear interpolant and plan to replace it with higher order interpolants when needed. The time complexity is O(n + t log n), where t is the number of critical points of F. For the first time

  1. Shared Investment Projects and Forecasting Errors: Setting Framework Conditions for Coordination and Sequencing Data Quality Activities

    PubMed Central

    Leitner, Stephan; Brauneis, Alexander; Rausch, Alexandra

    2015-01-01

    In this paper, we investigate the impact of inaccurate forecasting on the coordination of distributed investment decisions. In particular, by setting up a computational multi-agent model of a stylized firm, we investigate the case of investment opportunities that are mutually carried out by organizational departments. The forecasts of concern pertain to the initial amount of money necessary to launch and operate an investment opportunity, to the expected intertemporal distribution of cash flows, and the departments’ efficiency in operating the investment opportunity at hand. We propose a budget allocation mechanism for coordinating such distributed decisions The paper provides guidance on how to set framework conditions, in terms of the number of investment opportunities considered in one round of funding and the number of departments operating one investment opportunity, so that the coordination mechanism is highly robust to forecasting errors. Furthermore, we show that—in some setups—a certain extent of misforecasting is desirable from the firm’s point of view as it supports the achievement of the corporate objective of value maximization. We then address the question of how to improve forecasting quality in the best possible way, and provide policy advice on how to sequence activities for improving forecasting quality so that the robustness of the coordination mechanism to errors increases in the best possible way. At the same time, we show that wrong decisions regarding the sequencing can lead to a decrease in robustness. Finally, we conduct a comprehensive sensitivity analysis and prove that—in particular for relatively good forecasters—most of our results are robust to changes in setting the parameters of our multi-agent simulation model. PMID:25803736

  2. Shared investment projects and forecasting errors: setting framework conditions for coordination and sequencing data quality activities.

    PubMed

    Leitner, Stephan; Brauneis, Alexander; Rausch, Alexandra

    2015-01-01

    In this paper, we investigate the impact of inaccurate forecasting on the coordination of distributed investment decisions. In particular, by setting up a computational multi-agent model of a stylized firm, we investigate the case of investment opportunities that are mutually carried out by organizational departments. The forecasts of concern pertain to the initial amount of money necessary to launch and operate an investment opportunity, to the expected intertemporal distribution of cash flows, and the departments' efficiency in operating the investment opportunity at hand. We propose a budget allocation mechanism for coordinating such distributed decisions The paper provides guidance on how to set framework conditions, in terms of the number of investment opportunities considered in one round of funding and the number of departments operating one investment opportunity, so that the coordination mechanism is highly robust to forecasting errors. Furthermore, we show that-in some setups-a certain extent of misforecasting is desirable from the firm's point of view as it supports the achievement of the corporate objective of value maximization. We then address the question of how to improve forecasting quality in the best possible way, and provide policy advice on how to sequence activities for improving forecasting quality so that the robustness of the coordination mechanism to errors increases in the best possible way. At the same time, we show that wrong decisions regarding the sequencing can lead to a decrease in robustness. Finally, we conduct a comprehensive sensitivity analysis and prove that-in particular for relatively good forecasters-most of our results are robust to changes in setting the parameters of our multi-agent simulation model. PMID:25803736

  3. Shared investment projects and forecasting errors: setting framework conditions for coordination and sequencing data quality activities.

    PubMed

    Leitner, Stephan; Brauneis, Alexander; Rausch, Alexandra

    2015-01-01

    In this paper, we investigate the impact of inaccurate forecasting on the coordination of distributed investment decisions. In particular, by setting up a computational multi-agent model of a stylized firm, we investigate the case of investment opportunities that are mutually carried out by organizational departments. The forecasts of concern pertain to the initial amount of money necessary to launch and operate an investment opportunity, to the expected intertemporal distribution of cash flows, and the departments' efficiency in operating the investment opportunity at hand. We propose a budget allocation mechanism for coordinating such distributed decisions The paper provides guidance on how to set framework conditions, in terms of the number of investment opportunities considered in one round of funding and the number of departments operating one investment opportunity, so that the coordination mechanism is highly robust to forecasting errors. Furthermore, we show that-in some setups-a certain extent of misforecasting is desirable from the firm's point of view as it supports the achievement of the corporate objective of value maximization. We then address the question of how to improve forecasting quality in the best possible way, and provide policy advice on how to sequence activities for improving forecasting quality so that the robustness of the coordination mechanism to errors increases in the best possible way. At the same time, we show that wrong decisions regarding the sequencing can lead to a decrease in robustness. Finally, we conduct a comprehensive sensitivity analysis and prove that-in particular for relatively good forecasters-most of our results are robust to changes in setting the parameters of our multi-agent simulation model.

  4. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the

  5. Measuring Afterschool Program Quality Using Setting-Level Observational Approaches

    ERIC Educational Resources Information Center

    Oh, Yoonkyung; Osgood, D. Wayne; Smith, Emilie P.

    2015-01-01

    The importance of afterschool hours for youth development is widely acknowledged, and afterschool settings have recently received increasing attention as an important venue for youth interventions, bringing a growing need for reliable and valid measures of afterschool quality. This study examined the extent to which the two observational tools,…

  6. Measuring afterschool program quality using setting-level observational approaches

    PubMed Central

    Oh, Yoonkyung; Osgood, D. Wayne; Smith, Emilie Phillips

    2016-01-01

    As the importance of afterschool hours for youth development is widely acknowledged, afterschool settings have recently received increasing attention as an important venue for youth interventions. A range of intervention programs have been in place, generally aiming at positive youth development through enhancing the quality of programs. A growing need has thus arisen for reliable and valid measures of afterschool quality. This study examined the extent to which the two observational tools, i.e., Caregiver Interaction Scales (CIS) and Promising Practices Rating Scales (PPRS), could serve as reliable and valid tools for assessing the various dimensions of afterschool setting quality. The study shows the potential promise of the instruments, on the one hand, and suggests future directions for improvement of measurement design and development of the field, on the other hand. In particular, our findings suggest the importance of addressing the effect of day-to-day fluctuations in observed afterschool quality. PMID:26819487

  7. Threshold estimation based on a p-value framework in dose-response and regression settings.

    PubMed

    Mallik, A; Sen, B; Banerjee, M; Michailidis, G

    2011-12-01

    We use p-values to identify the threshold level at which a regression function leaves its baseline value, a problem motivated by applications in toxicological and pharmacological dose-response studies and environmental statistics. We study the problem in two sampling settings: one where multiple responses can be obtained at a number of different covariate levels, and the other the standard regression setting involving limited number of response values at each covariate. Our procedure involves testing the hypothesis that the regression function is at its baseline at each covariate value and then computing the potentially approximate p-value of the test. An estimate of the threshold is obtained by fitting a piecewise constant function with a single jump discontinuity, known as a stump, to these observed p-values, as they behave in markedly different ways on the two sides of the threshold. The estimate is shown to be consistent and its finite sample properties are studied through simulations. Our approach is computationally simple and extends to the estimation of the baseline value of the regression function, heteroscedastic errors and to time series. It is illustrated on some real data applications. PMID:23049132

  8. Threshold estimation based on a p-value framework in dose-response and regression settings

    PubMed Central

    Mallik, A.; Sen, B.; Banerjee, M.; Michailidis, G.

    2011-01-01

    Summary We use p-values to identify the threshold level at which a regression function leaves its baseline value, a problem motivated by applications in toxicological and pharmacological dose-response studies and environmental statistics. We study the problem in two sampling settings: one where multiple responses can be obtained at a number of different covariate levels, and the other the standard regression setting involving limited number of response values at each covariate. Our procedure involves testing the hypothesis that the regression function is at its baseline at each covariate value and then computing the potentially approximate p-value of the test. An estimate of the threshold is obtained by fitting a piecewise constant function with a single jump discontinuity, known as a stump, to these observed p-values, as they behave in markedly different ways on the two sides of the threshold. The estimate is shown to be consistent and its finite sample properties are studied through simulations. Our approach is computationally simple and extends to the estimation of the baseline value of the regression function, heteroscedastic errors and to time series. It is illustrated on some real data applications. PMID:23049132

  9. Marker ReDistancing/Level Set Method for High-Fidelity Implicit Interface Tracking

    SciTech Connect

    Robert Nourgaliev; Samet Kadioglu; Vincent Mousseau; Dana Knoll

    2010-02-01

    A hybrid of the Front-Tracking (FT) and the Level-Set (LS) methods is introduced, combining advantages and removing drawbacks of both methods. The kinematics of the interface is treated in a Lagrangian (FT) manner, by tracking markers placed at the interface. The markers are not connected – instead, the interface topology is resolved in an Eulerian (LS) framework, by wrapping a signed distance function around Lagrangian markers each time the markers move. For accuracy and efficiency, we have developed a high-order “anchoring” algorithm and an implicit PDE-based re-distancing. We have demonstrated that the method is 3rd-order accurate in space, near the markers, and therefore 1st-order convergent in curvature; in contrast to traditional PDE-based re-initialization algorithms, which tend to slightly relocate the zero Level Set and can be shown to be non-convergent in curvature. The implicit pseudo-time discretization of the re-distancing equation is implemented within the Jacobian-Free Newton Krylov (JFNK) framework combined with ILU(k) preconditioning. We have demonstrated that the steady-state solutions in pseudo-time can be achieved very efficiently, with iterations (CFL ), in contrast to the explicit re-distancing which requires 100s of iterations with CFL . The most cost-effective algorithm is found to be a hybrid of explicit and implicit discretizations, in which we apply first 10-15 iterations with explicit discretization (to bring the initial guess to the ball of convergence for the Newton’s method) and then finishing with 2-3 implicit steps, bringing the re-distancing equation to a complete steady-state. The eigenscopy of the JFNK-ILU(k) demonstrates the efficiency of the ILU(k) preconditioner, which effectively cluster eigenvalues of the otherwise extremely ill-conditioned Jacobian matrices, thereby enabling the Krylov (GMRES) method to converge with iterations, with only a few levels of ILU fill-ins. Importantly, due to the Level Set localization

  10. An Evaluation of the High Level Architecture (HLA) as a Framework for NASA Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reid, Michael R.; Powers, Edward I. (Technical Monitor)

    2000-01-01

    The High Level Architecture (HLA) is a current US Department of Defense and an industry (IEEE-1516) standard architecture for modeling and simulations. It provides a framework and set of functional rules and common interfaces for integrating separate and disparate simulators into a larger simulation. The goal of the HLA is to reduce software costs by facilitating the reuse of simulation components and by providing a runtime infrastructure to manage the simulations. In order to evaluate the applicability of the HLA as a technology for NASA space mission simulations, a Simulations Group at Goddard Space Flight Center (GSFC) conducted a study of the HLA and developed a simple prototype HLA-compliant space mission simulator. This paper summarizes the prototyping effort and discusses the potential usefulness of the HLA in the design and planning of future NASA space missions with a focus on risk mitigation and cost reduction.

  11. Joint Infrared Target Recognition and Segmentation Using a Shape Manifold-Aware Level Set

    PubMed Central

    Yu, Liangjiang; Fan, Guoliang; Gong, Jiulu; Havlicek, Joseph P.

    2015-01-01

    We propose new techniques for joint recognition, segmentation and pose estimation of infrared (IR) targets. The problem is formulated in a probabilistic level set framework where a shape constrained generative model is used to provide a multi-class and multi-view shape prior and where the shape model involves a couplet of view and identity manifolds (CVIM). A level set energy function is then iteratively optimized under the shape constraints provided by the CVIM. Since both the view and identity variables are expressed explicitly in the objective function, this approach naturally accomplishes recognition, segmentation and pose estimation as joint products of the optimization process. For realistic target chips, we solve the resulting multi-modal optimization problem by adopting a particle swarm optimization (PSO) algorithm and then improve the computational efficiency by implementing a gradient-boosted PSO (GB-PSO). Evaluation was performed using the Military Sensing Information Analysis Center (SENSIAC) ATR database, and experimental results show that both of the PSO algorithms reduce the cost of shape matching during CVIM-based shape inference. Particularly, GB-PSO outperforms other recent ATR algorithms, which require intensive shape matching, either explicitly (with pre-segmentation) or implicitly (without pre-segmentation). PMID:25938202

  12. Variational level-set segmentation and tracking of left ventricle using field prior

    NASA Astrophysics Data System (ADS)

    Afshin, Mariam; Ben Ayed, Ismail; Islam, Ali; Ross, Ian; Peters, Terry; Li, Shuo

    2011-03-01

    This study investigates a novel method of tracking Left Ventricle (LV) curve in Magnetic Resonance (MR) sequences. The method focuses on energy minimization by level-set curve boundary evolution. The level-set framework allows introducing knowledge of the field prior on the solution. The segmentation in each particular time relies not only on the current image but also the segmented image from previous phase. Field prior is defined based on the experimental fact that the mean logarithm of intensity inside endo and epi-cardium is approximately constant during a cardiac cycle. The solution is obtained by evolving two curves following the Euler-Lagrange minimization of a functional containing a field constraint. The functional measures the consistency of the field prior over a cardiac sequence. Our preliminary results show that the obtained segmentations are very well correlated with those manually obtained by experts. Furthermore, we observed that the proposed field prior speeds up curve evolution significantly and reduces the computation load.

  13. The adoption of the Reference Framework for diabetes care among primary care physicians in primary care settings: A cross-sectional study.

    PubMed

    Wong, Martin C S; Wang, Harry H X; Kwan, Mandy W M; Chan, Wai Man; Fan, Carmen K M; Liang, Miaoyin; Li, Shannon Ts; Fung, Franklin D H; Yeung, Ming Sze; Chan, David K L; Griffiths, Sian M

    2016-08-01

    The prevalence of diabetes mellitus has been increasing both globally and locally. Primary care physicians (PCPs) are in a privileged position to provide first contact and continuing care for diabetic patients. A territory-wide Reference Framework for Diabetes Care for Adults has been released by the Hong Kong Primary Care Office in 2010, with the aim to further enhance evidence-based and high quality care for diabetes in the primary care setting through wide adoption of the Reference Framework.A valid questionnaire survey was conducted among PCPs to evaluate the levels of, and the factors associated with, their adoption of the Reference Framework.A total of 414 completed surveys were received with the response rate of 13.0%. The average adoption score was 3.29 (SD 0.51) out of 4. Approximately 70% of PCPs highly adopted the Reference Framework in their routine practice. Binary logistic regression analysis showed that the PCPs perceptions on the inclusion of sufficient local information (adjusted odds ratio [aOR] = 4.748, 95%CI 1.597-14.115, P = 0.005) and reduction of professional autonomy of PCPs (aOR = 1.859, 95%CI 1.013-3.411, P = 0.045) were more likely to influence their adoption level of the Reference Framework for diabetes care in daily practices.The overall level of guideline adoption was found to be relatively high among PCPs for adult diabetes in primary care settings. The adoption barriers identified in this study should be addressed in the continuous updating of the Reference Framework. Strategies need to be considered to enhance the guideline adoption and implementation capacity.

  14. Setting Student Performance Standards: The Role of Achievement Level Descriptions in the Standard Setting Process.

    ERIC Educational Resources Information Center

    Bourque, Mary Lyn

    This paper looks at using descriptions of subject matter content to assist in the development and interpretation of student performance on the National Assessment of Educational Progress (NAEP). These descriptions of content, called achievement level descriptions (ALDs), were initially conceptualized as exemplary statements of the knowledge and…

  15. A novel framework for assessing metadata quality in epidemiological and public health research settings

    PubMed Central

    McMahon, Christiana; Denaxas, Spiros

    2016-01-01

    Metadata are critical in epidemiological and public health research. However, a lack of biomedical metadata quality frameworks and limited awareness of the implications of poor quality metadata renders data analyses problematic. In this study, we created and evaluated a novel framework to assess metadata quality of epidemiological and public health research datasets. We performed a literature review and surveyed stakeholders to enhance our understanding of biomedical metadata quality assessment. The review identified 11 studies and nine quality dimensions; none of which were specifically aimed at biomedical metadata. 96 individuals completed the survey; of those who submitted data, most only assessed metadata quality sometimes, and eight did not at all. Our framework has four sections: a) general information; b) tools and technologies; c) usability; and d) management and curation. We evaluated the framework using three test cases and sought expert feedback. The framework can assess biomedical metadata quality systematically and robustly. PMID:27570670

  16. Automatic segmentation of the lungs using robust level sets.

    PubMed

    Silveira, Margarida; Nascimento, Jacinto; Marques, Jorge

    2007-01-01

    This paper presents a method for the automatic segmentation of the lungs in X-ray computed tomography (CT) images. The proposed technique is based on the use of a robust geometric active contour that is initialized around the lungs, automatically splits in two, and performs outlier rejection during the curve evolution. The technique starts by grey-level thresholding of the images followed by edge detection. Then the edge connected points are organized into strokes and classified as valid or invalid. A confidence degree (weight) is assigned to each stroke and updated during the evolution process with the valid strokes receiving a high confidence degree and the confidence degrees of the outlier strokes tending to zero. These weights depend on the distance between the stroke points and the curve and also on the stroke size. Initialization of the curve is fully automatic. Experimental results show the effectiveness of the proposed technique.

  17. High-level waste tank farm set point document

    SciTech Connect

    Anthony, J.A. III

    1995-01-15

    Setpoints for nuclear safety-related instrumentation are required for actions determined by the design authorization basis. Minimum requirements need to be established for assuring that setpoints are established and held within specified limits. This document establishes the controlling methodology for changing setpoints of all classifications. The instrumentation under consideration involve the transfer, storage, and volume reduction of radioactive liquid waste in the F- and H-Area High-Level Radioactive Waste Tank Farms. The setpoint document will encompass the PROCESS AREA listed in the Safety Analysis Report (SAR) (DPSTSA-200-10 Sup 18) which includes the diversion box HDB-8 facility. In addition to the PROCESS AREAS listed in the SAR, Building 299-H and the Effluent Transfer Facility (ETF) are also included in the scope.

  18. Telemedicine: what framework, what levels of proof, implementation rules.

    PubMed

    Zannad, Faiez; Maugendre, Philippe; Audry, Antoine; Avril, Carole; Blaise, Lucile; Blin, Olivier; Burnel, Philippe; Falise-Mirat, Béatrice; Girault, Danièle; Giri, Isabelle; Goehrs, Jean-Marie; Lassale, Catherine; Le Meur, Roland; Leurent, Pierre; Ratignier-Carbonneil, Christelle; Rossignol, Patrick; Satonnet, Evelyne; Simon, Pierre; Treluyer, Laurent

    2014-01-01

    The concept of telemedicine was formalised in France in the 2009 "Hospital, patients, health territories" (loi hôpital, patients, santé, territoire) law and the 2010 decree through which it was applied. Many experiments have been carried out and the regulatory institutions (Ministry, Regional Health Agency [Agence régionale de santé, ARS], French National Health Authority [Haute autorité de santé, HAS], etc.) have issued various guidance statements and recommendations on its organisation and on the expectations of its evaluation. With this background, the round table wanted to produce recommendations on different areas of medical telemonitoring (the role of telemonitoring, the regulatory system, the principles for assessment, methods of use and conditions for sustained and seamless deployment). Whilst many studies carried out on new medical telemonitoring approaches have led to the postulate that it offers benefit, both clinically and in terms of patient quality of life, more information is needed to demonstrate its impact on the organisation of healthcare and the associated medico-economic benefit (criteria, methods, resources). Similarly, contractual frameworks for deployment of telemonitoring do exist, although they are complicated and involve many different stakeholders (Director General fo the Care Offering [Direction générale de l'offre de soins, DGOS], ARS, HAS, Agency for Shared Health Information Systems [Agence des systèmes d'information partagés de santé, ASIP], French National Data Protection Commission [Commission nationale informatique et libertés, CNIL], French National Medical Council [Conseil national de l'Ordre des médecins, CNOM], etc.) that would benefit from a shared approach and seamless exchange between the partners involved. The current challenge is also to define the conditions required to validate a stable economic model in order to promote organisational change. One topical issue is placing the emphasis on its evaluation and

  19. Telemedicine: what framework, what levels of proof, implementation rules.

    PubMed

    Zannad, Faiez; Maugendre, Philippe; Audry, Antoine; Avril, Carole; Blaise, Lucile; Blin, Olivier; Burnel, Philippe; Falise-Mirat, Béatrice; Girault, Danièle; Giri, Isabelle; Goehrs, Jean-Marie; Lassale, Catherine; Le Meur, Roland; Leurent, Pierre; Ratignier-Carbonneil, Christelle; Rossignol, Patrick; Satonnet, Evelyne; Simon, Pierre; Treluyer, Laurent

    2014-01-01

    The concept of telemedicine was formalised in France in the 2009 "Hospital, patients, health territories" (loi hôpital, patients, santé, territoire) law and the 2010 decree through which it was applied. Many experiments have been carried out and the regulatory institutions (Ministry, Regional Health Agency [Agence régionale de santé, ARS], French National Health Authority [Haute autorité de santé, HAS], etc.) have issued various guidance statements and recommendations on its organisation and on the expectations of its evaluation. With this background, the round table wanted to produce recommendations on different areas of medical telemonitoring (the role of telemonitoring, the regulatory system, the principles for assessment, methods of use and conditions for sustained and seamless deployment). Whilst many studies carried out on new medical telemonitoring approaches have led to the postulate that it offers benefit, both clinically and in terms of patient quality of life, more information is needed to demonstrate its impact on the organisation of healthcare and the associated medico-economic benefit (criteria, methods, resources). Similarly, contractual frameworks for deployment of telemonitoring do exist, although they are complicated and involve many different stakeholders (Director General fo the Care Offering [Direction générale de l'offre de soins, DGOS], ARS, HAS, Agency for Shared Health Information Systems [Agence des systèmes d'information partagés de santé, ASIP], French National Data Protection Commission [Commission nationale informatique et libertés, CNIL], French National Medical Council [Conseil national de l'Ordre des médecins, CNOM], etc.) that would benefit from a shared approach and seamless exchange between the partners involved. The current challenge is also to define the conditions required to validate a stable economic model in order to promote organisational change. One topical issue is placing the emphasis on its evaluation and

  20. Investigating the Experience of Outdoor and Adventurous Project Work in an Educational Setting Using a Self-Determination Framework

    ERIC Educational Resources Information Center

    Sproule, John; Martindale, Russell; Wang, John; Allison, Peter; Nash, Christine; Gray, Shirley

    2013-01-01

    The purpose of this study was to carry out a preliminary investigation to explore the use of outdoor and adventurous project work (PW) within an educational setting. Specifically, differences between the PW and normal academic school experiences were examined using a self-determination theory framework integrated with a goal orientation and…

  1. A Conceptual Framework for Educational Design at Modular Level to Promote Transfer of Learning

    ERIC Educational Resources Information Center

    Botma, Yvonne; Van Rensburg, G. H.; Coetzee, I. M.; Heyns, T.

    2015-01-01

    Students bridge the theory-practice gap when they apply in practice what they have learned in class. A conceptual framework was developed that can serve as foundation to design for learning transfer at modular level. The framework is based on an adopted and adapted systemic model of transfer of learning, existing learning theories, constructive…

  2. Level set segmentation of the heart from 4D phase contrast MRI

    NASA Astrophysics Data System (ADS)

    Kainmuller, Dagmar; Unterhinninghofen, Roland; Ley, Sebastian; Dillmann, Rüdiger

    2008-03-01

    Blood flow properties in the heart can be examined non invasively by means of Phase Contrast MRI (PC MRI), an imaging technique that provides not only morphology images but also velocity information. We present a novel feature combination for level set segmentation of the heart's cavities in multidirectional 4D PC MRI data. The challenge in performing the segmentation task successfully in this context is first of all the bad image quality, as compared to classical MRI. As generally in heart segmentation, the intra and inter subject variability of the heart has to be coped with as well. The central idea of our approach is to integrate a set of essentially differing sources of information into the segmentation process to make it capable of handling qualitatively bad and highly varying data. To the best of our knowledge our system is the first to concurrently incorporate a flow measure as well as a priori shape knowledge into a level set framework in addition to the commonly used edge and curvature information. The flow measure is derived from PC MRI velocity data. As shape knowledge we use a 3D shape of the respective cavity. We validated our system design by a series of qualitative performance tests. The combined use of shape knowledge and a flow measure increases segmentation quality compared to results obtained by using only one of those features. A first clinical study was performed on two 4D datasets, from which we segmented the left ventricle and atrium. The segmentation results were examined by an expert and judged suitable for use in clinical practice.

  3. Toppled television sets and head injuries in the pediatric population: a framework for prevention.

    PubMed

    Cusimano, Michael D; Parker, Nadine

    2016-01-01

    Injuries to children caused by falling televisions have become more frequent during the last decade. These injuries can be severe and even fatal and are likely to become even more common in the future as TVs increase in size and become more affordable. To formulate guidelines for the prevention of these injuries, the authors systematically reviewed the literature on injuries related to toppling televisions. The authors searched MEDLINE, PubMed, Embase, Scopus, CINAHL (Cumulative Index to Nursing and Allied Health Literature), Cochrane Library, and Google Scholar according to the Cochrane guidelines for all studies involving children 0-18 years of age who were injured by toppled TVs. Factors contributing to injury were categorized using Haddon's Matrix, and the public health approach was used as a framework for developing strategies to prevent these injuries. The vast majority (84%) of the injuries occurred in homes and more than three-fourths were unwitnessed by adult caregivers. The TVs were most commonly large and elevated off the ground. Dressers and other furniture not designed to support TVs were commonly involved in the TV-toppling incident. The case fatality rate varies widely, but almost all deaths reported (96%) were due to brain injuries. Toddlers between the ages of 1 and 3 years most frequently suffer injuries to the head and neck, and they are most likely to suffer severe injuries. Many of these injuries require brain imaging and neurosurgical intervention. Prevention of these injuries will require changes in TV design and legislation as well as increases in public education and awareness. Television-toppling injuries can be easily prevented; however, the rates of injury do not reflect a sufficient level of awareness, nor do they reflect an acceptable effort from an injury prevention perspective. PMID:26416669

  4. Toppled television sets and head injuries in the pediatric population: a framework for prevention.

    PubMed

    Cusimano, Michael D; Parker, Nadine

    2016-01-01

    Injuries to children caused by falling televisions have become more frequent during the last decade. These injuries can be severe and even fatal and are likely to become even more common in the future as TVs increase in size and become more affordable. To formulate guidelines for the prevention of these injuries, the authors systematically reviewed the literature on injuries related to toppling televisions. The authors searched MEDLINE, PubMed, Embase, Scopus, CINAHL (Cumulative Index to Nursing and Allied Health Literature), Cochrane Library, and Google Scholar according to the Cochrane guidelines for all studies involving children 0-18 years of age who were injured by toppled TVs. Factors contributing to injury were categorized using Haddon's Matrix, and the public health approach was used as a framework for developing strategies to prevent these injuries. The vast majority (84%) of the injuries occurred in homes and more than three-fourths were unwitnessed by adult caregivers. The TVs were most commonly large and elevated off the ground. Dressers and other furniture not designed to support TVs were commonly involved in the TV-toppling incident. The case fatality rate varies widely, but almost all deaths reported (96%) were due to brain injuries. Toddlers between the ages of 1 and 3 years most frequently suffer injuries to the head and neck, and they are most likely to suffer severe injuries. Many of these injuries require brain imaging and neurosurgical intervention. Prevention of these injuries will require changes in TV design and legislation as well as increases in public education and awareness. Television-toppling injuries can be easily prevented; however, the rates of injury do not reflect a sufficient level of awareness, nor do they reflect an acceptable effort from an injury prevention perspective.

  5. Bushmeat genetics: setting up a reference framework for the DNA typing of African forest bushmeat.

    PubMed

    Gaubert, Philippe; Njiokou, Flobert; Olayemi, Ayodeji; Pagani, Paolo; Dufour, Sylvain; Danquah, Emmanuel; Nutsuakor, Mac Elikem K; Ngua, Gabriel; Missoup, Alain-Didier; Tedesco, Pablo A; Dernat, Rémy; Antunes, Agostinho

    2015-05-01

    The bushmeat trade in tropical Africa represents illegal, unsustainable off-takes of millions of tons of wild game - mostly mammals - per year. We sequenced four mitochondrial gene fragments (cyt b, COI, 12S, 16S) in >300 bushmeat items representing nine mammalian orders and 59 morphological species from five western and central African countries (Guinea, Ghana, Nigeria, Cameroon and Equatorial Guinea). Our objectives were to assess the efficiency of cross-species PCR amplification and to evaluate the usefulness of our multilocus approach for reliable bushmeat species identification. We provide a straightforward amplification protocol using a single 'universal' primer pair per gene that generally yielded >90% PCR success rates across orders and was robust to different types of meat preprocessing and DNA extraction protocols. For taxonomic identification, we set up a decision pipeline combining similarity- and tree-based approaches with an assessment of taxonomic expertise and coverage of the GENBANK database. Our multilocus approach permitted us to: (i) adjust for existing taxonomic gaps in GENBANK databases, (ii) assign to the species level 67% of the morphological species hypotheses and (iii) successfully identify samples with uncertain taxonomic attribution (preprocessed carcasses and cryptic lineages). High levels of genetic polymorphism across genes and taxa, together with the excellent resolution observed among species-level clusters (neighbour-joining trees and Klee diagrams) advocate the usefulness of our markers for bushmeat DNA typing. We formalize our DNA typing decision pipeline through an expert-curated query database - DNA BUSHMEAT - that shall permit the automated identification of African forest bushmeat items.

  6. Cooperative fuzzy games approach to setting target levels of ECs in quality function deployment.

    PubMed

    Yang, Zhihui; Chen, Yizeng; Yin, Yunqiang

    2014-01-01

    Quality function deployment (QFD) can provide a means of translating customer requirements (CRs) into engineering characteristics (ECs) for each stage of product development and production. The main objective of QFD-based product planning is to determine the target levels of ECs for a new product or service. QFD is a breakthrough tool which can effectively reduce the gap between CRs and a new product/service. Even though there are conflicts among some ECs, the objective of developing new product is to maximize the overall customer satisfaction. Therefore, there may be room for cooperation among ECs. A cooperative game framework combined with fuzzy set theory is developed to determine the target levels of the ECs in QFD. The key to develop the model is the formulation of the bargaining function. In the proposed methodology, the players are viewed as the membership functions of ECs to formulate the bargaining function. The solution for the proposed model is Pareto-optimal. An illustrated example is cited to demonstrate the application and performance of the proposed approach.

  7. Cooperative Fuzzy Games Approach to Setting Target Levels of ECs in Quality Function Deployment

    PubMed Central

    Yang, Zhihui; Chen, Yizeng; Yin, Yunqiang

    2014-01-01

    Quality function deployment (QFD) can provide a means of translating customer requirements (CRs) into engineering characteristics (ECs) for each stage of product development and production. The main objective of QFD-based product planning is to determine the target levels of ECs for a new product or service. QFD is a breakthrough tool which can effectively reduce the gap between CRs and a new product/service. Even though there are conflicts among some ECs, the objective of developing new product is to maximize the overall customer satisfaction. Therefore, there may be room for cooperation among ECs. A cooperative game framework combined with fuzzy set theory is developed to determine the target levels of the ECs in QFD. The key to develop the model is the formulation of the bargaining function. In the proposed methodology, the players are viewed as the membership functions of ECs to formulate the bargaining function. The solution for the proposed model is Pareto-optimal. An illustrated example is cited to demonstrate the application and performance of the proposed approach. PMID:25097884

  8. Cooperative fuzzy games approach to setting target levels of ECs in quality function deployment.

    PubMed

    Yang, Zhihui; Chen, Yizeng; Yin, Yunqiang

    2014-01-01

    Quality function deployment (QFD) can provide a means of translating customer requirements (CRs) into engineering characteristics (ECs) for each stage of product development and production. The main objective of QFD-based product planning is to determine the target levels of ECs for a new product or service. QFD is a breakthrough tool which can effectively reduce the gap between CRs and a new product/service. Even though there are conflicts among some ECs, the objective of developing new product is to maximize the overall customer satisfaction. Therefore, there may be room for cooperation among ECs. A cooperative game framework combined with fuzzy set theory is developed to determine the target levels of the ECs in QFD. The key to develop the model is the formulation of the bargaining function. In the proposed methodology, the players are viewed as the membership functions of ECs to formulate the bargaining function. The solution for the proposed model is Pareto-optimal. An illustrated example is cited to demonstrate the application and performance of the proposed approach. PMID:25097884

  9. Benchmarking density functional theory predictions of framework structures and properties in a chemically diverse test set of metal-organic frameworks

    SciTech Connect

    Nazarian, Dalar; Ganesh, P.; Sholl, David S.

    2015-09-30

    We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionals for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.

  10. A global health delivery framework approach to epilepsy care in resource-limited settings.

    PubMed

    Cochran, Maggie F; Berkowitz, Aaron L

    2015-11-15

    The Global Health Delivery (GHD) framework (Farmer, Kim, and Porter, Lancet 2013;382:1060-69) allows for the analysis of health care delivery systems along four axes: a care delivery value chain that incorporates prevention, diagnosis, and treatment of a medical condition; shared delivery infrastructure that integrates care within existing healthcare delivery systems; alignment of care delivery with local context; and generation of economic growth and social development through the health care delivery system. Here, we apply the GHD framework to epilepsy care in rural regions of low- and middle-income countries (LMIC) where there are few or no neurologists.

  11. Novel multimodality segmentation using level sets and Jensen-Rényi divergence

    SciTech Connect

    Markel, Daniel; Zaidi, Habib; El Naqa, Issam

    2013-12-15

    Purpose: Positron emission tomography (PET) is playing an increasing role in radiotherapy treatment planning. However, despite progress, robust algorithms for PET and multimodal image segmentation are still lacking, especially if the algorithm were extended to image-guided and adaptive radiotherapy (IGART). This work presents a novel multimodality segmentation algorithm using the Jensen-Rényi divergence (JRD) to evolve the geometric level set contour. The algorithm offers improved noise tolerance which is particularly applicable to segmentation of regions found in PET and cone-beam computed tomography. Methods: A steepest gradient ascent optimization method is used in conjunction with the JRD and a level set active contour to iteratively evolve a contour to partition an image based on statistical divergence of the intensity histograms. The algorithm is evaluated using PET scans of pharyngolaryngeal squamous cell carcinoma with the corresponding histological reference. The multimodality extension of the algorithm is evaluated using 22 PET/CT scans of patients with lung carcinoma and a physical phantom scanned under varying image quality conditions. Results: The average concordance index (CI) of the JRD segmentation of the PET images was 0.56 with an average classification error of 65%. The segmentation of the lung carcinoma images had a maximum diameter relative error of 63%, 19.5%, and 14.8% when using CT, PET, and combined PET/CT images, respectively. The estimated maximal diameters of the gross tumor volume (GTV) showed a high correlation with the macroscopically determined maximal diameters, with aR{sup 2} value of 0.85 and 0.88 using the PET and PET/CT images, respectively. Results from the physical phantom show that the JRD is more robust to image noise compared to mutual information and region growing. Conclusions: The JRD has shown improved noise tolerance compared to mutual information for the purpose of PET image segmentation. Presented is a flexible

  12. Multi-scale texture-based level-set segmentation of breast B-mode images.

    PubMed

    Lang, Itai; Sklair-Levy, Miri; Spitzer, Hedva

    2016-05-01

    Automatic segmentation of ultrasonographic breast lesions is very challenging, due to the lesions' spiculated nature and the variance in shape and texture of the B-mode ultrasound images. Many studies have tried to answer this challenge by applying a variety of computational methods including: Markov random field, artificial neural networks, and active contours and level-set techniques. These studies focused on creating an automatic contour, with maximal resemblance to a manual contour, delineated by a trained radiologist. In this study, we have developed an algorithm, designed to capture the spiculated boundary of the lesion by using the properties from the corresponding ultrasonic image. This is primarily achieved through a unique multi-scale texture identifier (inspired by visual system models) integrated in a level-set framework. The algorithm׳s performance has been evaluated quantitatively via contour-based and region-based error metrics. We compared the algorithm-generated contour to a manual contour delineated by an expert radiologist. In addition, we suggest here a new method for performance evaluation where corrections made by the radiologist replace the algorithm-generated (original) result in the correction zones. The resulting corrected contour is then compared to the original version. The evaluation showed: (1) Mean absolute error of 0.5 pixels between the original and the corrected contour; (2) Overlapping area of 99.2% between the lesion regions, obtained by the algorithm and the corrected contour. These results are significantly better than those previously reported. In addition, we have examined the potential of our segmentation results to contribute to the discrimination between malignant and benign lesions. PMID:27010737

  13. An explanatory framework of teachers' perceptions of a positive mealtime environment in a preschool setting.

    PubMed

    Mita, Satoko C; Gray, Samuel A; Goodell, L Suzanne

    2015-07-01

    Attending a preschool center may help preschoolers with growth and development that encourage a healthy lifestyle, including sound eating behaviors. Providing a positive mealtime environment (PME) may be one of the keys to fostering a child's healthy eating habits in the classroom. However, a specific definition of a PME, the components of a PME, or directions on how to create one have not been established. The purpose of this study, therefore, was to explore Head Start teachers' perceptions related to a PME and create a conceptual framework representing these perceptions. To achieve this purpose, researchers conducted 65 in-depth phone interviews with Head Start teachers around the US. Applying principles of grounded theory, researchers developed a conceptual framework depicting teachers' perceptions of PME, consisting of five key components: (1) the people (i.e., teachers, kitchen staff, parent volunteers, and children), (2) positive emotional tone (e.g., relaxed and happy), (3) rules, expectations, and routines (e.g., family-style mealtime), (4) operations of a PME (i.e., eating, socialization, and learning), and (5) both short- and long-term outcomes of a PME. With this PME framework, researchers may be able to enhance the effectiveness of nutrition interventions related to a PME, focusing on the factors in the conceptual framework as well as barriers associated with achieving these factors.

  14. Validation of the Visitor and Resident Framework in an E-Book Setting

    ERIC Educational Resources Information Center

    Engelsmann, Hazel C.; Greifeneder, Elke; Lauridsen, Nikoline D.; Nielsen, Anja G.

    2014-01-01

    Introduction: By applying the visitor and resident framework on e-book usage, the article explores whether the concepts of a resident and a visitor can help to explain e-book use, and can help to gain a better insight into users' motivations for e-book use. Method: A questionnaire and semi-structured interviews were conducted with users of…

  15. An explanatory framework of teachers' perceptions of a positive mealtime environment in a preschool setting.

    PubMed

    Mita, Satoko C; Gray, Samuel A; Goodell, L Suzanne

    2015-07-01

    Attending a preschool center may help preschoolers with growth and development that encourage a healthy lifestyle, including sound eating behaviors. Providing a positive mealtime environment (PME) may be one of the keys to fostering a child's healthy eating habits in the classroom. However, a specific definition of a PME, the components of a PME, or directions on how to create one have not been established. The purpose of this study, therefore, was to explore Head Start teachers' perceptions related to a PME and create a conceptual framework representing these perceptions. To achieve this purpose, researchers conducted 65 in-depth phone interviews with Head Start teachers around the US. Applying principles of grounded theory, researchers developed a conceptual framework depicting teachers' perceptions of PME, consisting of five key components: (1) the people (i.e., teachers, kitchen staff, parent volunteers, and children), (2) positive emotional tone (e.g., relaxed and happy), (3) rules, expectations, and routines (e.g., family-style mealtime), (4) operations of a PME (i.e., eating, socialization, and learning), and (5) both short- and long-term outcomes of a PME. With this PME framework, researchers may be able to enhance the effectiveness of nutrition interventions related to a PME, focusing on the factors in the conceptual framework as well as barriers associated with achieving these factors. PMID:25728886

  16. A new kind of fuzzy n-ary hypergroups in the framework of soft set theory.

    PubMed

    Li, Hongjie; Yin, Yunqiang

    2014-01-01

    Maji et al. introduced the concept of fuzzy soft sets as a generalization of the standard soft sets and presented an application of fuzzy soft sets in a decision making problem. The aim of this paper is to apply the concept of fuzzy soft sets to n-ary hypergroup theory. The concepts of (∈(γ), ∈(γ) ∨ q(δ))-fuzzy soft (invertible) n-ary subhypergroups over a commutative n-ary hypergroup are introduced and some related properties and characterizations are obtained. The homomorphism properties of (∈(γ), ∈(γ) ∨ q(δ))-fuzzy soft (invertible) n-ary subhypergroups are also derived.

  17. Conflict and HIV: A framework for risk assessment to prevent HIV in conflict-affected settings in Africa

    PubMed Central

    Mock, Nancy B; Duale, Sambe; Brown, Lisanne F; Mathys, Ellen; O'Maonaigh, Heather C; Abul-Husn, Nina KL; Elliott, Sterling

    2004-01-01

    In sub-Saharan Africa, HIV/AIDS and violent conflict interact to shape population health and development in dramatic ways. HIV/AIDS can create conditions conducive to conflict. Conflict can affect the epidemiology of HIV/AIDS. Conflict is generally understood to accelerate HIV transmission, but this view is simplistic and disregards complex interrelationships between factors that can inhibit and accelerate the spread of HIV in conflict and post conflict settings, respectively. This paper provides a framework for understanding these factors and discusses their implications for policy formulation and program planning in conflict-affected settings. PMID:15679919

  18. The Tutelkan SPI Framework for Small Settings: A Methodology Transfer Vehicle

    NASA Astrophysics Data System (ADS)

    Valdes, Gonzalo; Astudillo, Hernán; Visconti, Marcello; López, Claudia

    Software organizations aim to improve their processes to increase their productivity, competitiveness and performance. Although numerous standards and models have been proposed, their adoption among small organizations is hard due to some size mismatches and to lack of experienced process engineers, which forces them to hire (expensive) external consultants. This article describes the Tutelkan SPI Framework, which proposes a three-fold approach to this problem: (1) providing a library of reusable process assets, (2) offering composition tools to describe small organizations processes using these assets, and (3) systematically training small organization focused consultants for these library and toolset. The framework has been successfully piloted with several Chilean small companies, and the library and tools are open and freely available.

  19. Conceptual Framework and Levels of Abstraction for a Complex Large-Scale System

    SciTech Connect

    Simpson, Mary J.

    2005-03-23

    A conceptual framework and levels of abstraction are created to apply across all potential threats. Bioterrorism is used as a complex example to describe the general framework. Bioterrorism is unlimited with respect to the use of a specific agent, mode of dissemination, and potential target. Because the threat is open-ended, there is a strong need for a common, systemic understanding of attack scenarios related to bioterrorism. In recognition of this large-scale complex problem, systems are being created to define, design and use the proper level of abstraction and conceptual framework in bioterrorism. The wide variety of biological agents and delivery mechanisms provide an opportunity for dynamic scale changes by the linking or interlinking of existing threat components. Concurrent impacts must be separated and evaluated in terms of a given environment and/or ‘abstraction framework.’

  20. Alternative Frameworks of the Secondary School Students on the Concept of Condensation at Submicroscopic Level

    ERIC Educational Resources Information Center

    Abdullah, Nurdiana; Surif, Johari; Ismail, Syuhaida

    2016-01-01

    The study was carried out to identify the alternative frameworks on the concept of condensation at submicroscopic level among secondary school students (N = 324). Data was collected by using the qualitative method through the Understanding Test on the Concept of Matter at Submicroscopic Level which consisted of 10 open-ended questions. The…

  1. Model-driven, probabilistic level set based segmentation of magnetic resonance images of the brain.

    PubMed

    Verma, Nishant; Muralidhar, Gautam S; Bovik, Alan C; Cowperthwaite, Matthew C; Markey, Mia K

    2011-01-01

    Accurate segmentation of magnetic resonance (MR) images of the brain to differentiate features such as soft tissue, tumor, edema and necrosis is critical for both diagnosis and treatment purposes. Region-based formulations of geometric active contour models are popular choices for segmentation of MR and other medical images. Most of the traditional region-based formulations model local region intensity by assuming a piecewise constant approximation. However, the piecewise constant approximation rarely holds true for medical images such as MR images due to the presence of noise and bias field, which invariably results in a poor segmentation of the image. To overcome this problem, we have developed a probabilistic region-based active contour model for automatic segmentation of MR images of the brain. In our approach, a mixture of Gaussian distributions is used to accurately model the arbitrarily shaped local region intensity distribution. Prior spatial information derived from probabilistic atlases is also integrated into the level set evolution framework for guiding the segmentation process. Our experiments with a series of publicly available brain MR images show that the proposed active contour model gives stable and accurate segmentation results when compared to the traditional region based formulations. PMID:22254928

  2. The Agenda Setting Function of the Mass Media at Three Levels of "Information Holding"

    ERIC Educational Resources Information Center

    Benton, Marc; Frazier, P. Jean

    1976-01-01

    Extends the theoretical concept of agenda setting to include awareness of general issues, awareness of proposed solutions, and specific knowledge about the proposals. Examines whether or not agenda setting is operative at these levels and compares findings with previous agenda setting studies. (MH)

  3. Intervention complexity--a conceptual framework to inform priority-setting in health.

    PubMed Central

    Gericke, Christian A.; Kurowski, Christoph; Ranson, M. Kent; Mills, Anne

    2005-01-01

    Health interventions vary substantially in the degree of effort required to implement them. To some extent this is apparent in their financial cost, but the nature and availability of non-financial resources is often of similar importance. In particular, human resource requirements are frequently a major constraint. We propose a conceptual framework for the analysis of interventions according to their degree of technical complexity; this complements the notion of institutional capacity in considering the feasibility of implementing an intervention. Interventions are categorized into four dimensions: characteristics of the basic intervention; characteristics of delivery; requirements on government capacity; and usage characteristics. The analysis of intervention complexity should lead to a better understanding of supply- and demand-side constraints to scaling up, indicate priorities for further research and development, and can point to potential areas for improvement of specific aspects of each intervention to close the gap between the complexity of an intervention and the capacity to implement it. The framework is illustrated using the examples of scaling up condom social marketing programmes, and the DOTS strategy for tuberculosis control in highly resource-constrained countries. The framework could be used as a tool for policy-makers, planners and programme managers when considering the expansion of existing projects or the introduction of new interventions. Intervention complexity thus complements the considerations of burden of disease, cost-effectiveness, affordability and political feasibility in health policy decision-making. Reducing the technical complexity of interventions will be crucial to meeting the health-related Millennium Development Goals. PMID:15868020

  4. Benchmarking density functional theory predictions of framework structures and properties in a chemically diverse test set of metal-organic frameworks

    DOE PAGES

    Nazarian, Dalar; Ganesh, P.; Sholl, David S.

    2015-09-30

    We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionalsmore » for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.« less

  5. The reliability and validity of the comfort level method of setting hearing aid gain.

    PubMed

    Walden, B E; Schuchman, G I; Sedge, R K

    1977-11-01

    The comfort level method (Carhart, 1946) probably is the most widely used procedure for setting the acoustic gain of hearing aids. A series of experiments were conducted to determine the test-retest reliability of the comfort level method and the relationship between the comfort settings established in a clinical test suite and the comfort settings utilized in more realistic daily listening situations. Adults with bilateral sensorineural hearing impairments were subjects. The results suggest that the comfort level method has good test-retest reliability for most clinical purposes. Further, clinically established comfort settings may accurately represent typical daily-use settings if the input level used to establish the comfort settings in the clinical environment is 70 dB SPL.

  6. A level set approach for left ventricle detection in CT images using shape segmentation and optical flow

    NASA Astrophysics Data System (ADS)

    Brieva, Jorge; Moya-Albor, Ernesto; Escalante-Ramírez, Boris

    2015-01-01

    The left ventricle (LV) segmentation plays an important role in a subsequent process for the functional analysis of the LV. Typical segmentation of the endocardium wall in the ventricle excludes papillary muscles which leads to an incorrect measure of the ejected volume in the LV. In this paper we present a new variational strategy using a 2D level set framework that includes a local term for enhancing the low contrast structures and a 2D shape model. The shape model in the level set method is propagated to all image sequences corresponding to the cardiac cycles through the optical flow approach using the Hermite transform. To evaluate our strategy we use the Dice index and the Hausdorff distance to compare the segmentation results with the manual segmentation carried out by the physician.

  7. Education leadership in the clinical health care setting: a framework for nursing education development.

    PubMed

    Mockett, Lynda; Horsfall, Janine; O'Callaghan, Wendy

    2006-12-01

    This paper describes how a new framework for clinical nursing education was introduced at Counties Manukau District Health Board (CMDHB), New Zealand. The project was initiated in response to the significant legislative and post registration nursing education changes within New Zealand. The journey of change has been a significant undertaking, and has required clear management, strong leadership, perseverance and understanding of the organisation's culture. The approach taken to managing the change had four stages, and reflects various change management models. The first stage, the identification process, identified the impetus for change. Creating the vision is the second stage and identified what the change would look like within the organisation. To ensure success and to guide the process of change a realistic and sustainable vision was developed. Implementing the vision was the third stage, and discusses the communication and pilot phase of implementing the nursing education framework. Stage four, embedding the vision, explores the process and experiences of changing an education culture and embedding the vision into an organisation. The paper concludes by discussing the importance of implementing robust, consistent, strategic and collaborative processes--that reflect and evaluate best educational nursing practice.

  8. A Framework for Testing and Promoting Expanded Dissemination of Promising Preventive Interventions that are Being Implemented in Community Settings

    PubMed Central

    Mason, W. Alex; Fleming, Charles B.; Thompson, Ronald W.; Haggerty, Kevin P.; Snyder, James J.

    2013-01-01

    Many evidence-based preventive interventions have been developed in recent years but few are widely used. With the current focus on efficacy trials, widespread dissemination and implementation of evidence-based interventions are often afterthoughts. One potential strategy for reversing this trend is to find a promising program with a strong delivery vehicle in place and improve and test the program’s efficacy through rigorous evaluation. If the program is supported by evidence, the dissemination vehicle is already in place and potentially can be expanded. This strategy has been used infrequently and has met with limited success to date, in part, because the field lacks a framework for guiding such research. To address this gap, we outline a framework for moving promising preventive interventions that are currently being implemented in community settings through a process of rigorous testing and, if needed, program modification in order to promote expanded dissemination. The framework is guided by RE-AIM (Reach, Efficacy/Effectiveness, Adoption, Implementation, and Maintenance; Glasgow, Vogt, & Boles, 1999), which focuses attention on external as well as internal validity in program tests, and is illustrated with examples. Challenges, such as responding to negative and null results, and opportunities inherent in the framework are discussed. PMID:23807241

  9. Commentary: A Response to Reckase's Conceptual Framework and Examples for Evaluating Standard Setting Methods

    ERIC Educational Resources Information Center

    Schulz, E. Matthew

    2006-01-01

    A look at real data shows that Reckase's psychometric theory for standard setting is not applicable to bookmark and that his simulations cannot explain actual differences between methods. It is suggested that exclusively test-centered, criterion-referenced approaches are too idealized and that a psychophysics paradigm and a theory of group…

  10. Design of the control set in the framework of variational data assimilation

    NASA Astrophysics Data System (ADS)

    Gejadze, I. Yu.; Malaterre, P.-O.

    2016-11-01

    Solving data assimilation problems under uncertainty in basic model parameters and in source terms may require a careful design of the control set. The task is to avoid such combinations of the control variables which may either lead to ill-posedness of the control problem formulation or compromise the robustness of the solution procedure. We suggest a method for quantifying the performance of a control set which is formed as a subset of the full set of uncertainty-bearing model inputs. Based on this quantity one can decide if the chosen 'safe' control set is sufficient in terms of the prediction accuracy. Technically, the method presents a certain generalization of the 'variational' uncertainty quantification method for observed systems. It is implemented as a matrix-free method, thus allowing high-dimensional applications. Moreover, if the Automatic Differentiation is utilized for computing the tangent linear and adjoint mappings, then it could be applied to any multi-input 'black-box' system. As application example we consider the full Saint-Venant hydraulic network model SIC2, which describes the flow dynamics in river and canal networks. The developed methodology seem useful in the context of the future SWOT satellite mission, which will provide observations of river systems the properties of which are known with quite a limited precision.

  11. Intellectual Curiosity in Action: A Framework to Assess First-Year Seminars in Liberal Arts Settings

    ERIC Educational Resources Information Center

    Kolb, Kenneth H.; Longest, Kyle C.; Barnett, Jenna C.

    2014-01-01

    Fostering students' intellectual curiosity is a common goal of first-year seminar programs--especially in liberal arts settings. The authors propose an alternative method to assess this ambiguous, value-laden concept. Relying on data gathered from pre- and posttest in-depth interviews of 34 students enrolled in first-year seminars, they…

  12. Alternative Dispute Resolution (ADR): A Different Framework for Conflict Resolution in Educational Settings.

    ERIC Educational Resources Information Center

    Turan, Selahattin; Taylor, Charles

    This paper briefly introduces alternative dispute resolution (ADR) processes and their fundamental principles. The paper provides a review of the literature on ADR and discusses its applicability in educational settings. The concept of conflict is explained, along with analysis of the limitations of traditional conflict resolution processes. The…

  13. Holocene sea level variations on the basis of integration of independent data sets

    SciTech Connect

    Sahagian, D.; Berkman, P. . Dept. of Geological Sciences and Byrd Polar Research Center)

    1992-01-01

    Variations in sea level through earth history have occurred at a wide variety of time scales. Sea level researchers have attacked the problem of measuring these sea level changes through a variety of approaches, each relevant only to the time scale in question, and usually only relevant to the specific locality from which a specific type of data are derived. There is a plethora of different data types that can and have been used (locally) for the measurement of Holocene sea level variations. The problem of merging different data sets for the purpose of constructing a global eustatic sea level curve for the Holocene has not previously been adequately addressed. The authors direct the efforts to that end. Numerous studies have been published regarding Holocene sea level changes. These have involved exposed fossil reef elevations, elevation of tidal deltas, elevation of depth of intertidal peat deposits, caves, tree rings, ice cores, moraines, eolian dune ridges, marine-cut terrace elevations, marine carbonate species, tide gauges, and lake level variations. Each of these data sets is based on particular set of assumptions, and is valid for a specific set of environments. In order to obtain the most accurate possible sea level curve for the Holocene, these data sets must be merged so that local and other influences can be filtered out of each data set. Since each data set involves very different measurements, each is scaled in order to define the sensitivity of the proxy measurement parameter to sea level, including error bounds. This effectively determines the temporal and spatial resolution of each data set. The level of independence of data sets is also quantified, in order to rule out the possibility of a common non-eustatic factor affecting more than one variety of data. The Holocene sea level curve is considered to be independent of other factors affecting the proxy data, and is taken to represent the relation between global ocean water and basin volumes.

  14. Motivation and engagement in music and sport: testing a multidimensional framework in diverse performance settings.

    PubMed

    Martin, Andrew J

    2008-02-01

    The present study assessed the application of a multidimensional model of motivation and engagement (the Motivation and Engagement Wheel) and its accompanying instrumentation (the Motivation and Engagement Scale) to the music and sport domains. Participants were 463 young classical musicians (N=224) and sportspeople (N=239). In both music and sport samples, the data confirmed the good fit of the four hypothesized higher-order dimensions and their 11 first-order dimensions: adaptive cognitions (self-efficacy, valuing, mastery orientation), adaptive behaviors (planning, task management, persistence), impeding/maladaptive cognitions (uncertain control, anxiety, failure avoidance), and maladaptive behaviors (self-handicapping, disengagement). Multigroup tests of factor invariance showed that in terms of underlying motivational constructs and the composition of and relationships among these constructs, key subsamples are not substantially different. Moreover-and of particular relevance to issues around the generalizability of the framework-the factor structure for music and sport samples was predominantly invariant.

  15. Disseminating hypnosis to health care settings: Applying the RE-AIM framework

    PubMed Central

    Yeh, Vivian M.; Schnur, Julie B.; Montgomery, Guy H.

    2014-01-01

    Hypnosis is a brief intervention ready for wider dissemination in medical contexts. Overall, hypnosis remains underused despite evidence supporting its beneficial clinical impact. This review will evaluate the evidence supporting hypnosis for dissemination using guidelines formulated by Glasgow and colleagues (1999). Five dissemination dimensions will be considered: Reach, Efficacy, Adoption, Implementation, and Maintenance (RE-AIM). Reach In medical settings, hypnosis is capable of helping a diverse range of individuals with a wide variety of problems. Efficacy There is evidence supporting the use of hypnosis for chronic pain, acute pain and emotional distress arising from medical procedures and conditions, cancer treatment-related side-effects and irritable bowel syndrome. Adoption Although hypnosis is currently not a part of mainstream clinical practices, evidence suggests that patients and healthcare providers are open to trying hypnosis, and may become more so when educated about what hypnosis can do. Implementation Hypnosis is a brief intervention capable of being administered effectively by healthcare providers. Maintenance Given the low resource needs of hypnosis, opportunities for reimbursement, and the ability of the intervention to potentially help medical settings reduce costs, the intervention has the qualities necessary to be integrated into routine care in a self-sustaining way in medical settings. In sum, hypnosis is a promising candidate for further dissemination. PMID:25267941

  16. Improving adolescent health policy: incorporating a framework for assessing state-level policies.

    PubMed

    Brindis, Claire D; Moore, Kristin

    2014-01-01

    Many US policies that affect health are made at the state, not the federal, level. Identifying state-level policies and data to analyze how different policies affect outcomes may help policy makers ascertain the usefulness of their public policies and funding decisions in improving the health of adolescent populations. A framework for describing and assessing the role of federal and state policies on adolescent health and well-being is proposed; an example of how the framework might be applied to the issue of teen childbearing is included. Such a framework can also help inform analyses of whether and how state and federal policies contribute to the variation across states in meeting adolescent health needs. A database on state policies, contextual variables, and health outcomes data can further enable researchers and policy makers to examine how these factors are associated with behaviors they aim to impact.

  17. Level set based vertebra segmentation for the evaluation of Ankylosing Spondylitis

    NASA Astrophysics Data System (ADS)

    Tan, Sovira; Yao, Jianhua; Ward, Michael M.; Yao, Lawrence; Summers, Ronald M.

    2006-03-01

    Ankylosing Spondylitis is a disease of the vertebra where abnormal bone structures (syndesmophytes) grow at intervertebral disk spaces. Because this growth is so slow as to be undetectable on plain radiographs taken over years, it is necessary to resort to computerized techniques to complement qualitative human judgment with precise quantitative measures on 3-D CT images. Very fine segmentation of the vertebral body is required to capture the small structures caused by the pathology. We propose a segmentation algorithm based on a cascade of three level set stages and requiring no training or prior knowledge. First, the noise inside the vertebral body that often blocks the proper evolution of level set surfaces is attenuated by a sigmoid function whose parameters are determined automatically. The 1st level set (geodesic active contour) is designed to roughly segment the interior of the vertebra despite often highly inhomogeneous and even discontinuous boundaries. The result is used as an initial contour for the 2nd level set (Laplacian level set) that closely captures the inner boundary of the cortical bone. The last level set (reversed Laplacian level set) segments the outer boundary of the cortical bone and also corrects small flaws of the previous stage. We carried out extensive tests on 30 vertebrae (5 from each of 6 patients). Two medical experts scored the results at intervertebral disk spaces focusing on end plates and syndesmophytes. Only two minor segmentation errors at vertebral end plates were reported and two syndesmophytes were considered slightly under-segmented.

  18. A combined watershed and level set method for segmentation of brightfield cell images

    NASA Astrophysics Data System (ADS)

    Tse, Shutong; Bradbury, Laura; Wan, Justin W. L.; Djambazian, Haig; Sladek, Robert; Hudson, Thomas

    2009-02-01

    Segmentation of brightfield cell images from microscopy is challenging in several ways. The contrast between cells and the background is low. Cells are usually surrounded by "halo", an optical artifact common in brightfield images. Also, cell divisions occur frequently, which raises the issue of topological change to segmentation. In this paper, we present a robust segmentation method based on the watershed and level set methods. Instead of heuristically locate where the initial markers for watershed should be, we apply a multiphase level set marker extraction to determine regions inside a cell. In contrast with the standard level set segmentation where only one level set function is used, we apply multiple level set functions (usually 3) to capture the different intensity levels in a cell image. This is particularly important to be able to distinguish regions of similar but different intensity levels in low contrast images. All the pixels obtained will be used as an initial marker for watershed. The region growing process of watershed will capture the rest of the cell until it hits the halo which serves as a "wall" to stop the expansion. By using these relatively large number of points as markers together with watershed, we show that the low contrast cell boundary can be captured correctly. Furthermore, we present a technique for watershed and level set to detect cell division automatically with no special human attention. Finally, we present segmentation results of C2C12 cells in brightfield images to illustrate the effectiveness of our method.

  19. Level set discrete element method for three-dimensional computations with triaxial case study

    NASA Astrophysics Data System (ADS)

    Kawamoto, Reid; Andò, Edward; Viggiani, Gioacchino; Andrade, José E.

    2016-06-01

    In this paper, we outline the level set discrete element method (LS-DEM) which is a discrete element method variant able to simulate systems of particles with arbitrary shape using level set functions as a geometric basis. This unique formulation allows seamless interfacing with level set-based characterization methods as well as computational ease in contact calculations. We then apply LS-DEM to simulate two virtual triaxial specimens generated from XRCT images of experiments and demonstrate LS-DEM's ability to quantitatively capture and predict stress-strain and volume-strain behavior observed in the experiments.

  20. Implementing the New State Framework for History-Social Studies of (Tenth Grade Level).

    ERIC Educational Resources Information Center

    Leavey, Don

    1990-01-01

    Describes experience of implementing new California History Social Science Framework at the tenth grade level at Edison High School, Huntington Beach, California. Discusses the anxieties felt by teachers as they omitted areas of world history to teach selected topics in greater depth. Presents the world history course structure that was developed…

  1. Using a Framework for Three Levels of Sense Making in a Mathematics Classroom

    ERIC Educational Resources Information Center

    Moss, Diana L.; Lamberg, Teruni

    2016-01-01

    This discussion-based lesson is designed to support Year 6 students in their initial understanding of using letters to represent numbers, expressions, and equations in algebra. The three level framework is designed for: (1) making thinking explicit, (2) exploring each other's solutions, and (3) developing new mathematical insights. In each level…

  2. Ice cover, landscape setting, and geological framework of Lake Vostok, East Antarctica

    USGS Publications Warehouse

    Studinger, M.; Bell, R.E.; Karner, G.D.; Tikku, A.A.; Holt, J.W.; Morse, D.L.; David, L.; Richter, T.G.; Kempf, S.D.; Peters, M.E.; Blankenship, D.D.; Sweeney, R.E.; Rystrom, V.L.

    2003-01-01

    Lake Vostok, located beneath more than 4 km of ice in the middle of East Antarctica, is a unique subglacial habitat and may contain microorganisms with distinct adaptations to such an extreme environment. Melting and freezing at the base of the ice sheet, which slowly flows across the lake, controls the flux of water, biota and sediment particles through the lake. The influx of thermal energy, however, is limited to contributions from below. Thus the geological origin of Lake Vostok is a critical boundary condition for the subglacial ecosystem. We present the first comprehensive maps of ice surface, ice thickness and subglacial topography around Lake Vostok. The ice flow across the lake and the landscape setting are closely linked to the geological origin of Lake Vostok. Our data show that Lake Vostok is located along a major geological boundary. Magnetic and gravity data are distinct east and west of the lake, as is the roughness of the subglacial topography. The physiographic setting of the lake has important consequences for the ice flow and thus the melting and freezing pattern and the lake's circulation. Lake Vostok is a tectonically controlled subglacial lake. The tectonic processes provided the space for a unique habitat and recent minor tectonic activity could have the potential to introduce small, but significant amounts of thermal energy into the lake. ?? 2002 Elsevier Science B.V. All rights reserved.

  3. A Conceptual Framework for Organizational Readiness to Implement Nutrition and Physical Activity Programs in Early Childhood Education Settings

    PubMed Central

    Upadhyaya, Mudita; Schober, Daniel J.; Byrd-Williams, Courtney

    2014-01-01

    Across multiple sectors, organizational readiness predicts the success of program implementation. However, the factors influencing readiness of early childhood education (ECE) organizations for implementation of new nutrition and physical activity programs is poorly understood. This study presents a new conceptual framework to measure organizational readiness to implement nutrition and physical activity programs in ECE centers serving children aged 0 to 5 years. The framework was validated for consensus on relevance and generalizability by conducting focus groups; the participants were managers (16 directors and 2 assistant directors) of ECE centers. The framework theorizes that it is necessary to have “collective readiness,” which takes into account such factors as resources, organizational operations, work culture, and the collective attitudes, motivation, beliefs, and intentions of ECE staff. Results of the focus groups demonstrated consensus on the relevance of proposed constructs across ECE settings. Including readiness measures during program planning and evaluation could inform implementation of ECE programs targeting nutrition and physical activity behaviors. PMID:25357258

  4. Basin-scale runoff prediction: An Ensemble Kalman Filter framework based on global hydrometeorological data sets

    NASA Astrophysics Data System (ADS)

    Lorenz, Christof; Tourian, Mohammad J.; Devaraju, Balaji; Sneeuw, Nico; Kunstmann, Harald

    2015-10-01

    In order to cope with the steady decline of the number of in situ gauges worldwide, there is a growing need for alternative methods to estimate runoff. We present an Ensemble Kalman Filter based approach that allows us to conclude on runoff for poorly or irregularly gauged basins. The approach focuses on the application of publicly available global hydrometeorological data sets for precipitation (GPCC, GPCP, CRU, UDEL), evapotranspiration (MODIS, FLUXNET, GLEAM, ERA interim, GLDAS), and water storage changes (GRACE, WGHM, GLDAS, MERRA LAND). Furthermore, runoff data from the GRDC and satellite altimetry derived estimates are used. We follow a least squares prediction that exploits the joint temporal and spatial auto- and cross-covariance structures of precipitation, evapotranspiration, water storage changes and runoff. We further consider time-dependent uncertainty estimates derived from all data sets. Our in-depth analysis comprises of 29 large river basins of different climate regions, with which runoff is predicted for a subset of 16 basins. Six configurations are analyzed: the Ensemble Kalman Filter (Smoother) and the hard (soft) Constrained Ensemble Kalman Filter (Smoother). Comparing the predictions to observed monthly runoff shows correlations larger than 0.5, percentage biases lower than ± 20%, and NSE-values larger than 0.5. A modified NSE-metric, stressing the difference to the mean annual cycle, shows an improvement of runoff predictions for 14 of the 16 basins. The proposed method is able to provide runoff estimates for nearly 100 poorly gauged basins covering an area of more than 11,500,000 km2 with a freshwater discharge, in volume, of more than 125,000 m3/s.

  5. Basin-scale runoff prediction: An Ensemble Kalman Filter framework based on global hydrometeorological data sets

    NASA Astrophysics Data System (ADS)

    Kunstmann, Harald; Lorenz, Christof; Tourian, Mohammad; Devaraju, Balaji; Sneeuw, Nico

    2016-04-01

    In order to cope with the steady decline of the number of in situ gauges worldwide, there is a growing need for alternative methods to estimate runoff. We present an Ensemble Kalman Filter based approach that allows us to conclude on runoff for poorly or irregularly gauged basins. The approach focuses on the application of publicly available global hydrometeorological data sets for precipitation (GPCC, GPCP, CRU, UDEL), evapotranspiration (MODIS, FLUXNET, GLEAM, ERA interim, GLDAS), and water storage changes (GRACE, WGHM, GLDAS, MERRA LAND). Furthermore, runoff data from the GRDC and satellite altimetry derived estimates are used. We follow a least squares prediction that exploits the joint temporal and spatial auto- and cross-covariance structures of precipitation, evapotranspiration, water storage changes and runoff. We further consider time-dependent uncertainty estimates derived from all data sets. Our in-depth analysis comprises of 29 large river basins of different climate regions, with which runoff is predicted for a subset of 16 basins. Six configurations are analyzed: the Ensemble Kalman Filter (Smoother) and the hard (soft) Constrained Ensemble Kalman Filter (Smoother). Comparing the predictions to observed monthly runoff shows correlations larger than 0.5, percentage biases lower than ± 20%, and NSE-values larger than 0.5. A modified NSE-metric, stressing the difference to the mean annual cycle, shows an improvement of runoff predictions for 14 of the 16 basins. The proposed method is able to provide runoff estimates for nearly 100 poorly gauged basins covering an area of more than 11,500,000 km2 with a freshwater discharge, in volume, of more than 125,000 m3/s.

  6. The Gabor-Based Tensor Level Set Method for Multiregional Image Segmentation

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Gao, Xinbo; Tao, Dacheng; Li, Xuelong; Li, Jie

    This paper represents a new level set method for multiregional image segmentation. It employs the Gabor filter bank to extract local geometrical features and builds the pixel tensor representation whose dimensionality is reduced by using the offline tensor analysis. Then multiphase level set functions are evolved in the tensor field to detect the boundaries of the corresponding image. The proposed method has three main advantages as follows. Firstly, employing the Gabor filter bank, the model is more robust against the salt-and-pepper noise. Secondly, the pixel tensor representation comprehensively depicts the information of pixels, which results in a better performance on the non-homogenous image segmentation. Thirdly, the model provides a uniform equation for multiphase level set functions to make it more practical. We apply the proposed method to synthetic and medical images respectively, and the results indicate that the proposed method is superior to the typical region-based level set method.

  7. Learning A Superpixel-Driven Speed Function for Level Set Tracking.

    PubMed

    Zhou, Xue; Li, Xi; Hu, Weiming

    2016-07-01

    A key problem in level set tracking is to construct a discriminative speed function for effective contour evolution. In this paper, we propose a level set tracking method based on a discriminative speed function, which produces a superpixel-driven force for effective level set evolution. Based on kernel density estimation and metric learning, the speed function is capable of effectively encoding the discriminative information on object appearance within a feasible metric space. Furthermore, we introduce adaptive object shape modeling into the level set evolution process, which leads to the tracking robustness in complex scenarios. To ensure the efficiency of adaptive object shape modeling, we develop a simple but efficient weighted non-negative matrix factorization method that can online learn an object shape dictionary. Experimental results on a number of challenging video sequences demonstrate the effectiveness and robustness of the proposed tracking method. PMID:26292353

  8. The Reliability and Validity of the Comfort Level Method of Setting Hearing Aid Gain

    ERIC Educational Resources Information Center

    Walden, Brian E.; And Others

    1977-01-01

    Investigated in a series of experiments with 40 adults (20- to 70-years-old) having bilateral sensorineural hearing impairments was the test-retest reliability of the comfort level method for setting the acoustic gain of hearing aids, and the relationship between the comfort settings utilized in more realistic daily listening situations.…

  9. An accurate conservative level set/ghost fluid method for simulating turbulent atomization

    SciTech Connect

    Desjardins, Olivier Moureau, Vincent; Pitsch, Heinz

    2008-09-10

    This paper presents a novel methodology for simulating incompressible two-phase flows by combining an improved version of the conservative level set technique introduced in [E. Olsson, G. Kreiss, A conservative level set method for two phase flow, J. Comput. Phys. 210 (2005) 225-246] with a ghost fluid approach. By employing a hyperbolic tangent level set function that is transported and re-initialized using fully conservative numerical schemes, mass conservation issues that are known to affect level set methods are greatly reduced. In order to improve the accuracy of the conservative level set method, high order numerical schemes are used. The overall robustness of the numerical approach is increased by computing the interface normals from a signed distance function reconstructed from the hyperbolic tangent level set by a fast marching method. The convergence of the curvature calculation is ensured by using a least squares reconstruction. The ghost fluid technique provides a way of handling the interfacial forces and large density jumps associated with two-phase flows with good accuracy, while avoiding artificial spreading of the interface. Since the proposed approach relies on partial differential equations, its implementation is straightforward in all coordinate systems, and it benefits from high parallel efficiency. The robustness and efficiency of the approach is further improved by using implicit schemes for the interface transport and re-initialization equations, as well as for the momentum solver. The performance of the method is assessed through both classical level set transport tests and simple two-phase flow examples including topology changes. It is then applied to simulate turbulent atomization of a liquid Diesel jet at Re=3000. The conservation errors associated with the accurate conservative level set technique are shown to remain small even for this complex case.

  10. A discontinuous Galerkin conservative level set scheme for interface capturing in multiphase flows

    NASA Astrophysics Data System (ADS)

    Owkes, Mark; Desjardins, Olivier

    2013-09-01

    The accurate conservative level set (ACLS) method of Desjardins et al. [O. Desjardins, V. Moureau, H. Pitsch, An accurate conservative level set/ghost fluid method for simulating turbulent atomization, J. Comput. Phys. 227 (18) (2008) 8395-8416] is extended by using a discontinuous Galerkin (DG) discretization. DG allows for the scheme to have an arbitrarily high order of accuracy with the smallest possible computational stencil resulting in an accurate method with good parallel scaling. This work includes a DG implementation of the level set transport equation, which moves the level set with the flow field velocity, and a DG implementation of the reinitialization equation, which is used to maintain the shape of the level set profile to promote good mass conservation. A near second order converging interface curvature is obtained by following a height function methodology (common amongst volume of fluid schemes) in the context of the conservative level set. Various numerical experiments are conducted to test the properties of the method and show excellent results, even on coarse meshes. The tests include Zalesak’s disk, two-dimensional deformation of a circle, time evolution of a standing wave, and a study of the Kelvin-Helmholtz instability. Finally, this novel methodology is employed to simulate the break-up of a turbulent liquid jet.

  11. A discontinuous Galerkin conservative level set scheme for interface capturing in multiphase flows

    SciTech Connect

    Owkes, Mark Desjardins, Olivier

    2013-09-15

    The accurate conservative level set (ACLS) method of Desjardins et al. [O. Desjardins, V. Moureau, H. Pitsch, An accurate conservative level set/ghost fluid method for simulating turbulent atomization, J. Comput. Phys. 227 (18) (2008) 8395–8416] is extended by using a discontinuous Galerkin (DG) discretization. DG allows for the scheme to have an arbitrarily high order of accuracy with the smallest possible computational stencil resulting in an accurate method with good parallel scaling. This work includes a DG implementation of the level set transport equation, which moves the level set with the flow field velocity, and a DG implementation of the reinitialization equation, which is used to maintain the shape of the level set profile to promote good mass conservation. A near second order converging interface curvature is obtained by following a height function methodology (common amongst volume of fluid schemes) in the context of the conservative level set. Various numerical experiments are conducted to test the properties of the method and show excellent results, even on coarse meshes. The tests include Zalesak’s disk, two-dimensional deformation of a circle, time evolution of a standing wave, and a study of the Kelvin–Helmholtz instability. Finally, this novel methodology is employed to simulate the break-up of a turbulent liquid jet.

  12. Locally constrained active contour: a region-based level set for ovarian cancer metastasis segmentation

    NASA Astrophysics Data System (ADS)

    Liu, Jianfei; Yao, Jianhua; Wang, Shijun; Linguraru, Marius George; Summers, Ronald M.

    2014-03-01

    Accurate segmentation of ovarian cancer metastases is clinically useful to evaluate tumor growth and determine follow-up treatment. We present a region-based level set algorithm with localization constraints to segment ovarian cancer metastases. Our approach is established on a representative region-based level set, Chan-Vese model, in which an active contour is driven by region competition. To reduce over-segmentation, we constrain the level set propagation within a narrow image band by embedding a dynamic localization function. The metastasis intensity prior is also estimated from image regions within the level set initialization. The localization function and intensity prior force the level set to stop at the desired metastasis boundaries. Our approach was validated on 19 ovarian cancer metastases with radiologist-labeled ground-truth on contrast-enhanced CT scans from 15 patients. The comparison between our algorithm and geodesic active contour indicated that the volume overlap was 75+/-10% vs. 56+/-6%, the Dice coefficient was 83+/-8% vs. 63+/-8%, and the average surface distance was 2.2+/-0.6mm vs. 4.4+/-0.9mm. Experimental results demonstrated that our algorithm outperformed traditional level set algorithms.

  13. An adaptive level set approach for incompressible two-phase flows

    SciTech Connect

    Sussman, M.; Almgren, A.S.; Bell, J.B.

    1997-04-01

    In Sussman, Smereka and Osher, a numerical method using the level set approach was formulated for solving incompressible two-phase flow with surface tension. In the level set approach, the interface is represented as the zero level set of a smooth function; this has the effect of replacing the advection of density, which has steep gradients at the interface, with the advection of the level set function, which is smooth. In addition, the interface can merge or break up with no special treatment. The authors maintain the level set function as the signed distance from the interface in order to robustly compute flows with high density ratios and stiff surface tension effects. In this work, they couple the level set scheme to an adaptive projection method for the incompressible Navier-Stokes equations, in order to achieve higher resolution of the interface with a minimum of additional expense. They present two-dimensional axisymmetric and fully three-dimensional results of air bubble and water drop computations.

  14. Aorta segmentation with a 3D level set approach and quantification of aortic calcifications in non-contrast chest CT.

    PubMed

    Kurugol, Sila; San Jose Estepar, Raul; Ross, James; Washko, George R

    2012-01-01

    Automatic aorta segmentation in thoracic computed tomography (CT) scans is important for aortic calcification quantification and to guide the segmentation of other central vessels. We propose an aorta segmentation algorithm consisting of an initial boundary detection step followed by 3D level set segmentation for refinement. Our algorithm exploits aortic cross-sectional circularity: we first detect aorta boundaries with a circular Hough transform on axial slices to detect ascending and descending aorta regions, and we apply the Hough transform on oblique slices to detect the aortic arch. The centers and radii of circles detected by Hough transform are fitted to smooth cubic spline functions using least-squares fitting. From these center and radius spline functions, we reconstruct an initial aorta surface using the Frenet frame. This reconstructed tubular surface is further refined with 3D level set evolutions. The level set framework we employ optimizes a functional that depends on both edge strength and smoothness terms and evolves the surface to the position of nearby edge location corresponding to the aorta wall. After aorta segmentation, we first detect the aortic calcifications with thresholding applied to the segmented aorta region. We then filter out the false positive regions due to nearby high intensity structures. We tested the algorithm on 45 CT scans and obtained a closest point mean error of 0.52 ± 0.10 mm between the manually and automatically segmented surfaces. The true positive detection rate of calcification algorithm was 0.96 over all CT scans. PMID:23366394

  15. Evolving entities: towards a unified framework for understanding diversity at the species and higher levels

    PubMed Central

    Barraclough, Timothy G.

    2010-01-01

    Current approaches to studying the evolution of biodiversity differ in their treatment of species and higher level diversity patterns. Species are regarded as the fundamental evolutionarily significant units of biodiversity, both in theory and in practice, and extensive theory explains how they originate and evolve. However, most species are still delimited using qualitative methods that only relate indirectly to the underlying theory. In contrast, higher level patterns of diversity have been subjected to rigorous quantitative study (using phylogenetics), but theory that adequately explains the observed patterns has been lacking. Most evolutionary analyses of higher level diversity patterns have considered non-equilibrium explanations based on rates of diversification (i.e. exponentially growing clades), rather than equilibrium explanations normally used at the species level and below (i.e. constant population sizes). This paper argues that species level and higher level patterns of diversity can be considered within a common framework, based on equilibrium explanations. It shows how forces normally considered in the context of speciation, namely divergent selection and geographical isolation, can generate evolutionarily significant units of diversity above the level of reproductively isolated species. Prospects for the framework to answer some unresolved questions about higher level diversity patterns are discussed. PMID:20439282

  16. Multiphase permittivity imaging using absolute value electrical capacitance tomography data and a level set algorithm.

    PubMed

    Al Hosani, E; Soleimani, M

    2016-06-28

    Multiphase flow imaging is a very challenging and critical topic in industrial process tomography. In this article, simulation and experimental results of reconstructing the permittivity profile of multiphase material from data collected in electrical capacitance tomography (ECT) are presented. A multiphase narrowband level set algorithm is developed to reconstruct the interfaces between three- or four-phase permittivity values. The level set algorithm is capable of imaging multiphase permittivity by using one set of ECT measurement data, so-called absolute value ECT reconstruction, and this is tested with high-contrast and low-contrast multiphase data. Simulation and experimental results showed the superiority of this algorithm over classical pixel-based image reconstruction methods. The multiphase level set algorithm and absolute ECT reconstruction are presented for the first time, to the best of our knowledge, in this paper and critically evaluated. This article is part of the themed issue 'Supersensing through industrial process tomography'. PMID:27185966

  17. Modeling shear modulus distribution in magnetic resonance elastography with piecewise constant level sets.

    PubMed

    Li, Bing Nan; Chui, Chee Kong; Ong, Sim Heng; Numano, Tomokazu; Washio, Toshikatsu; Homma, Kazuhiro; Chang, Stephen; Venkatesh, Sudhakar; Kobayashi, Etsuko

    2012-04-01

    Magnetic resonance elastography (MRE) is designed for imaging the mechanical properties of soft tissues. However, the interpretation of shear modulus distribution is often confusing and cumbersome. For reliable evaluation, a common practice is to specify the regions of interest and consider regional elasticity. Such an experience-dependent protocol is susceptible to intrapersonal and interpersonal variability. In this study we propose to remodel shear modulus distribution with piecewise constant level sets by referring to the corresponding magnitude image. Optimal segmentation and registration are achieved by a new hybrid level set model comprised of alternating global and local region competitions. Experimental results on the simulated MRE data sets show that the mean error of elasticity reconstruction is 11.33% for local frequency estimation and 18.87% for algebraic inversion of differential equation. Piecewise constant level set modeling is effective to improve the quality of shear modulus distribution, and facilitates MRE analysis and interpretation.

  18. Multiphase permittivity imaging using absolute value electrical capacitance tomography data and a level set algorithm.

    PubMed

    Al Hosani, E; Soleimani, M

    2016-06-28

    Multiphase flow imaging is a very challenging and critical topic in industrial process tomography. In this article, simulation and experimental results of reconstructing the permittivity profile of multiphase material from data collected in electrical capacitance tomography (ECT) are presented. A multiphase narrowband level set algorithm is developed to reconstruct the interfaces between three- or four-phase permittivity values. The level set algorithm is capable of imaging multiphase permittivity by using one set of ECT measurement data, so-called absolute value ECT reconstruction, and this is tested with high-contrast and low-contrast multiphase data. Simulation and experimental results showed the superiority of this algorithm over classical pixel-based image reconstruction methods. The multiphase level set algorithm and absolute ECT reconstruction are presented for the first time, to the best of our knowledge, in this paper and critically evaluated. This article is part of the themed issue 'Supersensing through industrial process tomography'.

  19. GIST: an interactive, GPU-based level set segmentation tool for 3D medical images.

    PubMed

    Cates, Joshua E; Lefohn, Aaron E; Whitaker, Ross T

    2004-09-01

    While level sets have demonstrated a great potential for 3D medical image segmentation, their usefulness has been limited by two problems. First, 3D level sets are relatively slow to compute. Second, their formulation usually entails several free parameters which can be very difficult to correctly tune for specific applications. The second problem is compounded by the first. This paper describes a new tool for 3D segmentation that addresses these problems by computing level-set surface models at interactive rates. This tool employs two important, novel technologies. First is the mapping of a 3D level-set solver onto a commodity graphics card (GPU). This mapping relies on a novel mechanism for GPU memory management. The interactive rates level-set PDE solver give the user immediate feedback on the parameter settings, and thus users can tune free parameters and control the shape of the model in real time. The second technology is the use of intensity-based speed functions, which allow a user to quickly and intuitively specify the behavior of the deformable model. We have found that the combination of these interactive tools enables users to produce good, reliable segmentations. To support this observation, this paper presents qualitative results from several different datasets as well as a quantitative evaluation from a study of brain tumor segmentations. PMID:15450217

  20. Joint inversion of geophysical data using petrophysical clustering and facies deformation wth the level set technique

    NASA Astrophysics Data System (ADS)

    Revil, A.

    2015-12-01

    Geological expertise and petrophysical relationships can be brought together to provide prior information while inverting multiple geophysical datasets. The merging of such information can result in more realistic solution in the distribution of the model parameters, reducing ipse facto the non-uniqueness of the inverse problem. We consider two level of heterogeneities: facies, described by facies boundaries and heteroegenities inside each facies determined by a correlogram. In this presentation, we pose the geophysical inverse problem in terms of Gaussian random fields with mean functions controlled by petrophysical relationships and covariance functions controlled by a prior geological cross-section, including the definition of spatial boundaries for the geological facies. The petrophysical relationship problem is formulated as a regression problem upon each facies. The inversion of the geophysical data is performed in a Bayesian framework. We demonstrate the usefulness of this strategy using a first synthetic case for which we perform a joint inversion of gravity and galvanometric resistivity data with the stations located at the ground surface. The joint inversion is used to recover the density and resistivity distributions of the subsurface. In a second step, we consider the possibility that the facies boundaries are deformable and their shapes are inverted as well. We use the level set approach to perform such deformation preserving prior topological properties of the facies throughout the inversion. With the help of prior facies petrophysical relationships and topological characteristic of each facies, we make posterior inference about multiple geophysical tomograms based on their corresponding geophysical data misfits. The method is applied to a second synthetic case showing that we can recover the heterogeneities inside the facies, the mean values for the petrophysical properties, and, to some extent, the facies boundaries using the 2D joint inversion of

  1. A three-tier framework for monitoring antiretroviral therapy in high HIV burden settings

    PubMed Central

    Osler, Meg; Hilderbrand, Katherine; Hennessey, Claudine; Arendse, Juanita; Goemaere, Eric; Ford, Nathan; Boulle, Andrew

    2014-01-01

    The provision of antiretroviral therapy (ART) in low and middle-income countries is a chronic disease intervention of unprecedented magnitude and is the dominant health systems challenge for high-burden countries, many of which rank among the poorest in the world. Substantial external investment, together with the requirement for service evolution to adapt to changing needs, including the constant shift to earlier ART initiation, makes outcome monitoring and reporting particularly important. However, there is growing concern at the inability of many high-burden countries to report on the outcomes of patients who have been in care for various durations, or even the number of patients in care at a particular point in time. In many instances, countries can only report on the number of patients ever started on ART. Despite paper register systems coming under increasing strain, the evolution from paper directly to complex electronic medical record solutions is not viable in many contexts. Implementing a bridging solution, such as a simple offline electronic version of the paper register, can be a pragmatic alternative. This paper describes and recommends a three-tiered monitoring approach in low- and middle-income countries based on the experience implementing such a system in the Western Cape province of South Africa. A three-tier approach allows Ministries of Health to strategically implement one of the tiers in each facility offering ART services. Each tier produces the same nationally required monthly enrolment and quarterly cohort reports so that outputs from the three tiers can be aggregated into a single database at any level of the health system. The choice of tier is based on context and resources at the time of implementation. As resources and infrastructure improve, more facilities will transition to the next highest and more technologically sophisticated tier. Implementing a three-tier monitoring system at country level for pre-antiretroviral wellness, ART

  2. Ontological Problem-Solving Framework for Assigning Sensor Systems and Algorithms to High-Level Missions

    PubMed Central

    Qualls, Joseph; Russomanno, David J.

    2011-01-01

    The lack of knowledge models to represent sensor systems, algorithms, and missions makes opportunistically discovering a synthesis of systems and algorithms that can satisfy high-level mission specifications impractical. A novel ontological problem-solving framework has been designed that leverages knowledge models describing sensors, algorithms, and high-level missions to facilitate automated inference of assigning systems to subtasks that may satisfy a given mission specification. To demonstrate the efficacy of the ontological problem-solving architecture, a family of persistence surveillance sensor systems and algorithms has been instantiated in a prototype environment to demonstrate the assignment of systems to subtasks of high-level missions. PMID:22164081

  3. A framework for sea level rise vulnerability assessment for southwest U.S. military installations

    USGS Publications Warehouse

    Chadwick, B.; Flick, Reinhard; Helly, J.; Nishikawa, T.; Pei, Fang Wang; O'Reilly, W.; Guza, R.; Bromirski, Peter; Young, A.; Crampton, W.; Wild, B.; Canner, I.

    2011-01-01

    We describe an analysis framework to determine military installation vulnerabilities under increases in local mean sea level as projected over the next century. The effort is in response to an increasing recognition of potential climate change ramifications for national security and recommendations that DoD conduct assessments of the impact on U.S. military installations of climate change. Results of the effort described here focus on development of a conceptual framework for sea level rise vulnerability assessment at coastal military installations in the southwest U.S. We introduce the vulnerability assessment in the context of a risk assessment paradigm that incorporates sources in the form of future sea level conditions, pathways of impact including inundation, flooding, erosion and intrusion, and a range of military installation specific receptors such as critical infrastructure and training areas. A unique aspect of the methodology is the capability to develop wave climate projections from GCM outputs and transform these to future wave conditions at specific coastal sites. Future sea level scenarios are considered in the context of installation sensitivity curves which reveal response thresholds specific to each installation, pathway and receptor. In the end, our goal is to provide a military-relevant framework for assessment of accelerated SLR vulnerability, and develop the best scientifically-based scenarios of waves, tides and storms and their implications for DoD installations in the southwestern U.S. ?? 2011 MTS.

  4. The Effects on Motor Performance of Setting an Overt Level of Aspiration by Mentally Retarded Students.

    ERIC Educational Resources Information Center

    Kozar, Bill

    This study investigates the effects of setting an overt level of aspiration on the standing long jump performance of mildly and moderately retarded institutionalized children. Thirty-three mildly retarded and seven moderately retarded students were randomly assigned to either an overt level of aspiration (OLA) group or a control group. Each…

  5. A distributed decision framework for building clusters with different heterogeneity settings

    DOE PAGES

    Jafari-Marandi, Ruholla; Omitaomu, Olufemi A.; Hu, Mengqi

    2016-01-05

    In the past few decades, extensive research has been conducted to develop operation and control strategy for smart buildings with the purpose of reducing energy consumption. Besides studying on single building, it is envisioned that the next generation buildings can freely connect with one another to share energy and exchange information in the context of smart grid. It was demonstrated that a network of connected buildings (aka building clusters) can significantly reduce primary energy consumption, improve environmental sustainability and building s resilience capability. However, an analytic tool to determine which type of buildings should form a cluster and what ismore » the impact of building clusters heterogeneity based on energy profile to the energy performance of building clusters is missing. To bridge these research gaps, we propose a self-organizing map clustering algorithm to divide multiple buildings to different clusters based on their energy profiles, and a homogeneity index to evaluate the heterogeneity of different building clusters configurations. In addition, a bi-level distributed decision model is developed to study the energy sharing in the building clusters. To demonstrate the effectiveness of the proposed clustering algorithm and decision model, we employ a dataset including monthly energy consumption data for 30 buildings where the data is collected every 15 min. It is demonstrated that the proposed decision model can achieve at least 13% cost savings for building clusters. Furthermore, the results show that the heterogeneity of energy profile is an important factor to select battery and renewable energy source for building clusters, and the shared battery and renewable energy are preferred for more heterogeneous building clusters.« less

  6. Breast mass segmentation in digital mammography based on pulse coupled neural network and level set method

    NASA Astrophysics Data System (ADS)

    Xie, Weiying; Ma, Yide; Li, Yunsong

    2015-05-01

    A novel approach to mammographic image segmentation, termed as PCNN-based level set algorithm, is presented in this paper. Just as its name implies, a method based on pulse coupled neural network (PCNN) in conjunction with the variational level set method for medical image segmentation. To date, little work has been done on detecting the initial zero level set contours based on PCNN algorithm for latterly level set evolution. When all the pixels of the input image are fired by PCNN, the small pixel value will be a much more refined segmentation. In mammographic image, the breast tumor presents big pixel value. Additionally, the mammographic image with predominantly dark region, so that we firstly obtain the negative of mammographic image with predominantly dark region except the breast tumor before all the pixels of an input image are fired by PCNN. Therefore, in here, PCNN algorithm is employed to achieve mammary-specific, initial mass contour detection. After that, the initial contours are all extracted. We define the extracted contours as the initial zero level set contours for automatic mass segmentation by variational level set in mammographic image analysis. What's more, a new proposed algorithm improves external energy of variational level set method in terms of mammographic images in low contrast. In accordance with the gray scale of mass region in mammographic image is higher than the region surrounded, so the Laplace operator is used to modify external energy, which could make the bright spot becoming much brighter than the surrounded pixels in the image. A preliminary evaluation of the proposed method performs on a known public database namely MIAS, rather than synthetic images. The experimental results demonstrate that our proposed approach can potentially obtain better masses detection results in terms of sensitivity and specificity. Ultimately, this algorithm could lead to increase both sensitivity and specificity of the physicians' interpretation of

  7. Target Detection in SAR Images Based on a Level Set Approach

    SciTech Connect

    Marques, Regis C.P.; Medeiros, Fatima N.S.; Ushizima, Daniela M.

    2008-09-01

    This paper introduces a new framework for point target detection in synthetic aperture radar (SAR) images. We focus on the task of locating reflective small regions using alevel set based algorithm. Unlike most of the approaches in image segmentation, we address an algorithm which incorporates speckle statistics instead of empirical parameters and also discards speckle filtering. The curve evolves according to speckle statistics, initially propagating with a maximum upward velocity in homogeneous areas. Our approach is validated by a series of tests on synthetic and real SAR images and compared with three other segmentation algorithms, demonstrating that it configures a novel and efficient method for target detection purpose.

  8. A Hybrid Method for Pancreas Extraction from CT Image Based on Level Set Methods

    PubMed Central

    Tan, Hanqing; Fujita, Hiroshi

    2013-01-01

    This paper proposes a novel semiautomatic method to extract the pancreas from abdominal CT images. Traditional level set and region growing methods that request locating initial contour near the final boundary of object have problem of leakage to nearby tissues of pancreas region. The proposed method consists of a customized fast-marching level set method which generates an optimal initial pancreas region to solve the problem that the level set method is sensitive to the initial contour location and a modified distance regularized level set method which extracts accurate pancreas. The novelty in our method is the proper selection and combination of level set methods, furthermore an energy-decrement algorithm and an energy-tune algorithm are proposed to reduce the negative impact of bonding force caused by connected tissue whose intensity is similar with pancreas. As a result, our method overcomes the shortages of oversegmentation at weak boundary and can accurately extract pancreas from CT images. The proposed method is compared to other five state-of-the-art medical image segmentation methods based on a CT image dataset which contains abdominal images from 10 patients. The evaluated results demonstrate that our method outperforms other methods by achieving higher accuracy and making less false segmentation in pancreas extraction. PMID:24066016

  9. A hybrid method for pancreas extraction from CT image based on level set methods.

    PubMed

    Jiang, Huiyan; Tan, Hanqing; Fujita, Hiroshi

    2013-01-01

    This paper proposes a novel semiautomatic method to extract the pancreas from abdominal CT images. Traditional level set and region growing methods that request locating initial contour near the final boundary of object have problem of leakage to nearby tissues of pancreas region. The proposed method consists of a customized fast-marching level set method which generates an optimal initial pancreas region to solve the problem that the level set method is sensitive to the initial contour location and a modified distance regularized level set method which extracts accurate pancreas. The novelty in our method is the proper selection and combination of level set methods, furthermore an energy-decrement algorithm and an energy-tune algorithm are proposed to reduce the negative impact of bonding force caused by connected tissue whose intensity is similar with pancreas. As a result, our method overcomes the shortages of oversegmentation at weak boundary and can accurately extract pancreas from CT images. The proposed method is compared to other five state-of-the-art medical image segmentation methods based on a CT image dataset which contains abdominal images from 10 patients. The evaluated results demonstrate that our method outperforms other methods by achieving higher accuracy and making less false segmentation in pancreas extraction.

  10. A multi-level approach for promoting HIV testing within African American church settings.

    PubMed

    Stewart, Jennifer M

    2015-02-01

    The African American church is a community-based organization that is integral to the lives, beliefs, and behaviors of the African American community. Engaging this vital institution as a primary setting for HIV testing and referral would significantly impact the epidemic. The disproportionately high HIV incidence rate among African Americans dictates the national priority for promotion of early and routine HIV testing, and suggests engaging community-based organizations in this endeavor. However, few multilevel HIV testing frameworks have been developed, tested, and evaluated within the African American church. This article proposes one such framework for promoting HIV testing and referral within African American churches. A qualitative study was employed to examine the perceptions, beliefs, knowledge, and behaviors related to understanding involvement in church-based HIV testing. A total of four focus groups with church leaders and four in-depth interviews with pastors, were conducted between November 2012 and June 2013 to identify the constructs most important to supporting Philadelphia churches' involvement in HIV testing, referral, and linkage to care. The data generated from this study were analyzed using a grounded theory approach and used to develop and refine a multilevel framework for identifying factors impacting church-based HIV testing and referral and to ultimately support capacity building among African American churches to promote HIV testing and linkage to care.

  11. Estimations of a global sea level trend: limitations from the structure of the PSMSL global sea level data set

    NASA Astrophysics Data System (ADS)

    Gröger, M.; Plag, H.-P.

    1993-08-01

    Among the possible impacts on environmental conditions of a global warming expected as a consequence of the increasing release of CO 2 and various other greenhouse gases into the atmosphere, a predicted rise in global sea level is considered to be of high importance. Thus, quite a number of recent studies have focused on detecting the "global sea level rise" or even an acceleration of this trend. A brief review of these studies is presented, showing, however, that the results are not conclusive, though most of the studies have been based on a single global data set of coastal tide gauge data provided by the Permanent Service for Mean Sea Level (PSMSL). A detailed discussion of a thoroughly revised subset reveals that the PSMSL data set suffers from three severe limitations: (1) the geographical distribution of reliable tide gauge stations is rather uneven with pronounced concentrations in some areas of the northern hemisphere (Europe, North America, Japan), and much fewer stations on the southern hemisphere where particularly few stations are located in Africa and in Antarctica; (2) the number of stations recording simultaneously at any time is far less than the total number of stations with the maximum within the interval between 1958 and 1988; (3) the number of long records is extremely small and almost all of them originate from a few regions of the northern hemisphere. The sensitivity of the median of the local trends to these temporal and spatial limitations is discussed by restricting the data set in both the spatial and temporal distribution. It is shown that the data base is insufficient for determining an integral value of the global rise in relative sea level. The effect of polar motion on sea level is modelled and it turns out to be locally of the order of 0.5 mm/yr, affecting regional trends to an order of 0.1 mm/yr. Thus, this effect can be neglected on time scale of decades to a hundred years. Though the data set is insufficient for determining an

  12. [Narrow band multi-region level set method for remote sensing image].

    PubMed

    Fang, Jiang-Xiong; Tu, En-Mei; Yang, Jie; Jia, Zhen-Hong; Nikola, Kasabov

    2011-11-01

    Massive redundant contours happen when the classical Chan-Vese (C-V) model is used to segment remote sensing images, which have interlaced edges. What's more, this model can't segment homogeneous objects with multiple regions. In order to overcome this limitation of C-V model, narrow band multiple level set method is proposed. The use of N-1 curves is required for the segmentation of N regions and each curve represents one region. First, the level set model to establish an independent multi-region region can eliminate the redundant contours and avoids the problems of vacuum and overlap. Then, narrow band approach to level set method can reduce the computational cost. Experimental results of remote image verify that our model is efficient and accurate.

  13. A Variational Level Set Approach to Segmentation and Bias Correction of Images with Intensity Inhomogeneity

    PubMed Central

    Huang, Rui; Ding, Zhaohua; Gatenby, Chris; Metaxas, Dimitris; Gore, John

    2009-01-01

    This paper presents a variational level set approach to joint segmentation and bias correction of images with intensity inhomogeneity. Our method is based on an observation that intensities in a relatively small local region are separable, despite of the inseparability of the intensities in the whole image caused by the intensity inhomogeneity. We first define a weighted K-means clustering objective function for image intensities in a neighborhood around each point, with the cluster centers having a multiplicative factor that estimates the bias within the neighborhood. The objective function is then integrated over the entire domain and incorporated into a variational level set formulation. The energy minimization is performed via a level set evolution process. Our method is able to estimate bias of quite general profiles. Moreover, it is robust to initialization, and therefore allows automatic applications. The proposed method has been used for images of various modalities with promising results. PMID:18982712

  14. Aerostructural Level Set Topology Optimization for a Common Research Model Wing

    NASA Technical Reports Server (NTRS)

    Dunning, Peter D.; Stanford, Bret K.; Kim, H. Alicia

    2014-01-01

    The purpose of this work is to use level set topology optimization to improve the design of a representative wing box structure for the NASA common research model. The objective is to minimize the total compliance of the structure under aerodynamic and body force loading, where the aerodynamic loading is coupled to the structural deformation. A taxi bump case was also considered, where only body force loads were applied. The trim condition that aerodynamic lift must balance the total weight of the aircraft is enforced by allowing the root angle of attack to change. The level set optimization method is implemented on an unstructured three-dimensional grid, so that the method can optimize a wing box with arbitrary geometry. Fast matching and upwind schemes are developed for an unstructured grid, which make the level set method robust and efficient. The adjoint method is used to obtain the coupled shape sensitivities required to perform aerostructural optimization of the wing box structure.

  15. Numerical Schemes for the Hamilton-Jacobi and Level Set Equations on Triangulated Domains

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.; Sethian, James A.

    2006-01-01

    Borrowing from techniques developed for conservation law equations, we have developed both monotone and higher order accurate numerical schemes which discretize the Hamilton-Jacobi and level set equations on triangulated domains. The use of unstructured meshes containing triangles (2D) and tetrahedra (3D) easily accommodates mesh adaptation to resolve disparate level set feature scales with a minimal number of solution unknowns. The minisymposium talk will discuss these algorithmic developments and present sample calculations using our adaptive triangulation algorithm applied to various moving interface problems such as etching, deposition, and curvature flow.

  16. Segmentation of the left ventricle using distance regularized two-layer level set approach.

    PubMed

    Feng, Chaolu; Li, Chunming; Zhao, Dazhe; Davatzikos, Christos; Litt, Harold

    2013-01-01

    We propose a novel two-layer level set approach for segmentation of the left ventricle (LV) from cardiac magnetic resonance (CMR) short-axis images. In our method, endocardium and epicardium are represented by two specified level contours of a level set function. Segmentation of the LV is formulated as a problem of optimizing the level set function such that these two level contours best fit the epicardium and endocardium. More importantly, a distance regularization (DR) constraint on the level contours is introduced to preserve smoothly varying distance between them. This DR constraint leads to a desirable interaction between the level contours that contributes to maintain the anatomical geometry of the endocardium and epicardium. The negative influence of intensity inhomogeneities on image segmentation are overcome by using a data term derived from a local intensity clustering property. Our method is quantitatively validated by experiments on the datasets for the MICCAI grand challenge on left ventricular segmentation, which demonstrates the advantages of our method in terms of segmentation accuracy and consistency with anatomical geometry.

  17. Individual-and Setting-Level Correlates of Secondary Traumatic Stress in Rape Crisis Center Staff.

    PubMed

    Dworkin, Emily R; Sorell, Nicole R; Allen, Nicole E

    2016-02-01

    Secondary traumatic stress (STS) is an issue of significant concern among providers who work with survivors of sexual assault. Although STS has been studied in relation to individual-level characteristics of a variety of types of trauma responders, less research has focused specifically on rape crisis centers as environments that might convey risk or protection from STS, and no research to knowledge has modeled setting-level variation in correlates of STS. The current study uses a sample of 164 staff members representing 40 rape crisis centers across a single Midwestern state to investigate the staff member-and agency-level correlates of STS. Results suggest that correlates exist at both levels of analysis. Younger age and greater severity of sexual assault history were statistically significant individual-level predictors of increased STS. Greater frequency of supervision was more strongly related to secondary stress for non-advocates than for advocates. At the setting level, lower levels of supervision and higher client loads agency-wide accounted for unique variance in staff members' STS. These findings suggest that characteristics of both providers and their settings are important to consider when understanding their STS.

  18. Accurate Adaptive Level Set Method and Sharpening Technique for Three Dimensional Deforming Interfaces

    NASA Technical Reports Server (NTRS)

    Kim, Hyoungin; Liou, Meng-Sing

    2011-01-01

    In this paper, we demonstrate improved accuracy of the level set method for resolving deforming interfaces by proposing two key elements: (1) accurate level set solutions on adapted Cartesian grids by judiciously choosing interpolation polynomials in regions of different grid levels and (2) enhanced reinitialization by an interface sharpening procedure. The level set equation is solved using a fifth order WENO scheme or a second order central differencing scheme depending on availability of uniform stencils at each grid point. Grid adaptation criteria are determined so that the Hamiltonian functions at nodes adjacent to interfaces are always calculated by the fifth order WENO scheme. This selective usage between the fifth order WENO and second order central differencing schemes is confirmed to give more accurate results compared to those in literature for standard test problems. In order to further improve accuracy especially near thin filaments, we suggest an artificial sharpening method, which is in a similar form with the conventional re-initialization method but utilizes sign of curvature instead of sign of the level set function. Consequently, volume loss due to numerical dissipation on thin filaments is remarkably reduced for the test problems

  19. A Measurement Framework for Team Level Assessment of Innovation Capability in Early Requirements Engineering

    NASA Astrophysics Data System (ADS)

    Regnell, Björn; Höst, Martin; Nilsson, Fredrik; Bengtsson, Henrik

    When developing software-intensive products for a market-place it is important for a development organisation to create innovative features for coming releases in order to achieve advantage over competitors. This paper focuses on assessment of innovation capability at team level in relation to the requirements engineering that is taking place before the actual product development projects are decided, when new business models, technology opportunities and intellectual property rights are created and investigated through e.g. prototyping and concept development. The result is a measurement framework focusing on four areas: innovation elicitation, selection, impact and ways-of-working. For each area, candidate measurements were derived from interviews to be used as inspiration in the development of a tailored measurement program. The framework is based on interviews with participants of a software team with specific innovation responsibilities and validated through cross-case analysis and feedback from practitioners.

  20. Demons versus Level-Set motion registration for coronary 18F-sodium fluoride PET

    PubMed Central

    Rubeaux, Mathieu; Joshi, Nikhil; Dweck, Marc R.; Fletcher, Alison; Motwani, Manish; Thomson, Louise E.; Germano, Guido; Dey, Damini; Berman, Daniel S.; Newby, David E.; Slomka, Piotr J.

    2016-01-01

    Ruptured coronary atherosclerotic plaques commonly cause acute myocardial infarction. It has been recently shown that active microcalcification in the coronary arteries, one of the features that characterizes vulnerable plaques at risk of rupture, can be imaged using cardiac gated 18F-sodium fluoride (18F-NaF) PET. We have shown in previous work that a motion correction technique applied to cardiac-gated 18F-NaF PET images can enhance image quality and improve uptake estimates. In this study, we further investigated the applicability of different algorithms for registration of the coronary artery PET images. In particular, we aimed to compare demons vs. level-set nonlinear registration techniques applied for the correction of cardiac motion in coronary 18F-NaF PET. To this end, fifteen patients underwent 18F-NaF PET and prospective coronary CT angiography (CCTA). PET data were reconstructed in 10 ECG gated bins; subsequently these gated bins were registered using demons and level-set methods guided by the extracted coronary arteries from CCTA, to eliminate the effect of cardiac motion on PET images. Noise levels, target-to-background ratios (TBR) and global motion were compared to assess image quality. Compared to the reference standard of using only diastolic PET image (25% of the counts from PET acquisition), cardiac motion registration using either level-set or demons techniques almost halved image noise due to the use of counts from the full PET acquisition and increased TBR difference between 18F-NaF positive and negative lesions. The demons method produces smoother deformation fields, exhibiting no singularities (which reflects how physically plausible the registration deformation is), as compared to the level-set method, which presents between 4 and 8% of singularities, depending on the coronary artery considered. In conclusion, the demons method produces smoother motion fields as compared to the level-set method, with a motion that is physiologically

  1. Demons versus level-set motion registration for coronary 18F-sodium fluoride PET

    NASA Astrophysics Data System (ADS)

    Rubeaux, Mathieu; Joshi, Nikhil; Dweck, Marc R.; Fletcher, Alison; Motwani, Manish; Thomson, Louise E.; Germano, Guido; Dey, Damini; Berman, Daniel S.; Newby, David E.; Slomka, Piotr J.

    2016-03-01

    Ruptured coronary atherosclerotic plaques commonly cause acute myocardial infarction. It has been recently shown that active microcalcification in the coronary arteries, one of the features that characterizes vulnerable plaques at risk of rupture, can be imaged using cardiac gated 18F-sodium fluoride (18F-NaF) PET. We have shown in previous work that a motion correction technique applied to cardiac-gated 18F-NaF PET images can enhance image quality and improve uptake estimates. In this study, we further investigated the applicability of different algorithms for registration of the coronary artery PET images. In particular, we aimed to compare demons vs. level-set nonlinear registration techniques applied for the correction of cardiac motion in coronary 18F-NaF PET. To this end, fifteen patients underwent 18F-NaF PET and prospective coronary CT angiography (CCTA). PET data were reconstructed in 10 ECG gated bins; subsequently these gated bins were registered using demons and level-set methods guided by the extracted coronary arteries from CCTA, to eliminate the effect of cardiac motion on PET images. Noise levels, target-to-background ratios (TBR) and global motion were compared to assess image quality. Compared to the reference standard of using only diastolic PET image (25% of the counts from PET acquisition), cardiac motion registration using either level-set or demons techniques almost halved image noise due to the use of counts from the full PET acquisition and increased TBR difference between 18F-NaF positive and negative lesions. The demons method produces smoother deformation fields, exhibiting no singularities (which reflects how physically plausible the registration deformation is), as compared to the level-set method, which presents between 4 and 8% of singularities, depending on the coronary artery considered. In conclusion, the demons method produces smoother motion fields as compared to the level-set method, with a motion that is physiologically

  2. Analysis of Forensic Autopsy in 120 Cases of Medical Disputes Among Different Levels of Institutional Settings.

    PubMed

    Yu, Lin-Sheng; Ye, Guang-Hua; Fan, Yan-Yan; Li, Xing-Biao; Feng, Xiang-Ping; Han, Jun-Ge; Lin, Ke-Zhi; Deng, Miao-Wu; Li, Feng

    2015-09-01

    Despite advances in medical science, the causes of death can sometimes only be determined by pathologists after a complete autopsy. Few studies have investigated the importance of forensic autopsy in medically disputed cases among different levels of institutional settings. Our study aimed to analyze forensic autopsy in 120 cases of medical disputes among five levels of institutional settings between 2001 and 2012 in Wenzhou, China. The results showed an overall concordance rate of 55%. Of the 39% of clinically missed diagnosis, cardiovascular pathology comprises 55.32%, while respiratory pathology accounts for the remaining 44. 68%. Factors that increase the likelihood of missed diagnoses were private clinics, community settings, and county hospitals. These results support that autopsy remains an important tool in establishing causes of death in medically disputed case, which may directly determine or exclude the fault of medical care and therefore in helping in resolving these cases.

  3. An investigation of children's levels of inquiry in an informal science setting

    NASA Astrophysics Data System (ADS)

    Clark-Thomas, Beth Anne

    Elementary school students' understanding of both science content and processes are enhanced by the higher level thinking associated with inquiry-based science investigations. Informal science setting personnel, elementary school teachers, and curriculum specialists charged with designing inquiry-based investigations would be well served by an understanding of the varying influence of certain present factors upon the students' willingness and ability to delve into such higher level inquiries. This study examined young children's use of inquiry-based materials and factors which may influence the level of inquiry they engaged in during informal science activities. An informal science setting was selected as the context for the examination of student inquiry behaviors because of the rich inquiry-based environment present at the site and the benefits previously noted in the research regarding the impact of informal science settings upon the construction of knowledge in science. The study revealed several patterns of behavior among children when they are engaged in inquiry-based activities at informal science exhibits. These repeated behaviors varied in the children's apparent purposeful use of the materials at the exhibits. These levels of inquiry behavior were taxonomically defined as high/medium/low within this study utilizing a researcher-developed tool. Furthermore, in this study adult interventions, questions, or prompting were found to impact the level of inquiry engaged in by the children. This study revealed that higher levels of inquiry were preceded by task directed and physical feature prompts. Moreover, the levels of inquiry behaviors were haltered, even lowered, when preceded by a prompt that focused on a science content or concept question. Results of this study have implications for the enhancement of inquiry-based science activities in elementary schools as well as in informal science settings. These findings have significance for all science educators

  4. Re-Setting the Concentration Levels of Students in Higher Education: An Exploratory Study

    ERIC Educational Resources Information Center

    Burke, Lisa A.; Ray, Ruth

    2008-01-01

    Evidence suggests that college students' concentration levels are limited and hard to maintain. Even though relevant in higher education, scant empirical research exists on interventions to "re-set" their concentration during a college lecture. Using a within-subjects design, four active learning interventions are administered across two…

  5. Physical Activity Levels in Coeducational and Single-Gender High School Physical Education Settings

    ERIC Educational Resources Information Center

    Hannon, James; Ratliffe, Thomas

    2005-01-01

    The purpose of this study was to investigate the effects of coeducational (coed) and single-gender game-play settings on the activity levels of Caucasian and African American high school physical education students. Students participated in flag football, ultimate Frisbee, and soccer units. Classes were as follows: there were two coed classes, two…

  6. Stabilised finite-element methods for solving the level set equation with mass conservation

    NASA Astrophysics Data System (ADS)

    Kabirou Touré, Mamadou; Fahsi, Adil; Soulaïmani, Azzeddine

    2016-01-01

    Finite-element methods are studied for solving moving interface flow problems using the level set approach and a stabilised variational formulation proposed in Touré and Soulaïmani (2012; Touré and Soulaïmani To appear in 2016), coupled with a level set correction method. The level set correction is intended to enhance the mass conservation satisfaction property. The stabilised variational formulation (Touré and Soulaïmani 2012; Touré and Soulaïmani, To appear in 2016) constrains the level set function to remain close to the signed distance function, while the mass conservation is a correction step which enforces the mass balance. The eXtended finite-element method (XFEM) is used to take into account the discontinuities of the properties within an element. XFEM is applied to solve the Navier-Stokes equations for two-phase flows. The numerical methods are numerically evaluated on several test cases such as time-reversed vortex flow, a rigid-body rotation of Zalesak's disc, sloshing flow in a tank, a dam-break over a bed, and a rising bubble subjected to buoyancy. The numerical results show the importance of satisfying global mass conservation to accurately capture the interface position.

  7. 76 FR 9004 - Public Comment on Setting Achievement Levels in Writing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-16

    ... WestEd to assist in gathering feedback on the design document. Additional information on the Governing... kinds of information that the Governing Board is seeking to obtain regarding the Design ] Document, and... recommendations to improve the design proposed for setting achievement levels for NAEP in writing. This...

  8. A fast level set method for synthetic aperture radar ocean image segmentation.

    PubMed

    Huang, Xiaoxia; Huang, Bo; Li, Hongga

    2009-01-01

    Segmentation of high noise imagery like Synthetic Aperture Radar (SAR) images is still one of the most challenging tasks in image processing. While level set, a novel approach based on the analysis of the motion of an interface, can be used to address this challenge, the cell-based iterations may make the process of image segmentation remarkably slow, especially for large-size images. For this reason fast level set algorithms such as narrow band and fast marching have been attempted. Built upon these, this paper presents an improved fast level set method for SAR ocean image segmentation. This competent method is dependent on both the intensity driven speed and curvature flow that result in a stable and smooth boundary. Notably, it is optimized to track moving interfaces for keeping up with the point-wise boundary propagation using a single list and a method of fast up-wind scheme iteration. The list facilitates efficient insertion and deletion of pixels on the propagation front. Meanwhile, the local up-wind scheme is used to update the motion of the curvature front instead of solving partial differential equations. Experiments have been carried out on extraction of surface slick features from ERS-2 SAR images to substantiate the efficacy of the proposed fast level set method.

  9. Level set segmentation for greenbelts by integrating wavelet texture and priori color knowledge

    NASA Astrophysics Data System (ADS)

    Yang, Tie-jun; Song, Zhi-hui; Jiang, Chuan-xian; Huang, Lin

    2013-09-01

    Segmenting greenbelts quickly and accurately in remote sensing images is an economic and effective method for the statistics of green coverage rate (GCR). Towards the problem of over-reliance on priori knowledge of the traditional level set segmentation model based on max-flow/min-cut Graph Cut principle and weighted Total Variation (GCTV), this paper proposes a level set segmentation method of combining regional texture features and priori knowledge of color and applies it to greenbelt segmentation in urban remote sensing images. For the color of greenbelts is not reliable for segmentation, Gabor wavelet transform is used to extract image texture features. Then we integrate the extracted features into the GCTV model which contains only priori knowledge of color, and use both the prior knowledge and the targets' texture to constrain the evolving of the level set which can solve the problem of over-reliance on priori knowledge. Meanwhile, the convexity of the corresponding energy functional is ensured by using relaxation and threshold method, and primal-dual algorithm with global relabeling is used to accelerate the evolution of the level set. The experiments show that our method can effectively reduce the dependence on priori knowledge of GCTV, and yields more accurate greenbelt segmentation results.

  10. Energy-optimal path planning by stochastic dynamically orthogonal level-set optimization

    NASA Astrophysics Data System (ADS)

    Subramani, Deepak N.; Lermusiaux, Pierre F. J.

    2016-04-01

    A stochastic optimization methodology is formulated for computing energy-optimal paths from among time-optimal paths of autonomous vehicles navigating in a dynamic flow field. Based on partial differential equations, the methodology rigorously leverages the level-set equation that governs time-optimal reachability fronts for a given relative vehicle-speed function. To set up the energy optimization, the relative vehicle-speed and headings are considered to be stochastic and new stochastic Dynamically Orthogonal (DO) level-set equations are derived. Their solution provides the distribution of time-optimal reachability fronts and corresponding distribution of time-optimal paths. An optimization is then performed on the vehicle's energy-time joint distribution to select the energy-optimal paths for each arrival time, among all stochastic time-optimal paths for that arrival time. Numerical schemes to solve the reduced stochastic DO level-set equations are obtained, and accuracy and efficiency considerations are discussed. These reduced equations are first shown to be efficient at solving the governing stochastic level-sets, in part by comparisons with direct Monte Carlo simulations. To validate the methodology and illustrate its accuracy, comparisons with semi-analytical energy-optimal path solutions are then completed. In particular, we consider the energy-optimal crossing of a canonical steady front and set up its semi-analytical solution using a energy-time nested nonlinear double-optimization scheme. We then showcase the inner workings and nuances of the energy-optimal path planning, considering different mission scenarios. Finally, we study and discuss results of energy-optimal missions in a wind-driven barotropic quasi-geostrophic double-gyre ocean circulation.

  11. A GPU Accelerated Discontinuous Galerkin Conservative Level Set Method for Simulating Atomization

    NASA Astrophysics Data System (ADS)

    Jibben, Zechariah J.

    This dissertation describes a process for interface capturing via an arbitrary-order, nearly quadrature free, discontinuous Galerkin (DG) scheme for the conservative level set method (Olsson et al., 2005, 2008). The DG numerical method is utilized to solve both advection and reinitialization, and executed on a refined level set grid (Herrmann, 2008) for effective use of processing power. Computation is executed in parallel utilizing both CPU and GPU architectures to make the method feasible at high order. Finally, a sparse data structure is implemented to take full advantage of parallelism on the GPU, where performance relies on well-managed memory operations. With solution variables projected into a kth order polynomial basis, a k + 1 order convergence rate is found for both advection and reinitialization tests using the method of manufactured solutions. Other standard test cases, such as Zalesak's disk and deformation of columns and spheres in periodic vortices are also performed, showing several orders of magnitude improvement over traditional WENO level set methods. These tests also show the impact of reinitialization, which often increases shape and volume errors as a result of level set scalar trapping by normal vectors calculated from the local level set field. Accelerating advection via GPU hardware is found to provide a 30x speedup factor comparing a 2.0GHz Intel Xeon E5-2620 CPU in serial vs. a Nvidia Tesla K20 GPU, with speedup factors increasing with polynomial degree until shared memory is filled. A similar algorithm is implemented for reinitialization, which relies on heavier use of shared and global memory and as a result fills them more quickly and produces smaller speedups of 18x.

  12. Online monitoring of oil film using electrical capacitance tomography and level set method

    SciTech Connect

    Xue, Q. Ma, M.; Sun, B. Y.; Cui, Z. Q.; Wang, H. X.

    2015-08-15

    In the application of oil-air lubrication system, electrical capacitance tomography (ECT) provides a promising way for monitoring oil film in the pipelines by reconstructing cross sectional oil distributions in real time. While in the case of small diameter pipe and thin oil film, the thickness of the oil film is hard to be observed visually since the interface of oil and air is not obvious in the reconstructed images. And the existence of artifacts in the reconstructions has seriously influenced the effectiveness of image segmentation techniques such as level set method. Besides, level set method is also unavailable for online monitoring due to its low computation speed. To address these problems, a modified level set method is developed: a distance regularized level set evolution formulation is extended to image two-phase flow online using an ECT system, a narrowband image filter is defined to eliminate the influence of artifacts, and considering the continuity of the oil distribution variation, the detected oil-air interface of a former image can be used as the initial contour for the detection of the subsequent frame; thus, the propagation from the initial contour to the boundary can be greatly accelerated, making it possible for real time tracking. To testify the feasibility of the proposed method, an oil-air lubrication facility with 4 mm inner diameter pipe is measured in normal operation using an 8-electrode ECT system. Both simulation and experiment results indicate that the modified level set method is capable of visualizing the oil-air interface accurately online.

  13. A localized re-initialization equation for the conservative level set method

    NASA Astrophysics Data System (ADS)

    McCaslin, Jeremy O.; Desjardins, Olivier

    2014-04-01

    The conservative level set methodology for interface transport is modified to allow for localized level set re-initialization. This approach is suitable to applications in which there is a significant amount of spatial variability in level set transport. The steady-state solution of the modified re-initialization equation matches that of the original conservative level set provided an additional Eikonal equation is solved, which can be done efficiently through a fast marching method (FMM). Implemented within the context of the accurate conservative level set method (ACLS) (Desjardins et al., 2008, [6]), the FMM solution of this Eikonal equation comes at no additional cost. A metric for the appropriate amount of local re-initialization is proposed based on estimates of local flow deformation and numerical diffusion. The method is compared to standard global re-initialization for two test cases, yielding the expected results that minor differences are observed for Zalesak's disk, and improvements in both mass conservation and interface topology are seen for a drop deforming in a vortex. Finally, the method is applied to simulation of a viscously damped standing wave and a three-dimensional drop impacting on a shallow pool. Negligible differences are observed for the standing wave, as expected. For the last case, results suggest that spatially varying re-initialization provides a reduction in spurious interfacial corrugations, improvements in the prediction of radial growth of the splashing lamella, and a reduction in conservation errors, as well as a reduction in overall computational cost that comes from improved conditioning of the pressure Poisson equation due to the removal of spurious corrugations.

  14. Online monitoring of oil film using electrical capacitance tomography and level set method

    NASA Astrophysics Data System (ADS)

    Xue, Q.; Sun, B. Y.; Cui, Z. Q.; Ma, M.; Wang, H. X.

    2015-08-01

    In the application of oil-air lubrication system, electrical capacitance tomography (ECT) provides a promising way for monitoring oil film in the pipelines by reconstructing cross sectional oil distributions in real time. While in the case of small diameter pipe and thin oil film, the thickness of the oil film is hard to be observed visually since the interface of oil and air is not obvious in the reconstructed images. And the existence of artifacts in the reconstructions has seriously influenced the effectiveness of image segmentation techniques such as level set method. Besides, level set method is also unavailable for online monitoring due to its low computation speed. To address these problems, a modified level set method is developed: a distance regularized level set evolution formulation is extended to image two-phase flow online using an ECT system, a narrowband image filter is defined to eliminate the influence of artifacts, and considering the continuity of the oil distribution variation, the detected oil-air interface of a former image can be used as the initial contour for the detection of the subsequent frame; thus, the propagation from the initial contour to the boundary can be greatly accelerated, making it possible for real time tracking. To testify the feasibility of the proposed method, an oil-air lubrication facility with 4 mm inner diameter pipe is measured in normal operation using an 8-electrode ECT system. Both simulation and experiment results indicate that the modified level set method is capable of visualizing the oil-air interface accurately online.

  15. Online monitoring of oil film using electrical capacitance tomography and level set method.

    PubMed

    Xue, Q; Sun, B Y; Cui, Z Q; Ma, M; Wang, H X

    2015-08-01

    In the application of oil-air lubrication system, electrical capacitance tomography (ECT) provides a promising way for monitoring oil film in the pipelines by reconstructing cross sectional oil distributions in real time. While in the case of small diameter pipe and thin oil film, the thickness of the oil film is hard to be observed visually since the interface of oil and air is not obvious in the reconstructed images. And the existence of artifacts in the reconstructions has seriously influenced the effectiveness of image segmentation techniques such as level set method. Besides, level set method is also unavailable for online monitoring due to its low computation speed. To address these problems, a modified level set method is developed: a distance regularized level set evolution formulation is extended to image two-phase flow online using an ECT system, a narrowband image filter is defined to eliminate the influence of artifacts, and considering the continuity of the oil distribution variation, the detected oil-air interface of a former image can be used as the initial contour for the detection of the subsequent frame; thus, the propagation from the initial contour to the boundary can be greatly accelerated, making it possible for real time tracking. To testify the feasibility of the proposed method, an oil-air lubrication facility with 4 mm inner diameter pipe is measured in normal operation using an 8-electrode ECT system. Both simulation and experiment results indicate that the modified level set method is capable of visualizing the oil-air interface accurately online. PMID:26329232

  16. Online monitoring of oil film using electrical capacitance tomography and level set method.

    PubMed

    Xue, Q; Sun, B Y; Cui, Z Q; Ma, M; Wang, H X

    2015-08-01

    In the application of oil-air lubrication system, electrical capacitance tomography (ECT) provides a promising way for monitoring oil film in the pipelines by reconstructing cross sectional oil distributions in real time. While in the case of small diameter pipe and thin oil film, the thickness of the oil film is hard to be observed visually since the interface of oil and air is not obvious in the reconstructed images. And the existence of artifacts in the reconstructions has seriously influenced the effectiveness of image segmentation techniques such as level set method. Besides, level set method is also unavailable for online monitoring due to its low computation speed. To address these problems, a modified level set method is developed: a distance regularized level set evolution formulation is extended to image two-phase flow online using an ECT system, a narrowband image filter is defined to eliminate the influence of artifacts, and considering the continuity of the oil distribution variation, the detected oil-air interface of a former image can be used as the initial contour for the detection of the subsequent frame; thus, the propagation from the initial contour to the boundary can be greatly accelerated, making it possible for real time tracking. To testify the feasibility of the proposed method, an oil-air lubrication facility with 4 mm inner diameter pipe is measured in normal operation using an 8-electrode ECT system. Both simulation and experiment results indicate that the modified level set method is capable of visualizing the oil-air interface accurately online.

  17. Pull-push level sets: a new term to encode prior knowledge for the segmentation of teeth images

    NASA Astrophysics Data System (ADS)

    de Luis Garcia, Rodrigo; San Jose Estepar, Raul; Alberola-Lopez, Carlos

    2005-04-01

    This paper presents a novel level set method for contour detection in multiple-object scenarios applied to the segmentation of teeth images. Teeth segmentation from 2D images of dental plaster cast models is a difficult problem because it is necessary to independently segment several objects that have very badly defined borders between them. Current methods for contour detection which only employ image information cannot successfully segment such structures. Being therefore necessary to use prior knowledge about the problem domain, current approaches in the literature are limited to the extraction of shape information of individual objects, whereas the key factor in such a problem are the relative positions of the different objects composing the anatomical structure. Therefore, we propose a novel method for introducing such information into a level set framework. This results in a new energy term which can be explained as a regional term that takes into account the relative positions of the different objects, and consequently creates an attraction or repulsion force that favors a determined configuration. The proposed method is compared with balloon and GVF snakes, as well as with the Geodesic Active Regions model, showing accurate results.

  18. Loosely coupled level sets for retinal layers and drusen segmentation in subjects with dry age-related macular degeneration

    NASA Astrophysics Data System (ADS)

    Novosel, Jelena; Wang, Ziyuan; de Jong, Henk; Vermeer, Koenraad A.; van Vliet, Lucas J.

    2016-03-01

    Optical coherence tomography (OCT) is used to produce high-resolution three-dimensional images of the retina, which permit the investigation of retinal irregularities. In dry age-related macular degeneration (AMD), a chronic eye disease that causes central vision loss, disruptions such as drusen and changes in retinal layer thicknesses occur which could be used as biomarkers for disease monitoring and diagnosis. Due to the topology disrupting pathology, existing segmentation methods often fail. Here, we present a solution for the segmentation of retinal layers in dry AMD subjects by extending our previously presented loosely coupled level sets framework which operates on attenuation coefficients. In eyes affected by AMD, Bruch's membrane becomes visible only below the drusen and our segmentation framework is adapted to delineate such a partially discernible interface. Furthermore, the initialization stage, which tentatively segments five interfaces, is modified to accommodate the appearance of drusen. This stage is based on Dijkstra's algorithm and combines prior knowledge on the shape of the interface, gradient and attenuation coefficient in the newly proposed cost function. This prior knowledge is incorporated by varying the weights for horizontal, diagonal and vertical edges. Finally, quantitative evaluation of the accuracy shows a good agreement between manual and automated segmentation.

  19. Three-Dimensional Level Set Modelling of Capillary-Controlled Displacements in Digital Porous Media

    NASA Astrophysics Data System (ADS)

    Helland, J.; Jettestuen, E.; Hatzignatiou, D. G.; Silin, D.

    2011-12-01

    In geological CO2 storage capillary entry pressures for CO2 invasion into low-permeability formation layers or cap rock are required for a reliable prediction of the displacement front in the storage site. High capillary entry pressures can hinder upward migration of CO2 causing it to either move laterally or get trapped. We present a 3D level set model for simulating capillary-controlled displacements in 3D rock images. Capillary pressure and interfacial area - saturation curves, as well as mean and principal interface curvatures are computed from the proposed model. The level set model is compared with a 2D semi-analytical model for calculating capillary pressure curves and arc menisci configurations in straight tubes with pore cross-sections obtained from 2D rock images. The critical displacement events and capillary entry pressures simulated with both models are in agreement. The level set simulations show that the computed mean curvature is approximately constant everywhere on the interfaces at steady state, whereas the two principal interface curvatures can vary significantly in pore space constrictions. It is also shown that the semi-analytical model provides a sufficient approximation to the initial fluid configuration required by the level set model. Level set simulations are performed in 3D images of random sphere packs (see Figure) and sandstone rocks, and the computed capillary pressure and interfacial area curves exhibit similar trends as measured data. Impacts of grid refinement on the simulated results are explored. It is demonstrated that the model accounts for several well documented critical pore level phenomena in 3D porous media, such as co-operative pore filling and Haines jumps. Furthermore, the non-wetting fluid is observed to snap off water by coalescence of opposite interfaces. These simulations also show that the two principal curvatures can vary significantly, which indicates that the shape of the interfaces is far from spherical in many

  20. Level set algorithms comparison for multi-slice CT left ventricle segmentation

    NASA Astrophysics Data System (ADS)

    Medina, Ruben; La Cruz, Alexandra; Ordoñes, Andrés.; Pesántez, Daniel; Morocho, Villie; Vanegas, Pablo

    2015-12-01

    The comparison of several Level Set algorithms is performed with respect to 2D left ventricle segmentation in Multi-Slice CT images. Five algorithms are compared by calculating the Dice coefficient between the resulting segmentation contour and a reference contour traced by a cardiologist. The algorithms are also tested on images contaminated with Gaussian noise for several values of PSNR. Additionally an algorithm for providing the initialization shape is proposed. This algorithm is based on a combination of mathematical morphology tools with watershed and region growing algorithms. Results on the set of test images are promising and suggest the extension to 3{D MSCT database segmentation.

  1. A patient-centered pharmacy services model of HIV patient care in community pharmacy settings: a theoretical and empirical framework.

    PubMed

    Kibicho, Jennifer; Owczarzak, Jill

    2012-01-01

    Reflecting trends in health care delivery, pharmacy practice has shifted from a drug-specific to a patient-centered model of care, aimed at improving the quality of patient care and reducing health care costs. In this article, we outline a theoretical model of patient-centered pharmacy services (PCPS), based on in-depth, qualitative interviews with a purposive sample of 28 pharmacists providing care to HIV-infected patients in specialty, semispecialty, and nonspecialty pharmacy settings. Data analysis was an interactive process informed by pharmacists' interviews and a review of the general literature on patient centered care, including Medication Therapy Management (MTM) services. Our main finding was that the current models of pharmacy services, including MTM, do not capture the range of pharmacy services in excess of mandated drug dispensing services. In this article, we propose a theoretical PCPS model that reflects the actual services pharmacists provide. The model includes five elements: (1) addressing patients as whole, contextualized persons; (2) customizing interventions to unique patient circumstances; (3) empowering patients to take responsibility for their own health care; (4) collaborating with clinical and nonclinical providers to address patient needs; and (5) developing sustained relationships with patients. The overarching goal of PCPS is to empower patients' to take responsibility for their own health care and self-manage their HIV-infection. Our findings provide the foundation for future studies regarding how widespread these practices are in diverse community settings, the validity of the proposed PCPS model, the potential for standardizing pharmacist practices, and the feasibility of a PCPS framework to reimburse pharmacists services.

  2. A variational level set method for the topology optimization of steady-state Navier Stokes flow

    NASA Astrophysics Data System (ADS)

    Zhou, Shiwei; Li, Qing

    2008-12-01

    The smoothness of topological interfaces often largely affects the fluid optimization and sometimes makes the density-based approaches, though well established in structural designs, inadequate. This paper presents a level-set method for topology optimization of steady-state Navier-Stokes flow subject to a specific fluid volume constraint. The solid-fluid interface is implicitly characterized by a zero-level contour of a higher-order scalar level set function and can be naturally transformed to other configurations as its host moves. A variational form of the cost function is constructed based upon the adjoint variable and Lagrangian multiplier techniques. To satisfy the volume constraint effectively, the Lagrangian multiplier derived from the first-order approximation of the cost function is amended by the bisection algorithm. The procedure allows evolving initial design to an optimal shape and/or topology by solving the Hamilton-Jacobi equation. Two classes of benchmarking examples are presented in this paper: (1) periodic microstructural material design for the maximum permeability; and (2) topology optimization of flow channels for minimizing energy dissipation. A number of 2D and 3D examples well demonstrated the feasibility and advantage of the level-set method in solving fluid-solid shape and topology optimization problems.

  3. A Framework for Spatial Assessment of Local Level Vulnerability and Adaptive Capacity to Extreme Heat

    NASA Astrophysics Data System (ADS)

    Wilhelmi, O.; Hayden, M.; Harlan, S.; Ruddell, D.; Komatsu, K.; England, B.; Uejio, C.

    2008-12-01

    Changing climate is predicted to increase the intensity and impacts of heat waves prompting the need to develop preparedness and adaptation strategies that reduce societal vulnerability. Central to understanding societal vulnerability, is adaptive capacity, the potential of a system or population to modify its features/behaviors so as to better cope with existing and anticipated stresses and fluctuations. Adaptive capacity influences adaptation, the actual adjustments made to cope with the impacts from current and future hazardous heat events. Understanding societal risks, vulnerabilities and adaptive capacity to extreme heat events and climate change requires an interdisciplinary approach that includes information about weather and climate, the natural and built environment, social processes and characteristics, interactions with the stakeholders, and an assessment of community vulnerability. This project presents a framework for an interdisciplinary approach and a case study that explore linkages between quantitative and qualitative data for a more comprehensive understanding of local level vulnerability and adaptive capacity to extreme heat events in Phoenix, Arizona. In this talk, we will present a methodological framework for conducting collaborative research on societal vulnerability and adaptive capacity on a local level that includes integration of household surveys into a quantitative spatial assessment of societal vulnerability. We highlight a collaborative partnership among researchers, community leaders and public health officials. Linkages between assessment of local adaptive capacity and development of regional climate change adaptation strategies will be discussed.

  4. Systems Science and Obesity Policy: A Novel Framework for Analyzing and Rethinking Population-Level Planning

    PubMed Central

    Matteson, Carrie L.; Finegood, Diane T.

    2014-01-01

    Objectives. We demonstrate the use of a systems-based framework to assess solutions to complex health problems such as obesity. Methods. We coded 12 documents published between 2004 and 2013 aimed at influencing obesity planning for complex systems design (9 reports from US and Canadian governmental or health authorities, 1 Cochrane review, and 2 Institute of Medicine reports). We sorted data using the intervention-level framework (ILF), a novel solutions-oriented approach to complex problems. An in-depth comparison of 3 documents provides further insight into complexity and systems design in obesity policy. Results. The majority of strategies focused mainly on changing the determinants of energy imbalance (food intake and physical activity). ILF analysis brings to the surface actions aimed at higher levels of system function and points to a need for more innovative policy design. Conclusions. Although many policymakers acknowledge obesity as a complex problem, many strategies stem from the paradigm of individual choice and are limited in scope. The ILF provides a template to encourage natural systems thinking and more strategic policy design grounded in complexity science. PMID:24832406

  5. A Framework for Lab Work Management in Mass Courses. Application to Low Level Input/Output without Hardware

    ERIC Educational Resources Information Center

    Rodriguez, Santiago; Zamorano, Juan; Rosales, Francisco; Dopico, Antonio Garcia; Pedraza, Jose Luis

    2007-01-01

    This paper describes a complete lab work management framework designed and developed in the authors' department to help teachers to manage the small projects that students are expected to complete as lab assignments during their graduate-level computer engineering studies. The paper focuses on an application example of the framework to a specific…

  6. Comparison of bladder segmentation using deep-learning convolutional neural network with and without level sets

    NASA Astrophysics Data System (ADS)

    Cha, Kenny H.; Hadjiiski, Lubomir M.; Samala, Ravi K.; Chan, Heang-Ping; Cohan, Richard H.; Caoili, Elaine M.

    2016-03-01

    We are developing a CAD system for detection of bladder cancer in CTU. In this study we investigated the application of deep-learning convolutional neural network (DL-CNN) to the segmentation of the bladder, which is a challenging problem because of the strong boundary between the non-contrast and contrast-filled regions in the bladder. We trained a DL-CNN to estimate the likelihood of a pixel being inside the bladder using neighborhood information. The segmented bladder was obtained from thresholding and hole-filling of the likelihood map. We compared the segmentation performance of the DL-CNN alone and with additional cascaded 3D and 2D level sets to refine the segmentation using 3D hand-segmented contours as reference standard. The segmentation accuracy was evaluated by five performance measures: average volume intersection %, average % volume error, average absolute % error, average minimum distance, and average Jaccard index for a data set of 81 training and 92 test cases. For the training set, DLCNN with level sets achieved performance measures of 87.2+/-6.1%, 6.0+/-9.1%, 8.7+/-6.1%, 3.0+/-1.2 mm, and 81.9+/-7.6%, respectively, while the DL-CNN alone obtained the values of 73.6+/-8.5%, 23.0+/-8.5%, 23.0+/-8.5%, 5.1+/-1.5 mm, and 71.5+/-9.2%, respectively. For the test set, the DL-CNN with level sets achieved performance measures of 81.9+/-12.1%, 10.2+/-16.2%, 14.0+/-13.0%, 3.6+/-2.0 mm, and 76.2+/-11.8%, respectively, while DL-CNN alone obtained 68.7+/-12.0%, 27.2+/-13.7%, 27.4+/-13.6%, 5.7+/-2.2 mm, and 66.2+/-11.8%, respectively. DL-CNN alone is effective in segmenting bladders but may not follow the details of the bladder wall. The combination of DL-CNN with level sets provides highly accurate bladder segmentation.

  7. A Cartesian Adaptive Level Set Method for Two-Phase Flows

    NASA Technical Reports Server (NTRS)

    Ham, F.; Young, Y.-N.

    2003-01-01

    In the present contribution we develop a level set method based on local anisotropic Cartesian adaptation as described in Ham et al. (2002). Such an approach should allow for the smallest possible Cartesian grid capable of resolving a given flow. The remainder of the paper is organized as follows. In section 2 the level set formulation for free surface calculations is presented and its strengths and weaknesses relative to the other free surface methods reviewed. In section 3 the collocated numerical method is described. In section 4 the method is validated by solving the 2D and 3D drop oscilation problem. In section 5 we present some results from more complex cases including the 3D drop breakup in an impulsively accelerated free stream, and the 3D immiscible Rayleigh-Taylor instability. Conclusions are given in section 6.

  8. Feasibility of level-set analysis of enface OCT retinal images in diabetic retinopathy

    PubMed Central

    Mohammad, Fatimah; Ansari, Rashid; Wanek, Justin; Francis, Andrew; Shahidi, Mahnaz

    2015-01-01

    Pathology segmentation in retinal images of patients with diabetic retinopathy is important to help better understand disease processes. We propose an automated level-set method with Fourier descriptor-based shape priors. A cost function measures the difference between the current and expected output. We applied our method to enface images generated for seven retinal layers and determined correspondence of pathologies between retinal layers. We compared our method to a distance-regularized level set method and show the advantages of using well-defined shape priors. Results obtained allow us to observe pathologies across multiple layers and to obtain metrics that measure the co-localization of pathologies in different layers. PMID:26137390

  9. Phase field and level set methods for modeling solute precipitation and/or dissolution

    SciTech Connect

    Zhijie Xu; Hai Huang; Paul Meakin

    2012-01-01

    The dynamics of solid-liquid interfaces controlled by solute precipitation and/or dissolution due to the chemical reaction at the interface were computed in two dimensions using a phase field models. Sharp-interface asymptotic analysis demonstrated that the phase field solutions should converge to the proper sharp-interface precipitation/dissolution limit. For the purpose of comparison, the numerical solution of the sharp-interface model for solute precipitation/dissolution was directly solved using a level set method. In general, the phase field results are found in good agreement with the level set results for all reaction rates and geometry configurations investigated. Present study supports the applications of both methods to more complicated and realistic reactive systems, including the nuclear waste release and mineral precipitation and dissolution

  10. Phase field and level set methods for modeling solute precipitation and/or dissolution

    SciTech Connect

    Xu, Zhijie; Huang, Hai; Li, Xiaoyi; Meakin, Paul

    2012-01-02

    The dynamics of solid-liquid interfaces controlled by solute precipitation and/or dissolution due to the chemical reaction at the interface were computed in two dimensions using a phase field models. Sharp-interface asymptotic analysis demonstrated that the phase field solutions should converge to the proper sharp-interface precipitation/dissolution limit. For the purpose of comparison, the numerical solution of the sharp-interface model for solute precipitation/dissolution was directly solved using a level set method. In general, the phase field results are found in good agreement with the level set results for all reaction rates and geometry configurations. Present study supports the applications of both methods to more complicated and realistic reactive systems.

  11. Atlas-based segmentation of 3D cerebral structures with competitive level sets and fuzzy control.

    PubMed

    Ciofolo, Cybèle; Barillot, Christian

    2009-06-01

    We propose a novel approach for the simultaneous segmentation of multiple structures with competitive level sets driven by fuzzy control. To this end, several contours evolve simultaneously toward previously defined anatomical targets. A fuzzy decision system combines the a priori knowledge provided by an anatomical atlas with the intensity distribution of the image and the relative position of the contours. This combination automatically determines the directional term of the evolution equation of each level set. This leads to a local expansion or contraction of the contours, in order to match the boundaries of their respective targets. Two applications are presented: the segmentation of the brain hemispheres and the cerebellum, and the segmentation of deep internal structures. Experimental results on real magnetic resonance (MR) images are presented, quantitatively assessed and discussed.

  12. Topology optimized design of carpet cloaks based on a level set approach

    NASA Astrophysics Data System (ADS)

    Fujii, Garuda; Nakamura, Masayuki

    2015-09-01

    This paper presents topology optimized designs of carpet cloaks made of dielectrics modeled by a level set boundary expression. The objective functional, evaluating the performance of the carpet cloaks, is defined as the integrated intensity of the difference between electric field reflected by the flat plane and that controlled by a carpet cloak covering a bump. The dielectric structures of carpet cloak are designed to minimize the objective functional value and, in some cases, the value reach 0.34% of that when a bare bump exists. Dielectric structures of carpet cloaks are expressed by level set functions given on grid points. The function becomes positive in dielectrics, negative in air and zero on air-dielectric interfaces and express air-dielectric interfaces explicitly.

  13. Parallel computation of level set method for 500 Hz visual servo control

    NASA Astrophysics Data System (ADS)

    Fei, Xianfeng; Igarashi, Yasunobu; Hashimoto, Koichi

    2008-11-01

    We propose a 2D microorganism tracking system using a parallel level set method and a column parallel vision system (CPV). This system keeps a single microorganism in the middle of the visual field under a microscope by visual servoing an automated stage. We propose a new energy function for the level set method. This function constrains an amount of light intensity inside the detected object contour to control the number of the detected objects. This algorithm is implemented in CPV system and computational time for each frame is 2 [ms], approximately. A tracking experiment for about 25 s is demonstrated. Also we demonstrate a single paramecium can be kept tracking even if other paramecia appear in the visual field and contact with the tracked paramecium.

  14. Unsupervised segmentation of the prostate using MR images based on level set with a shape prior.

    PubMed

    Liu, Xin; Langer, D L; Haider, M A; Van der Kwast, T H; Evans, A J; Wernick, M N; Yetik, I S

    2009-01-01

    Prostate cancer is the second leading cause of cancer death in American men. Current prostate MRI can benefit from automated tumor localization to help guide biopsy, radiotherapy and surgical planning. An important step of automated prostate cancer localization is the segmentation of the prostate. In this paper, we propose a fully automatic method for the segmentation of the prostate. We firstly apply a deformable ellipse model to find an ellipse that best fits the prostate shape. Then, this ellipse is used to initiate the level set and constrain the level set evolution with a shape penalty term. Finally, certain post processing methods are applied to refine the prostate boundaries. We apply the proposed method to real diffusion-weighted (DWI) MRI images data to test the performance. Our results show that accurate segmentation can be obtained with the proposed method compared to human readers.

  15. Automatic segmentation of Leishmania parasite in microscopic images using a modified CV level set method

    NASA Astrophysics Data System (ADS)

    Farahi, Maria; Rabbani, Hossein; Talebi, Ardeshir; Sarrafzadeh, Omid; Ensafi, Shahab

    2015-12-01

    Visceral Leishmaniasis is a parasitic disease that affects liver, spleen and bone marrow. According to World Health Organization report, definitive diagnosis is possible just by direct observation of the Leishman body in the microscopic image taken from bone marrow samples. We utilize morphological and CV level set method to segment Leishman bodies in digital color microscopic images captured from bone marrow samples. Linear contrast stretching method is used for image enhancement and morphological method is applied to determine the parasite regions and wipe up unwanted objects. Modified global and local CV level set methods are proposed for segmentation and a shape based stopping factor is used to hasten the algorithm. Manual segmentation is considered as ground truth to evaluate the proposed method. This method is tested on 28 samples and achieved 10.90% mean of segmentation error for global model and 9.76% for local model.

  16. Therapeutic and diagnostic set for irradiation the cell lines in low level laser therapy

    NASA Astrophysics Data System (ADS)

    Gryko, Lukasz; Zajac, Andrzej; Gilewski, Marian; Szymanska, Justyna; Goralczyk, Krzysztof

    2014-05-01

    In the paper is presented optoelectronic diagnostic set for standardization the biostimulation procedures performed on cell lines. The basic functional components of the therapeutic set are two digitally controlled illuminators. They are composed of the sets of semiconductor emitters - medium power laser diodes and high power LEDs emitting radiation in wide spectral range from 600 nm to 1000 nm. Emitters are coupled with applicator by fibre optic and optical systems that provides uniform irradiation of vessel with cell culture samples. Integrated spectrometer and optical power meter allow to control the energy and spectral parameters of electromagnetic radiation during the Low Level Light Therapy procedure. Dedicated power supplies and digital controlling system allow independent power of each emitter . It was developed active temperature stabilization system to thermal adjust spectral line of emitted radiation to more efficient association with absorption spectra of biological acceptors. Using the set to controlled irradiation and allowing to measure absorption spectrum of biological medium it is possible to carry out objective assessment the impact of the exposure parameters on the state cells subjected to Low Level Light Therapy. That procedure allows comparing the biological response of cell lines after irradiation with radiation of variable spectral and energetic parameters. Researches were carried out on vascular endothelial cell lines. Cells proliferations after irradiation of LEDs: 645 nm, 680 nm, 740 nm, 780 nm, 830 nm, 870 nm, 890 nm, 970 nm and lasers 650 nm and 830 nm were examined.

  17. Stochastic level-set variational implicit-solvent approach to solute-solvent interfacial fluctuations

    NASA Astrophysics Data System (ADS)

    Zhou, Shenggao; Sun, Hui; Cheng, Li-Tien; Dzubiella, Joachim; Li, Bo; McCammon, J. Andrew

    2016-08-01

    Recent years have seen the initial success of a variational implicit-solvent model (VISM), implemented with a robust level-set method, in capturing efficiently different hydration states and providing quantitatively good estimation of solvation free energies of biomolecules. The level-set minimization of the VISM solvation free-energy functional of all possible solute-solvent interfaces or dielectric boundaries predicts an equilibrium biomolecular conformation that is often close to an initial guess. In this work, we develop a theory in the form of Langevin geometrical flow to incorporate solute-solvent interfacial fluctuations into the VISM. Such fluctuations are crucial to biomolecular conformational changes and binding process. We also develop a stochastic level-set method to numerically implement such a theory. We describe the interfacial fluctuation through the "normal velocity" that is the solute-solvent interfacial force, derive the corresponding stochastic level-set equation in the sense of Stratonovich so that the surface representation is independent of the choice of implicit function, and develop numerical techniques for solving such an equation and processing the numerical data. We apply our computational method to study the dewetting transition in the system of two hydrophobic plates and a hydrophobic cavity of a synthetic host molecule cucurbit[7]uril. Numerical simulations demonstrate that our approach can describe an underlying system jumping out of a local minimum of the free-energy functional and can capture dewetting transitions of hydrophobic systems. In the case of two hydrophobic plates, we find that the wavelength of interfacial fluctuations has a strong influence to the dewetting transition. In addition, we find that the estimated energy barrier of the dewetting transition scales quadratically with the inter-plate distance, agreeing well with existing studies of molecular dynamics simulations. Our work is a first step toward the inclusion of

  18. Stochastic level-set variational implicit-solvent approach to solute-solvent interfacial fluctuations.

    PubMed

    Zhou, Shenggao; Sun, Hui; Cheng, Li-Tien; Dzubiella, Joachim; Li, Bo; McCammon, J Andrew

    2016-08-01

    Recent years have seen the initial success of a variational implicit-solvent model (VISM), implemented with a robust level-set method, in capturing efficiently different hydration states and providing quantitatively good estimation of solvation free energies of biomolecules. The level-set minimization of the VISM solvation free-energy functional of all possible solute-solvent interfaces or dielectric boundaries predicts an equilibrium biomolecular conformation that is often close to an initial guess. In this work, we develop a theory in the form of Langevin geometrical flow to incorporate solute-solvent interfacial fluctuations into the VISM. Such fluctuations are crucial to biomolecular conformational changes and binding process. We also develop a stochastic level-set method to numerically implement such a theory. We describe the interfacial fluctuation through the "normal velocity" that is the solute-solvent interfacial force, derive the corresponding stochastic level-set equation in the sense of Stratonovich so that the surface representation is independent of the choice of implicit function, and develop numerical techniques for solving such an equation and processing the numerical data. We apply our computational method to study the dewetting transition in the system of two hydrophobic plates and a hydrophobic cavity of a synthetic host molecule cucurbit[7]uril. Numerical simulations demonstrate that our approach can describe an underlying system jumping out of a local minimum of the free-energy functional and can capture dewetting transitions of hydrophobic systems. In the case of two hydrophobic plates, we find that the wavelength of interfacial fluctuations has a strong influence to the dewetting transition. In addition, we find that the estimated energy barrier of the dewetting transition scales quadratically with the inter-plate distance, agreeing well with existing studies of molecular dynamics simulations. Our work is a first step toward the inclusion of

  19. Stochastic level-set variational implicit-solvent approach to solute-solvent interfacial fluctuations.

    PubMed

    Zhou, Shenggao; Sun, Hui; Cheng, Li-Tien; Dzubiella, Joachim; Li, Bo; McCammon, J Andrew

    2016-08-01

    Recent years have seen the initial success of a variational implicit-solvent model (VISM), implemented with a robust level-set method, in capturing efficiently different hydration states and providing quantitatively good estimation of solvation free energies of biomolecules. The level-set minimization of the VISM solvation free-energy functional of all possible solute-solvent interfaces or dielectric boundaries predicts an equilibrium biomolecular conformation that is often close to an initial guess. In this work, we develop a theory in the form of Langevin geometrical flow to incorporate solute-solvent interfacial fluctuations into the VISM. Such fluctuations are crucial to biomolecular conformational changes and binding process. We also develop a stochastic level-set method to numerically implement such a theory. We describe the interfacial fluctuation through the "normal velocity" that is the solute-solvent interfacial force, derive the corresponding stochastic level-set equation in the sense of Stratonovich so that the surface representation is independent of the choice of implicit function, and develop numerical techniques for solving such an equation and processing the numerical data. We apply our computational method to study the dewetting transition in the system of two hydrophobic plates and a hydrophobic cavity of a synthetic host molecule cucurbit[7]uril. Numerical simulations demonstrate that our approach can describe an underlying system jumping out of a local minimum of the free-energy functional and can capture dewetting transitions of hydrophobic systems. In the case of two hydrophobic plates, we find that the wavelength of interfacial fluctuations has a strong influence to the dewetting transition. In addition, we find that the estimated energy barrier of the dewetting transition scales quadratically with the inter-plate distance, agreeing well with existing studies of molecular dynamics simulations. Our work is a first step toward the inclusion of

  20. Segmentation of the liver from abdominal MR images: a level-set approach

    NASA Astrophysics Data System (ADS)

    Abdalbari, Anwar; Huang, Xishi; Ren, Jing

    2015-03-01

    The usage of prior knowledge in segmentation of abdominal MR images enables more accurate and comprehensive interpretation about the organ to segment. Prior knowledge about abdominal organ like liver vessels can be employed to get an accurate segmentation of the liver that leads to accurate diagnosis or treatment plan. In this paper, a new method for segmenting the liver from abdominal MR images using liver vessels as prior knowledge is proposed. This paper employs the technique of level set method to segment the liver from MR abdominal images. The speed image used in the level set method is responsible for propagating and stopping region growing at boundaries. As a result of the poor contrast of the MR images between the liver and the surrounding organs i.e. stomach, kidneys, and heart causes leak of the segmented liver to those organs that lead to inaccurate or incorrect segmentation. For that reason, a second speed image is developed, as an extra term to the level set, to control the front propagation at weak edges with the help of the original speed image. The basic idea of the proposed approach is to use the second speed image as a boundary surface which is approximately orthogonal to the area of the leak. The aim of the new speed image is to slow down the level set propagation and prevent the leak in the regions close to liver boundary. The new speed image is a surface created by filling holes to reconstruct the liver surface. These holes are formed as a result of the exit and the entry of the liver vessels, and are considered the main cause of the segmentation leak. The result of the proposed method shows superior outcome than other methods in the literature.

  1. An adaptive level set method for shock-driven fluid-structure interaction

    SciTech Connect

    Deiterding, Ralf

    2007-01-01

    The fluid-structure interaction simulation of shock- and detonation-loaded structures requires numerical methods that can cope with large deformations as well as local topology changes. A robust, level-set-based shock-capturing fluid solver is described that allows coupling to any solid mechanics solver. As computational example, the elastic response of a thin steel panel, modeled with both shell and beam theory, to a shock wave in air is considered.

  2. Development of a hydrogeologic framework using tidally influenced groundwater levels, Hawaii

    NASA Astrophysics Data System (ADS)

    Rotzoll, K.; Oki, D. S.; El-Kadi, A. I.

    2013-12-01

    Aquifer hydraulic properties can be estimated from commonly available water-level data from tidally influenced wells because the tidal signal attenuation depends on the aquifer's regional hydraulic diffusivity. Estimates of hydraulic properties are required for models that are used to manage groundwater availability and quality. A few localized studies of tidal attenuation in Hawaii have been published, but many water-level records have not been analyzed and no regional synthesis of tidal attenuation information in Hawaii exists. Therefore, we estimate aquifer properties from tidal attenuation for Hawaii using groundwater-level records from more than 350 wells. Filtering methods to separate water-level fluctuations caused by ocean tides from other environmental stresses such as barometric pressure and long-period ocean-level variations are explored. For short-term records, several approaches to identify tidal components are examined. The estimated aquifer properties are combined in a regional context with respect to the hydrogeologic framework of each island. The results help to better understand conceptual models of groundwater flow in Hawaii aquifers and facilitate the development of regional numerical groundwater flow and transport models aimed at sustainable water-resource management.

  3. Hydrological drivers of record-setting water level rise on Earth's largest lake system

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Bruxer, J.; Durnford, D.; Smith, J. P.; Clites, A. H.; Seglenieks, F.; Qian, S. S.; Hunter, T. S.; Fortin, V.

    2016-05-01

    Between January 2013 and December 2014, water levels on Lake Superior and Lake Michigan-Huron, the two largest lakes on Earth by surface area, rose at the highest rate ever recorded for a 2 year period beginning in January and ending in December of the following year. This historic event coincided with below-average air temperatures and extensive winter ice cover across the Great Lakes. It also brought an end to a 15 year period of persistently below-average water levels on Lakes Superior and Michigan-Huron that included several months of record-low water levels. To differentiate hydrological drivers behind the recent water level rise, we developed a Bayesian Markov chain Monte Carlo (MCMC) routine for inferring historical estimates of the major components of each lake's water budget. Our results indicate that, in 2013, the water level rise on Lake Superior was driven by increased spring runoff and over-lake precipitation. In 2014, reduced over-lake evaporation played a more significant role in Lake Superior's water level rise. The water level rise on Lake Michigan-Huron in 2013 was also due to above-average spring runoff and persistent over-lake precipitation, while in 2014, it was due to a rare combination of below-average evaporation, above-average runoff and precipitation, and very high inflow rates from Lake Superior through the St. Marys River. We expect, in future research, to apply our new framework across the other Laurentian Great Lakes, and to Earth's other large freshwater basins as well.

  4. A Quadrature-Free Conservative Level Set RKDG for Simulating Atomization

    NASA Astrophysics Data System (ADS)

    Jibben, Zechariah; Herrmann, Marcus

    2012-11-01

    We present an arbitrary high-order, quadrature-free, Runge-Kutta discontinuous Galerkin (RKDG) method for the solution of the conservative level set equation (Olsson et al., 2007), used for capturing phase interfaces in atomizing multiphase flows. Special care is taken to maintain high-order accuracy in the reinitialization equation, using appropriate slope limiters when necessary and a shared basis across cell interfaces for the diffusive flux. For efficiency, we implement the method in the context of the dual narrow band overset mesh approach of the Refined Level Set Grid method (Herrmann, 2008). The accuracy, consistency, and convergence of the resulting method is demonstrated using the method of manufactured solutions (MMS) and several standard test cases, including Zalesak's disk and columns and spheres in prescribed deformation fields. Using MMS, we demonstrate k + 1 order spatial convergence for k-th order orthonormal Legendre polynomial basis functions. We furthermore show several orders of magnitude improvement in shape and volume errors over traditional WENO based distance function level set methods, and k - 1 order spatial convergence of interfacial curvature using direct neighbor cells only. Supported by Stanford's 2012 CTR Summer Program and NSF grant CBET-1054272.

  5. Numerical Simulation of Dynamic Contact Angles and Contact Lines in Multiphase Flows using Level Set Method

    NASA Astrophysics Data System (ADS)

    Pendota, Premchand

    Many physical phenomena and industrial applications involve multiphase fluid flows and hence it is of high importance to be able to simulate various aspects of these flows accurately. The Dynamic Contact Angles (DCA) and the contact lines at the wall boundaries are a couple of such important aspects. In the past few decades, many mathematical models were developed for predicting the contact angles of the inter-face with the wall boundary under various flow conditions. These models are used to incorporate the physics of DCA and contact line motion in numerical simulations using various interface capturing/tracking techniques. In the current thesis, a simple approach to incorporate the static and dynamic contact angle boundary conditions using the level set method is developed and implemented in multiphase CFD codes, LIT (Level set Interface Tracking) (Herrmann (2008)) and NGA (flow solver) (Desjardins et al (2008)). Various DCA models and associated boundary conditions are reviewed. In addition, numerical aspects such as the occurrence of a stress singularity at the contact lines and grid convergence of macroscopic interface shape are dealt with in the context of the level set approach.

  6. A level-set adjoint-state method for crosswell transmission-reflection traveltime tomography

    NASA Astrophysics Data System (ADS)

    Li, Wenbin; Leung, Shingyu; Qian, Jianliang

    2014-10-01

    We propose a level-set adjoint-state method for crosswell traveltime tomography using both first-arrival transmission and reflection traveltime data. Since our entire formulation is based on solving eikonal and advection equations on finite-difference meshes, our traveltime tomography strategy is carried out without computing rays explicitly. We incorporate reflection traveltime data into the formulation so that possible reflectors (slowness interfaces) in the targeted subsurface model can be recovered as well as the slowness distribution itself. Since a reflector may assume a variety of irregular geometries, we propose to use a level-set function to implicitly parametrize the shape of a reflector. Therefore, a mismatch functional is established to minimize the traveltime data misfit with respect to both the slowness distribution and the level-set function, and the minimization is achieved by using a gradient descent method with gradients computed by solving adjoint state equations. To assess uncertainty or reliability of reconstructed slowness models, we introduce a labelling function to characterize first-arrival ray coverage of the computational domain, and this labelling function satisfies an advection equation. We apply fast-sweeping type methods to solve eikonal, adjoint-state and advection equations arising in our formulation. Numerical examples demonstrate that the proposed algorithm is robust to noise in the measurements, and can recover complicated structure even with little information on the reflector.

  7. A Real-Time Algorithm for the Approximation of Level-Set-Based Curve Evolution

    PubMed Central

    Shi, Yonggang; Karl, William Clem

    2010-01-01

    In this paper, we present a complete and practical algorithm for the approximation of level-set-based curve evolution suitable for real-time implementation. In particular, we propose a two-cycle algorithm to approximate level-set-based curve evolution without the need of solving partial differential equations (PDEs). Our algorithm is applicable to a broad class of evolution speeds that can be viewed as composed of a data-dependent term and a curve smoothness regularization term. We achieve curve evolution corresponding to such evolution speeds by separating the evolution process into two different cycles: one cycle for the data-dependent term and a second cycle for the smoothness regularization. The smoothing term is derived from a Gaussian filtering process. In both cycles, the evolution is realized through a simple element switching mechanism between two linked lists, that implicitly represents the curve using an integer valued level-set function. By careful construction, all the key evolution steps require only integer operations. A consequence is that we obtain significant computation speedups compared to exact PDE-based approaches while obtaining excellent agreement with these methods for problems of practical engineering interest. In particular, the resulting algorithm is fast enough for use in real-time video processing applications, which we demonstrate through several image segmentation and video tracking experiments. PMID:18390371

  8. Dynamic multi-source X-ray tomography using a spacetime level set method

    NASA Astrophysics Data System (ADS)

    Niemi, Esa; Lassas, Matti; Kallonen, Aki; Harhanen, Lauri; Hämäläinen, Keijo; Siltanen, Samuli

    2015-06-01

    A novel variant of the level set method is introduced for dynamic X-ray tomography. The target is allowed to change in time while being imaged by one or several source-detector pairs at a relatively high frame-rate. The algorithmic approach is motivated by the results in [22], showing that the modified level set method can tolerate highly incomplete projection data in stationary tomography. Furthermore, defining the level set function in spacetime enforces temporal continuity in the dynamic tomography context considered here. The tomographic reconstruction is found as a minimizer of a nonlinear functional. The functional contains a regularization term penalizing the L2 norms of up to n derivatives of the reconstruction. The case n = 1 is shown to be equivalent to a convex Tikhonov problem that has a unique minimizer. For n ≥ 2 the existence of a minimizer is proved under certain assumptions on the signal-to-noise ratio and the size of the regularization parameter. Numerical examples with both simulated and measured dynamic X-ray data are included, and the proposed method is found to yield reconstructions superior to standard methods such as FBP or non-negativity constrained Tikhonov regularization and favorably comparable to those of total variation regularization. Furthermore, the methodology can be adapted to a wide range of measurement arrangements with one or more X-ray sources.

  9. Vascular Tree Segmentation in Medical Images Using Hessian-Based Multiscale Filtering and Level Set Method

    PubMed Central

    Jin, Jiaoying; Yang, Linjun; Zhang, Xuming

    2013-01-01

    Vascular segmentation plays an important role in medical image analysis. A novel technique for the automatic extraction of vascular trees from 2D medical images is presented, which combines Hessian-based multiscale filtering and a modified level set method. In the proposed algorithm, the morphological top-hat transformation is firstly adopted to attenuate background. Then Hessian-based multiscale filtering is used to enhance vascular structures by combining Hessian matrix with Gaussian convolution to tune the filtering response to the specific scales. Because Gaussian convolution tends to blur vessel boundaries, which makes scale selection inaccurate, an improved level set method is finally proposed to extract vascular structures by introducing an external constrained term related to the standard deviation of Gaussian function into the traditional level set. Our approach was tested on synthetic images with vascular-like structures and 2D slices extracted from real 3D abdomen magnetic resonance angiography (MRA) images along the coronal plane. The segmentation rates for synthetic images are above 95%. The results for MRA images demonstrate that the proposed method can extract most of the vascular structures successfully and accurately in visualization. Therefore, the proposed method is effective for the vascular tree extraction in medical images. PMID:24348738

  10. Implementation of E.U. Water Framework Directive: source assessment of metallic substances at catchment levels.

    PubMed

    Chon, Ho-Sik; Ohandja, Dieudonne-Guy; Voulvoulis, Nikolaos

    2010-01-01

    The E.U. Water Framework Directive (WFD) aims to prevent deterioration of water quality and to phase out or reduce the concentrations of priority substances at catchment levels. It requires changes in water management from a local scale to a river basin scale, and establishes Environmental Quality Standards (EQS) as a guideline for the chemical status of receiving waters. According to the Directive, the standard and the scope of the investigation for water management are more stringent and expanded than in the past, and this change also needs to be applied to restoring the level of metals in water bodies. The aim of this study was to identify anthropogenic emission sources of metallic substances at catchment levels. Potential sources providing substantial amounts of such substances in receiving waters included stormwater, industrial effluents, treated effluents, agricultural drainage, sediments, mining drainage and landfill leachates. Metallic substances have more emission sources than other dangerous substances at catchment levels. Therefore, source assessment for these substances is required to be considered more significantly to restore their chemical status in the context of the WFD. To improve source assessment quality, research on the role of societal and environmental parameters and contribution of each source to the chemical distribution in receiving waters need to be carried out. PMID:20081997

  11. Statistical criteria to set alarm levels for continuous measurements of ground contamination.

    PubMed

    Brandl, A; Jimenez, A D Herrera

    2008-08-01

    In the course of the decommissioning of the ASTRA research reactor at the site of the Austrian Research Centers at Seibersdorf, the operator and licensee, Nuclear Engineering Seibersdorf, conducted an extensive site survey and characterization to demonstrate compliance with regulatory site release criteria. This survey included radiological characterization of approximately 400,000 m(2) of open land on the Austrian Research Centers premises. Part of this survey was conducted using a mobile large-area gas proportional counter, continuously recording measurements while it was moved at a speed of 0.5 ms(-1). In order to set reasonable investigation levels, two alarm levels based on statistical considerations were developed. This paper describes the derivation of these alarm levels and the operational experience gained by detector deployment in the field. PMID:18617795

  12. Statistical criteria to set alarm levels for continuous measurements of ground contamination.

    PubMed

    Brandl, A; Jimenez, A D Herrera

    2008-08-01

    In the course of the decommissioning of the ASTRA research reactor at the site of the Austrian Research Centers at Seibersdorf, the operator and licensee, Nuclear Engineering Seibersdorf, conducted an extensive site survey and characterization to demonstrate compliance with regulatory site release criteria. This survey included radiological characterization of approximately 400,000 m(2) of open land on the Austrian Research Centers premises. Part of this survey was conducted using a mobile large-area gas proportional counter, continuously recording measurements while it was moved at a speed of 0.5 ms(-1). In order to set reasonable investigation levels, two alarm levels based on statistical considerations were developed. This paper describes the derivation of these alarm levels and the operational experience gained by detector deployment in the field.

  13. Improved inhalation technology for setting safe exposure levels for workplace chemicals

    NASA Technical Reports Server (NTRS)

    Stuart, Bruce O.

    1993-01-01

    Threshold Limit Values recommended as allowable air concentrations of a chemical in the workplace are often based upon a no-observable-effect-level (NOEL) determined by experimental inhalation studies using rodents. A 'safe level' for human exposure must then be estimated by the use of generalized safety factors in attempts to extrapolate from experimental rodents to man. The recent development of chemical-specific physiologically-based toxicokinetics makes use of measured physiological, biochemical, and metabolic parameters to construct a validated model that is able to 'scale-up' rodent response data to predict the behavior of the chemical in man. This procedure is made possible by recent advances in personal computer software and the emergence of appropriate biological data, and provides an analytical tool for much more reliable risk evaluation and airborne chemical exposure level setting for humans.

  14. Study of burn scar extraction automatically based on level set method using remote sensing data.

    PubMed

    Liu, Yang; Dai, Qin; Liu, Jianbo; Liu, ShiBin; Yang, Jin

    2014-01-01

    Burn scar extraction using remote sensing data is an efficient way to precisely evaluate burn area and measure vegetation recovery. Traditional burn scar extraction methodologies have no well effect on burn scar image with blurred and irregular edges. To address these issues, this paper proposes an automatic method to extract burn scar based on Level Set Method (LSM). This method utilizes the advantages of the different features in remote sensing images, as well as considers the practical needs of extracting the burn scar rapidly and automatically. This approach integrates Change Vector Analysis (CVA), Normalized Difference Vegetation Index (NDVI) and the Normalized Burn Ratio (NBR) to obtain difference image and modifies conventional Level Set Method Chan-Vese (C-V) model with a new initial curve which results from a binary image applying K-means method on fitting errors of two near-infrared band images. Landsat 5 TM and Landsat 8 OLI data sets are used to validate the proposed method. Comparison with conventional C-V model, OSTU algorithm, Fuzzy C-mean (FCM) algorithm are made to show that the proposed approach can extract the outline curve of fire burn scar effectively and exactly. The method has higher extraction accuracy and less algorithm complexity than that of the conventional C-V model.

  15. Study of burn scar extraction automatically based on level set method using remote sensing data.

    PubMed

    Liu, Yang; Dai, Qin; Liu, Jianbo; Liu, ShiBin; Yang, Jin

    2014-01-01

    Burn scar extraction using remote sensing data is an efficient way to precisely evaluate burn area and measure vegetation recovery. Traditional burn scar extraction methodologies have no well effect on burn scar image with blurred and irregular edges. To address these issues, this paper proposes an automatic method to extract burn scar based on Level Set Method (LSM). This method utilizes the advantages of the different features in remote sensing images, as well as considers the practical needs of extracting the burn scar rapidly and automatically. This approach integrates Change Vector Analysis (CVA), Normalized Difference Vegetation Index (NDVI) and the Normalized Burn Ratio (NBR) to obtain difference image and modifies conventional Level Set Method Chan-Vese (C-V) model with a new initial curve which results from a binary image applying K-means method on fitting errors of two near-infrared band images. Landsat 5 TM and Landsat 8 OLI data sets are used to validate the proposed method. Comparison with conventional C-V model, OSTU algorithm, Fuzzy C-mean (FCM) algorithm are made to show that the proposed approach can extract the outline curve of fire burn scar effectively and exactly. The method has higher extraction accuracy and less algorithm complexity than that of the conventional C-V model. PMID:24503563

  16. Street Level Hydrology: An Urban Application of the WRF-Hydro Framework in Denver, Colorado

    NASA Astrophysics Data System (ADS)

    Read, L.; Hogue, T. S.; Salas, F. R.; Gochis, D.

    2015-12-01

    Urban flood modeling at the watershed scale carries unique challenges in routing complexity, data resolution, social and political issues, and land surface - infrastructure interactions. The ability to accurately trace and predict the flow of water through the urban landscape enables better emergency response management, floodplain mapping, and data for future urban infrastructure planning and development. These services are of growing importance as urban population is expected to continue increasing by 1.84% per year for the next 25 years, increasing the vulnerability of urban regions to damages and loss of life from floods. Although a range of watershed-scale models have been applied in specific urban areas to examine these issues, there is a trend towards national scale hydrologic modeling enabled by supercomputing resources to understand larger system-wide hydrologic impacts and feedbacks. As such it is important to address how urban landscapes can be represented in large scale modeling processes. The current project investigates how coupling terrain and infrastructure routing can improve flow prediction and flooding events over the urban landscape. We utilize the WRF-Hydro modeling framework and a high-resolution terrain routing grid with the goal of compiling standard data needs necessary for fine scale urban modeling and dynamic flood forecasting in the urban setting. The city of Denver is selected as a case study, as it has experienced several large flooding events in the last five years and has an urban annual population growth rate of 1.5%, one of the highest in the U.S. Our work highlights the hydro-informatic challenges associated with linking channel networks and drainage infrastructure in an urban area using the WRF-Hydro modeling framework and high resolution urban models for short-term flood prediction.

  17. Large-Eddy Simulation of Premixed and Partially Premixed Turbulent Combustion Using a Level Set Method

    NASA Astrophysics Data System (ADS)

    Duchamp de Lageneste, Laurent; Pitsch, Heinz

    2001-11-01

    Level-set methods (G-equation) have been recently used in the context of RANS to model turbulent premixed (Hermann 2000) or partially premixed (Chen 1999) combustion. By directly taking into account unsteady effects, LES can be expected to improve predictions over RANS. Since the reaction zone thickness of premixed flames in technical devices is usually much smaller than the LES grid spacing, chemical reactions completely occur on the sub-grid scales and hence have to be modeled entirely. In the level-set methodology, the flame front is represented by an arbitrary iso-surface G0 of a scalar field G whose evolution is described by the so-called G-equation. This equation is only valid at G=G_0, and hence decoupled from other G levels. Heat release is then modeled using a flamelet approach in which temperature is determined as a function of G and the mixture-fraction Z. In the present study, the proposed approach has been formulated for LES and validated using data from a turbulent Bunsen burner experiment (Chen, Peters 1996). Simulation of an experimental Lean Premixed Prevapourised (LPP) dump combustor (Besson, Bruel 1999, 2000) under different premixed or partially premixed conditions will also be presented.

  18. GPU-Based Visualization of 3D Fluid Interfaces using Level Set Methods

    NASA Astrophysics Data System (ADS)

    Kadlec, B. J.

    2009-12-01

    We model a simple 3D fluid-interface problem using the level set method and visualize the interface as a dynamic surface. Level set methods allow implicit handling of complex topologies deformed by evolutions where sharp changes and cusps are present without destroying the representation. We present a highly optimized visualization and computation algorithm that is implemented in CUDA to run on the NVIDIA GeForce 295 GTX. CUDA is a general purpose parallel computing architecture that allows the NVIDIA GPU to be treated like a data parallel supercomputer in order to solve many computational problems in a fraction of the time required on a CPU. CUDA is compared to the new OpenCL™ (Open Computing Language), which is designed to run on heterogeneous computing environments but does not take advantage of low-level features in NVIDIA hardware that provide significant speedups. Therefore, our technique is implemented using CUDA and results are compared to a single CPU implementation to show the benefits of using the GPU and CUDA for visualizing fluid-interface problems. We solve a 1024^3 problem and experience significant speedup using the NVIDIA GeForce 295 GTX. Implementation details for mapping the problem to the GPU architecture are described as well as discussion on porting the technique to heterogeneous devices (AMD, Intel, IBM) using OpenCL. The results present a new interactive system for computing and visualizing the evolution of fluid interface problems on the GPU.

  19. Building a Community Framework for Adaptation to Sea Level Rise and Inundation

    NASA Astrophysics Data System (ADS)

    Culver, M. E.; Schubel, J.; Davidson, M. A.; Haines, J.

    2010-12-01

    Sea level rise and inundation pose a substantial risk to many coastal communities, and the risk is projected to increase because of continued development, changes in the frequency and intensity of inundation events, and acceleration in the rate of sea-level rise. Calls for action at all levels acknowledge that a viable response must engage federal, state and local expertise, perspectives, and resources in a coordinated and collaborative effort. Representatives from a variety of these agencies and organizations have developed a shared framework to help coastal communities structure and facilitate community-wide adaptation processes and to help agencies determine where investments should be made to enable states and local governments to assess impacts and initiate adaptation strategies over the next decade. For sea level rise planning and implementation, the requirements for high-quality data and information are vast and the availability is limited. Participants stressed the importance of data interoperability to ensure that users are able to apply data from a variety of sources and to improve availability and confidence in the data. Participants were able to prioritize the following six categories of data needed to support future sea level rise planning and implementation: - Data to understand land forms and where and how water will flow - Monitoring data and environmental drivers - Consistent sea level rise scenarios and projections across agencies to support local planning - Data to characterize vulnerabilities and impacts of sea level rise - Community characteristics - Legal frameworks and administrative structure. To develop a meaningful and effective sea level rise adaptation plan, state and local planners must understand how the availability, scale, and uncertainty of these types of data will impact new guidelines or adaptation measures. The tools necessary to carry-out the adaptation planning process need to be understood in terms of data requirements

  20. Simulating collisions of droplets with walls and films using a level set method

    NASA Astrophysics Data System (ADS)

    Kwon, Tae-Jun

    A coupled level set and Marker and Cell is developed for computing axisymmetric, incompressible, and immiscible two-phase flows. Instead of using marker particles to track the free surface, the level set function is employed to "capture" the complex interfacial structure. An iterative process is devised in order to maintain the level set function as the signed distance from the interface. As a base line case, the two-phase fluid code is implemented for simulating zero gravity capillary oscillations of liquid droplet in order to validate its capability to handle surface tension effects. Initially deformed shapes of the second spherical harmonics are to oscillate and the simulation results are compared with linearized analytic solutions as well as other numerical calculations. A viscous damping towards an equilibrium sphere and capillary oscillation-period variations are investigated. Axisymmetric steady rising bubbles in an unbounded quiescent viscous liquid are simulated for a validation of code's ability in buoyancy driven deformations and stable breakup processes. The simulation results show a consistency with other numerical results for a high Weber number cases. It should be pointed out that no additional treatment is necessary to handle bubble breakups. An impact of a liquid droplet onto solid surface and/or shallow liquid layer has been simulated and compared with corresponding experimental data. Case studies on the radial extension versus variable Weber numbers show good agreements with experimental observation. Drop collisions onto shallow liquid layer are chosen for splash simulations. The effects of layer thickness are studied by analyzing radial extension and elevation of the splash rim. Drop impingement on deep pool is also examined on purpose to simulate a crater and a central jet. It is found that there exists a criterion to distinguish between the splashing and the deposition events in terms of a single impact parameter. A simple dimensional analysis

  1. Non-Rigid Object Contour Tracking via a Novel Supervised Level Set Model.

    PubMed

    Sun, Xin; Yao, Hongxun; Zhang, Shengping; Li, Dong

    2015-11-01

    We present a novel approach to non-rigid objects contour tracking in this paper based on a supervised level set model (SLSM). In contrast to most existing trackers that use bounding box to specify the tracked target, the proposed method extracts the accurate contours of the target as tracking output, which achieves better description of the non-rigid objects while reduces background pollution to the target model. Moreover, conventional level set models only emphasize the regional intensity consistency and consider no priors. Differently, the curve evolution of the proposed SLSM is object-oriented and supervised by the specific knowledge of the targets we want to track. Therefore, the SLSM can ensure a more accurate convergence to the exact targets in tracking applications. In particular, we firstly construct the appearance model for the target in an online boosting manner due to its strong discriminative power between the object and the background. Then, the learnt target model is incorporated to model the probabilities of the level set contour by a Bayesian manner, leading the curve converge to the candidate region with maximum likelihood of being the target. Finally, the accurate target region qualifies the samples fed to the boosting procedure as well as the target model prepared for the next time step. We firstly describe the proposed mechanism of two-phase SLSM for single target tracking, then give its generalized multi-phase version for dealing with multi-target tracking cases. Positive decrease rate is used to adjust the learning pace over time, enabling tracking to continue under partial and total occlusion. Experimental results on a number of challenging sequences validate the effectiveness of the proposed method. PMID:26099142

  2. Non-Rigid Object Contour Tracking via a Novel Supervised Level Set Model.

    PubMed

    Sun, Xin; Yao, Hongxun; Zhang, Shengping; Li, Dong

    2015-11-01

    We present a novel approach to non-rigid objects contour tracking in this paper based on a supervised level set model (SLSM). In contrast to most existing trackers that use bounding box to specify the tracked target, the proposed method extracts the accurate contours of the target as tracking output, which achieves better description of the non-rigid objects while reduces background pollution to the target model. Moreover, conventional level set models only emphasize the regional intensity consistency and consider no priors. Differently, the curve evolution of the proposed SLSM is object-oriented and supervised by the specific knowledge of the targets we want to track. Therefore, the SLSM can ensure a more accurate convergence to the exact targets in tracking applications. In particular, we firstly construct the appearance model for the target in an online boosting manner due to its strong discriminative power between the object and the background. Then, the learnt target model is incorporated to model the probabilities of the level set contour by a Bayesian manner, leading the curve converge to the candidate region with maximum likelihood of being the target. Finally, the accurate target region qualifies the samples fed to the boosting procedure as well as the target model prepared for the next time step. We firstly describe the proposed mechanism of two-phase SLSM for single target tracking, then give its generalized multi-phase version for dealing with multi-target tracking cases. Positive decrease rate is used to adjust the learning pace over time, enabling tracking to continue under partial and total occlusion. Experimental results on a number of challenging sequences validate the effectiveness of the proposed method.

  3. A three-dimensional coupled Nitsche and level set method for electrohydrodynamic potential flows in moving domains

    NASA Astrophysics Data System (ADS)

    Johansson, A.; Garzon, M.; Sethian, J. A.

    2016-03-01

    In this paper we present a new algorithm for computing three-dimensional electrohydrodynamic flow in moving domains which can undergo topological changes. We consider a non-viscous, irrotational, perfect conducting fluid and introduce a way to model the electrically charged flow with an embedded potential approach. To numerically solve the resulting system, we combine a level set method to track both the free boundary and the surface velocity potential with a Nitsche finite element method for solving the Laplace equations. This results in an algorithmic framework that does not require body-conforming meshes, works in three dimensions, and seamlessly tracks topological change. Assembling this coupled system requires care: while convergence and stability properties of Nitsche's methods have been well studied for static problems, they have rarely been considered for moving domains or for obtaining the gradients of the solution on the embedded boundary. We therefore investigate the performance of the symmetric and non-symmetric Nitsche formulations, as well as two different stabilization techniques. The global algorithm and in particular the coupling between the Nitsche solver and the level set method are also analyzed in detail. Finally we present numerical results for several time-dependent problems, each one designed to achieve a specific objective: (a) The oscillation of a perturbed sphere, which is used for convergence studies and the examination of the Nitsche methods; (b) The break-up of a two lobe droplet with axial symmetry, which tests the capability of the algorithm to go past flow singularities such as topological changes and preservation of an axi-symmetric flow, and compares results to previous axi-symmetric calculations; (c) The electrohydrodynamical deformation of a thin film and subsequent jet ejection, which will account for the presence of electrical forces in a non-axi-symmetric geometry.

  4. Level set methods to compute minimal surfaces in a medium with exclusions (voids).

    SciTech Connect

    Walsh, Timothy Francis; Chopp, David; Torres, Monica

    2003-06-01

    In T1, periodic minimal surfaces in a medium with exclusions (voids) are constructed and in this paper we present two algorithms for computing these minimal surfaces. The two algorithms use evolution of level sets by mean curvature. The first algorithm solves the governing nonlinear PDE directly and enforces numerically an orthogonality condition that the surfaces satisfy when they meet the boundaries of the exclusions. The second algorithm involves h-adaptive finite element approximations of a linear convection-diffusion equation, which has been shown to linearize the governing nonlinear PDE for weighted mean curvature flow.

  5. Topology-optimized multiple-disk resonators obtained using level set expression incorporating surface effects.

    PubMed

    Fujii, Garuda; Ueta, Tsuyoshi; Mizuno, Mamoru; Nakamura, Masayuki

    2015-05-01

    Topology-optimized designs of multiple-disk resonators are presented using level-set expression that incorporates surface effects. Effects from total internal reflection at the surfaces of the dielectric disks are precisely simulated by modeling clearly defined dielectric boundaries during topology optimization. The electric field intensity in optimal resonators increases to more than four and a half times the initial intensity in a resonant state, whereas in some cases the Q factor increases by three and a half times that for the initial state. Wavelength-scale link structures between neighboring disks improve the performance of the multiple-disk resonators. PMID:25969226

  6. Springback assessment based on level set interpolation and shape manifolds in deep drawing

    NASA Astrophysics Data System (ADS)

    Le Quilliec, Guenhael; Raghavan, Balaji; Breitkopf, Piotr; Rassineux, Alain; Villon, Pierre; Roelandt, Jean-Marc

    2013-12-01

    In this paper, we introduce an original shape representation approach for automatic springback characterization. It is based on the generation of parameterized Level Set functions. The central idea is the concept of the shape manifold representing the design domain in the reduced-order shape-space. Performing Proper Orthogonal Decomposition on the shapes followed by using the Diffuse Approximation allows us to efficiently reduce the problem dimensionality and to interpolate uniquely between admissible input shapes, while also determining the smallest number of parameters needed to characterize the final formed shape. We apply this methodology to the problem of springback assessment for the deep drawing operation of metal sheets.

  7. Concurrent grammar inference machines for 2-D pattern recognition: a comparison with the level set approach

    NASA Astrophysics Data System (ADS)

    Lam, K. P.; Fletcher, P.

    2009-02-01

    Parallel processing promises scalable and effective computing power which can handle the complex data structures of knowledge representation languages efficiently. Past and present sequential architectures, despite the rapid advances in computing technology, have yet to provide such processing power and to offer a holistic solution to the problem. This paper presents a fresh attempt in formulating alternative techniques for grammar learning, based upon the parallel and distributed model of connectionism, to facilitate the more cognitively demanding task of pattern understanding. The proposed method has been compared with the contemporary approach of shape modelling based on level sets, and demonstrated its potential as a prototype for constructing robust networks on high performance parallel platforms.

  8. Crack Level Estimation Approach for Planetary Gear Sets Based on Simulation Signal and GRA

    NASA Astrophysics Data System (ADS)

    Cheng, Zhe; Hu, Niaoqing; Zuo, Mingjian; Fan, Bin

    2012-05-01

    The planetary gearbox is a critical mechanism in helicopter transmission systems. Tooth failures in planetary gear sets will cause great risk to helicopter operations. A crack level estimation methodology has been devised in this paper by integrating a physical model for simulation signal generation and a grey relational analysis (GRA) algorithm for damage level estimation. The proposed method was calibrated firstly with fault seeded test data and then validated with the data of other tests from a helicopter transmission test rig. The estimation results of test data coincide with the actual test records, showing the effectiveness and accuracy of the method in providing a novel way to hybrid model based methods and signal analysis methods for more accurate health monitoring and condition prediction.

  9. Vessel Segmentation and Blood Flow Simulation Using Level-Sets and Embedded Boundary Methods

    SciTech Connect

    Deschamps, T; Schwartz, P; Trebotich, D; Colella, P; Saloner, D; Malladi, R

    2004-12-09

    In this article we address the problem of blood flow simulation in realistic vascular objects. The anatomical surfaces are extracted by means of Level-Sets methods that accurately model the complex and varying surfaces of pathological objects such as aneurysms and stenoses. The surfaces obtained are defined at the sub-pixel level where they intersect the Cartesian grid of the image domain. It is therefore straightforward to construct embedded boundary representations of these objects on the same grid, for which recent work has enabled discretization of the Navier-Stokes equations for incompressible fluids. While most classical techniques require construction of a structured mesh that approximates the surface in order to extrapolate a 3D finite-element gridding of the whole volume, our method directly simulates the blood-flow inside the extracted surface without losing any complicated details and without building additional grids.

  10. Application of the WHO alert level framework to cyanobacterial monitoring of Lake Champlain, Vermont.

    PubMed

    Watzin, Mary C; Miller, Emily Brines; Shambaugh, Angela D; Kreider, Meghan A

    2006-06-01

    The increasing incidence of toxic cyanobacteria blooms worldwide has created a need for practical and efficient monitoring in order to protect public health. We developed a monitoring and alert framework based on World Health Organization (WHO) recommendations and applied it on Lake Champlain during the summers of 2002-2004. The protocol began with collection of net samples of phytoplankton in order to maximize the chance of finding potential toxin-producing cyanobacteria. Samples were collected lake-wide in partnership with ongoing monitoring efforts, but because open water sample sites did not capture conditions along the shoreline, we added near-shore and shoreline stations in problem areas. Samples were examined qualitatively until potential toxin-producing taxa were found. Then quantitative analyses began, using a rapid screening method to estimate cell density based on colony size. A final cell density of 4000 cells/mL triggered toxin analyses. Primary analysis was for microcystins using ELISA methods. Cell densities, locations of colonies, and toxin concentrations were reported weekly to public health officials. We found that screening for potential toxin-producing cyanobacteria and then measuring toxin concentrations when cell densities reached critical levels worked well to identify problem locations. Although the WHO recommends using chlorophyll a concentration, it was not a good indicator of problem densities of potential toxin-producing cyanobacteria. Our cell density screening method missed no developing blooms but produced less precise density estimates at high cell counts. Overall, our framework appears to provide an efficient and effective method for monitoring cyanotoxin risks.

  11. Time-optimal path planning in dynamic flows using level set equations: theory and schemes

    NASA Astrophysics Data System (ADS)

    Lolla, Tapovan; Lermusiaux, Pierre F. J.; Ueckermann, Mattheus P.; Haley, Patrick J.

    2014-10-01

    We develop an accurate partial differential equation-based methodology that predicts the time-optimal paths of autonomous vehicles navigating in any continuous, strong, and dynamic ocean currents, obviating the need for heuristics. The goal is to predict a sequence of steering directions so that vehicles can best utilize or avoid currents to minimize their travel time. Inspired by the level set method, we derive and demonstrate that a modified level set equation governs the time-optimal path in any continuous flow. We show that our algorithm is computationally efficient and apply it to a number of experiments. First, we validate our approach through a simple benchmark application in a Rankine vortex flow for which an analytical solution is available. Next, we apply our methodology to more complex, simulated flow fields such as unsteady double-gyre flows driven by wind stress and flows behind a circular island. These examples show that time-optimal paths for multiple vehicles can be planned even in the presence of complex flows in domains with obstacles. Finally, we present and support through illustrations several remarks that describe specific features of our methodology.

  12. Time-optimal path planning in dynamic flows using level set equations: theory and schemes

    NASA Astrophysics Data System (ADS)

    Lolla, Tapovan; Lermusiaux, Pierre F. J.; Ueckermann, Mattheus P.; Haley, Patrick J.

    2014-09-01

    We develop an accurate partial differential equation-based methodology that predicts the time-optimal paths of autonomous vehicles navigating in any continuous, strong, and dynamic ocean currents, obviating the need for heuristics. The goal is to predict a sequence of steering directions so that vehicles can best utilize or avoid currents to minimize their travel time. Inspired by the level set method, we derive and demonstrate that a modified level set equation governs the time-optimal path in any continuous flow. We show that our algorithm is computationally efficient and apply it to a number of experiments. First, we validate our approach through a simple benchmark application in a Rankine vortex flow for which an analytical solution is available. Next, we apply our methodology to more complex, simulated flow fields such as unsteady double-gyre flows driven by wind stress and flows behind a circular island. These examples show that time-optimal paths for multiple vehicles can be planned even in the presence of complex flows in domains with obstacles. Finally, we present and support through illustrations several remarks that describe specific features of our methodology.

  13. A level set method for determining critical curvatures for drainage and imbibition.

    PubMed

    Prodanović, Masa; Bryant, Steven L

    2006-12-15

    An accurate description of the mechanics of pore level displacement of immiscible fluids could significantly improve the predictions from pore network models of capillary pressure-saturation curves, interfacial areas and relative permeability in real porous media. If we assume quasi-static displacement, at constant pressure and surface tension, pore scale interfaces are modeled as constant mean curvature surfaces, which are not easy to calculate. Moreover, the extremely irregular geometry of natural porous media makes it difficult to evaluate surface curvature values and corresponding geometric configurations of two fluids. Finally, accounting for the topological changes of the interface, such as splitting or merging, is nontrivial. We apply the level set method for tracking and propagating interfaces in order to robustly handle topological changes and to obtain geometrically correct interfaces. We describe a simple but robust model for determining critical curvatures for throat drainage and pore imbibition. The model is set up for quasi-static displacements but it nevertheless captures both reversible and irreversible behavior (Haines jump, pore body imbibition). The pore scale grain boundary conditions are extracted from model porous media and from imaged geometries in real rocks. The method gives quantitative agreement with measurements and with other theories and computational approaches. PMID:17027812

  14. Incorporating level set methods in Geographical Information Systems (GIS) for land-surface process modeling

    NASA Astrophysics Data System (ADS)

    Pullar, D.

    2005-08-01

    Land-surface processes include a broad class of models that operate at a landscape scale. Current modelling approaches tend to be specialised towards one type of process, yet it is the interaction of processes that is increasing seen as important to obtain a more integrated approach to land management. This paper presents a technique and a tool that may be applied generically to landscape processes. The technique tracks moving interfaces across landscapes for processes such as water flow, biochemical diffusion, and plant dispersal. Its theoretical development applies a Lagrangian approach to motion over a Eulerian grid space by tracking quantities across a landscape as an evolving front. An algorithm for this technique, called level set method, is implemented in a geographical information system (GIS). It fits with a field data model in GIS and is implemented as operators in map algebra. The paper describes an implementation of the level set methods in a map algebra programming language, called MapScript, and gives example program scripts for applications in ecology and hydrology.

  15. On the Relationship between Variational Level Set-Based and SOM-Based Active Contours.

    PubMed

    Abdelsamea, Mohammed M; Gnecco, Giorgio; Gaber, Mohamed Medhat; Elyan, Eyad

    2015-01-01

    Most Active Contour Models (ACMs) deal with the image segmentation problem as a functional optimization problem, as they work on dividing an image into several regions by optimizing a suitable functional. Among ACMs, variational level set methods have been used to build an active contour with the aim of modeling arbitrarily complex shapes. Moreover, they can handle also topological changes of the contours. Self-Organizing Maps (SOMs) have attracted the attention of many computer vision scientists, particularly in modeling an active contour based on the idea of utilizing the prototypes (weights) of a SOM to control the evolution of the contour. SOM-based models have been proposed in general with the aim of exploiting the specific ability of SOMs to learn the edge-map information via their topology preservation property and overcoming some drawbacks of other ACMs, such as trapping into local minima of the image energy functional to be minimized in such models. In this survey, we illustrate the main concepts of variational level set-based ACMs, SOM-based ACMs, and their relationship and review in a comprehensive fashion the development of their state-of-the-art models from a machine learning perspective, with a focus on their strengths and weaknesses. PMID:25960736

  16. Comparisons and Limitations of Gradient Augmented Level Set and Algebraic Volume of Fluid Methods

    NASA Astrophysics Data System (ADS)

    Anumolu, Lakshman; Ryddner, Douglas; Trujillo, Mario

    2014-11-01

    Recent numerical methods for implicit interface transport are generally presented as enjoying higher order of spatial-temporal convergence when compared to classical methods or less sophisticated approaches. However, when applied to test cases, which are designed to simulate practical industrial conditions, significant reduction in convergence is observed in higher-order methods, whereas for the less sophisticated approaches same convergence is achieved but a growth in the error norms occurs. This provides an opportunity to understand the underlying issues which causes this decrease in accuracy in both types of methods. As an example we consider the Gradient Augmented Level Set method (GALS) and a variant of the Volume of Fluid (VoF) method in our study. Results show that while both methods do suffer from a loss of accuracy, it is the higher order method that suffers more. The implication is a significant reduction in the performance advantage of the GALS method over the VoF scheme. Reasons for this lie in the behavior of the higher order derivatives, particular in situations where the level set field is highly distorted. For the VoF approach, serious spurious deformations of the interface are observed, albeit with a deceptive zero loss of mass.

  17. Density and level set-XFEM schemes for topology optimization of 3-D structures

    NASA Astrophysics Data System (ADS)

    Villanueva, Carlos H.; Maute, Kurt

    2014-07-01

    As the capabilities of additive manufacturing techniques increase, topology optimization provides a promising approach to design geometrically sophisticated structures. Traditional topology optimization methods aim at finding conceptual designs, but they often do not resolve sufficiently the geometry and the structural response such that the optimized designs can be directly used for manufacturing. To overcome these limitations, this paper studies the viability of the extended finite element method (XFEM) in combination with the level-set method (LSM) for topology optimization of three dimensional structures. The LSM describes the geometry by defining the nodal level set values via explicit functions of the optimization variables. The structural response is predicted by a generalized version of the XFEM. The LSM-XFEM approach is compared against results from a traditional Solid Isotropic Material with Penalization method for two-phase "solid-void" and "solid-solid" problems. The numerical results demonstrate that the LSM-XFEM approach describes crisply the geometry and predicts the structural response with acceptable accuracy even on coarse meshes.

  18. On the Relationship between Variational Level Set-Based and SOM-Based Active Contours

    PubMed Central

    Abdelsamea, Mohammed M.; Gnecco, Giorgio; Gaber, Mohamed Medhat; Elyan, Eyad

    2015-01-01

    Most Active Contour Models (ACMs) deal with the image segmentation problem as a functional optimization problem, as they work on dividing an image into several regions by optimizing a suitable functional. Among ACMs, variational level set methods have been used to build an active contour with the aim of modeling arbitrarily complex shapes. Moreover, they can handle also topological changes of the contours. Self-Organizing Maps (SOMs) have attracted the attention of many computer vision scientists, particularly in modeling an active contour based on the idea of utilizing the prototypes (weights) of a SOM to control the evolution of the contour. SOM-based models have been proposed in general with the aim of exploiting the specific ability of SOMs to learn the edge-map information via their topology preservation property and overcoming some drawbacks of other ACMs, such as trapping into local minima of the image energy functional to be minimized in such models. In this survey, we illustrate the main concepts of variational level set-based ACMs, SOM-based ACMs, and their relationship and review in a comprehensive fashion the development of their state-of-the-art models from a machine learning perspective, with a focus on their strengths and weaknesses. PMID:25960736

  19. A mass conserving level set method for detailed numerical simulation of liquid atomization

    SciTech Connect

    Luo, Kun; Shao, Changxiao; Yang, Yue; Fan, Jianren

    2015-10-01

    An improved mass conserving level set method for detailed numerical simulations of liquid atomization is developed to address the issue of mass loss in the existing level set method. This method introduces a mass remedy procedure based on the local curvature at the interface, and in principle, can ensure the absolute mass conservation of the liquid phase in the computational domain. Three benchmark cases, including Zalesak's disk, a drop deforming in a vortex field, and the binary drop head-on collision, are simulated to validate the present method, and the excellent agreement with exact solutions or experimental results is achieved. It is shown that the present method is able to capture the complex interface with second-order accuracy and negligible additional computational cost. The present method is then applied to study more complex flows, such as a drop impacting on a liquid film and the swirling liquid sheet atomization, which again, demonstrates the advantages of mass conservation and the capability to represent the interface accurately.

  20. Soybean fruit development and set at the node level under combined photoperiod and radiation conditions

    PubMed Central

    Nico, Magalí; Mantese, Anita I.; Miralles, Daniel J.; Kantolic, Adriana G.

    2016-01-01

    In soybean, long days during post-flowering increase seed number. This positive photoperiodic effect on seed number has been previously associated with increments in the amount of radiation accumulated during the crop cycle because long days extend the duration of the crop cycle. However, evidence of intra-nodal processes independent of the availability of assimilates suggests that photoperiodic effects at the node level might also contribute to pod set. This work aims to identify the main mechanisms responsible for the increase in pod number per node in response to long days; including the dynamics of flowering, pod development, growth and set at the node level. Long days increased pods per node on the main stems, by increasing pods on lateral racemes (usually dominated positions) at some main stem nodes. Long days lengthened the flowering period and thereby increased the number of opened flowers on lateral racemes. The flowering period was prolonged under long days because effective seed filling was delayed on primary racemes (dominant positions). Long days also delayed the development of flowers into pods with filling seeds, delaying the initiation of pod elongation without modifying pod elongation rate. The embryo development matched the external pod length irrespective of the pod’s chronological age. These results suggest that long days during post-flowering enhance pod number per node through a relief of the competition between pods of different hierarchy within the node. The photoperiodic effect on the development of dominant pods, delaying their elongation and therefore postponing their active growth, extends flowering and allows pod set at positions that are usually dominated. PMID:26512057

  1. Modelling Molecular Mechanisms: A Framework of Scientific Reasoning to Construct Molecular-Level Explanations for Cellular Behaviour

    ERIC Educational Resources Information Center

    van Mil, Marc H. W.; Boerwinkel, Dirk Jan; Waarlo, Arend Jan

    2013-01-01

    Although molecular-level details are part of the upper-secondary biology curriculum in most countries, many studies report that students fail to connect molecular knowledge to phenomena at the level of cells, organs and organisms. Recent studies suggest that students lack a framework to reason about complex systems to make this connection. In this…

  2. Critical Factors to Consider in Evaluating Standard-Setting Studies to Map Language Test Scores to Frameworks of Language Proficiency

    ERIC Educational Resources Information Center

    Tannenbaum, Richard J.; Cho, Yeonsuk

    2014-01-01

    In this article, we consolidate and present in one place what is known about quality indicators for setting standards so that stakeholders may be able to recognize the signs of standard-setting quality. We use the context of setting standards to associate English language test scores with language proficiency descriptions such as those presented…

  3. Whole abdominal wall segmentation using augmented active shape models (AASM) with multi-atlas label fusion and level set

    NASA Astrophysics Data System (ADS)

    Xu, Zhoubing; Baucom, Rebeccah B.; Abramson, Richard G.; Poulose, Benjamin K.; Landman, Bennett A.

    2016-03-01

    The abdominal wall is an important structure differentiating subcutaneous and visceral compartments and intimately involved with maintaining abdominal structure. Segmentation of the whole abdominal wall on routinely acquired computed tomography (CT) scans remains challenging due to variations and complexities of the wall and surrounding tissues. In this study, we propose a slice-wise augmented active shape model (AASM) approach to robustly segment both the outer and inner surfaces of the abdominal wall. Multi-atlas label fusion (MALF) and level set (LS) techniques are integrated into the traditional ASM framework. The AASM approach globally optimizes the landmark updates in the presence of complicated underlying local anatomical contexts. The proposed approach was validated on 184 axial slices of 20 CT scans. The Hausdorff distance against the manual segmentation was significantly reduced using proposed approach compared to that using ASM, MALF, and LS individually. Our segmentation of the whole abdominal wall enables the subcutaneous and visceral fat measurement, with high correlation to the measurement derived from manual segmentation. This study presents the first generic algorithm that combines ASM, MALF, and LS, and demonstrates practical application for automatically capturing visceral and subcutaneous fat volumes.

  4. On the computation of viscous terms for incompressible two-phase flows with Level Set/Ghost Fluid Method

    NASA Astrophysics Data System (ADS)

    Lalanne, Benjamin; Villegas, Lucia Rueda; Tanguy, Sébastien; Risso, Frédéric

    2015-11-01

    In this paper, we present a detailed analysis of the computation of the viscous terms for the simulation of incompressible two-phase flows in the framework of Level Set/Ghost Fluid Method when viscosity is discontinuous across the interface. Two pioneering papers on the topic, Kang et al. [10] and Sussman et al. [26], proposed two different approaches to deal with viscous terms. However, a definitive assessment of their respective efficiency is currently not available. In this paper, we demonstrate from theoretical arguments and confirm from numerical simulations that these two approaches are equivalent from a continuous point of view and we compare their accuracies in relevant test-cases. We also propose a new intermediate method which uses the properties of the two previous methods. This new method enables a simple implementation for an implicit temporal discretization of the viscous terms. In addition, the efficiency of the Delta Function method [24] is also assessed and compared to the three previous ones, which allow us to propose a general overview of the accuracy of all available methods. The selected test-cases involve configurations wherein viscosity plays a major role and for which either theoretical results or experimental data are available as reference solutions: simulations of spherical rising bubbles, shape-oscillating bubbles and deformed rising bubbles at low Reynolds numbers.

  5. Whole Abdominal Wall Segmentation using Augmented Active Shape Models (AASM) with Multi-Atlas Label Fusion and Level Set

    PubMed Central

    Xu, Zhoubing; Baucom, Rebeccah B.; Abramson, Richard G.; Poulose, Benjamin K.; Landman, Bennett A.

    2016-01-01

    The abdominal wall is an important structure differentiating subcutaneous and visceral compartments and intimately involved with maintaining abdominal structure. Segmentation of the whole abdominal wall on routinely acquired computed tomography (CT) scans remains challenging due to variations and complexities of the wall and surrounding tissues. In this study, we propose a slice-wise augmented active shape model (AASM) approach to robustly segment both the outer and inner surfaces of the abdominal wall. Multi-atlas label fusion (MALF) and level set (LS) techniques are integrated into the traditional ASM framework. The AASM approach globally optimizes the landmark updates in the presence of complicated underlying local anatomical contexts. The proposed approach was validated on 184 axial slices of 20 CT scans. The Hausdorff distance against the manual segmentation was significantly reduced using proposed approach compared to that using ASM, MALF, and LS individually. Our segmentation of the whole abdominal wall enables the subcutaneous and visceral fat measurement, with high correlation to the measurement derived from manual segmentation. This study presents the first generic algorithm that combines ASM, MALF, and LS, and demonstrates practical application for automatically capturing visceral and subcutaneous fat volumes. PMID:27127333

  6. A new analytical framework for assessing the effect of sea-level rise and dredging on tidal damping in estuaries

    NASA Astrophysics Data System (ADS)

    Cai, Huayang; Savenije, Hubert H. G.; Toffolon, Marco

    2012-09-01

    This paper explores different analytical solutions of the tidal hydraulic equations in convergent estuaries. Linear and quasi-nonlinear models are compared for given geometry, friction, and tidal amplitude at the seaward boundary, proposing a common theoretical framework and showing that the main difference between the examined models lies in the treatment of the friction term. A general solution procedure is proposed for the set of governing analytical equations expressed in dimensionless form, and a new analytical expression for the tidal damping is derived as a weighted average of two solutions, characterized by the usual linearized formulation and the quasi-nonlinear Lagrangean treatment of the friction term. The different analytical solutions are tested against fully nonlinear numerical results for a wide range of parameters, and compared with observations in the Scheldt estuary. Overall, the new method compares best with the numerical solution and field data. The new accurate relationship for the tidal damping is then exploited for a classification of estuaries based on the distance of the tidally averaged depth from the ideal depth (relative to vanishing amplification) and the critical depth (condition for maximum amplification). Finally, the new model is used to investigate the effect of depth variations on the tidal dynamics in 23 real estuaries, highlighting the usefulness of the analytical method to assess the influence of human interventions (e.g. by dredging) and global sea-level rise on the estuarine environment.

  7. High-Order Discontinuous Galerkin Level Set Method for Interface Tracking and Re-Distancing on Unstructured Meshes

    NASA Astrophysics Data System (ADS)

    Greene, Patrick; Nourgaliev, Robert; Schofield, Sam

    2015-11-01

    A new sharp high-order interface tracking method for multi-material flow problems on unstructured meshes is presented. The method combines the marker-tracking algorithm with a discontinuous Galerkin (DG) level set method to implicitly track interfaces. DG projection is used to provide a mapping from the Lagrangian marker field to the Eulerian level set field. For the level set re-distancing, we developed a novel marching method that takes advantage of the unique features of the DG representation of the level set. The method efficiently marches outward from the zero level set with values in the new cells being computed solely from cell neighbors. Results are presented for a number of different interface geometries including ones with sharp corners and multiple hierarchical level sets. The method can robustly handle the level set discontinuities without explicit utilization of solution limiters. Results show that the expected high order (3rd and higher) of convergence for the DG representation of the level set is obtained for smooth solutions on unstructured meshes. High-order re-distancing on irregular meshes is a must for applications were the interfacial curvature is important for underlying physics, such as surface tension, wetting and detonation shock dynamics. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. Information management release number LLNL-ABS-675636.

  8. A GPU-accelerated adaptive discontinuous Galerkin method for level set equation

    NASA Astrophysics Data System (ADS)

    Karakus, A.; Warburton, T.; Aksel, M. H.; Sert, C.

    2016-01-01

    This paper presents a GPU-accelerated nodal discontinuous Galerkin method for the solution of two- and three-dimensional level set (LS) equation on unstructured adaptive meshes. Using adaptive mesh refinement, computations are localised mostly near the interface location to reduce the computational cost. Small global time step size resulting from the local adaptivity is avoided by local time-stepping based on a multi-rate Adams-Bashforth scheme. Platform independence of the solver is achieved with an extensible multi-threading programming API that allows runtime selection of different computing devices (GPU and CPU) and different threading interfaces (CUDA, OpenCL and OpenMP). Overall, a highly scalable, accurate and mass conservative numerical scheme that preserves the simplicity of LS formulation is obtained. Efficiency, performance and local high-order accuracy of the method are demonstrated through distinct numerical test cases.

  9. Wave breaking over sloping beaches using a coupled boundary integral-level set method

    SciTech Connect

    Garzon, M.; Adalsteinsson, D.; Gray, L.; Sethian, J.A.

    2003-12-08

    We present a numerical method for tracking breaking waves over sloping beaches. We use a fully non-linear potential model for in-compressible, irrotational and inviscid flow, and consider the effects of beach topography on breaking waves. The algorithm uses a Boundary Element Method (BEM) to compute the velocity at the interface, coupled to a Narrow Band Level Set Method to track the evolving air/water interface, and an associated extension equation to update the velocity potential both on and off the interface. The formulation of the algorithm is applicable to two and three dimensional breaking waves; in this paper, we concentrate on two-dimensional results showing wave breaking and rollup, and perform numerical convergence studies and comparison with previous techniques.

  10. Virtual Contrast for Coronary Vessels Based on Level Set Generated Subvoxel Accurate Centerlines

    PubMed Central

    Van Uitert, Robert; Wolf, Ivo; Tzatha, Efstathia; Gharib, Ahmed M; Summers, Ronald; Meinzer, Hans-Peter; Pettigrew, Roderic

    2006-01-01

    We present a tool for tracking coronary vessels in MRI scans of the human heart to aid in the screening of heart diseases. The vessels are identified through a single click inside each vessel present in a standard orthogonal view. The vessel identification results from a series of computational steps including eigenvalue analysis of the Hessian of the MRI image followed by a level set-based extraction of the vessel centerline. All identified vessels are highlighted using a virtual contrast agent and displayed simultaneously in a spherical curved reformation view. In cases of over segmentation, the vessel traces can be shortened by a click on each vessel end point. Intermediate analysis results of the vessel computation steps can be displayed as well. We successfully validated the tool on 40 MRI scans demonstrating accuracy and significant time savings over manual vessel tracing. PMID:23165062

  11. Defining obesity: second-level agenda setting attributes in black newspapers and general audience newspapers.

    PubMed

    Lee, Hyunmin; Len-Ríos, María E

    2014-01-01

    This content analysis study examines how obesity is depicted in general-audience and Black newspaper stories (N=391) through the lens of second-level agenda setting theory. The results reveal that both Black newspapers and general-audience newspapers generally ascribe individual causes for obesity. While both types of newspapers largely neglected to mention solutions for the problem, Black newspapers were more likely than general-audience newspapers to suggest both individual and societal solutions for treating obesity. For Black newspapers, these solutions more often included community interventions. In addition, Black newspapers more often used a negative tone in stories and more frequently mentioned ethnic and racial minorities as at-risk groups.

  12. A level set based flamelet model for the prediction of combustion in spark ignition engines

    NASA Astrophysics Data System (ADS)

    Ewald, J.; Peters, N.

    2005-08-01

    A Flamelet Model based on the Level Set approach for turbulent premixed combustion is presented. The original model is enhanced in order to consistently model the evolution of the premixed flame from laminar into a fully developed turbulent flame. This is accomplished by establishing a linear relationship between the thickness of the turbulent flame brush and the turbulent burning velocity. Starting from there a model for the initial flame propagation of a spherical spark kernel immediately after ignition and for the flame propagation in 3D space is derived. In contrast to other models, the same physical modeling assumptions are employed for the phase initially after spark ignition and for the later phases of flame propagation. The model is applied to a test case in an homogeneous charge Spark Ignition (SI) engine.

  13. Topology-optimized carpet cloaks based on a level-set boundary expression

    NASA Astrophysics Data System (ADS)

    Fujii, Garuda; Ueta, Tsuyoshi

    2016-10-01

    The concept of topology-optimized carpet cloaks is presented using level-set boundary expressions. Specifically, these carpet cloaks are designed with the idea of minimizing the value of an objective functional, which is here defined as the integrated intensity of the difference between the electric field reflected by a flat plane and that controlled by the carpet cloak. Made of dielectric material, our cloaks are designed to imitate reflections from a flat plane, and with some cloaks, the value of the objective functional falls below 0.12 % of that for a bare bump on a flat plane. These optimal carpet cloaks spontaneously satisfy the time-reversal symmetry of the scattered field during the optimization process. The profiles associated with optimal configurations are controlled by adjusting a regularization parameter. With this approach, a variety of configurations with different structural characteristics can be obtained.

  14. Initialisation of 3D level set for hippocampus segmentation from volumetric brain MR images

    NASA Astrophysics Data System (ADS)

    Hajiesmaeili, Maryam; Dehmeshki, Jamshid; Bagheri Nakhjavanlo, Bashir; Ellis, Tim

    2014-04-01

    Shrinkage of the hippocampus is a primary biomarker for Alzheimer's disease and can be measured through accurate segmentation of brain MR images. The paper will describe the problem of initialisation of a 3D level set algorithm for hippocampus segmentation that must cope with the some challenging characteristics, such as small size, wide range of intensities, narrow width, and shape variation. In addition, MR images require bias correction, to account for additional inhomogeneity associated with the scanner technology. Due to these inhomogeneities, using a single initialisation seed region inside the hippocampus is prone to failure. Alternative initialisation strategies are explored, such as using multiple initialisations in different sections (such as the head, body and tail) of the hippocampus. The Dice metric is used to validate our segmentation results with respect to ground truth for a dataset of 25 MR images. Experimental results indicate significant improvement in segmentation performance using the multiple initialisations techniques, yielding more accurate segmentation results for the hippocampus.

  15. Numerical Schemes for the Hamilton-Jacobi and Level Set Equations on Triangulated Domains

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.; Sethian, James A.

    1997-01-01

    Borrowing from techniques developed for conservation law equations, numerical schemes which discretize the Hamilton-Jacobi (H-J), level set, and Eikonal equations on triangulated domains are presented. The first scheme is a provably monotone discretization for certain forms of the H-J equations. Unfortunately, the basic scheme lacks proper Lipschitz continuity of the numerical Hamiltonian. By employing a virtual edge flipping technique, Lipschitz continuity of the numerical flux is restored on acute triangulations. Next, schemes are introduced and developed based on the weaker concept of positive coefficient approximations for homogeneous Hamiltonians. These schemes possess a discrete maximum principle on arbitrary triangulations and naturally exhibit proper Lipschitz continuity of the numerical Hamiltonian. Finally, a class of Petrov-Galerkin approximations are considered. These schemes are stabilized via a least-squares bilinear form. The Petrov-Galerkin schemes do not possess a discrete maximum principle but generalize to high order accuracy.

  16. Large deformation solid-fluid interaction via a level set approach.

    SciTech Connect

    Schunk, Peter Randall; Noble, David R.; Baer, Thomas A.; Rao, Rekha Ranjana; Notz, Patrick K.; Wilkes, Edward Dean

    2003-12-01

    Solidification and blood flow seemingly have little in common, but each involves a fluid in contact with a deformable solid. In these systems, the solid-fluid interface moves as the solid advects and deforms, often traversing the entire domain of interest. Currently, these problems cannot be simulated without innumerable expensive remeshing steps, mesh manipulations or decoupling the solid and fluid motion. Despite the wealth of progress recently made in mechanics modeling, this glaring inadequacy persists. We propose a new technique that tracks the interface implicitly and circumvents the need for remeshing and remapping the solution onto the new mesh. The solid-fluid boundary is tracked with a level set algorithm that changes the equation type dynamically depending on the phases present. This novel approach to coupled mechanics problems promises to give accurate stresses, displacements and velocities in both phases, simultaneously.

  17. Curvelet initialized level set cell segmentation for touching cells in low contrast images.

    PubMed

    Kaur, Sarabpreet; Sahambi, J S

    2016-04-01

    Cell segmentation is an important element of automatic cell analysis. This paper proposes a method to extract the cell nuclei and the cell boundaries of touching cells in low contrast images. First, the contrast of the low contrast cell images is improved by a combination of multiscale top hat filter and h-maxima. Then, a curvelet initialized level set method has been proposed to detect the cell nuclei and the boundaries. The image enhancement results have been verified using PSNR (Peak Signal to noise ratio) and the segmentation results have been verified using accuracy, sensitivity and precision metrics. The results show improved values of the performance metrics with the proposed method. PMID:26922612

  18. Prostate ultrasound image segmentation using level set-based region flow with shape guidance

    NASA Astrophysics Data System (ADS)

    Gong, Lixin; Ng, Lydia; Pathak, Sayan D.; Tutar, Ismail; Cho, Paul S.; Haynor, David R.; Kim, Yongmin

    2005-04-01

    Prostate segmentation in ultrasound images is a clinically important and technically challenging task. Despite several research attempts, few effective methods are available. One problem is the limited algorithmic robustness to common artifacts in clinical data sets. To improve the robustness, we have developed a hybrid level set method, which incorporates shape constraints into a region-based curve evolution process. The online segmentation method alternates between two steps, namely, shape model estimation (ME) and curve evolution (CE). The prior shape information is encoded in an implicit parametric model derived offline from manually outlined training data. Utilizing this prior shape information, the ME step tries to compute the maximum a posteriori estimate of the model parameters. The estimated shape is then used to guide the CE step, which in turn provides a new model initialization for the ME step. The process stops automatically when the curve locks onto the specific prostate shape. The ME and the CE steps complement each other to capture both global and local shape details. With shape guidance, this algorithm is less sensitive to initial contour placement and more robust even in the presence of large boundary gaps and strong clutter. Promising results are demonstrated on both synthetic and real prostate ultrasound images.

  19. Setting ozone critical levels for protecting horticultural Mediterranean crops: case study of tomato.

    PubMed

    González-Fernández, I; Calvo, E; Gerosa, G; Bermejo, V; Marzuoli, R; Calatayud, V; Alonso, R

    2014-02-01

    Seven experiments carried out in Italy and Spain have been used to parameterising a stomatal conductance model and establishing exposure- and dose-response relationships for yield and quality of tomato with the main goal of setting O3 critical levels (CLe). CLe with confidence intervals, between brackets, were set at an accumulated hourly O3 exposure over 40 nl l(-1), AOT40 = 8.4 (1.2, 15.6) ppm h and a phytotoxic ozone dose above a threshold of 6 nmol m(-2) s(-1), POD6 = 2.7 (0.8, 4.6) mmol m(-2) for yield and AOT40 = 18.7 (8.5, 28.8) ppm h and POD6 = 4.1 (2.0, 6.2) mmol m(-2) for quality, both indices performing equally well. CLe confidence intervals provide information on the quality of the dataset and should be included in future calculations of O3 CLe for improving current methodologies. These CLe, derived for sensitive tomato cultivars, should not be applied for quantifying O3-induced losses at the risk of making important overestimations of the economical losses associated with O3 pollution.

  20. Planning for Productive College-Level Work: Using the Course Assignment Framework.

    ERIC Educational Resources Information Center

    Mohr, Kathleen A. J.

    2002-01-01

    Asserts that well-designed course assignments are a critical component of effective teaching and learning processes. Discusses the Course Assignment Framework, which delineates 10 assignment categories, their rationales, and their advantages for faculty and students. States that the framework promotes combining tasks so that instructors can…

  1. Evaluating HIV Prevention: A Framework for National, State, and Local Levels.

    ERIC Educational Resources Information Center

    Rugg, Deborah; Buehler, Jim; Renaud, Michelle; Gilliam, Aisha; Heitgerd, Janet; Westover, Bonita; Wright-De Aguero, Linda; Bartholow, Kelly; Swanson, Sue

    1999-01-01

    Describes the 1995-1997 evaluation framework and activities of an evaluation of HIV-prevention efforts. This framework is offered as a platform for future efforts to determine the most effective ways to prevent HIV transmission, and it may be generalizable to other disease-prevention and health-promotion programs. (SLD)

  2. Teachers' Lives in Context: A Framework for Understanding Barriers to High Quality Teaching within Resource Deprived Settings

    ERIC Educational Resources Information Center

    Schwartz, Kate; Cappella, Elise; Aber, J. Lawrence

    2016-01-01

    Within low-income communities in low- and high-resource countries, there is a profound need for more effective schools that are better able to foster child and youth development and support student learning. This paper presents a theoretical framework for understanding the role of teacher ecology in influencing teacher effectiveness and, through…

  3. Texture analysis improves level set segmentation of the anterior abdominal wall

    SciTech Connect

    Xu, Zhoubing; Allen, Wade M.; Baucom, Rebeccah B.; Poulose, Benjamin K.; Landman, Bennett A.

    2013-12-15

    Purpose: The treatment of ventral hernias (VH) has been a challenging problem for medical care. Repair of these hernias is fraught with failure; recurrence rates ranging from 24% to 43% have been reported, even with the use of biocompatible mesh. Currently, computed tomography (CT) is used to guide intervention through expert, but qualitative, clinical judgments, notably, quantitative metrics based on image-processing are not used. The authors propose that image segmentation methods to capture the three-dimensional structure of the abdominal wall and its abnormalities will provide a foundation on which to measure geometric properties of hernias and surrounding tissues and, therefore, to optimize intervention.Methods: In this study with 20 clinically acquired CT scans on postoperative patients, the authors demonstrated a novel approach to geometric classification of the abdominal. The authors’ approach uses a texture analysis based on Gabor filters to extract feature vectors and follows a fuzzy c-means clustering method to estimate voxelwise probability memberships for eight clusters. The memberships estimated from the texture analysis are helpful to identify anatomical structures with inhomogeneous intensities. The membership was used to guide the level set evolution, as well as to derive an initial start close to the abdominal wall.Results: Segmentation results on abdominal walls were both quantitatively and qualitatively validated with surface errors based on manually labeled ground truth. Using texture, mean surface errors for the outer surface of the abdominal wall were less than 2 mm, with 91% of the outer surface less than 5 mm away from the manual tracings; errors were significantly greater (2–5 mm) for methods that did not use the texture.Conclusions: The authors’ approach establishes a baseline for characterizing the abdominal wall for improving VH care. Inherent texture patterns in CT scans are helpful to the tissue classification, and texture

  4. Texture analysis improves level set segmentation of the anterior abdominal wall

    PubMed Central

    Xu, Zhoubing; Allen, Wade M.; Baucom, Rebeccah B.; Poulose, Benjamin K.; Landman, Bennett A.

    2013-01-01

    Purpose: The treatment of ventral hernias (VH) has been a challenging problem for medical care. Repair of these hernias is fraught with failure; recurrence rates ranging from 24% to 43% have been reported, even with the use of biocompatible mesh. Currently, computed tomography (CT) is used to guide intervention through expert, but qualitative, clinical judgments, notably, quantitative metrics based on image-processing are not used. The authors propose that image segmentation methods to capture the three-dimensional structure of the abdominal wall and its abnormalities will provide a foundation on which to measure geometric properties of hernias and surrounding tissues and, therefore, to optimize intervention. Methods: In this study with 20 clinically acquired CT scans on postoperative patients, the authors demonstrated a novel approach to geometric classification of the abdominal. The authors’ approach uses a texture analysis based on Gabor filters to extract feature vectors and follows a fuzzy c-means clustering method to estimate voxelwise probability memberships for eight clusters. The memberships estimated from the texture analysis are helpful to identify anatomical structures with inhomogeneous intensities. The membership was used to guide the level set evolution, as well as to derive an initial start close to the abdominal wall. Results: Segmentation results on abdominal walls were both quantitatively and qualitatively validated with surface errors based on manually labeled ground truth. Using texture, mean surface errors for the outer surface of the abdominal wall were less than 2 mm, with 91% of the outer surface less than 5 mm away from the manual tracings; errors were significantly greater (2–5 mm) for methods that did not use the texture. Conclusions: The authors’ approach establishes a baseline for characterizing the abdominal wall for improving VH care. Inherent texture patterns in CT scans are helpful to the tissue classification, and texture

  5. An abdominal aortic aneurysm segmentation method: Level set with region and statistical information

    SciTech Connect

    Zhuge Feng; Rubin, Geoffrey D.; Sun Shaohua; Napel, Sandy

    2006-05-15

    We present a system for segmenting the human aortic aneurysm in CT angiograms (CTA), which, in turn, allows measurements of volume and morphological aspects useful for treatment planning. The system estimates a rough 'initial surface', and then refines it using a level set segmentation scheme augmented with two external analyzers: The global region analyzer, which incorporates a priori knowledge of the intensity, volume, and shape of the aorta and other structures, and the local feature analyzer, which uses voxel location, intensity, and texture features to train and drive a support vector machine classifier. Each analyzer outputs a value that corresponds to the likelihood that a given voxel is part of the aneurysm, which is used during level set iteration to control the evolution of the surface. We tested our system using a database of 20 CTA scans of patients with aortic aneurysms. The mean and worst case values of volume overlap, volume error, mean distance error, and maximum distance error relative to human tracing were 95.3%{+-}1.4% (s.d.); worst case=92.9%, 3.5%{+-}2.5% (s.d.); worst case=7.0%, 0.6{+-}0.2 mm (s.d.); worst case=1.0 mm, and 5.2{+-}2.3mm (s.d.); worstcase=9.6 mm, respectively. When implemented on a 2.8 GHz Pentium IV personal computer, the mean time required for segmentation was 7.4{+-}3.6min (s.d.). We also performed experiments that suggest that our method is insensitive to parameter changes within 10% of their experimentally determined values. This preliminary study proves feasibility for an accurate, precise, and robust system for segmentation of the abdominal aneurysm from CTA data, and may be of benefit to patients with aortic aneurysms.

  6. Level-set segmentation of pulmonary nodules in megavolt electronic portal images using a CT prior

    SciTech Connect

    Schildkraut, J. S.; Prosser, N.; Savakis, A.; Gomez, J.; Nazareth, D.; Singh, A. K.; Malhotra, H. K.

    2010-11-15

    Purpose: Pulmonary nodules present unique problems during radiation treatment due to nodule position uncertainty that is caused by respiration. The radiation field has to be enlarged to account for nodule motion during treatment. The purpose of this work is to provide a method of locating a pulmonary nodule in a megavolt portal image that can be used to reduce the internal target volume (ITV) during radiation therapy. A reduction in the ITV would result in a decrease in radiation toxicity to healthy tissue. Methods: Eight patients with nonsmall cell lung cancer were used in this study. CT scans that include the pulmonary nodule were captured with a GE Healthcare LightSpeed RT 16 scanner. Megavolt portal images were acquired with a Varian Trilogy unit equipped with an AS1000 electronic portal imaging device. The nodule localization method uses grayscale morphological filtering and level-set segmentation with a prior. The treatment-time portion of the algorithm is implemented on a graphical processing unit. Results: The method was retrospectively tested on eight cases that include a total of 151 megavolt portal image frames. The method reduced the nodule position uncertainty by an average of 40% for seven out of the eight cases. The treatment phase portion of the method has a subsecond execution time that makes it suitable for near-real-time nodule localization. Conclusions: A method was developed to localize a pulmonary nodule in a megavolt portal image. The method uses the characteristics of the nodule in a prior CT scan to enhance the nodule in the portal image and to identify the nodule region by level-set segmentation. In a retrospective study, the method reduced the nodule position uncertainty by an average of 40% for seven out of the eight cases studied.

  7. Information Seen as Part of the Development of Living Intelligence: the Five-Leveled Cybersemiotic Framework for FIS

    NASA Astrophysics Data System (ADS)

    Brier, Soren

    2003-06-01

    It is argued that a true transdisciplinary information science going from physical information to phenomenological understanding needs a metaphysical framework. Three different kinds of causality are implied: efficient, formal and final. And at least five different levels of existence are needed: 1. The quantum vacuum fields with entangled causation. 2. The physical level with is energy and force-based efficient causation. 3. The informational-chemical level with its formal causation based on pattern fitting. 4. The biological-semiotic level with its non-conscious final causation and 5. The social-linguistic level of self-consciousness with its conscious goal-oriented final causation. To integrate these consistently in an evolutionary theory as emergent levels, neither mechanical determinism nor complexity theory are sufficient because they cannot be a foundation for a theory of lived meaning. C. S. Peirce's triadic semiotic philosophy combined with a cybernetic and systemic view, like N. Luhmann's, could create the framework I call Cybersemiotics.

  8. Grouped False-Discovery Rate for Removing the Gene-Set-Level Bias of RNA-seq.

    PubMed

    Yang, Tae Young; Jeong, Seongmun

    2013-01-01

    In recent years, RNA-seq has become a very competitive alternative to microarrays. In RNA-seq experiments, the expected read count for a gene is proportional to its expression level multiplied by its transcript length. Even when two genes are expressed at the same level, differences in length will yield differing numbers of total reads. The characteristics of these RNA-seq experiments create a gene-level bias such that the proportion of significantly differentially expressed genes increases with the transcript length, whereas such bias is not present in microarray data. Gene-set analysis seeks to identify the gene sets that are enriched in the list of the identified significant genes. In the gene-set analysis of RNA-seq, the gene-level bias subsequently yields the gene-set-level bias that a gene set with genes of long length will be more likely to show up as enriched than will a gene set with genes of shorter length. Because gene expression is not related to its transcript length, any gene set containing long genes is not of biologically greater interest than gene sets with shorter genes. Accordingly the gene-set-level bias should be removed to accurately calculate the statistical significance of each gene-set enrichment in the RNA-seq. We present a new gene set analysis method of RNA-seq, called FDRseq, which can accurately calculate the statistical significance of a gene-set enrichment score by the grouped false-discovery rate. Numerical examples indicated that FDRseq is appropriate for controlling the transcript length bias in the gene-set analysis of RNA-seq data. To implement FDRseq, we developed the R program, which can be downloaded at no cost from http://home.mju.ac.kr/home/index.action?siteId=tyang.

  9. Landscape response to base-level fall in extensional settings: Amargosa River, Basin and Range, USA

    NASA Astrophysics Data System (ADS)

    Smith, J.; Brocklehurst, S. H.; Gawthorpe, R. L.; Finch, E.

    2012-12-01

    Studies examining transient landscapes within rift basins generally focus on settings where changes in boundary conditions are driven by active tectonics. However, the effect of drainage network re-organisation on landscape development and sediment routing has received significantly less attention. Within active rift settings it is common for drainage networks to become fragmented as uplift rates overcome the erosive potential of streams, while subsidence generates under-filled basins. On a regional-scale this results in poorly integrated drainage systems consisting of numerous internally drained basins. Integration can occur through the filling of sub-basins, lake over-spill, or drainage capture. This may dramatically affect base-level, catchment size, sediment flux and fluvial geomorphology, providing a natural experiment in fluvial response to changing boundary conditions, as well as representing a fundamental control on the ultimate preservation of sediments. We combine field and remote mapping with the available dating to investigate an example of late Pleistocene drainage integration in the southern Basin and Range, where drainage integration has resulted in a base-level fall and rejuvenation of the upstream landscape triggering further drainage rearrangement. The Amargosa River was previously part of an internally-drained basin, feeding the former Lake Tecopa. Drainage capture at 150-200 ka caused the Amargosa River to flow into Death Valley, carving the Amargosa Canyon through the Sperry Hills. The canyon itself has experienced aggradation as well as incision, with both terraces and fans representing levels above the current river. Upstream of the Amargosa Canyon, incision is reflected by minor knickpoints, and gullying along tributaries. For what is now westwards-flowing Willow Wash, the net incision of Amargosa Canyon has resulted in spectacular headward erosion, dissecting fan surfaces which previously graded northwest to Lake Tecopa. The Willow Wash

  10. Generalized cost-effectiveness analysis for national-level priority-setting in the health sector

    PubMed Central

    Hutubessy, Raymond; Chisholm, Dan; Edejer, Tessa Tan-Torres

    2003-01-01

    Cost-effectiveness analysis (CEA) is potentially an important aid to public health decision-making but, with some notable exceptions, its use and impact at the level of individual countries is limited. A number of potential reasons may account for this, among them technical shortcomings associated with the generation of current economic evidence, political expediency, social preferences and systemic barriers to implementation. As a form of sectoral CEA, Generalized CEA sets out to overcome a number of these barriers to the appropriate use of cost-effectiveness information at the regional and country level. Its application via WHO-CHOICE provides a new economic evidence base, as well as underlying methodological developments, concerning the cost-effectiveness of a range of health interventions for leading causes of, and risk factors for, disease. The estimated sub-regional costs and effects of different interventions provided by WHO-CHOICE can readily be tailored to the specific context of individual countries, for example by adjustment to the quantity and unit prices of intervention inputs (costs) or the coverage, efficacy and adherence rates of interventions (effectiveness). The potential usefulness of this information for health policy and planning is in assessing if current intervention strategies represent an efficient use of scarce resources, and which of the potential additional interventions that are not yet implemented, or not implemented fully, should be given priority on the grounds of cost-effectiveness. Health policy-makers and programme managers can use results from WHO-CHOICE as a valuable input into the planning and prioritization of services at national level, as well as a starting point for additional analyses of the trade-off between the efficiency of interventions in producing health and their impact on other key outcomes such as reducing inequalities and improving the health of the poor. PMID:14687420

  11. An ecofeminist conceptual framework to explore gendered environmental health inequities in urban settings and to inform healthy public policy.

    PubMed

    Chircop, Andrea

    2008-06-01

    This theoretical exploration is an attempt to conceptualize the link between gender and urban environmental health. The proposed ecofeminist framework enables an understanding of the link between the urban physical and social environments and health inequities mediated by gender and socioeconomic status. This framework is proposed as a theoretical magnifying glass to reveal the underlying logic that connects environmental exploitation on the one hand, and gendered health inequities on the other. Ecofeminism has the potential to reveal an inherent, normative conceptual analysis and argumentative justification of western society that permits the oppression of women and the exploitation of the environment. This insight will contribute to a better understanding of the mechanisms underlying gendered environmental health inequities and inform healthy public policy that is supportive of urban environmental health, particularly for low-income mothers.

  12. An ecofeminist conceptual framework to explore gendered environmental health inequities in urban settings and to inform healthy public policy.

    PubMed

    Chircop, Andrea

    2008-06-01

    This theoretical exploration is an attempt to conceptualize the link between gender and urban environmental health. The proposed ecofeminist framework enables an understanding of the link between the urban physical and social environments and health inequities mediated by gender and socioeconomic status. This framework is proposed as a theoretical magnifying glass to reveal the underlying logic that connects environmental exploitation on the one hand, and gendered health inequities on the other. Ecofeminism has the potential to reveal an inherent, normative conceptual analysis and argumentative justification of western society that permits the oppression of women and the exploitation of the environment. This insight will contribute to a better understanding of the mechanisms underlying gendered environmental health inequities and inform healthy public policy that is supportive of urban environmental health, particularly for low-income mothers. PMID:18476856

  13. Standard Setting in Relation to the Common European Framework of Reference for Languages: The Case of the State Examination of Dutch as a Second Language

    ERIC Educational Resources Information Center

    Bechger, Timo M.; Kuijper, Henk; Maris, Gunter

    2009-01-01

    This article reports on two related studies carried out to link the State examination of Dutch as a second language to the Common European Framework of Reference for languages (CEFR). In the first study, key persons from institutions for higher education were asked to determine the minimally required language level of beginning students. In the…

  14. 5-SPICE: the application of an original framework for community health worker program design, quality improvement and research agenda setting

    PubMed Central

    Palazuelos, Daniel; DaEun Im, Dana; Peckarsky, Matthew; Schwarz, Dan; Farmer, Didi Bertrand; Dhillon, Ranu; Johnson, Ari; Orihuela, Claudia; Hackett, Jill; Bazile, Junior; Berman, Leslie; Ballard, Madeleine; Panjabi, Raj; Ternier, Ralph; Slavin, Sam; Lee, Scott; Selinsky, Steve; Mitnick, Carole Diane

    2013-01-01

    Introduction Despite decades of experience with community health workers (CHWs) in a wide variety of global health projects, there is no established conceptual framework that structures how implementers and researchers can understand, study and improve their respective programs based on lessons learned by other CHW programs. Objective To apply an original, non-linear framework and case study method, 5-SPICE, to multiple sister projects of a large, international non-governmental organization (NGO), and other CHW projects. Design Engaging a large group of implementers, researchers and the best available literature, the 5-SPICE framework was refined and then applied to a selection of CHW programs. Insights gleaned from the case study method were summarized in a tabular format named the ‘5×5-SPICE chart’. This format graphically lists the ways in which essential CHW program elements interact, both positively and negatively, in the implementation field. Results The 5×5-SPICE charts reveal a variety of insights that come from a more complex understanding of how essential CHW projects interact and influence each other in their unique context. Some have been well described in the literature previously, while others are exclusive to this article. An analysis of how best to compensate CHWs is also offered as an example of the type of insights that this method may yield. Conclusions The 5-SPICE framework is a novel instrument that can be used to guide discussions about CHW projects. Insights from this process can help guide quality improvement efforts, or be used as hypothesis that will form the basis of a program's research agenda. Recent experience with research protocols embedded into successfully implemented projects demonstrates how such hypothesis can be rigorously tested. PMID:23561023

  15. Home advantage in high-level volleyball varies according to set number.

    PubMed

    Marcelino, Rui; Mesquita, Isabel; Palao Andrés, José Manuel; Sampaio, Jaime

    2009-01-01

    The aim of the present study was to identify the probability of winning each Volleyball set according to game location (home, away). Archival data was obtained from 275 sets in the 2005 Men's Senior World League and 65,949 actions were analysed. Set result (win, loss), game location (home, away), set number (first, second, third, fourth and fifth) and performance indicators (serve, reception, set, attack, dig and block) were the variables considered in this study. In a first moment, performance indicators were used in a logistic model of set result, by binary logistic regression analysis. After finding the adjusted logistic model, the log-odds of winning the set were analysed according to game location and set number. The results showed that winning a set is significantly related to performance indicators (Chisquare(18)=660.97, p<0.01). Analyses of log-odds of winning a set demonstrate that home teams always have more probability of winning the game than away teams, regardless of the set number. Home teams have more advantage at the beginning of the game (first set) and in the two last sets of the game (fourth and fifth sets), probably due to facilities familiarity and crowd effects. Different game actions explain these advantages and showed that to win the first set is more important to take risk, through a better performance in the attack and block, and to win the final set is important to manage the risk through a better performance on the reception. These results may suggest intra-game variation in home advantage and can be most useful to better prepare and direct the competition. Key pointsHome teams always have more probability of winning the game than away teams.Home teams have higher performance in reception, set and attack in the total of the sets.The advantage of home teams is more pronounced at the beginning of the game (first set) and in two last sets of the game (fourth and fifth sets) suggesting intra-game variation in home advantage.Analysis by sets

  16. Home advantage in high-level volleyball varies according to set number.

    PubMed

    Marcelino, Rui; Mesquita, Isabel; Palao Andrés, José Manuel; Sampaio, Jaime

    2009-01-01

    The aim of the present study was to identify the probability of winning each Volleyball set according to game location (home, away). Archival data was obtained from 275 sets in the 2005 Men's Senior World League and 65,949 actions were analysed. Set result (win, loss), game location (home, away), set number (first, second, third, fourth and fifth) and performance indicators (serve, reception, set, attack, dig and block) were the variables considered in this study. In a first moment, performance indicators were used in a logistic model of set result, by binary logistic regression analysis. After finding the adjusted logistic model, the log-odds of winning the set were analysed according to game location and set number. The results showed that winning a set is significantly related to performance indicators (Chisquare(18)=660.97, p<0.01). Analyses of log-odds of winning a set demonstrate that home teams always have more probability of winning the game than away teams, regardless of the set number. Home teams have more advantage at the beginning of the game (first set) and in the two last sets of the game (fourth and fifth sets), probably due to facilities familiarity and crowd effects. Different game actions explain these advantages and showed that to win the first set is more important to take risk, through a better performance in the attack and block, and to win the final set is important to manage the risk through a better performance on the reception. These results may suggest intra-game variation in home advantage and can be most useful to better prepare and direct the competition. Key pointsHome teams always have more probability of winning the game than away teams.Home teams have higher performance in reception, set and attack in the total of the sets.The advantage of home teams is more pronounced at the beginning of the game (first set) and in two last sets of the game (fourth and fifth sets) suggesting intra-game variation in home advantage.Analysis by sets

  17. Guiding Principles for Facilitating Higher Levels of Web-Based Distance Teaching and Learning in Post-Secondary Settings

    ERIC Educational Resources Information Center

    Kanuka, Heather

    2002-01-01

    The purpose of this study was to develop guiding principles to encourage higher levels of teaching and learning in Web-based distance education. The research framework used Zetterberg's (1962) model for change. Data from semi-structured interviews with university instructors who had experience in teaching Web-based distance education courses, a…

  18. A Patient-Centred Redesign Framework to Support System-Level Process Changes for Multimorbidities and Chronic Conditions.

    PubMed

    Sampalli, Tara; Edwards, Lynn; Christian, Erin; Kohler, Graeme; Bedford, Lisa; Demmons, Jillian; Verma, Jennifer; Gibson, Rick; Carson, Shannon Ryan

    2015-01-01

    Recent trends show an increase in the prevalence and costs associated with managing individuals with multimorbidities. Enabling better care for these individuals requires system-level changes such as the shift from a focus on a single disease or single service to multimorbidities and integrated systems of care. In this paper, a novel patient-centred redesign framework that was developed to support system-level process changes in four service areas has been discussed. The novelty of this framework is that it is embedded in patient perspectives and in the chronic care model as the theoretical foundation. The aims of this paper are to present an application of the framework in the context of four chronic disease prevention and management services, and to discuss early results from the pilot initiative along with an overview of the spread opportunities for this initiative. PMID:26718252

  19. Setting priorities for the adoption of health technologies on a national level -- the Israeli experience.

    PubMed

    Shani, S; Siebzehner, M I; Luxenburg, O; Shemer, J

    2000-12-01

    its recommended list with minor changes within a limited timeframe. In conclusion, we propose a practical and pragmatic model for the inclusion of new health technologies at a national level, based on health technology assessment and explicit priority setting.

  20. Setting priorities for the adoption of health technologies on a national level -- the Israeli experience.

    PubMed

    Shani, S; Siebzehner, M I; Luxenburg, O; Shemer, J

    2000-12-01

    its recommended list with minor changes within a limited timeframe. In conclusion, we propose a practical and pragmatic model for the inclusion of new health technologies at a national level, based on health technology assessment and explicit priority setting. PMID:11154787

  1. Low-level 14C methane oxidation rate measurements modified for remote field settings

    NASA Astrophysics Data System (ADS)

    Pack, M. A.; Pohlman, J.; Ruppel, C. D.; Xu, X.

    2012-12-01

    Aerobic methane oxidation limits atmospheric methane emissions from degraded subsea permafrost and dissociated methane hydrates in high latitude oceans. Methane oxidation rate measurements are a crucial tool for investigating the efficacy of this process, but are logistically challenging when working on small research vessels in remote settings. We modified a low-level 14C-CH4 oxidation rate measurement for use in the Beaufort Sea above hydrate bearing sediments during August 2012. Application of the more common 3H-CH4 rate measurement that uses 106 times more radioactivity was not practical because the R/V Ukpik cannot accommodate a radiation van. The low-level 14C measurement does not require a radiation van, but careful isolation of the 14C-label is essential to avoid contaminating natural abundance 14C measurements. We used 14C-CH4 with a total activity of 1.1 μCi, which is far below the 100 μCi permitting level. In addition, we modified field procedures to simplify and shorten sample processing. The original low-level 14C-CH4 method requires 6 steps in the field: (1) collect water samples in glass serum bottles, (2) inject 14C-CH4 into bottles, (3) incubate for 24 hours, (4) filter to separate the methanotrophic bacterial cells from the aqueous sample, (5) kill the filtrate with sodium hydroxide (NaOH), and (6) purge with nitrogen to remove unused 14C-CH4. Onshore, the 14C-CH4 respired to carbon dioxide or incorporated into cell material by methanotrophic bacteria during incubation is quantified by accelerator mass spectrometry (AMS). We conducted an experiment to test the possibility of storing samples for purging and filtering back onshore (steps 4 and 6). We subjected a series of water samples to steps 1-3 & 5, and preserved with mercuric chloride (HgCl2) instead of NaOH because HgCl2 is less likely to break down cell material during storage. The 14C-content of the carbon dioxide in samples preserved with HgCl2 and stored for up to 2 weeks was stable

  2. A General Framework for Power Analysis to Detect the Moderator Effects in Two- and Three-Level Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Dong, Nianbo; Spybrook, Jessaca; Kelcey, Ben

    2016-01-01

    The purpose of this study is to propose a general framework for power analyses to detect the moderator effects in two- and three-level cluster randomized trials (CRTs). The study specifically aims to: (1) develop the statistical formulations for calculating statistical power, minimum detectable effect size (MDES) and its confidence interval to…

  3. A Generic System-Level Framework for Self-Serve Health Monitoring System through Internet of Things (IoT).

    PubMed

    Ahmed, Mobyen Uddin; Björkman, Mats; Lindén, Maria

    2015-01-01

    Sensor data are traveling from sensors to a remote server, data is analyzed remotely in a distributed manner, and health status of a user is presented in real-time. This paper presents a generic system-level framework for a self-served health monitoring system through the Internet of Things (IoT) to facilities an efficient sensor data management.

  4. The Existence of Alternative Framework in Students' Scientific Imagination on the Concept of Matter at Submicroscopic Level: Macro Imagination

    ERIC Educational Resources Information Center

    Abdullah, Nurdiana; Surif, Johari

    2015-01-01

    This study is conducted with the purpose of identifying the alternative framework contained in students' imagination on the concept of matter at submicroscopic level. Through the design of purposive sampling techniques, a total of 15 students are interviewed to obtain the data. Data from analysis document is utilized to strengthen the interview.…

  5. Expected frontiers: Incorporating weather uncertainty into a policy analysis using an integrated bi-level multi-objective optimization framework

    EPA Science Inventory

    Weather is the main driver in both plant use of nutrients and fate and transport of nutrients in the environment. In previous work, we evaluated a green tax for control of agricultural nutrients in a bi-level optimization framework that linked deterministic models. In this study,...

  6. How Multi-Levels of Individual and Team Learning Interact in a Public Healthcare Organisation: A Conceptual Framework

    ERIC Educational Resources Information Center

    Doyle, Louise; Kelliher, Felicity; Harrington, Denis

    2016-01-01

    The aim of this paper is to review the relevant literature on organisational learning and offer a preliminary conceptual framework as a basis to explore how the multi-levels of individual learning and team learning interact in a public healthcare organisation. The organisational learning literature highlights a need for further understanding of…

  7. Detection of colonic polyp candidates with level set-based thickness mapping over the colon wall

    NASA Astrophysics Data System (ADS)

    Han, Hao; Li, Lihong; Duan, Chaijie; Zhao, Yang; Wang, Huafeng; Liang, Zhengrong

    2015-03-01

    Further improvement of computer-aided detection (CADe) of colonic polyps is vital to advance computed tomographic colonography (CTC) toward a screening modality, where the detection of flat polyps is especially challenging because limited image features can be extracted from flat polyps, and the traditional geometric features-based CADe methods usually fail to detect such polyps. In this paper, we present a novel pipeline to automatically detect initial polyp candidates (IPCs), especially flat polyps, from CTC images. First, the colon wall mucosa was extracted via a partial volume segmentation approach as a volumetric layer, where the inner border of colon wall can be obtained by shrinking the volumetric layer using level set based adaptive convolution. Then the outer border of colon wall (or the colon wall serosa) was segmented via a combined implementation of geodesic active contour and Mumford-Shah functional in a coarse-to-fine manner. Finally, the wall thickness was estimated along a unique path between the segmented inner and outer borders with consideration of the volumetric layers and was mapped onto a patient-specific three-dimensional (3D) colon wall model. The IPC detection results can usually be better visualized in a 2D image flattened from the 3D model, where abnormalities were detected by Z-score transformation of the thickness values. The proposed IPC detection approach was validated on 11 patients with 22 CTC scans, and each scan has at least one flat poly annotation. The above presented novel pipeline was effective to detect some flat polyps that were missed by our CADe system while keeping false detections in a relative low level. This preliminary study indicates that the presented pipeline can be incorporated into an existing CADe system to enhance the polyp detection power, especially for flat polyps.

  8. Cerebral Arteries Extraction using Level Set Segmentation and Adaptive Tracing for CT Angiography

    SciTech Connect

    Zhang Yong; Zhou Xiaobo; Srinivasan, Ranga; Wong, Stephen T. C.; Young, Geoff

    2007-11-02

    We propose an approach for extracting cerebral arteries from partial Computed Tomography Angiography (CTA). The challenges of extracting cerebral arteries from CTA come from the fact that arteries are usually surrounded by bones and veins in the lower portion of a CTA volume. There exists strong intensity-value overlap between vessels and surrounding objects. Besides, it is inappropriate to assume the 2D cross sections of arteries are circle or ellipse, especially for abnormal vessels. The navigation of the arteries could change suddenly in the 3D space. In this paper, a method based on level set segmentation is proposed to target this challenging problem. For the lower portion of a CTA volume, we use geodesic active contour method to detect cross section of arteries in the 2D space. The medial axis of the artery is obtained by adaptively tracking along its navigation path. This is done by finding the minimal cross section from cutting the arteries under different angles in the 3D spherical space. This method is highly automated, with minimum user input of providing only the starting point and initial navigation direction of the arteries of interests.

  9. DSA Image Blood Vessel Skeleton Extraction Based on Anti-concentration Diffusion and Level Set Method

    NASA Astrophysics Data System (ADS)

    Xu, Jing; Wu, Jian; Feng, Daming; Cui, Zhiming

    Serious types of vascular diseases such as carotid stenosis, aneurysm and vascular malformation may lead to brain stroke, which are the third leading cause of death and the number one cause of disability. In the clinical practice of diagnosis and treatment of cerebral vascular diseases, how to do effective detection and description of the vascular structure of two-dimensional angiography sequence image that is blood vessel skeleton extraction has been a difficult study for a long time. This paper mainly discussed two-dimensional image of blood vessel skeleton extraction based on the level set method, first do the preprocessing to the DSA image, namely uses anti-concentration diffusion model for the effective enhancement and uses improved Otsu local threshold segmentation technology based on regional division for the image binarization, then vascular skeleton extraction based on GMM (Group marching method) with fast sweeping theory was actualized. Experiments show that our approach not only improved the time complexity, but also make a good extraction results.

  10. Cerebral Arteries Extraction using Level Set Segmentation and Adaptive Tracing for CT Angiography

    NASA Astrophysics Data System (ADS)

    Zhang, Yong; Young, Geoff; Zhou, Xiaobo; Srinivasan, Ranga; Wong, Stephen T. C.

    2007-11-01

    We propose an approach for extracting cerebral arteries from partial Computed Tomography Angiography (CTA). The challenges of extracting cerebral arteries from CTA come from the fact that arteries are usually surrounded by bones and veins in the lower portion of a CTA volume. There exists strong intensity-value overlap between vessels and surrounding objects. Besides, it is inappropriate to assume the 2D cross sections of arteries are circle or ellipse, especially for abnormal vessels. The navigation of the arteries could change suddenly in the 3D space. In this paper, a method based on level set segmentation is proposed to target this challenging problem. For the lower portion of a CTA volume, we use geodesic active contour method to detect cross section of arteries in the 2D space. The medial axis of the artery is obtained by adaptively tracking along its navigation path. This is done by finding the minimal cross section from cutting the arteries under different angles in the 3D spherical space. This method is highly automated, with minimum user input of providing only the starting point and initial navigation direction of the arteries of interests.

  11. Semi-Automated Detection of Surface Degradation on Bridges Based on a Level Set Method

    NASA Astrophysics Data System (ADS)

    Masiero, A.; Guarnieri, A.; Pirotti, F.; Vettore, A.

    2015-08-01

    Due to the effect of climate factors, natural phenomena and human usage, buildings and infrastructures are subject of progressive degradation. The deterioration of these structures has to be monitored in order to avoid hazards for human beings and for the natural environment in their neighborhood. Hence, on the one hand, monitoring such infrastructures is of primarily importance. On the other hand, unfortunately, nowadays this monitoring effort is mostly done by expert and skilled personnel, which follow the overall data acquisition, analysis and result reporting process, making the whole monitoring procedure quite expensive for the public (and private, as well) agencies. This paper proposes the use of a partially user-assisted procedure in order to reduce the monitoring cost and to make the obtained result less subjective as well. The developed method relies on the use of images acquired with standard cameras by even inexperienced personnel. The deterioration on the infrastructure surface is detected by image segmentation based on a level sets method. The results of the semi-automated analysis procedure are remapped on a 3D model of the infrastructure obtained by means of a terrestrial laser scanning acquisition. The proposed method has been successfully tested on a portion of a road bridge in Perarolo di Cadore (BL), Italy.

  12. Automated Robust Image Segmentation: Level Set Method Using Nonnegative Matrix Factorization with Application to Brain MRI.

    PubMed

    Dera, Dimah; Bouaynaya, Nidhal; Fathallah-Shaykh, Hassan M

    2016-07-01

    We address the problem of fully automated region discovery and robust image segmentation by devising a new deformable model based on the level set method (LSM) and the probabilistic nonnegative matrix factorization (NMF). We describe the use of NMF to calculate the number of distinct regions in the image and to derive the local distribution of the regions, which is incorporated into the energy functional of the LSM. The results demonstrate that our NMF-LSM method is superior to other approaches when applied to synthetic binary and gray-scale images and to clinical magnetic resonance images (MRI) of the human brain with and without a malignant brain tumor, glioblastoma multiforme. In particular, the NMF-LSM method is fully automated, highly accurate, less sensitive to the initial selection of the contour(s) or initial conditions, more robust to noise and model parameters, and able to detect as small distinct regions as desired. These advantages stem from the fact that the proposed method relies on histogram information instead of intensity values and does not introduce nuisance model parameters. These properties provide a general approach for automated robust region discovery and segmentation in heterogeneous images. Compared with the retrospective radiological diagnoses of two patients with non-enhancing grade 2 and 3 oligodendroglioma, the NMF-LSM detects earlier progression times and appears suitable for monitoring tumor response. The NMF-LSM method fills an important need of automated segmentation of clinical MRI. PMID:27417984

  13. A New Ghost Cell/Level Set Method for Moving Boundary Problems: Application to Tumor Growth

    PubMed Central

    Macklin, Paul

    2011-01-01

    In this paper, we present a ghost cell/level set method for the evolution of interfaces whose normal velocity depend upon the solutions of linear and nonlinear quasi-steady reaction-diffusion equations with curvature-dependent boundary conditions. Our technique includes a ghost cell method that accurately discretizes normal derivative jump boundary conditions without smearing jumps in the tangential derivative; a new iterative method for solving linear and nonlinear quasi-steady reaction-diffusion equations; an adaptive discretization to compute the curvature and normal vectors; and a new discrete approximation to the Heaviside function. We present numerical examples that demonstrate better than 1.5-order convergence for problems where traditional ghost cell methods either fail to converge or attain at best sub-linear accuracy. We apply our techniques to a model of tumor growth in complex, heterogeneous tissues that consists of a nonlinear nutrient equation and a pressure equation with geometry-dependent jump boundary conditions. We simulate the growth of glioblastoma (an aggressive brain tumor) into a large, 1 cm square of brain tissue that includes heterogeneous nutrient delivery and varied biomechanical characteristics (white matter, gray matter, cerebrospinal fluid, and bone), and we observe growth morphologies that are highly dependent upon the variations of the tissue characteristics—an effect observed in real tumor growth. PMID:21331304

  14. Intracerebral Transplants and Memory Dysfunction: Circuitry Repair or Functional Level Setting?

    PubMed Central

    Will, Bruno; Kelche, Christian; Cassel, Jean-Christophe

    2000-01-01

    Intracerebral grafting techniques of fetal neural cells have been used essentially with two main types of lesion paradigms, namely damage to long projection systems, in which the source and the target are clearly separate, and damage to neurons that are involved in local circuits within a small (sub)region of the brain. With the’first lesion paradigm, grafts placed homotopically (in the source) are not appropriate because their fibers grow poorly through the host parenchyma and fail to reach their normal target. To be successful, the grafts must be placed ectopically in the target region of the damaged projection systems, where generally they work as level-setting systems. Conversely, with the second paradigm, the grafts are supposed to compensate for a local loss of neurons and must be placed homotopically to induce functional effects that are based on the reconstruction of a point-to-point circuitry. By inserting a biological or artificial bridging-substrate between the source and the target of long projection systems, it might be possible to combine the positive effects of both homotopic and ectopic grafting by achieving both target reinnervation and normal control of the grafted neurons within the source area. These issues are illustrated and discussed in this review. PMID:10709217

  15. Determinants of pregnancy and induced and spontaneous abortion in a jointly determined framework: evidence from a country-wide, district-level household survey in India.

    PubMed

    Ahmed, Salma; Ray, Ranjan

    2014-07-01

    This study provides evidence on the principal determinants of pregnancy and abortion in India using a large country-wide district-level data set (DLHS 2007). The paper provides an economic framework for the analysis of pregnancy and abortion. The study distinguishes between induced and spontaneous abortion and compares the effects of their determinants. The results show that there are wide differences between induced and spontaneous abortions in terms of the sign and magnitude of the estimated effects of several of their determinants, most notably wealth, the woman's age and her desire for children. The study makes a methodological contribution by proposing a trivariate probit estimation framework that recognizes the joint dependence of pregnancy and induced and spontaneous abortion, and provides evidence in support of this joint dependence. The study reports an inverted U-shaped effect of a woman's age on her pregnancy and both forms of abortion. The turning point in each case is quite robust to the estimation framework. A significant effect of contextual variables, at the village level, constructed from the individual responses, on a woman's pregnancy is found. The effects are weaker in the case of induced abortion, and insignificant in the case of spontaneous abortion. The results are shown to be fairly robust. This paper extends the literature on the relation between son preference and fertility by examining the link between mother's son preference and desire for more children with abortion rates.

  16. An analytical framework for estimating fertilization bias and the fertilization set from multiple sperm-storage organs.

    PubMed

    Manier, Mollie K; Lüpold, Stefan; Pitnick, Scott; Starmer, William T

    2013-10-01

    How sperm from competing males are used to fertilize eggs is poorly understood yet has important implications for postcopulatory sexual selection. Sperm may be used in direct proportion to their numerical representation within the fertilization set or with a bias toward one male over another. Previous theoretical treatments have assumed a single sperm-storage organ, but many taxa possess multiple organs or store sperm within multiple regions of the reproductive tract. In Drosophila, females store sperm in two distinct storage organ types: the seminal receptacle (SR) and the paired spermathecae. Here, we expand previous "raffle" models to describe "fertilization bias" independently for sperm within the SR and the spermathecae and estimate the fertilization set based on the relative contribution of sperm from the different sperm-storage organ types. We apply this model to three closely related species to reveal rapid divergence in the fertilization set and the potential for female sperm choice.

  17. Analysis of adequacy levels for human resources improvement within primary health care framework in Africa.

    PubMed

    Parent, Florence; Fromageot, Audrey; Coppieters, Yves; Lejeune, Colette; Lemenu, Dominique; Garant, Michèle; Piette, Danielle; Levêque, Alain; De Ketele, Jean-Marie

    2005-12-02

    Human resources in health care system in sub-Saharan Africa are generally picturing a lack of adequacy between expected skills from the professionals and health care needs expressed by the populations. It is, however, possible to analyse these various lacks of adequacy related to human resource management and their determinants to enhance the effectiveness of the health care system. From two projects focused on nurse professionals within the health care system in Central Africa, we present an analytic grid for adequacy levels looking into the following aspects:- adequacy between skills-based profiles for health system professionals, quality of care and service delivery (health care system /medical standards), needs and expectations from the populations,- adequacy between allocation of health system professionals, quality of care and services delivered (health care system /medical standards), needs and expectations from the populations,- adequacy between human resource management within health care system and medical standards,- adequacy between human resource management within education/teaching/training and needs from health care system and education sectors,- adequacy between basic and on-going education and realities of tasks expected and implemented by different categories of professionals within the health care system body,- adequacy between intentions for initial and on-going trainings and teaching programs in health sciences for trainers (teachers/supervisors/health care system professionals/ directors (teaching managers) of schools...). This tool is necessary for decision-makers as well as for health care system professionals who share common objectives for changes at each level of intervention within the health system. Setting this adequacy implies interdisciplinary and participative approaches for concerned actors in order to provide an overall vision of a more broaden system than health district, small island with self-rationality, and in which they

  18. Using Economic Evidence to Set Healthcare Priorities in Low-Income and Lower-Middle-Income Countries: A Systematic Review of Methodological Frameworks.

    PubMed

    Wiseman, Virginia; Mitton, Craig; Doyle-Waters, Mary M; Drake, Tom; Conteh, Lesong; Newall, Anthony T; Onwujekwe, Obinna; Jan, Stephen

    2016-02-01

    Policy makers in low-income and lower-middle-income countries (LMICs) are increasingly looking to develop 'evidence-based' frameworks for identifying priority health interventions. This paper synthesises and appraises the literature on methodological frameworks--which incorporate economic evaluation evidence--for the purpose of setting healthcare priorities in LMICs. A systematic search of Embase, MEDLINE, Econlit and PubMed identified 3968 articles with a further 21 articles identified through manual searching. A total of 36 papers were eligible for inclusion. These covered a wide range of health interventions with only two studies including health systems strengthening interventions related to financing, governance and human resources. A little under half of the studies (39%) included multiple criteria for priority setting, most commonly equity, feasibility and disease severity. Most studies (91%) specified a measure of 'efficiency' defined as cost per disability-adjusted life year averted. Ranking of health interventions using multi-criteria decision analysis and generalised cost-effectiveness were the most common frameworks for identifying priority health interventions. Approximately a third of studies discussed the affordability of priority interventions. Only one study identified priority areas for the release or redeployment of resources. The paper concludes by highlighting the need for local capacity to conduct evaluations (including economic analysis) and empowerment of local decision-makers to act on this evidence.

  19. Using Economic Evidence to Set Healthcare Priorities in Low‐Income and Lower‐Middle‐Income Countries: A Systematic Review of Methodological Frameworks

    PubMed Central

    Mitton, Craig; Doyle‐Waters, Mary M.; Drake, Tom; Conteh, Lesong; Newall, Anthony T.; Onwujekwe, Obinna; Jan, Stephen

    2016-01-01

    Abstract Policy makers in low‐income and lower‐middle‐income countries (LMICs) are increasingly looking to develop ‘evidence‐based’ frameworks for identifying priority health interventions. This paper synthesises and appraises the literature on methodological frameworks – which incorporate economic evaluation evidence – for the purpose of setting healthcare priorities in LMICs. A systematic search of Embase, MEDLINE, Econlit and PubMed identified 3968 articles with a further 21 articles identified through manual searching. A total of 36 papers were eligible for inclusion. These covered a wide range of health interventions with only two studies including health systems strengthening interventions related to financing, governance and human resources. A little under half of the studies (39%) included multiple criteria for priority setting, most commonly equity, feasibility and disease severity. Most studies (91%) specified a measure of ‘efficiency’ defined as cost per disability‐adjusted life year averted. Ranking of health interventions using multi‐criteria decision analysis and generalised cost‐effectiveness were the most common frameworks for identifying priority health interventions. Approximately a third of studies discussed the affordability of priority interventions. Only one study identified priority areas for the release or redeployment of resources. The paper concludes by highlighting the need for local capacity to conduct evaluations (including economic analysis) and empowerment of local decision‐makers to act on this evidence. PMID:26804361

  20. Modeling Primary Breakup: A Three-Dimensional Eulerian Level Set/Vortex Sheet Method for Two-Phase Interface Dynamics

    NASA Technical Reports Server (NTRS)

    Herrmann, M.

    2003-01-01

    This paper is divided into four parts. First, the level set/vortex sheet method for three-dimensional two-phase interface dynamics is presented. Second, the LSS model for the primary breakup of turbulent liquid jets and sheets is outlined and all terms requiring subgrid modeling are identified. Then, preliminary three-dimensional results of the level set/vortex sheet method are presented and discussed. Finally, conclusions are drawn and an outlook to future work is given.

  1. Toward accurate tooth segmentation from computed tomography images using a hybrid level set model

    SciTech Connect

    Gan, Yangzhou; Zhao, Qunfei; Xia, Zeyang E-mail: jing.xiong@siat.ac.cn; Hu, Ying; Xiong, Jing E-mail: jing.xiong@siat.ac.cn; Zhang, Jianwei

    2015-01-15

    Purpose: A three-dimensional (3D) model of the teeth provides important information for orthodontic diagnosis and treatment planning. Tooth segmentation is an essential step in generating the 3D digital model from computed tomography (CT) images. The aim of this study is to develop an accurate and efficient tooth segmentation method from CT images. Methods: The 3D dental CT volumetric images are segmented slice by slice in a two-dimensional (2D) transverse plane. The 2D segmentation is composed of a manual initialization step and an automatic slice by slice segmentation step. In the manual initialization step, the user manually picks a starting slice and selects a seed point for each tooth in this slice. In the automatic slice segmentation step, a developed hybrid level set model is applied to segment tooth contours from each slice. Tooth contour propagation strategy is employed to initialize the level set function automatically. Cone beam CT (CBCT) images of two subjects were used to tune the parameters. Images of 16 additional subjects were used to validate the performance of the method. Volume overlap metrics and surface distance metrics were adopted to assess the segmentation accuracy quantitatively. The volume overlap metrics were volume difference (VD, mm{sup 3}) and Dice similarity coefficient (DSC, %). The surface distance metrics were average symmetric surface distance (ASSD, mm), RMS (root mean square) symmetric surface distance (RMSSSD, mm), and maximum symmetric surface distance (MSSD, mm). Computation time was recorded to assess the efficiency. The performance of the proposed method has been compared with two state-of-the-art methods. Results: For the tested CBCT images, the VD, DSC, ASSD, RMSSSD, and MSSD for the incisor were 38.16 ± 12.94 mm{sup 3}, 88.82 ± 2.14%, 0.29 ± 0.03 mm, 0.32 ± 0.08 mm, and 1.25 ± 0.58 mm, respectively; the VD, DSC, ASSD, RMSSSD, and MSSD for the canine were 49.12 ± 9.33 mm{sup 3}, 91.57 ± 0.82%, 0.27 ± 0.02 mm, 0

  2. Birth choices in Timor-Leste: a framework for understanding the use of maternal health services in low resource settings.

    PubMed

    Wild, Kayli; Barclay, Lesley; Kelly, Paul; Martins, Nelson

    2010-12-01

    The high rate of maternal mortality in Timor-Leste is a persistent problem which has been exacerbated by the long history of military occupation and ongoing political crises since independence in 1999. It is similar to other developing countries where there have been slow declines in maternal mortality despite 20 years of Safe Motherhood interventions. The national Ministry of Health, United Nations (UN) agencies and non-government organisations (NGOs) have attempted to reduce maternal mortality by enacting policies and interventions to increase the number of births in health centres and hospitals. Despite considerable effort in promoting facility-based delivery, most Timorese women birth at home and the lack of midwives means few women have access to a skilled birth attendant. This paper investigates factors influencing access to and use of maternal health services in rural areas of Timor-Leste. It draws on 21 interviews and 11 group discussions with Timorese women and their families collected over two periods of fieldwork, one month in September 2006 and five months from July to December 2007. Theoretical concepts from anthropology and health social science are used to explore individual, social, political and health system issues which affect the way in which maternal health services are utilised. In drawing together a range of theories this paper aims to extend explanations around access to maternal health services in developing countries. An empirically informed framework is proposed which illustrates the complex factors that influence women's birth choices. This framework can be used by policy-makers, practitioners, donors and researchers to think critically about policy decisions and where investments can have the most impact for improving maternal health in Timor-Leste and elsewhere. PMID:20971540

  3. Birth choices in Timor-Leste: a framework for understanding the use of maternal health services in low resource settings.

    PubMed

    Wild, Kayli; Barclay, Lesley; Kelly, Paul; Martins, Nelson

    2010-12-01

    The high rate of maternal mortality in Timor-Leste is a persistent problem which has been exacerbated by the long history of military occupation and ongoing political crises since independence in 1999. It is similar to other developing countries where there have been slow declines in maternal mortality despite 20 years of Safe Motherhood interventions. The national Ministry of Health, United Nations (UN) agencies and non-government organisations (NGOs) have attempted to reduce maternal mortality by enacting policies and interventions to increase the number of births in health centres and hospitals. Despite considerable effort in promoting facility-based delivery, most Timorese women birth at home and the lack of midwives means few women have access to a skilled birth attendant. This paper investigates factors influencing access to and use of maternal health services in rural areas of Timor-Leste. It draws on 21 interviews and 11 group discussions with Timorese women and their families collected over two periods of fieldwork, one month in September 2006 and five months from July to December 2007. Theoretical concepts from anthropology and health social science are used to explore individual, social, political and health system issues which affect the way in which maternal health services are utilised. In drawing together a range of theories this paper aims to extend explanations around access to maternal health services in developing countries. An empirically informed framework is proposed which illustrates the complex factors that influence women's birth choices. This framework can be used by policy-makers, practitioners, donors and researchers to think critically about policy decisions and where investments can have the most impact for improving maternal health in Timor-Leste and elsewhere.

  4. Initial condition for efficient mapping of level set algorithms on many-core architectures

    NASA Astrophysics Data System (ADS)

    Tornai, Gábor János; Cserey, György

    2014-12-01

    In this paper, we investigated the effect of adding more small curves to the initial condition which determines the required number of iterations of a fast level set (LS) evolution. As a result, we discovered two new theorems and developed a proof on the worst case of the required number of iterations. Furthermore, we found that these kinds of initial conditions fit well to many-core architectures. To show this, we have included two case studies which are presented on different platforms. One runs on a graphical processing unit (GPU) and the other is executed on a cellular nonlinear network universal machine (CNN-UM). With the new initial conditions, the steady-state solutions of the LS are reached in less than eight iterations depending on the granularity of the initial condition. These dense iterations can be calculated very quickly on many-core platforms according to the two case studies. In the case of the proposed dense initial condition on GPU, there is a significant speedup compared to the sparse initial condition in all cases since our dense initial condition together with the algorithm utilizes the properties of the underlying architecture. Therefore, greater performance gain can be achieved (up to 18 times speedup compared to the sparse initial condition on GPU). Additionally, we have validated our concept against numerically approximated LS evolution of standard flows (mean curvature, Chan-Vese, geodesic active regions). The dice indexes between the fast LS evolutions and the evolutions of the numerically approximated partial differential equations are in the range of 0.99±0.003.

  5. CT liver volumetry using geodesic active contour segmentation with a level-set algorithm

    NASA Astrophysics Data System (ADS)

    Suzuki, Kenji; Epstein, Mark L.; Kohlbrenner, Ryan; Obajuluwa, Ademola; Xu, Jianwu; Hori, Masatoshi; Baron, Richard

    2010-03-01

    Automatic liver segmentation on CT images is challenging because the liver often abuts other organs of a similar density. Our purpose was to develop an accurate automated liver segmentation scheme for measuring liver volumes. We developed an automated volumetry scheme for the liver in CT based on a 5 step schema. First, an anisotropic smoothing filter was applied to portal-venous phase CT images to remove noise while preserving the liver structure, followed by an edge enhancer to enhance the liver boundary. By using the boundary-enhanced image as a speed function, a fastmarching algorithm generated an initial surface that roughly estimated the liver shape. A geodesic-active-contour segmentation algorithm coupled with level-set contour-evolution refined the initial surface so as to more precisely fit the liver boundary. The liver volume was calculated based on the refined liver surface. Hepatic CT scans of eighteen prospective liver donors were obtained under a liver transplant protocol with a multi-detector CT system. Automated liver volumes obtained were compared with those manually traced by a radiologist, used as "gold standard." The mean liver volume obtained with our scheme was 1,520 cc, whereas the mean manual volume was 1,486 cc, with the mean absolute difference of 104 cc (7.0%). CT liver volumetrics based on an automated scheme agreed excellently with "goldstandard" manual volumetrics (intra-class correlation coefficient was 0.95) with no statistically significant difference (p(F<=f)=0.32), and required substantially less completion time. Our automated scheme provides an efficient and accurate way of measuring liver volumes.

  6. Volume analysis of treatment response of head and neck lesions using 3D level set segmentation

    NASA Astrophysics Data System (ADS)

    Hadjiiski, Lubomir; Street, Ethan; Sahiner, Berkman; Gujar, Sachin; Ibrahim, Mohannad; Chan, Heang-Ping; Mukherji, Suresh K.

    2008-03-01

    A computerized system for segmenting lesions in head and neck CT scans was developed to assist radiologists in estimation of the response to treatment of malignant lesions. The system performs 3D segmentations based on a level set model and uses as input an approximate bounding box for the lesion of interest. In this preliminary study, CT scans from a pre-treatment exam and a post one-cycle chemotherapy exam of 13 patients containing head and neck neoplasms were used. A radiologist marked 35 temporal pairs of lesions. 13 pairs were primary site cancers and 22 pairs were metastatic lymph nodes. For all lesions, a radiologist outlined a contour on the best slice on both the pre- and post treatment scans. For the 13 primary lesion pairs, full 3D contours were also extracted by a radiologist. The average pre- and post-treatment areas on the best slices for all lesions were 4.5 and 2.1 cm2, respectively. For the 13 primary site pairs the average pre- and post-treatment primary lesions volumes were 15.4 and 6.7 cm 3 respectively. The correlation between the automatic and manual estimates for the pre-to-post-treatment change in area for all 35 pairs was r=0.97, while the correlation for the percent change in area was r=0.80. The correlation for the change in volume for the 13 primary site pairs was r=0.89, while the correlation for the percent change in volume was r=0.79. The average signed percent error between the automatic and manual areas for all 70 lesions was 11.0+/-20.6%. The average signed percent error between the automatic and manual volumes for all 26 primary lesions was 37.8+/-42.1%. The preliminary results indicate that the automated segmentation system can reliably estimate tumor size change in response to treatment relative to radiologist's hand segmentation.

  7. Expert Consensus on the Rehabilitation Framework Guiding a Model of Care for People Living With HIV in a South African Setting.

    PubMed

    Chetty, Verusia; Hanass-Hancock, Jill; Myezwa, Hellen

    2016-01-01

    Disabilities and treatments related to HIV are a focus for rehabilitation professionals in HIV-endemic countries, yet these countries lack guidance to integrate rehabilitation into a model of care for people living with HIV. We asked HIV and rehabilitation experts in South Africa to engage in a modified Delphi survey based on findings from (a) an enquiry into stakeholder perspectives of a context-specific rehabilitation framework at a semi-rural setting and (b) an analysis of international models of care-guiding rehabilitation. Consensus was determined by an a priori threshold of 70% of agreement and interquartile range (≤ 1 on criterion) to be included as essential or useful in the model of care framework. Experts agreed that improving access to care, optimal communication between stakeholders, education and training for health care workers, and home-based rehabilitation were essential for the model. Furthermore, task shifting and evidence-based practice were seen as fundamental for optimal care.

  8. Fostering Multirepresentational Levels of Chemical Concepts: A Framework to Develop Educational Software

    ERIC Educational Resources Information Center

    Marson, Guilherme A.; Torres, Bayardo B.

    2011-01-01

    This work presents a convenient framework for developing interactive chemical education software to facilitate the integration of macroscopic, microscopic, and symbolic dimensions of chemical concepts--specifically, via the development of software for gel permeation chromatography. The instructional role of the software was evaluated in a study…

  9. A public health framework to translate risk factors related to political violence and war into multi-level preventive interventions.

    PubMed

    De Jong, Joop T V M

    2010-01-01

    Political violence, armed conflicts and human rights violations are produced by a variety of political, economic and socio-cultural factors. Conflicts can be analyzed with an interdisciplinary approach to obtain a global understanding of the relative contribution of risk and protective factors. A public health framework was designed to address these risk factors and protective factors. The framework resulted in a matrix that combined primary, secondary and tertiary interventions with their implementation on the levels of the society-at-large, the community, and the family and individual. Subsequently, the risk and protective factors were translated into multi-sectoral, multi-modal and multi-level preventive interventions involving the economy, governance, diplomacy, the military, human rights, agriculture, health, and education. Then the interventions were slotted in their appropriate place in the matrix. The interventions can be applied in an integrative form by international agencies, governments and non-governmental organizations, and molded to meet the requirements of the historic, political-economic and socio-cultural context. The framework maps the complementary fit among the different actors while engaging themselves in preventive, rehabilitative and reconstructive interventions. The framework shows how the economic, diplomatic, political, criminal justice, human rights, military, health and rural development sectors can collaborate to promote peace or prevent the aggravation or continuation of violence. A deeper understanding of the association between risk and protective factors and the developmental pathways of generic, country-specific and culture-specific factors leading to political violence is needed. PMID:19883967

  10. A conceptual framework for advanced practice nursing in a pediatric tertiary care setting: the SickKids' experience.

    PubMed

    LeGrow, Karen; Hubley, Pam; McAllister, Mary

    2010-05-01

    Advanced practice nurses (APNs) at The Hospital for Sick Children (SickKids) are pediatric healthcare providers who integrate principles and theories of advanced nursing with specialty knowledge to provide autonomous, independent, accountable, ethical and developmentally appropriate care in complex, often ambiguous and rapidly changing healthcare environments. Caring for children and adolescents requires culturally sensitive and family-centred approaches to care that incorporate a unique body of knowledge. Family-centred care is an approach to planning, delivery and evaluation of healthcare that is governed by the establishment of mutually beneficial partnerships among APNs, health professionals and children/families. The cornerstone of APN practice at SickKids is the recognition of "family" as the recipients of care. By valuing and developing relationships with families, APNs promote excellence in healthcare across the care continuum to optimize the child's and family's physical, emotional, social, psychological and spiritual well-being. This paper outlines the evolution of advanced practice nursing at SickKids, beginning with the introduction of APN roles in the 1970s and culminating in the current critical mass of APNs who have been integrated throughout the hospital's infrastructure. We describe the process used to create a common vision and a framework to guide pediatric advanced nursing practice.

  11. Interprofessional team building in the palliative home care setting: Use of a conceptual framework to inform a pilot evaluation.

    PubMed

    Shaw, James; Kearney, Colleen; Glenns, Brenda; McKay, Sandra

    2016-01-01

    Home-based palliative care is increasingly dependent on interprofessional teams to deliver collaborative care that more adequately meets the needs of clients and families. The purpose of this pilot evaluation was to qualitatively explore the views of an interprofessional group of home care providers (occupational therapists, nurses, personal support work supervisors, community care coordinators, and a team coordinator) regarding a pilot project encouraging teamwork in interprofessional palliative home care services. We used qualitative methods, informed by an interprofessional conceptual framework, to analyse participants' accounts and provide recommendations regarding strategies for interprofessional team building in palliative home health care. Findings suggest that encouraging practitioners to share past experiences and foster common goals for palliative care are important elements of team building in interprofessional palliative care. Also, establishing a team leader who emphasises sharing power among team members and addressing the need for mutual emotional support may help to maximise interprofessional teamwork in palliative home care. These findings may be used to develop and test more comprehensive efforts to promote stronger interprofessional teamwork in palliative home health care delivery.

  12. Interprofessional team building in the palliative home care setting: Use of a conceptual framework to inform a pilot evaluation.

    PubMed

    Shaw, James; Kearney, Colleen; Glenns, Brenda; McKay, Sandra

    2016-01-01

    Home-based palliative care is increasingly dependent on interprofessional teams to deliver collaborative care that more adequately meets the needs of clients and families. The purpose of this pilot evaluation was to qualitatively explore the views of an interprofessional group of home care providers (occupational therapists, nurses, personal support work supervisors, community care coordinators, and a team coordinator) regarding a pilot project encouraging teamwork in interprofessional palliative home care services. We used qualitative methods, informed by an interprofessional conceptual framework, to analyse participants' accounts and provide recommendations regarding strategies for interprofessional team building in palliative home health care. Findings suggest that encouraging practitioners to share past experiences and foster common goals for palliative care are important elements of team building in interprofessional palliative care. Also, establishing a team leader who emphasises sharing power among team members and addressing the need for mutual emotional support may help to maximise interprofessional teamwork in palliative home care. These findings may be used to develop and test more comprehensive efforts to promote stronger interprofessional teamwork in palliative home health care delivery. PMID:27026192

  13. Stress distribution in fixed-partial prosthesis and peri-implant bone tissue with different framework materials and vertical misfit levels: a three-dimensional finite element analysis.

    PubMed

    Bacchi, Ataís; Consani, Rafael L X; Mesquita, Marcelo F; dos Santos, Mateus B F

    2013-09-01

    The purpose of this study was to evaluate the influence of superstructure material and vertical misfits on the stresses created in an implant-supported partial prosthesis. A three-dimensional (3-D) finite element model was prepared based on common clinical data. The posterior part of a severely resorbed jaw with two osseointegrated implants at the second premolar and second molar regions was modeled using specific modeling software (SolidWorks 2010). Finite element models were created by importing the solid model into mechanical simulation software (ANSYS Workbench 11). The models were divided into groups according to the prosthesis framework material (type IV gold alloy, silver-palladium alloy, commercially pure titanium, cobalt-chromium alloy, or zirconia) and vertical misfit level (10 µm, 50 µm, and 100 µm) created at one implant-prosthesis interface. The gap of the vertical misfit was set to be closed and the stress values were measured in the framework, porcelain veneer, retention screw, and bone tissue. Stiffer materials led to higher stress concentration in the framework and increased stress values in the retention screw, while in the same circumstances, the porcelain veneer showed lower stress values, and there was no significant difference in stress in the peri-implant bone tissue. A considerable increase in stress concentration was observed in all the structures evaluated within the misfit amplification. The framework material influenced the stress concentration in the prosthetic structures and retention screw, but not that in bone tissue. All the structures were significantly influenced by the increase in the misfit levels.

  14. Strengthening fairness, transparency and accountability in health care priority setting at district level in Tanzania

    PubMed Central

    Maluka, Stephen Oswald

    2011-01-01

    Health care systems are faced with the challenge of resource scarcity and have insufficient resources to respond to all health problems and target groups simultaneously. Hence, priority setting is an inevitable aspect of every health system. However, priority setting is complex and difficult because the process is frequently influenced by political, institutional and managerial factors that are not considered by conventional priority-setting tools. In a five-year EU-supported project, which started in 2006, ways of strengthening fairness and accountability in priority setting in district health management were studied. This review is based on a PhD thesis that aimed to analyse health care organisation and management systems, and explore the potential and challenges of implementing Accountability for Reasonableness (A4R) approach to priority setting in Tanzania. A qualitative case study in Mbarali district formed the basis of exploring the sociopolitical and institutional contexts within which health care decision making takes place. The study also explores how the A4R intervention was shaped, enabled and constrained by the contexts. Key informant interviews were conducted. Relevant documents were also gathered and group priority-setting processes in the district were observed. The study revealed that, despite the obvious national rhetoric on decentralisation, actual practice in the district involved little community participation. The assumption that devolution to local government promotes transparency, accountability and community participation, is far from reality. The study also found that while the A4R approach was perceived to be helpful in strengthening transparency, accountability and stakeholder engagement, integrating the innovation into the district health system was challenging. This study underscores the idea that greater involvement and accountability among local actors may increase the legitimacy and fairness of priority-setting decisions. A broader

  15. Strengthening fairness, transparency and accountability in health care priority setting at district level in Tanzania.

    PubMed

    Maluka, Stephen Oswald

    2011-01-01

    Health care systems are faced with the challenge of resource scarcity and have insufficient resources to respond to all health problems and target groups simultaneously. Hence, priority setting is an inevitable aspect of every health system. However, priority setting is complex and difficult because the process is frequently influenced by political, institutional and managerial factors that are not considered by conventional priority-setting tools. In a five-year EU-supported project, which started in 2006, ways of strengthening fairness and accountability in priority setting in district health management were studied. This review is based on a PhD thesis that aimed to analyse health care organisation and management systems, and explore the potential and challenges of implementing Accountability for Reasonableness (A4R) approach to priority setting in Tanzania. A qualitative case study in Mbarali district formed the basis of exploring the sociopolitical and institutional contexts within which health care decision making takes place. The study also explores how the A4R intervention was shaped, enabled and constrained by the contexts. Key informant interviews were conducted. Relevant documents were also gathered and group priority-setting processes in the district were observed. The study revealed that, despite the obvious national rhetoric on decentralisation, actual practice in the district involved little community participation. The assumption that devolution to local government promotes transparency, accountability and community participation, is far from reality. The study also found that while the A4R approach was perceived to be helpful in strengthening transparency, accountability and stakeholder engagement, integrating the innovation into the district health system was challenging. This study underscores the idea that greater involvement and accountability among local actors may increase the legitimacy and fairness of priority-setting decisions. A broader

  16. SET overexpression in HEK293 cells regulates mitochondrial uncoupling proteins levels within a mitochondrial fission/reduced autophagic flux scenario

    SciTech Connect

    Almeida, Luciana O.; Goto, Renata N.; Neto, Marinaldo P.C.; Sousa, Lucas O.; Curti, Carlos; Leopoldino, Andréia M.

    2015-03-06

    We hypothesized that SET, a protein accumulated in some cancer types and Alzheimer disease, is involved in cell death through mitochondrial mechanisms. We addressed the mRNA and protein levels of the mitochondrial uncoupling proteins UCP1, UCP2 and UCP3 (S and L isoforms) by quantitative real-time PCR and immunofluorescence as well as other mitochondrial involvements, in HEK293 cells overexpressing the SET protein (HEK293/SET), either in the presence or absence of oxidative stress induced by the pro-oxidant t-butyl hydroperoxide (t-BHP). SET overexpression in HEK293 cells decreased UCP1 and increased UCP2 and UCP3 (S/L) mRNA and protein levels, whilst also preventing lipid peroxidation and decreasing the content of cellular ATP. SET overexpression also (i) decreased the area of mitochondria and increased the number of organelles and lysosomes, (ii) increased mitochondrial fission, as demonstrated by increased FIS1 mRNA and FIS-1 protein levels, an apparent accumulation of DRP-1 protein, and an increase in the VDAC protein level, and (iii) reduced autophagic flux, as demonstrated by a decrease in LC3B lipidation (LC3B-II) in the presence of chloroquine. Therefore, SET overexpression in HEK293 cells promotes mitochondrial fission and reduces autophagic flux in apparent association with up-regulation of UCP2 and UCP3; this implies a potential involvement in cellular processes that are deregulated such as in Alzheimer's disease and cancer. - Highlights: • SET, UCPs and autophagy prevention are correlated. • SET action has mitochondrial involvement. • UCP2/3 may reduce ROS and prevent autophagy. • SET protects cell from ROS via UCP2/3.

  17. SET overexpression in HEK293 cells regulates mitochondrial uncoupling proteins levels within a mitochondrial fission/reduced autophagic flux scenario.

    PubMed

    Almeida, Luciana O; Goto, Renata N; Neto, Marinaldo P C; Sousa, Lucas O; Curti, Carlos; Leopoldino, Andréia M

    2015-03-01

    We hypothesized that SET, a protein accumulated in some cancer types and Alzheimer disease, is involved in cell death through mitochondrial mechanisms. We addressed the mRNA and protein levels of the mitochondrial uncoupling proteins UCP1, UCP2 and UCP3 (S and L isoforms) by quantitative real-time PCR and immunofluorescence as well as other mitochondrial involvements, in HEK293 cells overexpressing the SET protein (HEK293/SET), either in the presence or absence of oxidative stress induced by the pro-oxidant t-butyl hydroperoxide (t-BHP). SET overexpression in HEK293 cells decreased UCP1 and increased UCP2 and UCP3 (S/L) mRNA and protein levels, whilst also preventing lipid peroxidation and decreasing the content of cellular ATP. SET overexpression also (i) decreased the area of mitochondria and increased the number of organelles and lysosomes, (ii) increased mitochondrial fission, as demonstrated by increased FIS1 mRNA and FIS-1 protein levels, an apparent accumulation of DRP-1 protein, and an increase in the VDAC protein level, and (iii) reduced autophagic flux, as demonstrated by a decrease in LC3B lipidation (LC3B-II) in the presence of chloroquine. Therefore, SET overexpression in HEK293 cells promotes mitochondrial fission and reduces autophagic flux in apparent association with up-regulation of UCP2 and UCP3; this implies a potential involvement in cellular processes that are deregulated such as in Alzheimer's disease and cancer.

  18. Optimal Design in Three-Level Block Randomized Designs with Two Levels of Nesting: An ANOVA Framework with Random Effects

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2013-01-01

    Large-scale experiments that involve nested structures may assign treatment conditions either to subgroups such as classrooms or to individuals such as students within subgroups. Key aspects of the design of such experiments include knowledge of the variance structure in higher levels and the sample sizes necessary to reach sufficient power to…

  19. Comparing Panelists' Understanding of Standard Setting across Multiple Levels of an Alternate Science Assessment

    ERIC Educational Resources Information Center

    Hansen, Mary A.; Lyon, Steven R.; Heh, Peter; Zigmond, Naomi

    2013-01-01

    Large-scale assessment programs, including alternate assessments based on alternate achievement standards (AA-AAS), must provide evidence of technical quality and validity. This study provides information about the technical quality of one AA-AAS by evaluating the standard setting for the science component. The assessment was designed to have…

  20. Dynamic-thresholding level set: a novel computer-aided volumetry method for liver tumors in hepatic CT images

    NASA Astrophysics Data System (ADS)

    Cai, Wenli; Yoshida, Hiroyuki; Harris, Gordon J.

    2007-03-01

    Measurement of the volume of focal liver tumors, called liver tumor volumetry, is indispensable for assessing the growth of tumors and for monitoring the response of tumors to oncology treatments. Traditional edge models, such as the maximum gradient and zero-crossing methods, often fail to detect the accurate boundary of a fuzzy object such as a liver tumor. As a result, the computerized volumetry based on these edge models tends to differ from manual segmentation results performed by physicians. In this study, we developed a novel computerized volumetry method for fuzzy objects, called dynamic-thresholding level set (DT level set). An optimal threshold value computed from a histogram tends to shift, relative to the theoretical threshold value obtained from a normal distribution model, toward a smaller region in the histogram. We thus designed a mobile shell structure, called a propagating shell, which is a thick region encompassing the level set front. The optimal threshold calculated from the histogram of the shell drives the level set front toward the boundary of a liver tumor. When the volume ratio between the object and the background in the shell approaches one, the optimal threshold value best fits the theoretical threshold value and the shell stops propagating. Application of the DT level set to 26 hepatic CT cases with 63 biopsy-confirmed hepatocellular carcinomas (HCCs) and metastases showed that the computer measured volumes were highly correlated with those of tumors measured manually by physicians. Our preliminary results showed that DT level set was effective and accurate in estimating the volumes of liver tumors detected in hepatic CT images.

  1. Contextual Effects in an Educational Setting: An Example of Level Three Research.

    ERIC Educational Resources Information Center

    Sears, Constance; Husak, William S.

    A systematic three-level ("Level 3") approach to research in the motor behavior area was used to investigate the influence of varying degrees of contextual interference in the acquisition of volleyball serving skills. One hundred and twenty-eight middle school subjects learned three volleyball serves during a 3-week long unit in a physical…

  2. Setting Us Free? Building Meaningful Models of Progression for a "Post-Levels" World

    ERIC Educational Resources Information Center

    Ford, Alex

    2014-01-01

    Alex Ford was thrilled by the prospect of freedom offered to history departments in England by the abolition of level descriptions within the National Curriculum. After analysing the range of competing purposes that the level descriptions were previously forced to serve, Ford argues that the three distinct tasks of measuring current attainment,…

  3. A framework for leveling informatics content across four years of a Bachelor of Science in Nursing (BSN) curriculum.

    PubMed

    Frisch, Noreen; Borycki, Elizabeth

    2013-01-01

    While there are several published statements of nursing informatics competencies needed for the Bachelor of Science in nursing (BSN) graduate, faculty at schools of nursing has little guidance on how to incorporate the teaching of such competencies into curricula that are already overloaded with required content. The authors present a framework for addressing nursing informatics content within teaching plans that already exist in virtually all BSN programs. The framework is based on an organization of curriculum content that moves the learner from elementary to complex nursing concepts and ideas as a means to level the content. Further, the framework is organized around four broad content areas included in all curricula: professional responsibility, care delivery, community and population-based nursing, and leadership/management. Examples of informatics content to be addressed at each level and content area are provided. Lastly a practice-appraisal tool, the UVIC Informatics Practice Appraisal - BSN is presented as a means to track student learning and outcomes across the four years of a BSN program.

  4. Establishing optimal project-level strategies for pavement maintenance and rehabilitation - A framework and case study

    NASA Astrophysics Data System (ADS)

    Irfan, Muhammad; Bilal Khurshid, Muhammad; Bai, Qiang; Labi, Samuel; Morin, Thomas L.

    2012-05-01

    This article presents a framework and an illustrative example for identifying the optimal pavement maintenance and rehabilitation (M&R) strategy using a mixed-integer nonlinear programming model. The objective function is to maximize the cost-effectiveness expressed as the ratio of the effectiveness to the cost. The constraints for the optimization problem are related to performance, budget, and choice. Two different formulations of effectiveness are derived using treatment-specific performance models for each constituent treatment of the strategy; and cost is expressed in terms of the agency and user costs over the life cycle. The proposed methodology is demonstrated using a case study. Probability distributions are established for the optimization input variables and Monte Carlo simulations are carried out to yield optimal solutions. Using the results of these simulations, M&R strategy contours are developed as a novel tool that can help pavement managers quickly identify the optimal M&R strategy for a given pavement section.

  5. CMS software architecture. Software framework, services and persistency in high level trigger, reconstruction and analysis

    NASA Astrophysics Data System (ADS)

    Innocente, V.; Silvestris, L.; Stickland, D.; CMS Software Group

    2001-10-01

    This paper describes the design of a resilient and flexible software architecture that has been developed to satisfy the data processing requirements of a large HEP experiment, CMS, currently being constructed at the LHC machine at CERN. We describe various components of a software framework that allows integration of physics modules and which can be easily adapted for use in different processing environments both real-time (online trigger) and offline (event reconstruction and analysis). Features such as the mechanisms for scheduling algorithms, configuring the application and managing the dependences among modules are described in detail. In particular, a major effort has been placed on providing a service for managing persistent data and the experience using a commercial ODBMS (Objectivity/DB) is therefore described in detail.

  6. Robust Framework to Combine Diverse Classifiers Assigning Distributed Confidence to Individual Classifiers at Class Level

    PubMed Central

    Arshad, Sannia; Rho, Seungmin

    2014-01-01

    We have presented a classification framework that combines multiple heterogeneous classifiers in the presence of class label noise. An extension of m-Mediods based modeling is presented that generates model of various classes whilst identifying and filtering noisy training data. This noise free data is further used to learn model for other classifiers such as GMM and SVM. A weight learning method is then introduced to learn weights on each class for different classifiers to construct an ensemble. For this purpose, we applied genetic algorithm to search for an optimal weight vector on which classifier ensemble is expected to give the best accuracy. The proposed approach is evaluated on variety of real life datasets. It is also compared with existing standard ensemble techniques such as Adaboost, Bagging, and Random Subspace Methods. Experimental results show the superiority of proposed ensemble method as compared to its competitors, especially in the presence of class label noise and imbalance classes. PMID:25295302

  7. A Competency-Based Framework for Graduate-Level Health Educators.

    ERIC Educational Resources Information Center

    American Alliance for Health, Physical Education, Recreation and Dance, Reston, VA. American Association for Health Education.

    This document builds on the 1997 Standards for the Preparation of Graduate-Level Health Educators by including expanded content descriptors and objectives for the graduate level competencies. It begins by discussing evolution of health education competencies; chronology of the graduate standard development process; benefits of graduate level…

  8. Homelessness Outcome Reporting Normative Framework: Systems-Level Evaluation of Progress in Ending Homelessness

    ERIC Educational Resources Information Center

    Austen, Tyrone; Pauly, Bernie

    2012-01-01

    Homelessness is a serious and growing issue. Evaluations of systemic-level changes are needed to determine progress in reducing or ending homelessness. The report card methodology is one means of systems-level assessment. Rather than solely establishing an enumeration, homelessness report cards can capture pertinent information about structural…

  9. Embracing a Common Focus: A Framework for Middle Level Teacher Preparation

    ERIC Educational Resources Information Center

    Faulkner, Shawn A.; Howell, Penny B.; Cook, Chris M.

    2013-01-01

    As more and more states make a commitment to specialized middle level teacher preparation, teacher education programs across the country must make the necessary adjustments to ensure middle level teachers are prepared to be successful. Unfortunately, individual state and institutional requirements often make this challenging and can result in…

  10. Effects of facility developments and encounter levels on perceptions of settings, crowding, and norms in a korean park.

    PubMed

    Kim, Sang-Oh; Shelby, Bo; Needham, Mark D

    2014-02-01

    This article examines potential effects of two physical developments (presence or absence of an aerial tramway, a road vs. a trail) and one social variable (increasing encounters with other people) on individuals' perceptions of settings (i.e., perceived settings), crowding, and acceptance of encounters (i.e., norms) in Mudeungsan Provincial Park in South Korea, where there have been proposals for a new aerial tramway. Data were obtained from 241 students at Chonnam National University, almost all of whom had previously visited this park (e.g., 66 % visited at least one of the two study locations in this park, 55 % visited this park in the past 12 months). Simulated photographs showed encounter levels (1 or 15 hikers), the presence or absence of a tramway, and a road versus a trail. Respondents encountering low numbers of other people felt less crowded, considered these use levels to be more acceptable, and perceived the area as more pristine and less developed. Locations containing an aerial tramway were perceived as more developed and less natural, and higher encounter levels were considered to be more acceptable at these locations. Whether settings contained a road or a trail did not influence perceived settings, crowding, or norms. Implications of these findings for future research and management of parks and related outdoor settings are discussed.

  11. Effects of Facility Developments and Encounter Levels on Perceptions of Settings, Crowding, and Norms in a Korean Park

    NASA Astrophysics Data System (ADS)

    Kim, Sang-Oh; Shelby, Bo; Needham, Mark D.

    2014-02-01

    This article examines potential effects of two physical developments (presence or absence of an aerial tramway, a road vs. a trail) and one social variable (increasing encounters with other people) on individuals' perceptions of settings (i.e., perceived settings), crowding, and acceptance of encounters (i.e., norms) in Mudeungsan Provincial Park in South Korea, where there have been proposals for a new aerial tramway. Data were obtained from 241 students at Chonnam National University, almost all of whom had previously visited this park (e.g., 66 % visited at least one of the two study locations in this park, 55 % visited this park in the past 12 months). Simulated photographs showed encounter levels (1 or 15 hikers), the presence or absence of a tramway, and a road versus a trail. Respondents encountering low numbers of other people felt less crowded, considered these use levels to be more acceptable, and perceived the area as more pristine and less developed. Locations containing an aerial tramway were perceived as more developed and less natural, and higher encounter levels were considered to be more acceptable at these locations. Whether settings contained a road or a trail did not influence perceived settings, crowding, or norms. Implications of these findings for future research and management of parks and related outdoor settings are discussed.

  12. Wave energy level and geographic setting correlate with Florida beach water quality.

    PubMed

    Feng, Zhixuan; Reniers, Ad; Haus, Brian K; Solo-Gabriele, Helena M; Kelly, Elizabeth A

    2016-03-15

    Many recreational beaches suffer from elevated levels of microorganisms, resulting in beach advisories and closures due to lack of compliance with Environmental Protection Agency guidelines. We conducted the first statewide beach water quality assessment by analyzing decadal records of fecal indicator bacteria (enterococci and fecal coliform) levels at 262 Florida beaches. The objectives were to depict synoptic patterns of beach water quality exceedance along the entire Florida shoreline and to evaluate their relationships with wave condition and geographic location. Percent exceedances based on enterococci and fecal coliform were negatively correlated with both long-term mean wave energy and beach slope. Also, Gulf of Mexico beaches exceeded the thresholds significantly more than Atlantic Ocean ones, perhaps partially due to the lower wave energy. A possible linkage between wave energy level and water quality is beach sand, a pervasive nonpoint source that tends to harbor more bacteria in the low-wave-energy environment.

  13. Wave energy level and geographic setting correlate with Florida beach water quality.

    PubMed

    Feng, Zhixuan; Reniers, Ad; Haus, Brian K; Solo-Gabriele, Helena M; Kelly, Elizabeth A

    2016-03-15

    Many recreational beaches suffer from elevated levels of microorganisms, resulting in beach advisories and closures due to lack of compliance with Environmental Protection Agency guidelines. We conducted the first statewide beach water quality assessment by analyzing decadal records of fecal indicator bacteria (enterococci and fecal coliform) levels at 262 Florida beaches. The objectives were to depict synoptic patterns of beach water quality exceedance along the entire Florida shoreline and to evaluate their relationships with wave condition and geographic location. Percent exceedances based on enterococci and fecal coliform were negatively correlated with both long-term mean wave energy and beach slope. Also, Gulf of Mexico beaches exceeded the thresholds significantly more than Atlantic Ocean ones, perhaps partially due to the lower wave energy. A possible linkage between wave energy level and water quality is beach sand, a pervasive nonpoint source that tends to harbor more bacteria in the low-wave-energy environment. PMID:26892203

  14. Sparse field level set method for non-convex Hamiltonians in 3D plasma etching profile simulations

    NASA Astrophysics Data System (ADS)

    Radjenović, Branislav; Lee, Jae Koo; Radmilović-Radjenović, Marija

    2006-01-01

    Level set method [S. Osher, J. Sethian, J. Comput. Phys. 79 (1988) 12] is a highly robust and accurate computational technique for tracking moving interfaces in various application domains. It originates from the idea to view the moving front as a particular level set of a higher dimensional function, so the topological merging and breaking, sharp gradients and cusps can form naturally, and the effects of curvature can be easily incorporated. The resulting equations, describing interface surface evolution, are of Hamilton-Jacobi type and they are solved using techniques developed for hyperbolic equations. In this paper we describe an extension of the sparse field method for solving level set equations in the case of non-convex Hamiltonians, which are common in the simulations of the profile surface evolution during plasma etching and deposition processes. Sparse field method itself, developed by Whitaker [R. Whitaker, Internat. J. Comput. Vision 29 (3) (1998) 203] and broadly used in image processing community, is an alternative to the usual combination of narrow band and fast marching procedures for the computationally effective solving of level set equations. The developed procedure is applied to the simulations of 3D feature profile surface evolution during plasma etching process, that include the effects of ion enhanced chemical etching and physical sputtering, which are the primary causes of the Hamiltonian non-convexity.

  15. The Daily Events and Emotions of Master's-Level Family Therapy Trainees in Off-Campus Practicum Settings

    ERIC Educational Resources Information Center

    Edwards, Todd M.; Patterson, Jo Ellen

    2012-01-01

    The Day Reconstruction Method (DRM) was used to assess the daily events and emotions of one program's master's-level family therapy trainees in off-campus practicum settings. This study examines the DRM reports of 35 family therapy trainees in the second year of their master's program in marriage and family therapy. Four themes emerged from the…

  16. An on-line learning tracking of non-rigid target combining multiple-instance boosting and level set

    NASA Astrophysics Data System (ADS)

    Chen, Mingming; Cai, Jingju

    2013-10-01

    Visual tracking algorithms based on online boosting generally use a rectangular bounding box to represent the position of the target, while actually the shape of the target is always irregular. This will cause the classifier to learn the features of the non-target parts in the rectangle region, thereby the performance of the classifier is reduced, and drift would happen. To avoid the limitations of the bounding-box, we propose a novel tracking-by-detection algorithm involving the level set segmentation, which ensures the classifier only learn the features of the real target area in the tracking box. Because the shape of the target only changes a little between two adjacent frames and the current level set algorithm can avoid the re-initialization of the signed distance function, it only takes a few iterations to converge to the position of the target contour in the next frame. We also make some improvement on the level set energy function so that the zero level set would have less possible to converge to the false contour. In addition, we use gradient boost to improve the original multi-instance learning (MIL) algorithm like the WMILtracker, which greatly speed up the tracker. Our algorithm outperforms the original MILtracker both on speed and precision. Compared with the WMILtracker, our algorithm runs at a almost same speed, but we can avoid the drift caused by background learning, so the precision is better.

  17. Simulation of Heterogeneous Atom Probe Tip Shapes Evolution during Field Evaporation Using a Level Set Method and Different Evaporation Models

    SciTech Connect

    Xu, Zhijie; Li, Dongsheng; Xu, Wei; Devaraj, Arun; Colby, Robert J.; Thevuthasan, Suntharampillai; Geiser, B. P.; Larson, David J.

    2015-04-01

    In atom probe tomography (APT), accurate reconstruction of the spatial positions of field evaporated ions from measured detector patterns depends upon a correct understanding of the dynamic tip shape evolution and evaporation laws of component atoms. Artifacts in APT reconstructions of heterogeneous materials can be attributed to the assumption of homogeneous evaporation of all the elements in the material in addition to the assumption of a steady state hemispherical dynamic tip shape evolution. A level set method based specimen shape evolution model is developed in this study to simulate the evaporation of synthetic layered-structured APT tips. The simulation results of the shape evolution by the level set model qualitatively agree with the finite element method and the literature data using the finite difference method. The asymmetric evolving shape predicted by the level set model demonstrates the complex evaporation behavior of heterogeneous tip and the interface curvature can potentially lead to the artifacts in the APT reconstruction of such materials. Compared with other APT simulation methods, the new method provides smoother interface representation with the aid of the intrinsic sub-grid accuracy. Two evaporation models (linear and exponential evaporation laws) are implemented in the level set simulations and the effect of evaporation laws on the tip shape evolution is also presented.

  18. The Integrated Behavioural Model for Water, Sanitation, and Hygiene: a systematic review of behavioural models and a framework for designing and evaluating behaviour change interventions in infrastructure-restricted settings

    PubMed Central

    2013-01-01

    Background Promotion and provision of low-cost technologies that enable improved water, sanitation, and hygiene (WASH) practices are seen as viable solutions for reducing high rates of morbidity and mortality due to enteric illnesses in low-income countries. A number of theoretical models, explanatory frameworks, and decision-making models have emerged which attempt to guide behaviour change interventions related to WASH. The design and evaluation of such interventions would benefit from a synthesis of this body of theory informing WASH behaviour change and maintenance. Methods We completed a systematic review of existing models and frameworks through a search of related articles available in PubMed and in the grey literature. Information on the organization of behavioural determinants was extracted from the references that fulfilled the selection criteria and synthesized. Results from this synthesis were combined with other relevant literature, and from feedback through concurrent formative and pilot research conducted in the context of two cluster-randomized trials on the efficacy of WASH behaviour change interventions to inform the development of a framework to guide the development and evaluation of WASH interventions: the Integrated Behavioural Model for Water, Sanitation, and Hygiene (IBM-WASH). Results We identified 15 WASH-specific theoretical models, behaviour change frameworks, or programmatic models, of which 9 addressed our review questions. Existing models under-represented the potential role of technology in influencing behavioural outcomes, focused on individual-level behavioural determinants, and had largely ignored the role of the physical and natural environment. IBM-WASH attempts to correct this by acknowledging three dimensions (Contextual Factors, Psychosocial Factors, and Technology Factors) that operate on five-levels (structural, community, household, individual, and habitual). Conclusions A number of WASH-specific models and frameworks

  19. High performance in healthcare priority setting and resource allocation: A literature- and case study-based framework in the Canadian context.

    PubMed

    Smith, Neale; Mitton, Craig; Hall, William; Bryan, Stirling; Donaldson, Cam; Peacock, Stuart; Gibson, Jennifer L; Urquhart, Bonnie

    2016-08-01

    Priority setting and resource allocation, or PSRA, are key functions of executive teams in healthcare organizations. Yet decision-makers often base their choices on historical patterns of resource distribution or political pressures. Our aim was to provide leaders with guidance on how to improve PSRA practice, by creating organizational contexts which enable high performance. We carried out in-depth case studies of six Canadian healthcare organizations to obtain from healthcare leaders their understanding of the concept of high performance in PSRA and the factors which contribute to its achievement. Individual and group interviews were carried out (n = 62) with senior managers, middle managers and Board members. Site observations and document review were used to assist researchers in interpreting the interview data. Qualitative data were analyzed iteratively with the literature on empirical examples of PSRA practice, in order to develop a framework of high performance in PSRA. The framework consists of four domains - structures, processes, attitudes and behaviours, and outcomes - within which are 19 specific elements. The emergent themes derive from case studies in different kinds of health organizations (urban/rural, small/large) across Canada. The elements can serve as a checklist for 'high performance' in PSRA. This framework provides a means by which decision-makers in healthcare might assess their practice and identify key areas for improvement. The findings are likely generalizable, certainly within Canada but also across countries. This work constitutes, to our knowledge, the first attempt to present a full package of elements comprising high performance in health care PSRA.

  20. High performance in healthcare priority setting and resource allocation: A literature- and case study-based framework in the Canadian context.

    PubMed

    Smith, Neale; Mitton, Craig; Hall, William; Bryan, Stirling; Donaldson, Cam; Peacock, Stuart; Gibson, Jennifer L; Urquhart, Bonnie

    2016-08-01

    Priority setting and resource allocation, or PSRA, are key functions of executive teams in healthcare organizations. Yet decision-makers often base their choices on historical patterns of resource distribution or political pressures. Our aim was to provide leaders with guidance on how to improve PSRA practice, by creating organizational contexts which enable high performance. We carried out in-depth case studies of six Canadian healthcare organizations to obtain from healthcare leaders their understanding of the concept of high performance in PSRA and the factors which contribute to its achievement. Individual and group interviews were carried out (n = 62) with senior managers, middle managers and Board members. Site observations and document review were used to assist researchers in interpreting the interview data. Qualitative data were analyzed iteratively with the literature on empirical examples of PSRA practice, in order to develop a framework of high performance in PSRA. The framework consists of four domains - structures, processes, attitudes and behaviours, and outcomes - within which are 19 specific elements. The emergent themes derive from case studies in different kinds of health organizations (urban/rural, small/large) across Canada. The elements can serve as a checklist for 'high performance' in PSRA. This framework provides a means by which decision-makers in healthcare might assess their practice and identify key areas for improvement. The findings are likely generalizable, certainly within Canada but also across countries. This work constitutes, to our knowledge, the first attempt to present a full package of elements comprising high performance in health care PSRA. PMID:27367899

  1. A framework for the recognition of high-level surgical tasks from video images for cataract surgeries.

    PubMed

    Lalys, F; Riffaud, L; Bouget, D; Jannin, P

    2012-04-01

    The need for a better integration of the new generation of computer-assisted-surgical systems has been recently emphasized. One necessity to achieve this objective is to retrieve data from the operating room (OR) with different sensors, then to derive models from these data. Recently, the use of videos from cameras in the OR has demonstrated its efficiency. In this paper, we propose a framework to assist in the development of systems for the automatic recognition of high-level surgical tasks using microscope videos analysis. We validated its use on cataract procedures. The idea is to combine state-of-the-art computer vision techniques with time series analysis. The first step of the framework consisted in the definition of several visual cues for extracting semantic information, therefore, characterizing each frame of the video. Five different pieces of image-based classifiers were, therefore, implemented. A step of pupil segmentation was also applied for dedicated visual cue detection. Time series classification algorithms were then applied to model time-varying data. Dynamic time warping and hidden Markov models were tested. This association combined the advantages of all methods for better understanding of the problem. The framework was finally validated through various studies. Six binary visual cues were chosen along with 12 phases to detect, obtaining accuracies of 94%. PMID:22203700

  2. The health literacy skills framework.

    PubMed

    Squiers, Linda; Peinado, Susana; Berkman, Nancy; Boudewyns, Vanessa; McCormack, Lauren

    2012-01-01

    Although there are a variety of models and frameworks that describe factors that are associated with health literacy skills, few illustrate the full pathway from development and moderators of health literacy skills, their application, and the outcomes that result all in one framework or model. This article introduces the Health Literacy Skills conceptual framework that does encompass this full continuum. To develop the framework, the authors reviewed and built upon existing health literacy frameworks. The Health Literacy Skills framework hypothesizes the relations between health literacy and health-related outcomes and depicts how health literacy functions at the level of the individual. The framework also reflects how factors external to the individual (e.g., family, setting, community, culture, and media) influence the constructs and relations represented in the framework. The framework is organized into 4 primary components: (a) factors that influence the development and use of health literacy skills; (b) health-related stimuli; (c) health literacy skills needed to comprehend the stimulus and perform the task; and (d) mediators between health literacy and health outcomes. Previous theoretical frameworks lend support to the proposed causal pathways it illustrates. The authors hope this conceptual framework can serve as a springboard for further discussion and advancement in operationalizing this complex construct. The Health Literacy Skills framework could also be used to guide the development of interventions to improve health literacy. Future research should be conducted to fully test the relations in the framework.

  3. Connected Functional Working Spaces: A Framework for the Teaching and Learning of Functions at Upper Secondary Level

    ERIC Educational Resources Information Center

    Minh, Tran Kiem; Lagrange, Jean-Baptiste

    2016-01-01

    This paper aims at contributing to remedy the narrow treatment of functions at upper secondary level. Assuming that students make sense of functions by working on functional situations in distinctive settings, we propose to consider functional working spaces inspired by geometrical working spaces. We analyse a classroom situation based on a…

  4. Levels of Cognitive Complexity: A Framework for the Measurement of Thinking.

    ERIC Educational Resources Information Center

    McDaniel, Ernest

    Some theoretical background is presented for the proposition that thinking processes can be measured by determining the levels of cognitive complexity apparent in written interpretations of complex situations. The rationale for scoring interpretations is presented, and some illustrative data are discussed. The approach to measurement of thinking…

  5. Optimal Sampling of Units in Three-Level Cluster Randomized Designs: An Ancova Framework

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2011-01-01

    Field experiments with nested structures assign entire groups such as schools to treatment and control conditions. Key aspects of such cluster randomized experiments include knowledge of the intraclass correlation structure and the sample sizes necessary to achieve adequate power to detect the treatment effect. The units at each level of the…

  6. Optical coupling devices to a broadband low level laser therapy set

    NASA Astrophysics Data System (ADS)

    Gryko, L.; Zajac, A.

    2011-10-01

    Precise knowledge of the spatial distributions of optical radiation in the biological medium is required in all cases of medical laser procedures, but for the low-energy interactions influencing the course of photochemical processes (biostimulation treatments) has not yet been precisely controlled. The variety of procedures and results of the trials will mobilize to look for unequivocal parameters of laser radiation, which both in vitro and in vivo will result in acceleration of cell proliferation and the expected therapeutic efficacy. There is a need to conduct objective diagnostic tests of tissues during treatment using a laser measuring system analyzing the status of the tissue (its optical properties) during therapeutic exposition. It is necessary to build an illuminator providing homogeneous distribution of spectral power density and spatial power density on the surface of the test. An illumination set is composed of a collection of over a dozen diodes LED emitting in therapeutic window of biological tissue (range 600-1000 nm). In this paper are presented the optical couplers enable the implementation of this purpose - conical coupler and MM planar fiber.

  7. Framework for DOE mixed low-level waste disposal: Site fact sheets

    SciTech Connect

    Gruebel, M.M.; Waters, R.D.; Hospelhorn, M.B.; Chu, M.S.Y.

    1994-11-01

    The Department of Energy (DOE) is required to prepare and submit Site Treatment Plans (STPS) pursuant to the Federal Facility Compliance Act (FFCAct). Although the FFCAct does not require that disposal be addressed in the STPS, the DOE and the States recognize that treatment of mixed low-level waste will result in residues that will require disposal in either low-level waste or mixed low-level waste disposal facilities. As a result, the DOE is working with the States to define and develop a process for evaluating disposal-site suitability in concert with the FFCAct and development of the STPS. Forty-nine potential disposal sites were screened; preliminary screening criteria reduced the number of sites for consideration to twenty-six. The DOE then prepared fact sheets for the remaining sites. These fact sheets provided additional site-specific information for understanding the strengths and weaknesses of the twenty-six sites as potential disposal sites. The information also provided the basis for discussion among affected States and the DOE in recommending sites for more detailed evaluation.

  8. Synthesis of magnetic framework composites for the discrimination of Escherichia coli at the strain level.

    PubMed

    Wei, Ji-Ping; Qiao, Bin; Song, Wen-Jun; Chen, Tao; li, Fei; Li, Bo-Zhi; Wang, Jin; Han, Ye; Huang, Yan-Feng; Zhou, Zhi-Jiang

    2015-04-01

    Rapid and efficient characterization and identification of pathogens at the strain level is of key importance for epidemiologic investigations, which still remains a challenge. In this work, solvothermically Fe3O4-COOH@MIL-101 composites were fabricated by in situ crystallization approach. The composites combine the excellent properties of both chromium (III) terephthalate (MIL-101) and carboxylic-functionalized magnetite (Fe3O4-COOH) particles and possess the efficient peptides/proteins enrichment properties and magnetic responsiveness. Fe3O4-COOH@MIL-101 composites as magnetic solid phase extraction materials were used to increase the discriminatory power of MALDI-TOF MS profiles. BSA tryptic peptides at a low concentration of 0.25 fmol μL(-1) could be detected by MALDI-TOF MS. In addition, Fe3O4-COOH@MIL-101 composites were successfully applied in the selective enrichment of the protein biomarkers from bacterial cell lysates and discrimination of Escherichia coli at the strain level. This work provides the possibility for wide applications of magnetic MOFs to discriminate pathogens below the species level. PMID:25813232

  9. VIALS: An Eulerian tool based on total variation and the level set method for studying dynamical systems

    NASA Astrophysics Data System (ADS)

    You, Guoqiao; Leung, Shingyu

    2014-06-01

    We propose a new Eulerian tool to study complicated dynamical systems based on the average growth in the surface area of a family of level surfaces represented implicitly by a level set function. Since this proposed quantity determines the temporal variation of the averaged surface area of all level surfaces, we name the quantity the Variation of the Integral over Area of Level Surfaces (VIALS). Numerically, all these infinitely many level surfaces are advected according to the given dynamics by solving one single linear advection equation. To develop a computationally efficient approach, we apply the coarea formula and rewrite the surface area integral as a simple integral relating the total variation (TV) of the level set function. The proposed method can be easily incorporated with a recent Eulerian algorithm for efficient computation of flow maps to speed up our approach. We will also prove that the proposed VIALS is closely related to the computation of the finite time Lyapunov exponent (FTLE) in the Lagrangian coherent structure (LCS) extraction. This connects our proposed Eulerian approach to widely used Lagrangian techniques for understanding complicated dynamical systems.

  10. Towards tributyltin quantification in natural water at the Environmental Quality Standard level required by the Water Framework Directive.

    PubMed

    Alasonati, Enrica; Fettig, Ina; Richter, Janine; Philipp, Rosemarie; Milačič, Radmila; Sčančar, Janez; Zuliani, Tea; Tunç, Murat; Bilsel, Mine; Gören, Ahmet Ceyhan; Fisicaro, Paola

    2016-11-01

    The European Union (EU) has included tributyltin (TBT) and its compounds in the list of priority water pollutants. Quality standards demanded by the EU Water Framework Directive (WFD) require determination of TBT at so low concentration level that chemical analysis is still difficult and further research is needed to improve the sensitivity, the accuracy and the precision of existing methodologies. Within the frame of a joint research project "Traceable measurements for monitoring critical pollutants under the European Water Framework Directive" in the European Metrology Research Programme (EMRP), four metrological and designated institutes have developed a primary method to quantify TBT in natural water using liquid-liquid extraction (LLE) and species-specific isotope dilution mass spectrometry (SSIDMS). The procedure has been validated at the Environmental Quality Standard (EQS) level (0.2ngL(-1) as cation) and at the WFD-required limit of quantification (LOQ) (0.06ngL(-1) as cation). The LOQ of the methodology was 0.06ngL(-1) and the average measurement uncertainty at the LOQ was 36%, which agreed with WFD requirements. The analytical difficulties of the method, namely the presence of TBT in blanks and the sources of measurement uncertainties, as well as the interlaboratory comparison results are discussed in detail. PMID:27591644

  11. Instant visual detection of picogram levels of trinitrotoluene by using luminescent metal-organic framework gel-coated filter paper.

    PubMed

    Lee, Ji Ha; Kang, Sunwoo; Lee, Jin Yong; Jaworski, Justyn; Jung, Jong Hwa

    2013-12-01

    There is an ongoing need for explosive detection strategies to uncover threats to human security including illegal transport and terrorist activities. The widespread military use of the explosive trinitrotoluene (TNT) for landmines poses another particular threat to human health in the form of contamination of the surrounding environment and groundwater. The detection of explosives, particularly at low picogram levels, by using a molecular sensor is seen as an important challenge. Herein, we report on the use of a fluorescent metal-organic framework hydrogel that exhibits a higher detection capability for TNT in the gel state compared with that in the solution state. A portable sensor prepared from filter paper coated by the hydrogel was able to detect TNT at the picogram level with a detection limit of 1.82 ppt (parts per trillon). Our results present a simple and new means to provide selective detection of TNT on a surface or in aqueous solution, as afforded by the unique molecular packing through the metal-organic framework structure in the gel formation and the associated photophysical properties. Furthermore, the rheological properties of the MOF-based gel were similar to those of a typical hydrogel.

  12. Using the World Health Organization's 4S-Framework to Strengthen National Strategies, Policies and Services to Address Mental Health Problems in Adolescents in Resource-Constrained Settings

    PubMed Central

    2011-01-01

    Background Most adolescents live in resource-constrained countries and their mental health has been less well recognised than other aspects of their health. The World Health Organization's 4-S Framework provides a structure for national initiatives to improve adolescent health through: gathering and using strategic information; developing evidence-informed policies; scaling up provision and use of health services; and strengthening linkages with other government sectors. The aim of this paper is to discuss how the findings of a recent systematic review of mental health problems in adolescents in resource-constrained settings might be applied using the 4-S Framework. Method Analysis of the implications of the findings of a systematic search of the English-language literature for national strategies, policies, services and cross-sectoral linkages to improve the mental health of adolescents in resource-constrained settings. Results Data are available for only 33/112 [29%] resource-constrained countries, but in all where data are available, non-psychotic mental health problems in adolescents are identifiable, prevalent and associated with reduced quality of life, impaired participation and compromised development. In the absence of evidence about effective interventions in these settings expert opinion is that a broad public policy response which addresses direct strategies for prevention, early intervention and treatment; health service and health workforce requirements; social inclusion of marginalised groups of adolescents; and specific education is required. Specific endorsed strategies include public education, parent education, training for teachers and primary healthcare workers, psycho-educational curricula, identification through periodic screening of the most vulnerable and referral for care, and the availability of counsellors or other identified trained staff members in schools from whom adolescents can seek assistance for personal, peer and family

  13. Exploring a morphodynamic modeling framework for reef island evolution under sea-level rise

    NASA Astrophysics Data System (ADS)

    Lorenzo Trueba, J.; Ashton, A. D.; Donnelly, J. P.

    2013-12-01

    Global sea-level rise rates have increased over the last century, with dramatic rate increases expected over the coming century and beyond. Not only are rates projected to approach those of the previous deglaciation, the actual increase in elevation by the end of the century (potentially 1m or more) will be significant in terms of the elevations of low-lying coastal landforms. Coral reef islands, often called 'cays' or 'motus', which generally comprise the subaerial portion of atolls, are particularly sensitive to sea-level rise. These landforms are typically low-lying (on the order of meters high), and are formed of wave-transported detrital sediment perched atop coralline rock. As opposed to barrier islands that can be supplied by offshore sediment from the shoreface, breakdown of corals and the shallow offshore lithology can serve as a source of sediment to reef islands, which can help build these islands as sea level rises. Here, we present a morphodynamic model to explore the combined effects of sea-level rise, sediment supply, and overwash processes on the evolution of reef islands. Model results demonstrate how reef islands are particularly sensitive to the offshore generation of sediment. When this onshore sediment supply is low, islands migrate lagoonward via storm overwash, Islands migrate over the proximal lagoonward regions, which tend to include a shallow (~2m) platform, until they reach the edge of a typically very deep lagoon (up to 60m or more). At the lagoon edge, reef islands stop their migration and eventually drown overwash sediment flux is lost to the lagoon. In contrast, a high sediment supply of offshore sediment can bulwark reef islands before reaching the lagoon edge. One possibility is that the island attains a ';static equilibrium' in which the overwash flux fills the top-barrier accommodation created by sea-level rise, and the island surface area is maintained. When the sediment supply is very high, however, the island can undergo rapid

  14. [A case of depression whose symptoms cured by setting her psychological base on the transcendent level].

    PubMed

    Ogasawara, Masayuki; Tagami, Shinji; Inoue, Yoichi; Takeda, Masatoshi

    2009-01-01

    she prayed to God for healing when she read a part in the Bible about a woman suffering from a hemorrhage for twelve years who touched the hem of Jesus' garment and was healed immediately (Matthew 9:20-22 and Luke 8:43-48), the patient suddenly experienced "the salvation of God" and realized what trust really meant. Through the experience, her clinical problems became totally cured, and the therapy concluded with her discharge from hospital. Several months later, she sent the therapist a letter including the following message: "I am grateful to the Lord for salvation from anxiety and irritation, but to the therapist for helping me realize it." This clinical course can be understood based on the patient's clinical problems (e.g., despair, anxiety, and depression), arising from the breakdown of her efforts to maintain stability by founding her psychological base on her feelings of omnipotence, avoiding facing her internal negative psychological factors (e.g., rage), and these were automatically resolved when her psychological base was switched to the transcendent level through "the Great being" experience and "the salvation of God." Such a sudden, marked improvement resembles what Miller and C'de Baca reported as "quantum change," of which the characteristics are vividness, surprise, benevolence, and permanence. The therapist paid attention to maintain a constant psychological distance from the patient, not persisting in modifying her cognition, with the transcendent level being the basis for the entire therapy. This stance of the therapist itself was considered to prompt her transcendence and bring about her eventual cure. This clinical course seemed to be highly suggestive of a psychotherapeutic mechanism, indicating the close relationship between the transcendent level and basic trust.

  15. Hydrogeologic setting east of a low-level radioactive-waste disposal site near Sheffield, Illinois

    USGS Publications Warehouse

    Foster, J.B.; Garklavs, George; Mackey, G.W.

    1984-01-01

    Core samples from 45 test wells and 4 borings were used to describe the glacial geology of the area east of the low-level radioactive-waste disposal site near Sheffield, Bureau County, Illinois. Previous work has shown that shallow ground water beneath the disposal site flows east through a pebbly-sand unit of the Toulon Member of the Glasford Formation. The pebbly sand was found in core samples from wells in an area extending northeast from the waste-disposal site to a strip-mine lake and east along the south side of the lake. Other stratigraphic units identified in the study area are correlated with units found on the disposal site. The pebbly-sand unit of the Toulon Member grades from a pebbly sand on site into a coarse gravel with sand and pebbles towards the lake. The Hulick Till Member, a key bed, underlies the Toulon Member throughout most of the study area. A narrow channel-like depression in the Hulick Till is filled with coarse gravelly sand of the Toulon Member. The filled depression extends eastward from near the northeast corner of the waste-disposal site to the strip-mine lake. (USGS)

  16. Examining Screening-Level Multimedia Models Through a Comparison Framework for Landfill Management.

    PubMed

    Asif, Zunaira; Chen, Zhi

    2016-01-01

    Two models for evaluating transport and fate of benzene were studied and compared in this paper. A fugacity model and an analytical environmental multimedia model (AEMM) were used to reconcile fate and mass transfer of benzene observed in a landfill site. The comparison of two models were based on average concentrations and partition behavior of benzene among three different phases i.e., air, soil, and groundwater. In the study of fugacity method about 99.6 % of the total benzene flux was distributed into air from landfill source. According to AEMM the diffusion gas flux was also predominant mechanism for benzene released from landfill and advection of gas and liquid was second dominant transport mechanism at steady-state conditions. Overall study of fugacity modeling (Level I and II) confirms the fate and transport mechanism of benzene released from landfill by comparing it with AEMM. However, the values of predicted concentrations, advection, and diffusion flux of benzene using fugacity model were different from AEMM results due to variation in input parameters. In comparison with experimental observations, fugacity model showed more error difference as compared to AEMM as fugacity model is treated as a single unit box model. This study confirms that fugacity model is a screening level tool to be used in conjunction with detailed remediation followed by AEMM that can be evolved as strategic decision-making stage. PMID:26342953

  17. Examining Screening-Level Multimedia Models Through a Comparison Framework for Landfill Management

    NASA Astrophysics Data System (ADS)

    Asif, Zunaira; Chen, Zhi

    2016-01-01

    Two models for evaluating transport and fate of benzene were studied and compared in this paper. A fugacity model and an analytical environmental multimedia model (AEMM) were used to reconcile fate and mass transfer of benzene observed in a landfill site. The comparison of two models were based on average concentrations and partition behavior of benzene among three different phases i.e., air, soil, and groundwater. In the study of fugacity method about 99.6 % of the total benzene flux was distributed into air from landfill source. According to AEMM the diffusion gas flux was also predominant mechanism for benzene released from landfill and advection of gas and liquid was second dominant transport mechanism at steady-state conditions. Overall study of fugacity modeling (Level I and II) confirms the fate and transport mechanism of benzene released from landfill by comparing it with AEMM. However, the values of predicted concentrations, advection, and diffusion flux of benzene using fugacity model were different from AEMM results due to variation in input parameters. In comparison with experimental observations, fugacity model showed more error difference as compared to AEMM as fugacity model is treated as a single unit box model. This study confirms that fugacity model is a screening level tool to be used in conjunction with detailed remediation followed by AEMM that can be evolved as strategic decision-making stage.

  18. Examining Screening-Level Multimedia Models Through a Comparison Framework for Landfill Management.

    PubMed

    Asif, Zunaira; Chen, Zhi

    2016-01-01

    Two models for evaluating transport and fate of benzene were studied and compared in this paper. A fugacity model and an analytical environmental multimedia model (AEMM) were used to reconcile fate and mass transfer of benzene observed in a landfill site. The comparison of two models were based on average concentrations and partition behavior of benzene among three different phases i.e., air, soil, and groundwater. In the study of fugacity method about 99.6 % of the total benzene flux was distributed into air from landfill source. According to AEMM the diffusion gas flux was also predominant mechanism for benzene released from landfill and advection of gas and liquid was second dominant transport mechanism at steady-state conditions. Overall study of fugacity modeling (Level I and II) confirms the fate and transport mechanism of benzene released from landfill by comparing it with AEMM. However, the values of predicted concentrations, advection, and diffusion flux of benzene using fugacity model were different from AEMM results due to variation in input parameters. In comparison with experimental observations, fugacity model showed more error difference as compared to AEMM as fugacity model is treated as a single unit box model. This study confirms that fugacity model is a screening level tool to be used in conjunction with detailed remediation followed by AEMM that can be evolved as strategic decision-making stage.

  19. Setting action levels for drinking water: are we protecting our health or our economy (or our backs!)?

    PubMed

    Reimann, Clemens; Banks, David

    2004-10-01

    Clean and healthy drinking water is important for life. Drinking water can be drawn from streams, lakes and rivers, directly collected (and stored) from rain, acquired by desalination of ocean water and melting of ice or it can be extracted from groundwater resources. Groundwater may reach the earth's surface in the form of springs or can be extracted via dug or drilled wells; it also contributes significantly to river baseflow. Different water quality issues have to be faced when utilising these different water resources. Some of these are at present largely neglected in water quality regulations. This paper focuses on the inorganic chemical quality of natural groundwater. Possible health effects, the problems of setting meaningful action levels or maximum admissible concentrations (MAC-values) for drinking water, and potential shortcomings in current legislation are discussed. An approach to setting action levels based on transparency, toxicological risk assessment, completeness, and identifiable responsibility is suggested. PMID:15336887

  20. Setting action levels for drinking water: are we protecting our health or our economy (or our backs!)?

    PubMed

    Reimann, Clemens; Banks, David

    2004-10-01

    Clean and healthy drinking water is important for life. Drinking water can be drawn from streams, lakes and rivers, directly collected (and stored) from rain, acquired by desalination of ocean water and melting of ice or it can be extracted from groundwater resources. Groundwater may reach the earth's surface in the form of springs or can be extracted via dug or drilled wells; it also contributes significantly to river baseflow. Different water quality issues have to be faced when utilising these different water resources. Some of these are at present largely neglected in water quality regulations. This paper focuses on the inorganic chemical quality of natural groundwater. Possible health effects, the problems of setting meaningful action levels or maximum admissible concentrations (MAC-values) for drinking water, and potential shortcomings in current legislation are discussed. An approach to setting action levels based on transparency, toxicological risk assessment, completeness, and identifiable responsibility is suggested.

  1. WriteSmoothing: Improving Lifetime of Non-volatile Caches Using Intra-set Wear-leveling

    SciTech Connect

    Mittal, Sparsh; Vetter, Jeffrey S; Li, Dong

    2014-01-01

    Driven by the trends of increasing core-count and bandwidth-wall problem, the size of last level caches (LLCs) has greatly increased. Since SRAM consumes high leakage power, researchers have explored use of non-volatile memories (NVMs) for designing caches as they provide high density and consume low leakage power. However, since NVMs have low write-endurance and the existing cache management policies are write variation-unaware, effective wear-leveling techniques are required for achieving reasonable cache lifetimes using NVMs. We present WriteSmoothing, a technique for mitigating intra-set write variation in NVM caches. WriteSmoothing logically divides the cache-sets into multiple modules. For each module, WriteSmoothing collectively records number of writes in each way for any of the sets. It then periodically makes most frequently written ways in a module unavailable to shift the write-pressure to other ways in the sets of the module. Extensive simulation results have shown that on average, for single and dual-core system configurations, WriteSmoothing improves cache lifetime by 2.17X and 2.75X, respectively. Also, its implementation overhead is small and it works well for a wide range of algorithm and system parameters.

  2. Distinct genetic signatures for variability in total and free serum thyroxine levels in four sets of recombinant inbred mice.

    PubMed

    McLachlan, Sandra M; Lu, Lu; Aliesky, Holly A; Williams, Robert W; Rapoport, Basil

    2011-03-01

    C3H/He and BALB/c mice have elevated serum thyroxine levels associated with low deiodinase type-1 activity whereas C57BL/6 (B6) mice have low thyroxine levels and elevated deiodinase type-1 activity. High-resolution genetic maps are available for four sets of recombinant inbred (RI) mice derived from B6 parents bred to C3H/He, BALB/c, DBA/2, or A strains. Total and free T4 (T-T4 and F-T4) levels in females from these RI sets (BXH, CXB, BXD, and AXBXA) were analyzed to test two hypotheses: first, serum T4 variability is linked to the deiodinase type-1 gene; second, because of their shared B6 parent, the RI sets will share linkages responsible for T-T4 or F-T4 variability. A number of chromosomes (Chr) and loci were linked to T-T4 (Chr 1, 4, 13, 11) or F-T4 (Chr 1, 6, 13, 18, 19). Linkage between T-T4 and Chr 4 was limited to CXB and BXH strains, but the locus was distinct from the deiodinase type-1 gene. Surprisingly, many linkages were unique providing "genetic signatures" for T-T4 or F-T4 in each set of RI mice. Indeed, the strongest linkage between T-T4 (or F-T4) and a Chr 2 locus (logarithm of the odds scores >4.4) was only observed in AXBXA strains. Some loci corresponded to genes/Chr associated in humans with variable TSH or T-T4 levels. Unlike inbred mice, human populations are extremely diverse. Consequently, our data suggest that the contributions of unique chromosomes/loci controlling T-T4 and F-T4 in distinct human subgroups are likely to be "buried" in genetic analyses of heterogeneous human populations.

  3. Individual and setting level predictors of the implementation of a skin cancer prevention program: a multilevel analysis

    PubMed Central

    2010-01-01

    Background To achieve widespread cancer control, a better understanding is needed of the factors that contribute to successful implementation of effective skin cancer prevention interventions. This study assessed the relative contributions of individual- and setting-level characteristics to implementation of a widely disseminated skin cancer prevention program. Methods A multilevel analysis was conducted using data from the Pool Cool Diffusion Trial from 2004 and replicated with data from 2005. Implementation of Pool Cool by lifeguards was measured using a composite score (implementation variable, range 0 to 10) that assessed whether the lifeguard performed different components of the intervention. Predictors included lifeguard background characteristics, lifeguard sun protection-related attitudes and behaviors, pool characteristics, and enhanced (i.e., more technical assistance, tailored materials, and incentives are provided) versus basic treatment group. Results The mean value of the implementation variable was 4 in both years (2004 and 2005; SD = 2 in 2004 and SD = 3 in 2005) indicating a moderate implementation for most lifeguards. Several individual-level (lifeguard characteristics) and setting-level (pool characteristics and treatment group) factors were found to be significantly associated with implementation of Pool Cool by lifeguards. All three lifeguard-level domains (lifeguard background characteristics, lifeguard sun protection-related attitudes and behaviors) and six pool-level predictors (number of weekly pool visitors, intervention intensity, geographic latitude, pool location, sun safety and/or skin cancer prevention programs, and sun safety programs and policies) were included in the final model. The most important predictors of implementation were the number of weekly pool visitors (inverse association) and enhanced treatment group (positive association). That is, pools with fewer weekly visitors and pools in the enhanced treatment group had

  4. Development of a Software Framework for System-Level Carbon Sequestration Risk Assessment

    SciTech Connect

    Miller, R.

    2013-02-28

    The overall purpose of this project was to identify, evaluate, select, develop, and test a suite of enhancements to the GoldSim software program, in order to make it a better tool for use in support of Carbon Capture and Sequestration (CCS) projects. The GoldSim software is a foundational tool used by scientists at NETL and at other laboratories and research institutions to evaluate system-level risks of proposed CCS projects. The primary product of the project was a series of successively improved versions of the GoldSim software, supported by an extensive User’s Guide. All of the enhancements were tested by scientists at Los Alamos National Laboratory, and several of the enhancements have already been incorporated into the CO{sub 2}-PENS sequestration model.

  5. Selective removal of cesium and strontium using porous frameworks from high level nuclear waste.

    PubMed

    Aguila, Briana; Banerjee, Debasis; Nie, Zimin; Shin, Yongsoon; Ma, Shengqian; Thallapally, Praveen K

    2016-05-01

    Efficient and cost-effective removal of radioactive (137)Cs and (90)Sr found in spent fuel is an important step for safe, long-term storage of nuclear waste. Solid-state materials such as resins and titanosilicate zeolites have been assessed for the removal of Cs and Sr from aqueous solutions, but there is room for improvement in terms of capacity and selectivity. Herein, we report the Cs(+) and Sr(2+) exchange potential of an ultra stable MOF, namely, MIL-101-SO3H, as a function of different contact times, concentrations, pH levels, and in the presence of competing ions. Our preliminary results suggest that MOFs with suitable ion exchange groups can be promising alternate materials for cesium and strontium removal.

  6. Selective removal of cesium and strontium using porous frameworks from high level nuclear waste.

    PubMed

    Aguila, Briana; Banerjee, Debasis; Nie, Zimin; Shin, Yongsoon; Ma, Shengqian; Thallapally, Praveen K

    2016-05-01

    Efficient and cost-effective removal of radioactive (137)Cs and (90)Sr found in spent fuel is an important step for safe, long-term storage of nuclear waste. Solid-state materials such as resins and titanosilicate zeolites have been assessed for the removal of Cs and Sr from aqueous solutions, but there is room for improvement in terms of capacity and selectivity. Herein, we report the Cs(+) and Sr(2+) exchange potential of an ultra stable MOF, namely, MIL-101-SO3H, as a function of different contact times, concentrations, pH levels, and in the presence of competing ions. Our preliminary results suggest that MOFs with suitable ion exchange groups can be promising alternate materials for cesium and strontium removal. PMID:27055254

  7. Pulmonary Nodule Detection Model Based on SVM and CT Image Feature-Level Fusion with Rough Sets

    PubMed Central

    Lu, Huiling; Zhang, Junjie; Shi, Hongbin

    2016-01-01

    In order to improve the detection accuracy of pulmonary nodules in CT image, considering two problems of pulmonary nodules detection model, including unreasonable feature structure and nontightness of feature representation, a pulmonary nodules detection algorithm is proposed based on SVM and CT image feature-level fusion with rough sets. Firstly, CT images of pulmonary nodule are analyzed, and 42-dimensional feature components are extracted, including six new 3-dimensional features proposed by this paper and others 2-dimensional and 3-dimensional features. Secondly, these features are reduced for five times with rough set based on feature-level fusion. Thirdly, a grid optimization model is used to optimize the kernel function of support vector machine (SVM), which is used as a classifier to identify pulmonary nodules. Finally, lung CT images of 70 patients with pulmonary nodules are collected as the original samples, which are used to verify the effectiveness and stability of the proposed model by four groups' comparative experiments. The experimental results show that the effectiveness and stability of the proposed model based on rough set feature-level fusion are improved in some degrees. PMID:27722173

  8. Levels of 8-OxodG Predict Hepatobiliary Pathology in Opisthorchis viverrini Endemic Settings in Thailand.

    PubMed

    Saichua, Prasert; Yakovleva, Anna; Kamamia, Christine; Jariwala, Amar R; Sithithaworn, Jiraporn; Sripa, Banchob; Brindley, Paul J; Laha, Thewarach; Mairiang, Eimorn; Pairojkul, Chawalit; Khuntikeo, Narong; Mulvenna, Jason; Sithithaworn, Paiboon; Bethony, Jeffrey M

    2015-01-01

    Opisthorchis viverrini is distinct among helminth infections as it drives a chronic inflammatory response in the intrahepatic bile duct that progresses from advanced periductal fibrosis (APF) to cholangiocarcinoma (CCA). Extensive research shows that oxidative stress (OS) plays a critical role in the transition from chronic O. viverrini infection to CCA. OS also results in the excision of a modified DNA lesion (8-oxodG) into urine, the levels of which can be detected by immunoassay. Herein, we measured concentrations of urine 8-oxodG by immunoassay from the following four groups in the Khon Kaen Cancer Cohort study: (1) O. viverrini negative individuals, (2) O. viverrini positive individuals with no APF as determined by abdominal ultrasound, (3) O. viverrini positive individuals with APF as determined by abdominal ultrasound, and (4) O. viverrini induced cases of CCA. A logistic regression model was used to evaluate the utility of creatinine-adjusted urinary 8-oxodG among these groups, along with demographic, behavioral, and immunological risk factors. Receiver operating characteristic (ROC) curve analysis was used to evaluate the predictive accuracy of urinary 8-oxodG for APF and CCA. Elevated concentrations of 8-oxodG in urine positively associated with APF and CCA in a strongly dose-dependent manner. Urinary 8-oxodG concentrations also accurately predicted whether an individual presented with APF or CCA compared to O. viverrini infected individuals without these pathologies. In conclusion, urinary 8-oxodG is a robust 'candidate' biomarker of the progression of APF and CCA from chronic opisthorchiasis, which is indicative of the critical role that OS plays in both of these advanced hepatobiliary pathologies. The findings also confirm our previous observations that severe liver pathology occurs early and asymptomatically in residents of O. viverrini endemic regions, where individuals are infected for years (often decades) with this food-borne pathogen. These

  9. Levels of 8-OxodG Predict Hepatobiliary Pathology in Opisthorchis viverrini Endemic Settings in Thailand

    PubMed Central

    Jariwala, Amar R.; Sithithaworn, Jiraporn; Sripa, Banchob; Brindley, Paul J.; Laha, Thewarach; Mairiang, Eimorn; Pairojkul, Chawalit; Khuntikeo, Narong; Mulvenna, Jason; Sithithaworn, Paiboon; Bethony, Jeffrey M.

    2015-01-01

    Opisthorchis viverrini is distinct among helminth infections as it drives a chronic inflammatory response in the intrahepatic bile duct that progresses from advanced periductal fibrosis (APF) to cholangiocarcinoma (CCA). Extensive research shows that oxidative stress (OS) plays a critical role in the transition from chronic O. viverrini infection to CCA. OS also results in the excision of a modified DNA lesion (8-oxodG) into urine, the levels of which can be detected by immunoassay. Herein, we measured concentrations of urine 8-oxodG by immunoassay from the following four groups in the Khon Kaen Cancer Cohort study: (1) O. viverrini negative individuals, (2) O. viverrini positive individuals with no APF as determined by abdominal ultrasound, (3) O. viverrini positive individuals with APF as determined by abdominal ultrasound, and (4) O. viverrini induced cases of CCA. A logistic regression model was used to evaluate the utility of creatinine-adjusted urinary 8-oxodG among these groups, along with demographic, behavioral, and immunological risk factors. Receiver operating characteristic (ROC) curve analysis was used to evaluate the predictive accuracy of urinary 8-oxodG for APF and CCA. Elevated concentrations of 8-oxodG in urine positively associated with APF and CCA in a strongly dose-dependent manner. Urinary 8-oxodG concentrations also accurately predicted whether an individual presented with APF or CCA compared to O. viverrini infected individuals without these pathologies. In conclusion, urinary 8-oxodG is a robust ‘candidate’ biomarker of the progression of APF and CCA from chronic opisthorchiasis, which is indicative of the critical role that OS plays in both of these advanced hepatobiliary pathologies. The findings also confirm our previous observations that severe liver pathology occurs early and asymptomatically in residents of O. viverrini endemic regions, where individuals are infected for years (often decades) with this food-borne pathogen. These

  10. Setting the most robust effluent level under severe uncertainty: application of information-gap decision theory to chemical management.

    PubMed

    Yokomizo, Hiroyuki; Naito, Wataru; Tanaka, Yoshinari; Kamo, Masashi

    2013-11-01

    Decisions in ecological risk management for chemical substances must be made based on incomplete information due to uncertainties. To protect the ecosystems from the adverse effect of chemicals, a precautionary approach is often taken. The precautionary approach, which is based on conservative assumptions about the risks of chemical substances, can be applied selecting management models and data. This approach can lead to an adequate margin of safety for ecosystems by reducing exposure to harmful substances, either by reducing the use of target chemicals or putting in place strict water quality criteria. However, the reduction of chemical use or effluent concentrations typically entails a financial burden. The cost effectiveness of the precautionary approach may be small. Hence, we need to develop a formulaic methodology in chemical risk management that can sufficiently protect ecosystems in a cost-effective way, even when we do not have sufficient information for chemical management. Information-gap decision theory can provide the formulaic methodology. Information-gap decision theory determines which action is the most robust to uncertainty by guaranteeing an acceptable outcome under the largest degree of uncertainty without requiring information about the extent of parameter uncertainty at the outset. In this paper, we illustrate the application of information-gap decision theory to derive a framework for setting effluent limits of pollutants for point sources under uncertainty. Our application incorporates a cost for reduction in pollutant emission and a cost to wildlife species affected by the pollutant. Our framework enables us to settle upon actions to deal with severe uncertainty in ecological risk management of chemicals.

  11. Providing a navigable route for acute medicine nurses to advance their practice: a framework of ascending levels of practice.

    PubMed

    Lees-Deutsch, Liz; Christian, Jan; Setchfield, Ian

    2016-01-01

    This article conveys concerns raised by delegates at the International SAM Conference (Manchester, 2015) regarding how to advance nursing practice in acute medicine. It endeavors to capture the essence of 'how to advance practice' and 'how to integrate advanced practice' within the workforce structures of an acute medicine unit (AMU). It addresses the production of tacit knowledge and the recognition and integration of this to developing the nursing workforce. The current context of NHS efficiencies and recruitment issues emphasize the value of retaining tacit knowledge. Uniquely, this article offers an early conceptual framework through which levels of advancement and potential transition points to advance nursing practice in acute medicine are articulated. Determining how to advance requires identification of prior accomplishments such as, tacit knowledge, experiential learning, CPD, specialist courses and management experience. This requires nurses to make judicious decisions to advance their practice and the distinction between 'amassing experience' and 'career progression'. It aims to stimulate thinking around the practicalities of advancement, the value of tacit knowledge and potential realization through the framework trajectory. PMID:27441313

  12. TDP-43 aggregation mirrors TDP-43 knockdown, affecting the expression levels of a common set of proteins

    PubMed Central

    Prpar Mihevc, S.; Baralle, Marco; Buratti, Emanuele; Rogelj, Boris

    2016-01-01

    TDP-43 protein plays an important role in regulating transcriptional repression, RNA metabolism, and splicing. Typically it shuttles between the nucleus and the cytoplasm to perform its functions, while abnormal cytoplasmic aggregation of TDP-43 has been associated with neurodegenerative diseases amyotrophic lateral sclerosis (ALS) and frontotemporal lobar degeneration (FTLD). For the purpose of this study we selected a set of proteins that were misregulated following silencing of TDP-43 and analysed their expression in a TDP-43-aggregation model cell line HEK293 Flp-in Flag-TDP-43-12x-Q/N F4L. Following TDP-43 sequestration in insoluble aggregates, we observed higher nuclear levels of EIF4A3, and POLDIP3β, whereas nuclear levels of DNMT3A, HNRNPA3, PABPC1 and POLDIP3α dropped, and cytoplasmic levels of RANBP1 dropped. In addition, immunofluorescence signal intensity quantifications showed increased nuclear expression of HNRNPL and YARS, and downregulation of cytoplasmic DPCD. Furthermore, cytoplasmic levels of predominantly nuclear protein ALYREF increased. In conclusion, by identifying a common set of proteins that are differentially expressed in a similar manner in these two different conditions, we show that TDP-43 aggregation has a comparable effect to TDP-43 knockdown. PMID:27665936

  13. Inverting Glacial Isostatic Adjustment with Paleo Sea Level Records using Bayesian Framework and Burgers Rheology

    NASA Astrophysics Data System (ADS)

    Caron, L.; Metivier, L.; Greff-Lefftz, M.; Fleitout, L.; Rouby, H.

    2015-12-01

    Glacial Isostatic Adjustment models most often assume a mantle with a viscoelastic Maxwell rheology and a given ice history model. Here we use a Bayesian Monte Carlo with Markov Chains formalism to invert the global GIA signal simultaneously for the mechanical properties of the mantle and for the volume of the various ice-sheets using as starting ice models two distinct previously published ice histories. Burgers as well as Maxwell rheologies are considered.The fitted data consist of 5720 paleo sea level records from the last 35kyrs, with a world-wide distribution. Our ambition is to present not only the best fitting model, but also the range of possible solutions (within the explored space of parameters) with their respective probability of explaining the data, and thus reveal the trade-off effects and range of uncertainty affecting the parameters. Our a posteriori probality maps exhibit in all cases two distinct peaks: both are characterized by an upper mantle viscosity around 5.1020Pa.s but one of the peaks features a lower mantle viscosity around 3.1021Pa.s while the other indicates lower mantle viscosity of more than 1.1022Pa.s. The global maximum depends upon the starting ice history and the chosen rheology: the first peak (P1) has the highest probability only in the case with a Maxwell rheology and ice history based on ICE-5G, while the second peak (P2) is favored when using ANU-based ice history or Burgers rheology, and is our preferred solution as it is also consistent with long-term geodynamics and gravity gradients anomalies over Laurentide. P2 is associated with larger volumes for the Laurentian and Fennoscandian ice-sheets and as a consequence of total ice volume balance, smaller volumes for the Antactic ice-sheet. This last point interfers with the estimate of present-day ice-melting in Antarctica from GRACE data. Finally, we find that P2 with Burgers rheology favors the existence of a tectosphere, i.e. a viscous sublithospheric layer.

  14. Fish welfare assurance system: initial steps to set up an effective tool to safeguard and monitor farmed fish welfare at a company level.

    PubMed

    van de Vis, J W; Poelman, M; Lambooij, E; Bégout, M-L; Pilarczyk, M

    2012-02-01

    The objective was to take a first step in the development of a process-oriented quality assurance (QA) system for monitoring and safeguarding of fish welfare at a company level. A process-oriented approach is focused on preventing hazards and involves establishment of critical steps in a process that requires careful control. The seven principles of the Hazard Analysis Critical Control Points (HACCP) concept were used as a framework to establish the QA system. HACCP is an internationally agreed approach for management of food safety, which was adapted for the purpose of safeguarding and monitoring the welfare of farmed fish. As the main focus of this QA system is farmed fish welfare assurance at a company level, it was named Fish Welfare Assurance System (FWAS). In this paper we present the initial steps of setting up FWAS for on growing of sea bass (Dicentrarchus labrax), carp (Cyprinus carpio) and European eel (Anguilla anguilla). Four major hazards were selected, which were fish species dependent. Critical Control Points (CCPs) that need to be controlled to minimize or avoid the four hazards are presented. For FWAS, monitoring of CCPs at a farm level is essential. For monitoring purposes, Operational Welfare Indicators (OWIs) are needed to establish whether critical biotic, abiotic, managerial and environmental factors are controlled. For the OWIs we present critical limits/target values. A critical limit is the maximum or minimum value to which a factor must be controlled at a critical control point to prevent, eliminate or reduce a hazard to an acceptable level. For managerial factors target levels are more appropriate than critical limits. Regarding the international trade of farmed fish products, we propose that FWAS needs to be standardized in aquaculture chains. For this standardization a consensus on the concept of fish welfare, methods to assess welfare objectively and knowledge on the needs of farmed fish are required.

  15. Intervening at the Setting Level to Prevent Behavioral Incidents in Residential Child Care: Efficacy of the CARE Program Model.

    PubMed

    Izzo, Charles V; Smith, Elliott G; Holden, Martha J; Norton, Catherine I; Nunno, Michael A; Sellers, Deborah E

    2016-07-01

    The current study examined the impact of a setting-level intervention on the prevention of aggressive or dangerous behavioral incidents involving youth living in group care environments. Eleven group care agencies implemented Children and Residential Experiences (CARE), a principle-based program that helps agencies use a set of evidence-informed principles to guide programming and enrich the relational dynamics throughout the agency. All agencies served mostly youth referred from child welfare. The 3-year implementation of CARE involved intensive agency-wide training and on-site consultation to agency leaders and managers around supporting and facilitating day-to-day application of the principles in both childcare and staff management arenas. Agencies provided data over 48 months on the monthly frequency of behavioral incidents most related to program objectives. Using multiple baseline interrupted time series analysis to assess program effects, we tested whether trends during the program implementation period declined significantly compared to the 12 months before implementation. Results showed significant program effects on incidents involving youth aggression toward adult staff, property destruction, and running away. Effects on aggression toward peers and self-harm were also found but were less consistent. Staff ratings of positive organizational social context (OSC) predicted fewer incidents, but there was no clear relationship between OSC and observed program effects. Findings support the potential efficacy of the CARE model and illustrate that intervening "upstream" at the setting level may help to prevent coercive caregiving patterns and increase opportunities for healthy social interactions.

  16. Investigation of indoor air volatile organic compounds concentration levels in dental settings and some related methodological issues.

    PubMed

    Santarsiero, Anna; Fuselli, Sergio; Piermattei, Alessandro; Morlino, Roberta; De Blasio, Giorgia; De Felice, Marco; Ortolani, Emanuela

    2009-01-01

    The assessment of indoor air volatile organic compounds (VOCs) concentration levels in dental settings has a big health relevance for the potentially massive occupational exposure to a lot of diverse contaminants. The comparison of the VOCs profile relative to indoor conditions and to the corresponding outdoor concentrations, as well as the discovery of possible correlations between specific dental activities and VOCs concentration variations are of utmost importance for offering a reliable characterization of risk for dentists and dental staff health. In this study we review the most relevant environmental studies addressing the VOCs contamination level in dental settings. We analyze the methodological problems this kind of study must face and we report preliminary results of an indoor air investigation, carried out at dental hospital in Italy, the "Ospedale odontoiatrico George Eastman" of Rome, in which general lines for the analysis of dental settings in environmental terms are sketched. The aim of this work is to identify the kind of problems a typical enclosed (non-industrial) environment indoor air investigation has to cope with by means of the analysis of a case study.

  17. Acute and Chronic Toxicity of Nitrate to Early Life Stages of Zebrafish--Setting Nitrate Safety Levels for Zebrafish Rearing.

    PubMed

    Learmonth, Cândida; Carvalho, António Paulo

    2015-08-01

    Recirculating aquaculture systems (RAS) have been widely used for zebrafish rearing, allowing holding of many thousands of fish at high densities. Water quality in RAS largely depends on biofilters that ultimately convert the extremely toxic ammonia excreted by fish into the much less toxic nitrate. However, when water renewal is minimal in RAS, nitrate can accumulate to high enough levels to negatively impact fish welfare and performance. Therefore, the setting of safety levels of nitrate for zebrafish should be a priority to avoid unwanted effects in both the intensive production of this species and research outputs. The present study aimed to define nitrate safety levels for zebrafish based on acute and chronic toxicity bioassays in early life stages of this species. Acute bioassays revealed ontogenetic changes in response to high nitrate levels. Based on NOEC (no observed effect concentration) values, safety levels should be set at 1450, 1855, and 1075 mg/L NO3(-)-N to prevent acute lethal effects in embryos, newly-hatched larvae, and swim-up larvae, respectively. In the chronic bioassay, larvae were exposed to nitrate concentrations of 50, 100, 200, and 400 mg/L NO3(-)-N during the entire larval period (23 days). No negative effects were observed either on larval performance or condition at concentrations up to 200 mg/L NO3(-)-N. However, at 400 mg/L NO3(-)-N, survival drastically decreased and fish showed reduced growth and evidence of morphological abnormalities. Accordingly, a safety level of 200 mg/L NO3(-)-N is recommended during the larval rearing of zebrafish to prevent negative impacts on juvenile production. PMID:25996778

  18. Acute and Chronic Toxicity of Nitrate to Early Life Stages of Zebrafish--Setting Nitrate Safety Levels for Zebrafish Rearing.

    PubMed

    Learmonth, Cândida; Carvalho, António Paulo

    2015-08-01

    Recirculating aquaculture systems (RAS) have been widely used for zebrafish rearing, allowing holding of many thousands of fish at high densities. Water quality in RAS largely depends on biofilters that ultimately convert the extremely toxic ammonia excreted by fish into the much less toxic nitrate. However, when water renewal is minimal in RAS, nitrate can accumulate to high enough levels to negatively impact fish welfare and performance. Therefore, the setting of safety levels of nitrate for zebrafish should be a priority to avoid unwanted effects in both the intensive production of this species and research outputs. The present study aimed to define nitrate safety levels for zebrafish based on acute and chronic toxicity bioassays in early life stages of this species. Acute bioassays revealed ontogenetic changes in response to high nitrate levels. Based on NOEC (no observed effect concentration) values, safety levels should be set at 1450, 1855, and 1075 mg/L NO3(-)-N to prevent acute lethal effects in embryos, newly-hatched larvae, and swim-up larvae, respectively. In the chronic bioassay, larvae were exposed to nitrate concentrations of 50, 100, 200, and 400 mg/L NO3(-)-N during the entire larval period (23 days). No negative effects were observed either on larval performance or condition at concentrations up to 200 mg/L NO3(-)-N. However, at 400 mg/L NO3(-)-N, survival drastically decreased and fish showed reduced growth and evidence of morphological abnormalities. Accordingly, a safety level of 200 mg/L NO3(-)-N is recommended during the larval rearing of zebrafish to prevent negative impacts on juvenile production.

  19. Efficient model chemistries for peptides. I. General framework and a study of the heterolevel approximation in RHF and MP2 with Pople split-valence basis sets.

    PubMed

    Echenique, Pablo; Alonso, José Luis

    2008-07-15

    We present an exhaustive study of more than 250 ab initio potential energy surfaces (PESs) of the model dipeptide HCO-L-Ala-NH(2). The model chemistries (MCs) investigated are constructed as homo- and heterolevels involving possibly different RHF and MP2 calculations for the geometry and the energy. The basis sets used belong to a sample of 39 representants from Pople's split-valence families, ranging from the small 3-21G to the large 6-311++G(2df,2pd). The reference PES to which the rest are compared is the MP2/6-311++G(2df,2pd) homolevel, which, as far as we are aware, is the most accurate PES in the literature. All data sets have been analyzed according to a general framework, which can be extended to other complex problems and which captures the nearness concept in the space of MCs. The great number of MCs evaluated has allowed us to significantly explore this space and show that the correlation between accuracy and computational cost of the methods is imperfect, thus justifying a systematic search for the combination of features in a MC that is optimal to deal with peptides. Regarding the particular MCs studied, the most important conclusion is that the potentially very cost-saving heterolevel approximation is a very efficient one to describe the whole PES of HCO-L-Ala-NH(2). Finally, we show that, although RHF may be used to calculate the geometry if a MP2 single-point energy calculation follows, pure RHF//RHF homolevels are not recommendable for this problem.

  20. Cervical cancer screening in low-resource settings: A cost-effectiveness framework for valuing tradeoffs between test performance and program coverage.

    PubMed

    Campos, Nicole G; Castle, Philip E; Wright, Thomas C; Kim, Jane J

    2015-11-01

    As cervical cancer screening programs are implemented in low-resource settings, protocols are needed to maximize health benefits under operational constraints. Our objective was to develop a framework for examining health and economic tradeoffs between screening test sensitivity, population coverage and follow-up of screen-positive women, to help decision makers identify where program investments yield the greatest value. As an illustrative example, we used an individual-based Monte Carlo simulation model of the natural history of human papillomavirus (HPV) and cervical cancer calibrated to epidemiologic data from Uganda. We assumed once in a lifetime screening at age 35 with two-visit HPV DNA testing or one-visit visual inspection with acetic acid (VIA). We assessed the health and economic tradeoffs that arise between (i) test sensitivity and screening coverage; (ii) test sensitivity and loss to follow-up (LTFU) of screen-positive women; and (iii) test sensitivity, screening coverage and LTFU simultaneously. The decline in health benefits associated with sacrificing HPV DNA test sensitivity by 20% (e.g., shifting from provider- to self-collection of specimens) could be offset by gains in coverage if coverage increased by at least 20%. When LTFU was 10%, two-visit HPV DNA testing with 80-90% sensitivity was more effective and more cost-effective than one-visit VIA with 40% sensitivity and yielded greater health benefits than VIA even as VIA sensitivity increased to 60% and HPV test sensitivity declined to 70%. As LTFU increased, two-visit HPV DNA testing became more costly and less effective than one-visit VIA. Setting-specific data on achievable test sensitivity, coverage, follow-up rates and programmatic costs are needed to guide decision making for cervical cancer screening.

  1. Assessement of serum amyloid A levels in the rehabilitation setting in the Florida manatee (Trichechus manatus latirostris).

    PubMed

    Cray, Carolyn; Dickey, Meranda; Brewer, Leah Brinson; Arheart, Kristopher L

    2013-12-01

    The acute phase protein serum amyloid A (SAA) has been previously shown to have value as a biomarker of inflammation and infection in many species, including manatees (Trichechus manatus latirostris). In the current study, results from an automated assay for SAA were used in a rehabilitation setting. Reference intervals were established from clinically normal manatees using the robust method: 0-46 mg/L. More than 30-fold higher mean SAA levels were observed in manatees suffering from cold stress and boat-related trauma. Poor correlations were observed between SAA and total white blood count, percentage of neutrophils, albumin, and albumin/globulin ratio. A moderate correlation was observed between SAA and the presence of nucleated red blood cells. The sensitivity of SAA testing was 93% and the specificity was 98%, representing the highest combined values of all the analytes. The results indicate that the automated method for SAA quantitation can provide important clinical data for manatees in a rehabilitation setting.

  2. Preliminary analysis of acceleration of sea level rise through the twentieth century using extended tide gauge data sets (August 2014)

    NASA Astrophysics Data System (ADS)

    Hogarth, Peter

    2014-11-01

    This work explores the potential for extending tide gauge time series from the Permanent Service for Mean Sea Level (PSMSL) using historical documents, PSMSL ancillary data, and by developing additional composite time series using near neighbor tide gauges. The aim was to increase the number, completeness, and geographical extent of records covering most or all of the twentieth century. The number of at least 75% complete century-scale time series have been approximately doubled over the original PSMSL data set. In total, over 4800 station years have been added, with 294 of these added to 10 long Southern Hemisphere records. Individual century-scale acceleration values derived from this new extended data set tend to converge on a value of 0.01 ± 0.008 mm/yr2. This result agrees closely with recent work and is statistically significant at the 1 sigma level. Possible causes of acceleration and errors are briefly discussed. Results confirm the importance of current data archeology projects involving digitization of the remaining archives of hard copy tide gauge data for sea level and climate studies.

  3. Parietal blood oxygenation level-dependent response evoked by covert visual search reflects set-size effect in monkeys.

    PubMed

    Atabaki, A; Marciniak, K; Dicke, P W; Karnath, H-O; Thier, P

    2014-03-01

    Distinguishing a target from distractors during visual search is crucial for goal-directed behaviour. The more distractors that are presented with the target, the larger is the subject's error rate. This observation defines the set-size effect in visual search. Neurons in areas related to attention and eye movements, like the lateral intraparietal area (LIP) and frontal eye field (FEF), diminish their firing rates when the number of distractors increases, in line with the behavioural set-size effect. Furthermore, human imaging studies that have tried to delineate cortical areas modulating their blood oxygenation level-dependent (BOLD) response with set size have yielded contradictory results. In order to test whether BOLD imaging of the rhesus monkey cortex yields results consistent with the electrophysiological findings and, moreover, to clarify if additional other cortical regions beyond the two hitherto implicated are involved in this process, we studied monkeys while performing a covert visual search task. When varying the number of distractors in the search task, we observed a monotonic increase in error rates when search time was kept constant as was expected if monkeys resorted to a serial search strategy. Visual search consistently evoked robust BOLD activity in the monkey FEF and a region in the intraparietal sulcus in its lateral and middle part, probably involving area LIP. Whereas the BOLD response in the FEF did not depend on set size, the LIP signal increased in parallel with set size. These results demonstrate the virtue of BOLD imaging in monkeys when trying to delineate cortical areas underlying a cognitive process like visual search. However, they also demonstrate the caution needed when inferring neural activity from BOLD activity.

  4. Low levels of HIV test coverage in clinical settings in the UK: a systematic review of adherence to 2008 guidelines

    PubMed Central

    Elmahdi, Rahma; Gerver, Sarah M; Gomez Guillen, Gabriela; Fidler, Sarah; Cooke, Graham; Ward, Helen

    2014-01-01

    Objectives To quantify the extent to which guideline recommendations for routine testing for HIV are adhered to outside of genitourinary medicine (GUM), sexual health (SH) and antenatal clinics. Methods A systematic review of published data on testing levels following publication of 2008 guidelines was undertaken. Medline, Embase and conference abstracts were searched according to a predefined protocol. We included studies reporting the number of HIV tests administered in those eligible for guideline recommended testing. We excluded reports of testing in settings with established testing surveillance (GUM/SH and antenatal clinics). A random effects meta-analysis was carried out to summarise level of HIV testing across the studies identified. Results Thirty studies were identified, most of which were retrospective studies or audits of testing practice. Results were heterogeneous. The overall pooled estimate of HIV test coverage was 27.2% (95% CI 22.4% to 32%). Test coverage was marginally higher in patients tested in settings where routine testing is recommended (29.5%) than in those with clinical indicator diseases (22.4%). Provider test offer was found to be lower (40.4%) than patient acceptance of testing (71.5%). Conclusions Adherence to 2008 national guidelines for HIV testing in the UK is poor outside of GUM/SH and antenatal clinics. Low levels of provider test offer appear to be a major contributor to this. Failure to adhere to testing guidelines is likely to be contributing to late diagnosis with implications for poorer clinical outcomes and continued onwards transmission of HIV. Improved surveillance of HIV testing outside of specialist settings may be useful in increasing adherence testing guidelines. PMID:24412996

  5. The l1-l2 regularization framework unmasks the hypoxia signature hidden in the transcriptome of a set of heterogeneous neuroblastoma cell lines

    PubMed Central

    Fardin, Paolo; Barla, Annalisa; Mosci, Sofia; Rosasco, Lorenzo; Verri, Alessandro; Varesio, Luigi

    2009-01-01

    Background Gene expression signatures are clusters of genes discriminating different statuses of the cells and their definition is critical for understanding the molecular bases of diseases. The identification of a gene signature is complicated by the high dimensional nature of the data and by the genetic heterogeneity of the responding cells. The l1-l2 regularization is an embedded feature selection technique that fulfills all the desirable properties of a variable selection algorithm and has the potential to generate a specific signature even in biologically complex settings. We studied the application of this algorithm to detect the signature characterizing the transcriptional response of neuroblastoma tumor cell lines to hypoxia, a condition of low oxygen tension that occurs in the tumor microenvironment. Results We determined the gene expression profile of 9 neuroblastoma cell lines cultured under normoxic and hypoxic conditions. We studied a heterogeneous set of neuroblastoma cell lines to mimic the in vivo situation and to test the robustness and validity of the l1-l2 regularization with double optimization. Analysis by hierarchical, spectral, and k-means clustering or supervised approach based on t-test analysis divided the cell lines on the bases of genetic differences. However, the disturbance of this strong transcriptional response completely masked the detection of the more subtle response to hypoxia. Different results were obtained when we applied the l1-l2 regularization framework. The algorithm distinguished the normoxic and hypoxic statuses defining signatures comprising 3 to 38 probesets, with a leave-one-out error of 17%. A consensus hypoxia signature was established setting the frequency score at 50% and the correlation parameter ε equal to 100. This signature is composed by 11 probesets representing 8 well characterized genes known to be modulated by hypoxia. Conclusion We demonstrate that l1-l2 regularization outperforms more conventional

  6. A 3D Level Sets Method for Segmenting the Mouse Spleen and Follicles in Volumetric microCT Images

    SciTech Connect

    Price, Jeffery R; Aykac, Deniz; Wall, Jonathan

    2006-01-01

    We present a semi-automatic, 3D approach for segmenting the mouse spleen, and its interior follicles, in volumetric microCT imagery. Based upon previous 2D level sets work, we develop a fully 3D implementation and provide the corresponding finite difference formulas. We incorporate statistical and proximity weighting schemes to improve segmentation performance. We also note an issue with the original algorithm and propose a solution that proves beneficial in our experiments. Experimental results are provided for artificial and real data.

  7. 3-dimensional throat region segmentation from MRI data based on Fourier interpolation and 3-dimensional level set methods.

    PubMed

    Campbell, Sean; Doshi, Trushali; Soraghan, John; Petropoulakis, Lykourgos; Di Caterina, Gaetano; Grose, Derek; MacKenzie, Kenneth

    2015-01-01

    A new algorithm for 3D throat region segmentation from magnetic resonance imaging (MRI) is presented. The proposed algorithm initially pre-processes the MRI data to increase the contrast between the throat region and its surrounding tissues and to reduce artifacts. Isotropic 3D volume is reconstructed using the Fourier interpolation. Furthermore, a cube encompassing the throat region is evolved using level set method to form a smooth 3D boundary of the throat region. The results of the proposed algorithm on real and synthetic MRI data are used to validate the robustness and accuracy of the algorithm.

  8. ComPASS: A Common Framework for Streamlining Multi-Level Planning Systems from Scientist to Observatory/Spacecraft

    NASA Astrophysics Data System (ADS)

    Brooks, T.

    Current efforts to integrate multiple planning and scheduling subsystems into an end-to-end, scientist-observatory/spacecraft, planning system are handcrafted and very resource intensive. In addition, future spacecraft/observatory configurations such as constellations and multi-observatory ``campaigns'' introduce challenges that require a new approach to planning and scheduling system integration. The Advanced Architectures and Automation Branch of NASA's Goddard Space Flight Center has embarked on a multi-year project entitled ComPASS (Common Planning And Scheduling System) with the goal of producing a common framework for end-to-end collaborative planning and scheduling. ComPASS is intended to streamline automation of the planning process and enable general application of automated collaborative planning technologies. During NASA FY99, the ComPASS project has developed the infrastructure for plan distribution/collaboration, a common plan representation, and interfaces to existing components. The first set of components to be integrated include an extended version of Goddard's Scientist's Expert Assistant (SEA) science client, ASPEN - a planner/scheduler from the Jet Propulsion Laboratory (JPL), and Satellite Toolkit (STK) from Analytical Graphics, Inc.

  9. Towards people-centred health systems: a multi-level framework for analysing primary health care governance in low- and middle-income countries.

    PubMed

    Abimbola, Seye; Negin, Joel; Jan, Stephen; Martiniuk, Alexandra

    2014-09-01

    Although there is evidence that non-government health system actors can individually or collectively develop practical strategies to address primary health care (PHC) challenges in the community, existing frameworks for analysing health system governance largely focus on the role of governments, and do not sufficiently account for the broad range of contribution to PHC governance. This is important because of the tendency for weak governments in low- and middle-income countries (LMICs). We present a multi-level governance framework for use as a thinking guide in analysing PHC governance in LMICs. This framework has previously been used to analyse the governance of common-pool resources such as community fisheries and irrigation systems. We apply the framework to PHC because, like common-pool resources, PHC facilities in LMICs tend to be commonly owned by the community such that individual and collective action is often required to avoid the 'tragedy of the commons'-destruction and degradation of the resource resulting from lack of concern for its continuous supply. In the multi-level framework, PHC governance is conceptualized at three levels, depending on who influences the supply and demand of PHC services in a community and how: operational governance (individuals and providers within the local health market), collective governance (community coalitions) and constitutional governance (governments at different levels and other distant but influential actors). Using the example of PHC governance in Nigeria, we illustrate how the multi-level governance framework offers a people-centred lens on the governance of PHC in LMICs, with a focus on relations among health system actors within and between levels of governance. We demonstrate the potential impact of health system actors functioning at different levels of governance on PHC delivery, and how governance failure at one level can be assuaged by governance at another level. PMID:25274638

  10. Towards people-centred health systems: a multi-level framework for analysing primary health care governance in low- and middle-income countries

    PubMed Central

    Abimbola, Seye; Negin, Joel; Jan, Stephen; Martiniuk, Alexandra

    2014-01-01

    Although there is evidence that non-government health system actors can individually or collectively develop practical strategies to address primary health care (PHC) challenges in the community, existing frameworks for analysing health system governance largely focus on the role of governments, and do not sufficiently account for the broad range of contribution to PHC governance. This is important because of the tendency for weak governments in low- and middle-income countries (LMICs). We present a multi-level governance framework for use as a thinking guide in analysing PHC governance in LMICs. This framework has previously been used to analyse the governance of common-pool resources such as community fisheries and irrigation systems. We apply the framework to PHC because, like common-pool resources, PHC facilities in LMICs tend to be commonly owned by the community such that individual and collective action is often required to avoid the ‘tragedy of the commons’—destruction and degradation of the resource resulting from lack of concern for its continuous supply. In the multi-level framework, PHC governance is conceptualized at three levels, depending on who influences the supply and demand of PHC services in a community and how: operational governance (individuals and providers within the local health market), collective governance (community coalitions) and constitutional governance (governments at different levels and other distant but influential actors). Using the example of PHC governance in Nigeria, we illustrate how the multi-level governance framework offers a people-centred lens on the governance of PHC in LMICs, with a focus on relations among health system actors within and between levels of governance. We demonstrate the potential impact of health system actors functioning at different levels of governance on PHC delivery, and how governance failure at one level can be assuaged by governance at another level. PMID:25274638

  11. Towards people-centred health systems: a multi-level framework for analysing primary health care governance in low- and middle-income countries.

    PubMed

    Abimbola, Seye; Negin, Joel; Jan, Stephen; Martiniuk, Alexandra

    2014-09-01

    Although there is evidence that non-government health system actors can individually or collectively develop practical strategies to address primary health care (PHC) challenges in the community, existing frameworks for analysing health system governance largely focus on the role of governments, and do not sufficiently account for the broad range of contribution to PHC governance. This is important because of the tendency for weak governments in low- and middle-income countries (LMICs). We present a multi-level governance framework for use as a thinking guide in analysing PHC governance in LMICs. This framework has previously been used to analyse the governance of common-pool resources such as community fisheries and irrigation systems. We apply the framework to PHC because, like common-pool resources, PHC facilities in LMICs tend to be commonly owned by the community such that individual and collective action is often required to avoid the 'tragedy of the commons'-destruction and degradation of the resource resulting from lack of concern for its continuous supply. In the multi-level framework, PHC governance is conceptualized at three levels, depending on who influences the supply and demand of PHC services in a community and how: operational governance (individuals and providers within the local health market), collective governance (community coalitions) and constitutional governance (governments at different levels and other distant but influential actors). Using the example of PHC governance in Nigeria, we illustrate how the multi-level governance framework offers a people-centred lens on the governance of PHC in LMICs, with a focus on relations among health system actors within and between levels of governance. We demonstrate the potential impact of health system actors functioning at different levels of governance on PHC delivery, and how governance failure at one level can be assuaged by governance at another level.

  12. Automatic shape-based level set segmentation for needle tracking in 3-D TRUS-guided prostate brachytherapy.

    PubMed

    Yan, Ping; Cheeseborough, John C; Chao, K S Clifford

    2012-09-01

    Prostate brachytherapy is an effective treatment for early prostate cancer. The success depends critically on the correct needle implant positions. We have devised an automatic shape-based level set segmentation tool for needle tracking in 3-D transrectal ultrasound (TRUS) images, which uses the shape information and level set technique to localize the needle position and estimate the endpoint of needle in real-time. The 3-D TRUS images used in the evaluation of our tools were obtained using a 2-D TRUS transducer from Ultrasonix (Richmond, BC, Canada) and a computer-controlled stepper motor system from Thorlabs (Newton, NJ, USA). The accuracy and feedback mechanism had been validated using prostate phantoms and compared with 3-D positions of these needles derived from experts' readings. The experts' segmentation of needles from 3-D computed tomography images was the ground truth in this study. The difference between automatic and expert segmentations are within 0.1 mm for 17 of 19 implanted needles. The mean errors of automatic segmentations by comparing with the ground truth are within 0.25 mm. Our automated method allows real-time TRUS-based needle placement difference within one pixel compared with manual expert segmentation.

  13. LV wall segmentation using the variational level set method (LSM) with additional shape constraint for oedema quantification

    NASA Astrophysics Data System (ADS)

    Kadir, K.; Gao, H.; Payne, A.; Soraghan, J.; Berry, C.

    2012-10-01

    In this paper an automatic algorithm for the left ventricle (LV) wall segmentation and oedema quantification from T2-weighted cardiac magnetic resonance (CMR) images is presented. The extent of myocardial oedema delineates the ischaemic area-at-risk (AAR) after myocardial infarction (MI). Since AAR can be used to estimate the amount of salvageable myocardial post-MI, oedema imaging has potential clinical utility in the management of acute MI patients. This paper presents a new scheme based on the variational level set method (LSM) with additional shape constraint for the segmentation of T2-weighted CMR image. In our approach, shape information of the myocardial wall is utilized to introduce a shape feature of the myocardial wall into the variational level set formulation. The performance of the method is tested using real CMR images (12 patients) and the results of the automatic system are compared to manual segmentation. The mean perpendicular distances between the automatic and manual LV wall boundaries are in the range of 1-2 mm. Bland-Altman analysis on LV wall area indicates there is no consistent bias as a function of LV wall area, with a mean bias of -121 mm2 between individual investigator one (IV1) and LSM, and -122 mm2 between individual investigator two (IV2) and LSM when compared to two investigators. Furthermore, the oedema quantification demonstrates good correlation when compared to an expert with an average error of 9.3% for 69 slices of short axis CMR image from 12 patients.

  14. Level-set reconstruction algorithm for ultrafast limited-angle X-ray computed tomography of two-phase flows

    PubMed Central

    Bieberle, M.; Hampel, U.

    2015-01-01

    Tomographic image reconstruction is based on recovering an object distribution from its projections, which have been acquired from all angular views around the object. If the angular range is limited to less than 180° of parallel projections, typical reconstruction artefacts arise when using standard algorithms. To compensate for this, specialized algorithms using a priori information about the object need to be applied. The application behind this work is ultrafast limited-angle X-ray computed tomography of two-phase flows. Here, only a binary distribution of the two phases needs to be reconstructed, which reduces the complexity of the inverse problem. To solve it, a new reconstruction algorithm (LSR) based on the level-set method is proposed. It includes one force function term accounting for matching the projection data and one incorporating a curvature-dependent smoothing of the phase boundary. The algorithm has been validated using simulated as well as measured projections of known structures, and its performance has been compared to the algebraic reconstruction technique and a binary derivative of it. The validation as well as the application of the level-set reconstruction on a dynamic two-phase flow demonstrated its applicability and its advantages over other reconstruction algorithms. PMID:25939623

  15. Mask pattern recovery by level set method based inverse inspection technology (IIT) and its application on defect auto disposition

    NASA Astrophysics Data System (ADS)

    Park, Jin-Hyung; Chung, Paul D. H.; Jeon, Chan-Uk; Cho, Han Ku; Pang, Linyong; Peng, Danping; Tolani, Vikram; Cecil, Tom; Kim, David; Baik, KiHo

    2009-10-01

    At the most advanced technology nodes, such as 32nm and 22nm, aggressive OPC and Sub-Resolution Assist Features (SRAFs) are required. However, their use results in significantly increased mask complexity, making mask defect disposition more challenging than ever. This paper describes how mask patterns can first be recovered from the inspection images by applying patented algorithms using Level Set Methods. The mask pattern recovery step is then followed by aerial/wafer image simulation, the results of which can be plugged into an automated mask defect disposition system based on aerial/wafer image. The disposition criteria are primarily based on wafer-plane CD variance. The system also connects to a post-OPC lithography verification tool that can provide gauges and CD specs, thereby enabling them to be used in mask defect disposition as well. Results on both programmed defects and production defects collected at Samsung mask shop are presented to show the accuracy and consistency of using the Level Set Methods and aerial/wafer image based automated mask disposition.

  16. Intervening at the Setting Level to Prevent Behavioral Incidents in Residential Child Care: Efficacy of the CARE Program Model.

    PubMed

    Izzo, Charles V; Smith, Elliott G; Holden, Martha J; Norton, Catherine I; Nunno, Michael A; Sellers, Deborah E

    2016-07-01

    The current study examined the impact of a setting-level intervention on the prevention of aggressive or dangerous behavioral incidents involving youth living in group care environments. Eleven group care agencies implemented Children and Residential Experiences (CARE), a principle-based program that helps agencies use a set of evidence-informed principles to guide programming and enrich the relational dynamics throughout the agency. All agencies served mostly youth referred from child welfare. The 3-year implementation of CARE involved intensive agency-wide training and on-site consultation to agency leaders and managers around supporting and facilitating day-to-day application of the principles in both childcare and staff management arenas. Agencies provided data over 48 months on the monthly frequency of behavioral incidents most related to program objectives. Using multiple baseline interrupted time series analysis to assess program effects, we tested whether trends during the program implementation period declined significantly compared to the 12 months before implementation. Results showed significant program effects on incidents involving youth aggression toward adult staff, property destruction, and running away. Effects on aggression toward peers and self-harm were also found but were less consistent. Staff ratings of positive organizational social context (OSC) predicted fewer incidents, but there was no clear relationship between OSC and observed program effects. Findings support the potential efficacy of the CARE model and illustrate that intervening "upstream" at the setting level may help to prevent coercive caregiving patterns and increase opportunities for healthy social interactions. PMID:27138932

  17. Practical Recommendations for Robot-Assisted Treadmill Therapy (Lokomat) in Children with Cerebral Palsy: Indications, Goal Setting, and Clinical Implementation within the WHO-ICF Framework.

    PubMed

    Aurich-Schuler, Tabea; Warken, Birgit; Graser, Judith V; Ulrich, Thilo; Borggraefe, Ingo; Heinen, Florian; Meyer-Heim, Andreas; van Hedel, Hubertus J A; Schroeder, A Sebastian

    2015-08-01

    Active participation and the highest level of independence during daily living are primary goals in neurorehabilitation. Therefore, standing and walking are key factors in many rehabilitation programs. Despite inconclusive evidence considering the best application and efficacy of robotic tools in the field of pediatric neurorehabilitation, robotic technologies have been implemented to complement conventional therapies in recent years. A group of experienced therapists and physicians joined in an "expert panel." They compared their clinical application protocols, discussed recurring open questions, and developed experience-based recommendations for robot-assisted treadmill therapy (exemplified by the Lokomat, Hocoma, Volketswil, Switzerland) with a focus on children with cerebral palsy. Specific indications and therapeutic goals were defined considering the severity of motor impairments and the International Classification of Functioning, Disability and Health framework (ICF). After five meetings, consensus was found and recommendations for the implementation of robot-assisted treadmill therapy including postsurgery rehabilitation were proposed. This article aims to provide a comprehensive overview on therapeutical applications in a fast developing field of medicine, where scientific evidence is still scarce. These recommendations can help physicians and therapists to plan the child's individual therapy protocol of robot-assisted treadmill therapy.

  18. Patient- and population-level health consequences of discontinuing antiretroviral therapy in settings with inadequate HIV treatment availability

    PubMed Central

    2012-01-01

    Background In resource-limited settings, HIV budgets are flattening or decreasing. A policy of discontinuing antiretroviral therapy (ART) after HIV treatment failure was modeled to highlight trade-offs among competing policy goals of optimizing individual and population health outcomes. Methods In settings with two available ART regimens, we assessed two strategies: (1) continue ART after second-line failure (Status Quo) and (2) discontinue ART after second-line failure (Alternative). A computer model simulated outcomes for a single cohort of newly detected, HIV-infected individuals. Projections were fed into a population-level model allowing multiple cohorts to compete for ART with constraints on treatment capacity. In the Alternative strategy, discontinuation of second-line ART occurred upon detection of antiretroviral failure, specified by WHO guidelines. Those discontinuing failed ART experienced an increased risk of AIDS-related mortality compared to those continuing ART. Results At the population level, the Alternative strategy increased the mean number initiating ART annually by 1,100 individuals (+18.7%) to 6,980 compared to the Status Quo. More individuals initiating ART under the Alternative strategy increased total life-years by 15,000 (+2.8%) to 555,000, compared to the Status Quo. Although more individuals received treatment under the Alternative strategy, life expectancy for those treated decreased by 0.7 years (−8.0%) to 8.1 years compared to the Status Quo. In a cohort of treated patients only, 600 more individuals (+27.1%) died by 5 years under the Alternative strategy compared to the Status Quo. Results were sensitive to the timing of detection of ART failure, number of ART regimens, and treatment capacity. Although we believe the results robust in the short-term, this analysis reflects settings where HIV case detection occurs late in the disease course and treatment capacity and the incidence of newly detected patients are stable. Conclusions

  19. Numerical simulation of non-viscous liquid pinch off using a coupled level set-boundary integral method

    SciTech Connect

    Garzon, Maria; Sethian, James A.; Gray, Leonard J

    2009-01-01

    Simulations of the pinch off of an inviscid fluid column are carried out based upon a potential flow model with capillary forces. The interface location and the time evolution of the free surface boundary condition are both approximated by means of level set techniques on a fixed domain. The interface velocity is obtained via a Galerkin boundary integral solution of the 3D axisymmetric Laplace equation. A short time analytical solution of the Raleigh-Taylor instability in a liquid column is available, and this result is compared with our numerical experiments to validate the algorithm. The method is capable of handling pinch-off and after pinch-off events, and simulations showing the time evolution of the fluid tube are presented.

  20. PCA and level set based non-rigid image registration for MRI and Paxinos-Watson atlas of rat brain

    NASA Astrophysics Data System (ADS)

    Cai, Chao; Liu, Ailing; Ding, Mingyue; Zhou, Chengping

    2007-12-01

    Image registration provides the ability to geometrically align one dataset with another. It is a basic task in a great variety of biomedical imaging applications. This paper introduced a novel three-dimensional registration method for Magnetic Resonance Image (MRI) and Paxinos-Watson Atlas of rat brain. For the purpose of adapting to a large range and non-linear deformation between MRI and atlas in higher registration accuracy, based on the segmentation of rat brain, we chose the principle components analysis (PCA) automatically performing the linear registration, and then, a level set based nonlinear registration correcting some small distortions. We implemented this registration method in a rat brain 3D reconstruction and analysis system. Experiments have demonstrated that this method can be successfully applied to registering the low resolution and noise affection MRI with Paxinos-Watson Atlas of rat brain.

  1. Best Practices for Ethical Sharing of Individual-Level Health Research Data From Low- and Middle-Income Settings.

    PubMed

    Bull, Susan; Cheah, Phaik Yeong; Denny, Spencer; Jao, Irene; Marsh, Vicki; Merson, Laura; Shah More, Neena; Nhan, Le Nguyen Thanh; Osrin, David; Tangseefa, Decha; Wassenaar, Douglas; Parker, Michael

    2015-07-01

    Sharing individual-level data from clinical and public health research is increasingly being seen as a core requirement for effective and efficient biomedical research. This article discusses the results of a systematic review and multisite qualitative study of key stakeholders' perspectives on best practices in ethical data sharing in low- and middle-income settings. Our research suggests that for data sharing to be effective and sustainable, multiple social and ethical requirements need to be met. An effective model of data sharing will be one in which considered judgments will need to be made about how best to achieve scientific progress, minimize risks of harm, promote fairness and reciprocity, and build and sustain trust.

  2. Motion of a semi-infinite bubble in a liquid filled channel using the level set method

    NASA Astrophysics Data System (ADS)

    Tolga Akcabay, Deniz; Halpern, David; Grotberg, James B.

    2008-11-01

    The study of plug propagation in lung airways is of interest in the treatment of medical conditions such as asthma and in drug delivery. The problem of a semi-infinite bubble steadily displacing a liquid in a 2D channel (planar Bretherton problem) is computed using a fractional-step method on a Cartesian grid to solve the Navier-Stokes equations and a level-set formulation for resolving the air-liquid interface. We matched with available literature the geometry of the front and rear menisci of this semi-infinite bubble, stresses on the channel walls, and the maximum pressure drop as a function of the Capillary number -- the ratio of viscous to surface tension effects. Furthermore, we present preliminary results for flows within tapered walls to address area expansion near airway bifurcations.

  3. The daily events and emotions of master's-level family therapy trainees in off-campus practicum settings.

    PubMed

    Edwards, Todd M; Patterson, Jo Ellen

    2012-10-01

    The Day Reconstruction Method (DRM) was used to assess the daily events and emotions of one program's master's-level family therapy trainees in off-campus practicum settings. This study examines the DRM reports of 35 family therapy trainees in the second year of their master's program in marriage and family therapy. Four themes emerged from the results: (i) Personal contact with peers-in-training engenders the most positive emotions during practicum; (ii) Trainees experience more positive emotions during therapy with families and couples in comparison with therapy with individuals; (iii) Positive affect increases over the course of a student's practicum year; and (iv) Trainees experience less positive affect in individual supervision in comparison with most other training activities. Flow theory offers guidance for supervisors helping trainees face developmental challenges of clinical training.

  4. Best Practices for Ethical Sharing of Individual-Level Health Research Data From Low- and Middle-Income Settings

    PubMed Central

    Cheah, Phaik Yeong; Denny, Spencer; Jao, Irene; Marsh, Vicki; Merson, Laura; Shah More, Neena; Nhan, Le Nguyen Thanh; Osrin, David; Tangseefa, Decha; Wassenaar, Douglas; Parker, Michael

    2015-01-01

    Sharing individual-level data from clinical and public health research is increasingly being seen as a core requirement for effective and efficient biomedical research. This article discusses the results of a systematic review and multisite qualitative study of key stakeholders’ perspectives on best practices in ethical data sharing in low- and middle-income settings. Our research suggests that for data sharing to be effective and sustainable, multiple social and ethical requirements need to be met. An effective model of data sharing will be one in which considered judgments will need to be made about how best to achieve scientific progress, minimize risks of harm, promote fairness and reciprocity, and build and sustain trust. PMID:26297751

  5. Neuronal nuclei localization in 3D using level set and watershed segmentation from laser scanning microscopy images

    NASA Astrophysics Data System (ADS)

    Zhu, Yingxuan; Olson, Eric; Subramanian, Arun; Feiglin, David; Varshney, Pramod K.; Krol, Andrzej

    2008-03-01

    Abnormalities of the number and location of cells are hallmarks of both developmental and degenerative neurological diseases. However, standard stereological methods are impractical for assigning each cell's nucleus position within a large volume of brain tissue. We propose an automated approach for segmentation and localization of the brain cell nuclei in laser scanning microscopy (LSM) embryonic mouse brain images. The nuclei in these images are first segmented by using the level set (LS) and watershed methods in each optical plane. The segmentation results are further refined by application of information from adjacent optical planes and prior knowledge of nuclear shape. Segmentation is then followed with an algorithm for 3D localization of the centroid of nucleus (CN). Each volume of tissue is thus represented by a collection of centroids leading to an approximate 10,000-fold reduction in the data set size, as compared to the original image series. Our method has been tested on LSM images obtained from an embryonic mouse brain, and compared to the segmentation and CN localization performed by an expert. The average Euclidian distance between locations of CNs obtained using our method and those obtained by an expert is 1.58+/-1.24 µm, a value well within the ~5 µm average radius of each nucleus. We conclude that our approach accurately segments and localizes CNs within cell dense embryonic tissue.

  6. Assessement of serum amyloid A levels in the rehabilitation setting in the Florida manatee (Trichechus manatus latirostris).

    PubMed

    Cray, Carolyn; Dickey, Meranda; Brewer, Leah Brinson; Arheart, Kristopher L

    2013-12-01

    The acute phase protein serum amyloid A (SAA) has been previously shown to have value as a biomarker of inflammation and infection in many species, including manatees (Trichechus manatus latirostris). In the current study, results from an automated assay for SAA were used in a rehabilitation setting. Reference intervals were established from clinically normal manatees using the robust method: 0-46 mg/L. More than 30-fold higher mean SAA levels were observed in manatees suffering from cold stress and boat-related trauma. Poor correlations were observed between SAA and total white blood count, percentage of neutrophils, albumin, and albumin/globulin ratio. A moderate correlation was observed between SAA and the presence of nucleated red blood cells. The sensitivity of SAA testing was 93% and the specificity was 98%, representing the highest combined values of all the analytes. The results indicate that the automated method for SAA quantitation can provide important clinical data for manatees in a rehabilitation setting. PMID:24450049

  7. Calculation of contact angles at triple phase boundary in solid oxide fuel cell anode using the level set method

    SciTech Connect

    Sun, Xiaojun; Hasegawa, Yosuke; Kohno, Haruhiko; Jiao, Zhenjun; Hayakawa, Koji; Okita, Kohei; Shikazono, Naoki

    2014-10-15

    A level set method is applied to characterize the three dimensional structures of nickel, yttria stabilized zirconia and pore phases in solid oxide fuel cell anode reconstructed by focused ion beam-scanning electron microscope. A numerical algorithm is developed to evaluate the contact angles at the triple phase boundary based on interfacial normal vectors which can be calculated from the signed distance functions defined for each of the three phases. Furthermore, surface tension force is estimated from the contact angles by assuming the interfacial force balance at the triple phase boundary. The average contact angle values of nickel, yttria stabilized zirconia and pore are found to be 143°–156°, 83°–138° and 82°–123°, respectively. The mean contact angles remained nearly unchanged after 100 hour operation. However, the contact angles just after reduction are different for the cells with different sintering temperatures. In addition, standard deviations of the contact angles are very large especially for yttria stabilized zirconia and pore phases. The calculated surface tension forces from mean contact angles were close to the experimental values found in the literature. Slight increase of surface tensions of nickel/pore and nickel/yttria stabilized zirconia were observed after operation. Present data are expected to be used not only for the understanding of the degradation mechanism, but also for the quantitative prediction of the microstructural temporal evolution of solid oxide fuel cell anode. - Highlights: • A level set method is applied to characterize the 3D structures of SOFC anode. • A numerical algorithm is developed to evaluate the contact angles at the TPB. • Surface tension force is estimated from the contact angles. • The average contact angle values are found to be 143o-156o, 83o-138o and 82o-123o. • Present data are expected to understand degradation and predict evolution of SOFC.

  8. LV wall segmentation using the variational level set method (LSM) with additional shape constraint for oedema quantification.

    PubMed

    Kadir, K; Gao, H; Payne, A; Soraghan, J; Berry, C

    2012-10-01

    In this paper an automatic algorithm for the left ventricle (LV) wall segmentation and oedema quantification from T2-weighted cardiac magnetic resonance (CMR) images is presented. The extent of myocardial oedema delineates the ischaemic area-at-risk (AAR) after myocardial infarction (MI). Since AAR can be used to estimate the amount of salvageable myocardial post-MI, oedema imaging has potential clinical utility in the management of acute MI patients. This paper presents a new scheme based on the variational level set method (LSM) with additional shape constraint for the segmentation of T2-weighted CMR image. In our approach, shape information of the myocardial wall is utilized to introduce a shape feature of the myocardial wall into the variational level set formulation. The performance of the method is tested using real CMR images (12 patients) and the results of the automatic system are compared to manual segmentation. The mean perpendicular distances between the automatic and manual LV wall boundaries are in the range of 1-2 mm. Bland-Altman analysis on LV wall area indicates there is no consistent bias as a function of LV wall area, with a mean bias of -121 mm(2) between individual investigator one (IV1) and LSM, and -122 mm(2) between individual investigator two (IV2) and LSM when compared to two investigators. Furthermore, the oedema quantification demonstrates good correlation when compared to an expert with an average error of 9.3% for 69 slices of short axis CMR image from 12 patients.

  9. A level set method for image segmentation in the presence of intensity inhomogeneities with application to MRI.

    PubMed

    Li, Chunming; Huang, Rui; Ding, Zhaohua; Gatenby, J Chris; Metaxas, Dimitris N; Gore, John C

    2011-07-01

    Intensity inhomogeneity often occurs in real-world images, which presents a considerable challenge in image segmentation. The most widely used image segmentation algorithms are region-based and typically rely on the homogeneity of the image intensities in the regions of interest, which often fail to provide accurate segmentation results due to the intensity inhomogeneity. This paper proposes a novel region-based method for image segmentation, which is able to deal with intensity inhomogeneities in the segmentation. First, based on the model of images with intensity inhomogeneities, we derive a local intensity clustering property of the image intensities, and define a local clustering criterion function for the image intensities in a neighborhood of each point. This local clustering criterion function is then integrated with respect to the neighborhood center to give a global criterion of image segmentation. In a level set formulation, this criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, by minimizing this energy, our method is able to simultaneously segment the image and estimate the bias field, and the estimated bias field can be used for intensity inhomogeneity correction (or bias correction). Our method has been validated on synthetic images and real images of various modalities, with desirable performance in the presence of intensity inhomogeneities. Experiments show that our method is more robust to initialization, faster and more accurate than the well-known piecewise smooth model. As an application, our method has been used for segmentation and bias correction of magnetic resonance (MR) images with promising results. PMID:21518662

  10. A hybrid semi-automatic method for liver segmentation based on level-set methods using multiple seed points.

    PubMed

    Yang, Xiaopeng; Yu, Hee Chul; Choi, Younggeun; Lee, Wonsup; Wang, Baojian; Yang, Jaedo; Hwang, Hongpil; Kim, Ji Hyun; Song, Jisoo; Cho, Baik Hwan; You, Heecheon

    2014-01-01

    The present study developed a hybrid semi-automatic method to extract the liver from abdominal computerized tomography (CT) images. The proposed hybrid method consists of a customized fast-marching level-set method for detection of an optimal initial liver region from multiple seed points selected by the user and a threshold-based level-set method for extraction of the actual liver region based on the initial liver region. The performance of the hybrid method was compared with those of the 2D region growing method implemented in OsiriX using abdominal CT datasets of 15 patients. The hybrid method showed a significantly higher accuracy in liver extraction (similarity index, SI=97.6 ± 0.5%; false positive error, FPE = 2.2 ± 0.7%; false negative error, FNE=2.5 ± 0.8%; average symmetric surface distance, ASD=1.4 ± 0.5mm) than the 2D (SI=94.0 ± 1.9%; FPE = 5.3 ± 1.1%; FNE=6.5 ± 3.7%; ASD=6.7 ± 3.8mm) region growing method. The total liver extraction time per CT dataset of the hybrid method (77 ± 10 s) is significantly less than the 2D region growing method (575 ± 136 s). The interaction time per CT dataset between the user and a computer of the hybrid method (28 ± 4 s) is significantly shorter than the 2D region growing method (484 ± 126 s). The proposed hybrid method was found preferred for liver segmentation in preoperative virtual liver surgery planning.

  11. Linking English-Language Test Scores onto the Common European Framework of Reference: An Application of Standard-Setting Methodology. TOEFL iBT Research Report TOEFL iBt-06. ETS RR-08-34

    ERIC Educational Resources Information Center

    Tannenbaum, Richard J.; Wylie, E. Caroline

    2008-01-01

    The Common European Framework of Reference (CEFR) describes language proficiency in reading, writing, speaking, and listening on a 6-level scale. In this study, English-language experts from across Europe linked CEFR levels to scores on three tests: the TOEFL® iBT test, the TOEIC® assessment, and the TOEIC "Bridge"™ test.…

  12. Dynamics of shigellosis epidemics: estimating individual-level transmission and reporting rates from national epidemiologic data sets.

    PubMed

    Joh, Richard I; Hoekstra, Robert M; Barzilay, Ezra J; Bowen, Anna; Mintz, Eric D; Weiss, Howard; Weitz, Joshua S

    2013-10-15

    Shigellosis, a diarrheal disease, is endemic worldwide and is responsible for approximately 15,000 laboratory-confirmed cases in the United States every year. However, patients with shigellosis often do not seek medical care. To estimate the burden of shigellosis, we extended time-series susceptible-infected-recovered models to infer epidemiologic parameters from underreported case data. We applied the time-series susceptible-infected-recovered-based inference schemes to analyze the largest surveillance data set of Shigella sonnei in the United States from 1967 to 2007 with county-level resolution. The dynamics of shigellosis transmission show strong annual and multiyear cycles, as well as seasonality. By using the schemes, we inferred individual-level parameters of shigellosis infection, including seasonal transmissibilities and basic reproductive number (R0). In addition, this study provides quantitative estimates of the reporting rate, suggesting that the shigellosis burden in the United States may be more than 10 times the number of laboratory-confirmed cases. Although the estimated reporting rate is generally under 20%, and R0 is generally under 1.5, there is a strong negative correlation between estimates of the reporting rate and R0. Such negative correlations are likely to pose identifiability problems in underreported diseases. We discuss complementary approaches that might further disentangle the true reporting rate and R0. PMID:24008913

  13. Determinants of symptom profile and severity of conduct disorder in a tertiary level pediatric care set up: A pilot study

    PubMed Central

    Jayaprakash, R.; Rajamohanan, K.; Anil, P.

    2014-01-01

    Background: Conduct disorders (CDs) are one of the most common causes for referral to child and adolescent mental health centers. CD varies in its environmental factors, symptom profile, severity, co-morbidity, and functional impairment. Aims: The aim was to analyze the determinants of symptom profile and severity among childhood and adolescent onset CD. Settings and Design: Clinic based study with 60 consecutive children between 6 and 18 years of age satisfying International Classification of Disease-10 Development Control Rules guidelines for CD, attending behavioral pediatrics unit outpatient. Materials and Methods: The family psychopathology, symptom severity, and functional level were assessed using parent interview schedule, revised behavioral problem checklist and Children's Global Assessment Scale. Statistical Analysis: The correlation and predictive power of the variables were analyzed using SPSS 16.0 version. Results: There was significant male dominance (88.3%) with boy girl ratio 7.5:1. Most common comorbidity noticed was hyperkinetic disorders (45%). Childhood onset group was more predominant (70%). Prevalence of comorbidity was more among early onset group (66.7%) than the late-onset group (33.3%). The family psychopathology, symptom severity, and the functional impairment were significantly higher in the childhood onset group. Conclusion: The determinants of symptom profile and severity are early onset (childhood onset CD), nature, and quantity of family psychopathology, prevalence, and type of comorbidity and nature of symptom profile itself. The family psychopathology is positively correlated with the symptom severity and negatively correlated with the functional level of the children with CD. The symptom severity was negatively correlated with the functional level of the child with CD. PMID:25568472

  14. The application of language-game theory to the analysis of science learning: Developing an interpretive classroom-level learning framework

    NASA Astrophysics Data System (ADS)

    Ahmadibasir, Mohammad

    In this study an interpretive learning framework that aims to measure learning on the classroom level is introduced. In order to develop and evaluate the value of the framework, a theoretical/empirical study is designed. The researcher attempted to illustrate how the proposed framework provides insights on the problem of classroom-level learning. The framework is developed by construction of connections between the current literature on science learning and Wittgenstein's language-game theory. In this framework learning is defined as change of classroom language-game or discourse. In the proposed framework, learning is measured by analysis of classroom discourse. The empirical explanation power of the framework is evaluated by applying the framework in the analysis of learning in a fifth-grade science classroom. The researcher attempted to analyze how students' colloquial discourse changed to a discourse that bears more resemblance to science discourse. The results of the empirical part of the investigation are presented in three parts: first, the gap between what students did and what they were supposed to do was reported. The gap showed that students during the classroom inquiry wanted to do simple comparisons by direct observation, while they were supposed to do tool-assisted observation and procedural manipulation for a complete comparison. Second, it was illustrated that the first attempt to connect the colloquial to science discourse was done by what was immediately intelligible for students and then the teacher negotiated with students in order to help them to connect the old to the new language-game more purposefully. The researcher suggested that these two events in the science classroom are critical in discourse change. Third, it was illustrated that through the academic year, the way that students did the act of comparison was improved and by the end of the year more accurate causal inferences were observable in classroom communication. At the end of the

  15. Are Providers More Likely to Contribute to Healthcare Disparities Under High Levels of Cognitive Load? How Features of the Healthcare Setting May Lead to Biases in Medical Decision Making

    PubMed Central

    Burgess, Diana J.

    2014-01-01

    Systematic reviews of healthcare disparities suggest that clinicians’ diagnostic and therapeutic decision making varies by clinically irrelevant characteristics, such as patient race, and that this variation may contribute to healthcare disparities. However, there is little understanding of the particular features of the healthcare setting under which clinicians are most likely to be inappropriately influenced by these characteristics. This study delineates several hypotheses to stimulate future research in this area. It is posited that healthcare settings in which providers experience high levels of cognitive load will increase the likelihood of racial disparities via 2 pathways. First, providers who experience higher levels of cognitive load are hypothesized to make poorer medical decisions and provide poorer care for all patients, due to lower levels of controlled processing (H1). Second, under greater levels of cognitive load, it is hypothesized that healthcare providers’ medical decisions and interpersonal behaviors will be more likely to be influenced by racial stereotypes, leading to poorer processes and outcomes of care for racial minority patients (H2). It is further hypothesized that certain characteristics of healthcare settings will result in higher levels of cognitive load experienced by providers (H3). Finally, it is hypothesized that minority patients will be disproportionately likely to be treated in healthcare settings in which providers experience greater levels of cognitive load (H4a), which will result in racial disparities due to lower levels of controlled processing by providers (H4b) and the influence of racial stereotypes (H4c).The study concludes with implications for research and practice that flow from this framework. PMID:19726783

  16. A Research Study Using the Delphi Method to Define Essential Competencies for a High School Game Art and Design Course Framework at the National Level

    ERIC Educational Resources Information Center

    Mack, Nayo Corenus-Geneva

    2011-01-01

    This research study reports the findings of a Delphi study conducted to determine the essential competencies and objectives for a high school Game Art and Design course framework at the national level. The Delphi panel consisted of gaming, industry and educational experts from all over the world who were members of the International Game…

  17. Exploring facilitators and barriers to individual and organizational level capacity building: outcomes of participation in a community priority setting workshop.

    PubMed

    Flaman, Laura M; Nykiforuk, Candace I J; Plotnikoff, Ronald C; Raine, Kim

    2010-06-01

    This article explores facilitators and barriers to individual and organizational capacity to address priority strategies for community-level chronic disease prevention. Interviews were conducted with a group of participants who previously participated in a community priority-setting workshop held in two Alberta communities. The goal of the workshop was to bring together key community stakeholders to collaboratively identify action strategies for preventing chronic diseases in their communities. While capacity building was not the specific aim of the workshop, it could be considered an unintended byproduct of bringing together community representatives around a specific issue. One purpose of this study was to examine the participants' capacity to take action on the priority strategies identified at the workshop. Eleven one-on-one semi-structured interviews were conducted with workshop participants to examine facilitators and barriers to individual and organizational level capacity building. Findings suggest that there were several barriers identified by participants that limited their capacity to take action on the workshop strategies, specifically: (i) organizations' lack of priorities or competing priorities; (ii) priorities secondary to the organizational mandate; (iii) disconnect between organizational and community priorities; (iv) disconnect between community organization priorities; (v) disconnect between organizations and government/funder priorities; (vi) limited resources (i.e. time, money and personnel); and, (vii) bigger community issues. The primary facilitator of individual capacity to take action or priority strategies was supportive organizations. Recognition of these elements will allow practitioners, organizations, governments/funders, and communities to focus on seeking ways to improve capacity for chronic disease prevention. PMID:20587629

  18. SparCLeS: dynamic l₁ sparse classifiers with level sets for robust beard/moustache detection and segmentation.

    PubMed

    Le, T Hoang Ngan; Luu, Khoa; Savvides, Marios

    2013-08-01

    Robust facial hair detection and segmentation is a highly valued soft biometric attribute for carrying out forensic facial analysis. In this paper, we propose a novel and fully automatic system, called SparCLeS, for beard/moustache detection and segmentation in challenging facial images. SparCLeS uses the multiscale self-quotient (MSQ) algorithm to preprocess facial images and deal with illumination variation. Histogram of oriented gradients (HOG) features are extracted from the preprocessed images and a dynamic sparse classifier is built using these features to classify a facial region as either containing skin or facial hair. A level set based approach, which makes use of the advantages of both global and local information, is then used to segment the regions of a face containing facial hair. Experimental results demonstrate the effectiveness of our proposed system in detecting and segmenting facial hair regions in images drawn from three databases, i.e., the NIST Multiple Biometric Grand Challenge (MBGC) still face database, the NIST Color Facial Recognition Technology FERET database, and the Labeled Faces in the Wild (LFW) database.

  19. Correction to ``Extracting Man-Made Objects From High Spatial Resolution Remote Sensing Images via Fast Level Set Evolutions''

    NASA Astrophysics Data System (ADS)

    Li, Zhongbin; Shi, Wenzhong; Wang, Qunming; Miao, Zelang

    2015-10-01

    Object extraction from remote sensing images has long been an intensive research topic in the field of surveying and mapping. Most existing methods are devoted to handling just one type of object and little attention has been paid to improving the computational efficiency. In recent years, level set evolution (LSE) has been shown to be very promising for object extraction in the community of image processing and computer vision because it can handle topological changes automatically while achieving high accuracy. However, the application of state-of-the-art LSEs is compromised by laborious parameter tuning and expensive computation. In this paper, we proposed two fast LSEs for man-made object extraction from high spatial resolution remote sensing images. The traditional mean curvature-based regularization term is replaced by a Gaussian kernel and it is mathematically sound to do that. Thus a larger time step can be used in the numerical scheme to expedite the proposed LSEs. In contrast to existing methods, the proposed LSEs are significantly faster. Most importantly, they involve much fewer parameters while achieving better performance. The advantages of the proposed LSEs over other state-of-the-art approaches have been verified by a range of experiments.

  20. Novel level-set based segmentation method of the lung at HRCT images of diffuse interstitial lung disease (DILD)

    NASA Astrophysics Data System (ADS)

    Lee, Jeongjin; Seo, Joon Beom; Kim, Namkug; Park, Sang Ok; Lee, Ho; Shin, Yeong Gil; Kim, Soo-Hong

    2009-02-01

    In this paper, we propose an algorithm for reliable segmentation of the lung at HRCT of DILD. Our method consists of four main steps. First, the airway and colon are segmented and excluded by thresholding(-974 HU) and connected component analysis. Second, initial lung is identified by thresholding(-474 HU). Third, shape propagation outward the lung is performed on the initial lung. Actual lung boundaries exist inside the propagated boundaries. Finally, subsequent shape modeling level-set inward the lung from the propagated boundary can identify the lung boundary when the curvature term was highly weighted. To assess the accuracy of the proposed algorithm, the segmentation results of 54 patients are compared with those of manual segmentation done by an expert radiologist. The value of 1 minus volumetric overlap is less than 5% error. Accurate result of our method would be useful in determining the lung parenchyma at HRCT, which is the essential step for the automatic classification and quantification of diffuse interstitial lung disease.

  1. Automatic optimal filament segmentation with sub-pixel accuracy using generalized linear models and B-spline level-sets.

    PubMed

    Xiao, Xun; Geyer, Veikko F; Bowne-Anderson, Hugo; Howard, Jonathon; Sbalzarini, Ivo F

    2016-08-01

    Biological filaments, such as actin filaments, microtubules, and cilia, are often imaged using different light-microscopy techniques. Reconstructing the filament curve from the acquired images constitutes the filament segmentation problem. Since filaments have lower dimensionality than the image itself, there is an inherent trade-off between tracing the filament with sub-pixel accuracy and avoiding noise artifacts. Here, we present a globally optimal filament segmentation method based on B-spline vector level-sets and a generalized linear model for the pixel intensity statistics. We show that the resulting optimization problem is convex and can hence be solved with global optimality. We introduce a simple and efficient algorithm to compute such optimal filament segmentations, and provide an open-source implementation as an ImageJ/Fiji plugin. We further derive an information-theoretic lower bound on the filament segmentation error, quantifying how well an algorithm could possibly do given the information in the image. We show that our algorithm asymptotically reaches this bound in the spline coefficients. We validate our method in comprehensive benchmarks, compare with other methods, and show applications from fluorescence, phase-contrast, and dark-field microscopy.

  2. On the use of the resting potential and level set methods for identifying ischemic heart disease: An inverse problem

    NASA Astrophysics Data System (ADS)

    Nielsen, Bjørn Fredrik; Lysaker, Marius; Tveito, Aslak

    2007-01-01

    The electrical activity in the heart is modeled by a complex, nonlinear, fully coupled system of differential equations. Several scientists have studied how this model, referred to as the bidomain model, can be modified to incorporate the effect of heart infarctions on simulated ECG (electrocardiogram) recordings. We are concerned with the associated inverse problem; how can we use ECG recordings and mathematical models to identify the position, size and shape of heart infarctions? Due to the extreme CPU efforts needed to solve the bidomain equations, this model, in its full complexity, is not well-suited for this kind of problems. In this paper we show how biological knowledge about the resting potential in the heart and level set techniques can be combined to derive a suitable stationary model, expressed in terms of an elliptic PDE, for such applications. This approach leads to a nonlinear ill-posed minimization problem, which we propose to regularize and solve with a simple iterative scheme. Finally, our theoretical findings are illuminated through a series of computer simulations for an experimental setup involving a realistic heart in torso geometry. More specifically, experiments with synthetic ECG recordings, produced by solving the bidomain model, indicate that our method manages to identify the physical characteristics of the ischemic region(s) in the heart. Furthermore, the ill-posed nature of this inverse problem is explored, i.e. several quantitative issues of our scheme are explored.

  3. Mechanical behavior of pathological and normal red blood cells in microvascular flow based on modified level-set method

    NASA Astrophysics Data System (ADS)

    Zhang, XiWen; Ma, FangChao; Hao, PengFei; Yao, ZhaoHui

    2016-01-01

    The research of the motion and deformation of the RBCs is important to reveal the mechanism of blood diseases. A numerical method has been developed with level set formulation for elastic membrane immersed in incompressible fluid. The numerical model satisfies mass and energy conservation without the leaking problems in classical Immersed Boundary Method (IBM), at the same time, computing grid we used can be much smaller than the general literatures. The motion and deformation of a red blood cell (including pathological & normal status) in microvascular flow are simulated. It is found that the Reynolds number and membrane's stiffness play an important role in the transmutation and oscillation of the elastic membrane. The normal biconcave shape of the RBC is propitious to create high deformation than other pathological shapes. With reduced viscosity of the interior fluid both the velocity of the blood and the deformability of the cell reduced. With increased viscosity of the plasma both the velocity of the blood and the deformability of the cell reduced. The tank treading of the RBC membrane is observed at low enough viscosity contrast in shear flow. The tank tread fixed inclination angle of the cell depends on the shear ratio and viscosity contrast, which can be compared with the experimental observation well.

  4. Using computerised patient-level costing data for setting DRG weights: the Victorian (Australia) cost weight studies.

    PubMed

    Jackson, T

    2001-05-01

    Casemix-funding systems for hospital inpatient care require a set of resource weights which will not inadvertently distort patterns of patient care. Few health systems have very good sources of cost information, and specific studies to derive empirical cost relativities are themselves costly. This paper reports a 5 year program of research into the use of data from hospital management information systems (clinical costing systems) to estimate resource relativities for inpatient hospital care used in Victoria's DRG-based payment system. The paper briefly describes international approaches to cost weight estimation. It describes the architecture of clinical costing systems, and contrasts process and job costing approaches to cost estimation. Techniques of data validation and reliability testing developed in the conduct of four of the first five of the Victorian Cost Weight Studies (1993-1998) are described. Improvement in sampling, data validity and reliability are documented over the course of the research program, the advantages of patient-level data are highlighted. The usefulness of these byproduct data for estimation of relative resource weights and other policy applications may be an important factor in hospital and health system decisions to invest in clinical costing technology.

  5. Automatic optimal filament segmentation with sub-pixel accuracy using generalized linear models and B-spline level-sets.

    PubMed

    Xiao, Xun; Geyer, Veikko F; Bowne-Anderson, Hugo; Howard, Jonathon; Sbalzarini, Ivo F

    2016-08-01

    Biological filaments, such as actin filaments, microtubules, and cilia, are often imaged using different light-microscopy techniques. Reconstructing the filament curve from the acquired images constitutes the filament segmentation problem. Since filaments have lower dimensionality than the image itself, there is an inherent trade-off between tracing the filament with sub-pixel accuracy and avoiding noise artifacts. Here, we present a globally optimal filament segmentation method based on B-spline vector level-sets and a generalized linear model for the pixel intensity statistics. We show that the resulting optimization problem is convex and can hence be solved with global optimality. We introduce a simple and efficient algorithm to compute such optimal filament segmentations, and provide an open-source implementation as an ImageJ/Fiji plugin. We further derive an information-theoretic lower bound on the filament segmentation error, quantifying how well an algorithm could possibly do given the information in the image. We show that our algorithm asymptotically reaches this bound in the spline coefficients. We validate our method in comprehensive benchmarks, compare with other methods, and show applications from fluorescence, phase-contrast, and dark-field microscopy. PMID:27104582

  6. Developmental screening tools: feasibility of use at primary healthcare level in low- and middle-income settings.

    PubMed

    Fischer, Vinicius Jobim; Morris, Jodi; Martines, José

    2014-06-01

    An estimated 150 million children have a disability. Early identification of developmental disabilities is a high priority for the World Health Organization to allow action to reduce impairments through Gap Action Program on mental health. The study identified the feasibility of using the developmental screening and monitoring tools for children aged 0-3 year(s) by non-specialist primary healthcare providers in low-resource settings. A systematic review of the literature was conducted to identify the tools, assess their psychometric properties, and feasibility of use in low- and middle-income countries (LMICs). Key indicators to examine feasibility in LMICs were derived from a consultation with 23 international experts. We identified 426 studies from which 14 tools used in LMICs were extracted for further examination. Three tools reported adequate psychometric properties and met most of the feasibility criteria. Three tools appear promising for use in identifying and monitoring young children with disabilities at primary healthcare level in LMICs. Further research and development are needed to optimize these tools. PMID:25076668

  7. Developmental Screening Tools: Feasibility of Use at Primary Healthcare Level in Low- and Middle-income Settings

    PubMed Central

    Morris, Jodi; Martines, José

    2014-01-01

    ABSTRACT An estimated 150 million children have a disability. Early identification of developmental disabilities is a high priority for the World Health Organization to allow action to reduce impairments through Gap Action Program on mental health. The study identified the feasibility of using the developmental screening and monitoring tools for children aged 0-3 year(s) by non-specialist primary healthcare providers in low-resource settings. A systematic review of the literature was conducted to identify the tools, assess their psychometric properties, and feasibility of use in low- and middle-income countries (LMICs). Key indicators to examine feasibility in LMICs were derived from a consultation with 23 international experts. We identified 426 studies from which 14 tools used in LMICs were extracted for further examination. Three tools reported adequate psychometric properties and met most of the feasibility criteria. Three tools appear promising for use in identifying and monitoring young children with disabilities at primary healthcare level in LMICs. Further research and development are needed to optimize these tools. PMID:25076668

  8. Sedimentary framework of the southern Maine inner continental shelf: Influence of glaciation and sea-level change

    USGS Publications Warehouse

    Kelley, J.T.; Belknap, D.F.; Shipp, R.C.

    1989-01-01

    Although the tidally influenced shoreline of Maine is longer than that of virtually any other state, almost no research on its geology has been published. In order to go some way towards remedying this, 1500 km of high-resolution seismic reflection data and 800 km of sidescan sonar imagery have been collected. On the basis of these data and observations made during ten submersible dives, more than 800 bottom samples were collected and evaluated for texture and composition. The understanding of the sedimentary framework of the southern Maine shelf and the processes that maintain it are summarized, and future research directions to evaluate the strategic mineral potential are indicated. In the past 14,000 years, the Maine shelf has experienced a deglaciation and two marine transgressions separated by a regression. The deglaciation was accompanied by the first transgression and deposited till interbedded with up to 40 m of glaciomarine sediment (the Presumpscot Formation) across the shelf. The first transgression culminated about 12,500 yrs B.P., and its landward limit is marked by large glaciomarine deltas 50-100 km landward of the present-day coast. Sea level fell until about 9500 yrs B.P., when shorelines were cut at about the 65 m depth and some large "lowstand deltas" were deposited. Sea level has risen since then and in the general absence of modern river sediment input marine processes have reworked the older sediment. Five shelf environments have been defined in terms of their surficial sediment and stratigraphy. Nearshore ramps are sandy regions extending to about 30 m deep offshore of sandy beaches. These may be reworked lowstand deltas, and possess the thickest bodies of sand in the region. Nearshore basins are mud-filled troughs seaward of coastal areas lacking significant river input. Slumping glaciomarine deposits provide most of the Holocene mud that floors these basins. Rocky zones are extensive areas of exposed rock most common in the 30-50 m depth

  9. Gender Mainstreaming in Education at the Level of Field Operations: The Case of CARE USA's Indicator Framework

    ERIC Educational Resources Information Center

    Miske, Shirley; Meagher, Margaret; DeJaeghere, Joan

    2010-01-01

    Following the adoption of gender mainstreaming at the Beijing Conference for Women in 1995 as a major strategy to promote gender equality and the recognition of gender analysis as central to this process, Gender and Development (GAD) frameworks have provided tools for gender analysis in various sectors. Gender mainstreaming in basic education has…

  10. The resilience activation framework: a conceptual model of how access to social resources promotes adaptation and rapid recovery in post-disaster settings.

    PubMed

    Abramson, David M; Grattan, Lynn M; Mayer, Brian; Colten, Craig E; Arosemena, Farah A; Bedimo-Rung, Ariane; Lichtveld, Maureen

    2015-01-01

    A number of governmental agencies have called for enhancing citizens' resilience as a means of preparing populations in advance of disasters, and as a counterbalance to social and individual vulnerabilities. This increasing scholarly, policy, and programmatic interest in promoting individual and communal resilience presents a challenge to the research and practice communities: to develop a translational framework that can accommodate multidisciplinary scientific perspectives into a single, applied model. The Resilience Activation Framework provides a basis for testing how access to social resources, such as formal and informal social support and help, promotes positive adaptation or reduced psychopathology among individuals and communities exposed to the acute collective stressors associated with disasters, whether human-made, natural, or technological in origin. Articulating the mechanisms by which access to social resources activate and sustain resilience capacities for optimal mental health outcomes post-disaster can lead to the development of effective preventive and early intervention programs.

  11. The Resilience Activation Framework: A conceptual model of how access to social resources promotes adaptation and rapid recovery in post-disaster settings

    PubMed Central

    Abramson, David M.; Grattan, Lynn M.; Mayer, Brian; Colten, Craig E.; Arosemena, Farah A.; Rung, Ariane; Lichtveld, Maureen

    2014-01-01

    A number of governmental agencies have called for enhancing citizen’s resilience as a means of preparing populations in advance of disasters, and as a counter-balance to social and individual vulnerabilities. This increasing scholarly, policy and programmatic interest in promoting individual and communal resilience presents a challenge to the research and practice communities: to develop a translational framework that can accommodate multi-disciplinary scientific perspectives into a single, applied model. The Resilience Activation Framework provides a basis for testing how access to social resources, such as formal and informal social support and help, promotes positive adaptation or reduced psychopathology among individuals and communities exposed to the acute collective stressors associated with disasters, whether manmade, natural, or technological in origin. Articulating the mechanisms by which access to social resources activate and sustain resilience capacities for optimal mental health outcomes post-disaster can lead to the development of effective preventive and early intervention programs. PMID:24870399

  12. A Simple Model Framework to Explore the Deeply Uncertain, Local Sea Level Response to Climate Change. A Case Study on New Orleans, Louisiana

    NASA Astrophysics Data System (ADS)

    Bakker, Alexander; Louchard, Domitille; Keller, Klaus

    2016-04-01

    Sea-level rise threatens many coastal areas around the world. The integrated assessment of potential adaptation and mitigation strategies requires a sound understanding of the upper tails and the major drivers of the uncertainties. Global warming causes sea-level to rise, primarily due to thermal expansion of the oceans and mass loss of the major ice sheets, smaller ice caps and glaciers. These components show distinctly different responses to temperature changes with respect to response time, threshold behavior, and local fingerprints. Projections of these different components are deeply uncertain. Projected uncertainty ranges strongly depend on (necessary) pragmatic choices and assumptions; e.g. on the applied climate scenarios, which processes to include and how to parameterize them, and on error structure of the observations. Competing assumptions are very hard to objectively weigh. Hence, uncertainties of sea-level response are hard to grasp in a single distribution function. The deep uncertainty can be better understood by making clear the key assumptions. Here we demonstrate this approach using a relatively simple model framework. We present a mechanistically motivated, but simple model framework that is intended to efficiently explore the deeply uncertain sea-level response to anthropogenic climate change. The model consists of 'building blocks' that represent the major components of sea-level response and its uncertainties, including threshold behavior. The framework's simplicity enables the simulation of large ensembles allowing for an efficient exploration of parameter uncertainty and for the simulation of multiple combined adaptation and mitigation strategies. The model framework can skilfully reproduce earlier major sea level assessments, but due to the modular setup it can also be easily utilized to explore high-end scenarios and the effect of competing assumptions and parameterizations.

  13. From papers to practices: district level priority setting processes and criteria for family planning, maternal, newborn and child health interventions in Tanzania

    PubMed Central

    2011-01-01

    Background Successful priority setting is increasingly known to be an important aspect in achieving better family planning, maternal, newborn and child health (FMNCH) outcomes in developing countries. However, far too little attention has been paid to capturing and analysing the priority setting processes and criteria for FMNCH at district level. This paper seeks to capture and analyse the priority setting processes and criteria for FMNCH at district level in Tanzania. Specifically, we assess the FMNCH actor's engagement and understanding, the criteria used in decision making and the way criteria are identified, the information or evidence and tools used to prioritize FMNCH interventions at district level in Tanzania. Methods We conducted an exploratory study mixing both qualitative and quantitative methods to capture and analyse the priority setting for FMNCH at district level, and identify the criteria for priority setting. We purposively sampled the participants to be included in the study. We collected the data using the nominal group technique (NGT), in-depth interviews (IDIs) with key informants and documentary review. We analysed the collected data using both content analysis for qualitative data and correlation analysis for quantitative data. Results We found a number of shortfalls in the district's priority setting processes and criteria which may lead to inefficient and unfair priority setting decisions in FMNCH. In addition, participants identified the priority setting criteria and established the perceived relative importance of the identified criteria. However, we noted differences exist in judging the relative importance attached to the criteria by different stakeholders in the districts. Conclusions In Tanzania, FMNCH contents in both general development policies and sector policies are well articulated. However, the current priority setting process for FMNCH at district levels are wanting in several aspects rendering the priority setting process for

  14. Self-Compassion: A Mentorship Framework for Counselor Educator Mothers

    ERIC Educational Resources Information Center

    Solomon, Coralis; Barden, Sejal Mehta

    2016-01-01

    Counselor educators experience high levels of stress. Mothers in academia face an additional set of emotional stressors. The authors offer a self-compassion framework for mentors to increase emotional resilience of mothers in counselor education.

  15. The role of the basis set and the level of quantum mechanical theory in the prediction of the structure and reactivity of cisplatin.

    PubMed

    Paschoal, Diego; Marcial, Bruna L; Lopes, Juliana Fedoce; De Almeida, Wagner B; Dos Santos, Hélio F

    2012-11-01

    In this article, we conducted an extensive ab initio study on the importance of the level of theory and the basis set for theoretical predictions of the structure and reactivity of cisplatin [cis-diamminedichloroplatinum(II) (cDDP)]. Initially, the role of the basis set for the Pt atom was assessed using 24 different basis sets, including three all-electron basis sets (ABS). In addition, a modified all-electron double zeta polarized basis set (mDZP) was proposed by adding a set of diffuse d functions onto the existing DZP basis set. The energy barrier and the rate constant for the first chloride/water exchange ligand process, namely, the aquation reaction, were taken as benchmarks for which reliable experimental data are available. At the B3LYP/mDZP/6-31+G(d) level (the first basis set is for Pt and the last set is for all of the light atoms), the energy barrier was 22.8 kcal mol(-1), which is in agreement with the average experimental value, 22.9 ± 0.4 kcal mol(-1). For the other accessible ABS (DZP and ADZP), the corresponding values were 15.4 and 24.5 kcal mol(-1), respectively. The ADZP and mDZP are notably similar, raising the importance of diffuse d functions for the prediction of the kinetic properties of cDDP. In this article, we also analyze the ligand basis set and the level of theory effects by considering 36 basis sets at distinct levels of theory, namely, Hartree-Fock, MP2, and several DFT functionals. From a survey of the data, we recommend the mPW1PW91/mDZP/6-31+G(d) or B3PW91/mDZP/6-31+G(d) levels to describe the structure and reactivity of cDDP and its small derivatives. Conversely, for large molecules containing a cisplatin motif (for example, the cDDP-DNA complex), the lower levels B3LYP/LANL2DZ/6-31+G(d) and B3LYP/SBKJC-VDZ/6-31+G(d) are suggested. At these levels of theory, the predicted energy barrier was 26.0 and 25.9 kcal mol(-1), respectively, which is only 13% higher than the actual value.

  16. Design of a pseudo-log image transform hardware accelerator in a high-level synthesis-based memory management framework

    NASA Astrophysics Data System (ADS)

    Butt, Shahzad Ahmad; Mancini, Stéphane; Rousseau, Frédéric; Lavagno, Luciano

    2014-09-01

    The pseudo-log image transform belongs to a class of image processing kernels that generate memory references which are nonlinear functions of loop indices. Due to the nonlinearity of the memory references, the usual design methodologies do not allow efficient hardware implementation for nonlinear kernels. For optimized hardware implementation, these kernels require the creation of a customized memory hierarchy and efficient data/memory management strategy. We present the design and real-time hardware implementation of a pseudo-log image transform IP (hardware image processing engine) using a memory management framework. The framework generates a controller which efficiently manages input data movement in the form of tiles between off-chip main memory, on-chip memory, and the core processing unit. The framework can jointly optimize the memory hierarchy and the tile computation schedule to reduce on-chip memory requirements, to maximize throughput, and to increase data reuse for reducing off-chip memory bandwidth requirements. The algorithmic C++ description of the pseudo-log kernel is profiled in the framework to generate an enhanced description with a customized memory hierarchy. The enhanced description of the kernel is then used for high-level synthesis (HLS) to perform architectural design space exploration in order to find an optimal implementation under given performance constraints. The optimized register transfer level implementation of the IP generated after HLS is used for performance estimation. The performance estimation is done in a simulation framework to characterize the IP with different external off-chip memory latencies and a variety of data transfer policies. Experimental results show that the designed IP can be used for real-time implementation and that the generated memory hierarchy is capable of feeding the IP with a sufficiently high bandwidth even in the presence of long external memory latencies.

  17. Robust Systems Test Framework

    2003-01-01

    The Robust Systems Test Framework (RSTF) provides a means of specifying and running test programs on various computation platforms. RSTF provides a level of specification above standard scripting languages. During a set of runs, standard timing information is collected. The RSTF specification can also gather job-specific information, and can include ways to classify test outcomes. All results and scripts can be stored into and retrieved from an SQL database for later data analysis. RSTF alsomore » provides operations for managing the script and result files, and for compiling applications and gathering compilation information such as optimization flags.« less

  18. Robust Systems Test Framework

    SciTech Connect

    Ballance, Robert A.

    2003-01-01

    The Robust Systems Test Framework (RSTF) provides a means of specifying and running test programs on various computation platforms. RSTF provides a level of specification above standard scripting languages. During a set of runs, standard timing information is collected. The RSTF specification can also gather job-specific information, and can include ways to classify test outcomes. All results and scripts can be stored into and retrieved from an SQL database for later data analysis. RSTF also provides operations for managing the script and result files, and for compiling applications and gathering compilation information such as optimization flags.

  19. The relative noise levels of parallel axis gear sets with various contact ratios and gear tooth forms

    NASA Technical Reports Server (NTRS)

    Drago, Raymond J.; Lenski, Joseph W., Jr.; Spencer, Robert H.; Valco, Mark; Oswald, Fred B.

    1993-01-01

    The real noise reduction benefits which may be obtained through the use of one gear tooth form as compared to another is an important design parameter for any geared system, especially for helicopters in which both weight and reliability are very important factors. This paper describes the design and testing of nine sets of gears which are as identical as possible except for their basic tooth geometry. Noise measurements were made at various combinations of load and speed for each gear set so that direct comparisons could be made. The resultant data was analyzed so that valid conclusions could be drawn and interpreted for design use.

  20. End of FY10 report - used fuel disposition technical bases and lessons learned : legal and regulatory framework for high-level waste disposition in the United States.

    SciTech Connect

    Weiner, Ruth F.; Blink, James A.; Rechard, Robert Paul; Perry, Frank; Jenkins-Smith, Hank C.; Carter, Joe; Nutt, Mark; Cotton, Tom

    2010-09-01

    This report examines the current policy, legal, and regulatory framework pertaining to used nuclear fuel and high level waste management in the United States. The goal is to identify potential changes that if made could add flexibility and possibly improve the chances of successfully implementing technical aspects of a nuclear waste policy. Experience suggests that the regulatory framework should be established prior to initiating future repository development. Concerning specifics of the regulatory framework, reasonable expectation as the standard of proof was successfully implemented and could be retained in the future; yet, the current classification system for radioactive waste, including hazardous constituents, warrants reexamination. Whether or not consideration of multiple sites are considered simultaneously in the future, inclusion of mechanisms such as deliberate use of performance assessment to manage site characterization would be wise. Because of experience gained here and abroad, diversity of geologic media is not particularly necessary as a criterion in site selection guidelines for multiple sites. Stepwise development of the repository program that includes flexibility also warrants serious consideration. Furthermore, integration of the waste management system from storage, transportation, and disposition, should be examined and would be facilitated by integration of the legal and regulatory framework. Finally, in order to enhance acceptability of future repository development, the national policy should be cognizant of those policy and technical attributes that enhance initial acceptance, and those policy and technical attributes that maintain and broaden credibility.

  1. Infodemiology and infoveillance: framework for an emerging set of public health informatics methods to analyze search, communication and publication behavior on the Internet.

    PubMed

    Eysenbach, Gunther

    2009-03-27

    Infodemiology can be defined as the science of distribution and determinants of information in an electronic medium, specifically the Internet, or in a population, with the ultimate aim to inform public health and public policy. Infodemiology data can be collected and analyzed in near real time. Examples for infodemiology applications include the analysis of queries from Internet search engines to predict disease outbreaks (eg. influenza), monitoring peoples' status updates on microblogs such as Twitter for syndromic surveillance, detecting and quantifying disparities in health information availability, identifying and monitoring of public health relevant publications on the Internet (eg. anti-vaccination sites, but also news articles or expert-curated outbreak reports), automated tools to measure information diffusion and knowledge translation, and tracking the effectiveness of health marketing campaigns. Moreover, analyzing how people search and navigate the Internet for health-related information, as well as how they communicate and share this information, can provide valuable insights into health-related behavior of populations. Seven years after the infodemiology concept was first introduced, this paper revisits the emerging fields of infodemiology and infoveillance and proposes an expanded framework, introducing some basic metrics such as information prevalence, concept occurrence ratios, and information incidence. The framework distinguishes supply-based applications (analyzing what is being published on the Internet, eg. on Web sites, newsgroups, blogs, microblogs and social media) from demand-based methods (search and navigation behavior), and further distinguishes passive from active infoveillance methods. Infodemiology metrics follow population health relevant events or predict them. Thus, these metrics and methods are potentially useful for public health practice and research, and should be further developed and standardized.

  2. Infodemiology and Infoveillance: Framework for an Emerging Set of Public Health Informatics Methods to Analyze Search, Communication and Publication Behavior on the Internet

    PubMed Central

    2009-01-01

    Infodemiology can be defined as the science of distribution and determinants of information in an electronic medium, specifically the Internet, or in a population, with the ultimate aim to inform public health and public policy. Infodemiology data can be collected and analyzed in near real time. Examples for infodemiology applications include: the analysis of queries from Internet search engines to predict disease outbreaks (eg. influenza); monitoring peoples' status updates on microblogs such as Twitter for syndromic surveillance; detecting and quantifying disparities in health information availability; identifying and monitoring of public health relevant publications on the Internet (eg. anti-vaccination sites, but also news articles or expert-curated outbreak reports); automated tools to measure information diffusion and knowledge translation, and tracking the effectiveness of health marketing campaigns. Moreover, analyzing how people search and navigate the Internet for health-related information, as well as how they communicate and share this information, can provide valuable insights into health-related behavior of populations. Seven years after the infodemiology concept was first introduced, this paper revisits the emerging fields of infodemiology and infoveillance and proposes an expanded framework, introducing some basic metrics such as information prevalence, concept occurrence ratios, and information incidence. The framework distinguishes supply-based applications (analyzing what is being published on the Internet, eg. on Web sites, newsgroups, blogs, microblogs and social media) from demand-based methods (search and navigation behavior), and further distinguishes passive from active infoveillance methods. Infodemiology metrics follow population health relevant events or predict them. Thus, these metrics and methods are potentially useful for public health practice and research, and should be further developed and standardized. PMID:19329408

  3. Pedometer-Based Physical Activity Level and Body Composition among Minority Children in a Physical Activity Setting

    ERIC Educational Resources Information Center

    Agbuga, Bulent

    2011-01-01

    Most studies focusing on the relationship between physical activity and obesity have been conducted in middle class Caucasian adults and children and few such studies are available concerning minority children in physical activity settings (Johnson, Kulinna, Tudor-Locke, Darst, & Pangrazi, 2007; Rowlands et al., 1999; Tudor-Locke, Lee, Morgan,…

  4. Critical Skill Sets of Entry-Level IT Professionals: An Empirical Examination of Perceptions from Field Personnel

    ERIC Educational Resources Information Center

    McMurtrey, Mark E.; Downey, James P.; Zeltmann, Steven M.; Friedman, William H.

    2008-01-01

    Understanding the skill sets required of IT personnel is a critical endeavor for both business organizations and academic or training institutions. Companies spend crucial resources training personnel, particularly new IT employees, and educational institutions must know what skills are essential in order to plan an effective curriculum. Rapid…

  5. Algorithms for projecting a point onto a level surface of a continuous function on a compact set

    NASA Astrophysics Data System (ADS)

    Arutyunova, N. K.; Dulliev, A. M.; Zabotin, V. I.

    2014-09-01

    Given an equation f( x) = 0, the problem of finding its solution nearest to a given point is considered. In contrast to the authors' previous works dealing with this problem, exact algorithms are proposed assuming that the function f is continuous on a compact set. The convergence of the algorithms is proved, and their performance is illustrated with test examples.

  6. An application-dependent framework for the recognition of high-level surgical tasks in the OR.

    PubMed

    Lalys, Florent; Riffaud, Laurent; Bouget, David; Jannin, Pierre

    2011-01-01

    Surgical process analysis and modeling is a recent and important topic aiming at introducing a new generation of computer-assisted surgical systems. Among all of the techniques already in use for extracting data from the Operating Room, the use of image videos allows automating the surgeons' assistance without altering the surgical routine. We proposed in this paper an application-dependent framework able to automatically extract the phases of the surgery only by using microscope videos as input data and that can be adaptable to different surgical specialties. First, four distinct types of classifiers based on image processing were implemented to extract visual cues from video frames. Each of these classifiers was related to one kind of visual cue: visual cues recognizable through color were detected with a color histogram approach, for shape-oriented visual cues we trained a Haar classifier, for texture-oriented visual cues we used a bag-of-word approach with SIFT descriptors, and for all other visual cues we used a classical image classification approach including a feature extraction, selection, and a supervised classification. The extraction of this semantic vector for each video frame then permitted to classify time series using either Hidden Markov Model or Dynamic Time Warping algorithms. The framework was validated on cataract surgeries, obtaining accuracies of 95%. PMID:22003634

  7. Molecular-level characterization of the breathing behavior of the jungle-gym-type DMOF-1 metal-organic framework.

    PubMed

    Grosch, Jason S; Paesani, Francesco

    2012-03-01

    Fundamental insights into the molecular mechanisms that determine the breathing behavior of the jungle-gym-type DMOF-1 metal-organic framework upon adsorption of benzene and isopropyl alcohol are gained from computer simulations. In all cases, good agreement is obtained between the calculated and experimental structural parameters. In the case of benzene adsorption, DMOF-1 is predicted to exist in a narrow pore configuration at high loadings and/or low temperature. A structural transition into a large pore configuration is then observed as the temperature increases and/or the loading decreases, which is directly related to the spatial distribution and molecular interactions of the benzene molecules within the pores. The isopropyl alcohol adsorption simulations indicate that DMOF-1 undergoes two distinct structural transitions (from large pore to narrow pore and then back to large pore) as the number of adsorbed molecules increases, which is explained in terms of the formation of hydrogen bonds between the isopropyl molecules and the framework.

  8. Increases of SET level and translocation are correlated with tau hyperphosphorylation at ser202/thr205 in CA1 of Ts65Dn mice.

    PubMed

    Dorard, Emilie; Gorisse-Hussonnois, Lucie; Guihenneuc-Jouyaux, Chantal; Albac, Christelle; Potier, Marie-Claude; Allinquant, Bernadette

    2016-10-01

    SET is a multifunctional protein, but when present in the cytoplasm, acts as a powerful inhibitor of phosphatase 2A. We previously observed that in CA1 of Down syndrome (DS) patients, the level of SET is increased, and SET is translocated to the cytoplasm and associated with the hyperphosphorylation of tau at ser202/thr205. The presence of SET in the cytoplasm in DS brains may play a role in the progression of the disease. Here, we show that in CA1 of 3-month-old Ts65Dn mice modeling DS, SET level is increased, and SET is translocated to the cytoplasm and associated with tau hyperphosphorylations at ser202/thr205 and with amyloid precursor protein caspase cleaved as observed in Alzheimer disease brains. Tau hyperphosphorylation at ser356 and activation of other phosphatase 2A targets such as the mammalian target of rapamycin and adenosine monophosphate protein kinases were also observed, suggesting deleterious mechanisms. We propose Ts65Dn mice as a model for therapeutic approaches focused on SET overexpression and its cytoplasmic translocation to slow down disease progression. PMID:27460148

  9. Increases of SET level and translocation are correlated with tau hyperphosphorylation at ser202/thr205 in CA1 of Ts65Dn mice.

    PubMed

    Dorard, Emilie; Gorisse-Hussonnois, Lucie; Guihenneuc-Jouyaux, Chantal; Albac, Christelle; Potier, Marie-Claude; Allinquant, Bernadette

    2016-10-01

    SET is a multifunctional protein, but when present in the cytoplasm, acts as a powerful inhibitor of phosphatase 2A. We previously observed that in CA1 of Down syndrome (DS) patients, the level of SET is increased, and SET is translocated to the cytoplasm and associated with the hyperphosphorylation of tau at ser202/thr205. The presence of SET in the cytoplasm in DS brains may play a role in the progression of the disease. Here, we show that in CA1 of 3-month-old Ts65Dn mice modeling DS, SET level is increased, and SET is translocated to the cytoplasm and associated with tau hyperphosphorylations at ser202/thr205 and with amyloid precursor protein caspase cleaved as observed in Alzheimer disease brains. Tau hyperphosphorylation at ser356 and activation of other phosphatase 2A targets such as the mammalian target of rapamycin and adenosine monophosphate protein kinases were also observed, suggesting deleterious mechanisms. We propose Ts65Dn mice as a model for therapeutic approaches focused on SET overexpression and its cytoplasmic translocation to slow down disease progression.

  10. Setting Standards for the 1998 NAEP in Civics and Writing: Using Focus Groups To Finalize the Achievement Levels Descriptions.

    ERIC Educational Resources Information Center

    Hanick, Patricia L.; Loomis, Susan Cooper

    The description of achievement levels is important to the process of reporting student performance on the National Assessment of Educational Progress (NAEP). A process was designed to develop achievement level descriptions (ALDs) for writing and civics. To begin, focus groups were conducted in each of the four NAEP regions for both writing and…

  11. Critical Review: Building on the HIV Cascade: A Complementary "HIV States and Transitions" Framework for Describing HIV Diagnosis, Care, and Treatment at the Population Level.

    PubMed

    Powers, Kimberly A; Miller, William C

    2015-07-01

    The HIV cascade--often referred to as "the HIV continuum"--provides a valuable framework for population-level representations of engagement with the HIV healthcare system. The importance and appeal of this framework are evidenced by a large body of scientific literature, as well as by the adoption of cascade-related indicators by medical and public health organizations worldwide. Despite its centrality in the fields of HIV treatment and prevention, however, the traditional cascade provides limited description of the processes affecting the numbers it represents. Representations that describe these processes and capture the dynamic nature of HIV-infected persons' pathways through the healthcare system are essential for monitoring and predicting intervention effects and epidemic trends. We propose here a complementary schema--termed the "HIV States and Transitions" framework--designed to maintain key strengths of the traditional cascade while addressing key limitations and more fully describing the dynamic aspects of HIV testing, care, and treatment at the population level. PMID:25835604

  12. The Simple View of Reading as a Framework for National Literacy Initiatives: A Hierarchical Model of Pupil-Level and Classroom-Level Factors

    ERIC Educational Resources Information Center

    Savage, Robert; Burgos, Giovani; Wood, Eileen; Piquette, Noella

    2015-01-01

    The Simple View of Reading (SVR) describes Reading Comprehension as the product of distinct child-level variance in decoding (D) and linguistic comprehension (LC) component abilities. When used as a model for educational policy, distinct classroom-level influences of each of the components of the SVR model have been assumed, but have not yet been…

  13. Integrating Frequency-Based Mathematics Instruction with a Multi-Level Assessment System to Enhance Response to Intervention Frameworks

    ERIC Educational Resources Information Center

    Moors, Alison; Weisenburgh-Snyder, Amy; Robbins, Joanne

    2010-01-01

    The American government set new standards mandating States to demonstrate adequate yearly progress for all students with the inception of the No Child Left Behind Act. To be eligible for the more recent Race to the Top funds, states must show, in part, a commitment to "building data systems that measure student growth and success, and inform…

  14. Characteristics of Students Receiving Special Education Services in a Central Minnesota School District According to Setting, Classification, and Level of Service.

    ERIC Educational Resources Information Center

    Ittenbach, Richard F.; And Others

    The records of 1,231 preschool, elementary, and secondary students receiving special education services in a central Minnesota school district were evaluated to provide information on differences according to setting, classification, and level of service. Data were analyzed within the context of four broad domains: demographics (age, race, gender,…

  15. Mary's First Schoolday in Paris: A Set of Six Culture Assimilators Written in English for Use in French Classes at All Levels.

    ERIC Educational Resources Information Center

    Lapeyre, Andrea

    A set of six programmed culture assimilators, written in English for use in French classes at all levels, is presented. The assimilators do not deal with specific classroom activities but rather with Mary's activities outside class. The assimilators are entitled: Having Breakfast, Taking the Bus, In the Latin Quarter, Lunch in a Cafe, In the…

  16. Entry-Level Athletic Trainers' Self-Confidence in Clinical Skill Preparedness for Treating Athletic and Emergent Settings Populations

    ERIC Educational Resources Information Center

    Morin, Gary E.; Misasi, Sharon; Davis, Charles; Hannah, Corey; Rothbard, Matthew

    2014-01-01

    Context: Clinical education is an important component of athletic training education. Concern exists regarding whether clinical experience adequately prepares students to perform professional skills after graduation, particularly with patients in emerging settings. Objective: To determine the confidence levels of athletic training graduates in…

  17. A Ghost Fluid/Level Set Method for boiling flows and liquid evaporation: Application to the Leidenfrost effect

    NASA Astrophysics Data System (ADS)

    Rueda Villegas, Lucia; Alis, Romain; Lepilliez, Mathieu; Tanguy, Sébastien

    2016-07-01

    The development of numerical methods for the direct numerical simulation of two-phase flows with phase change, in the framework of interface capturing or interface tracking methods, is the main topic of this study. We propose a novel numerical method, which allows dealing with both evaporation and boiling at the interface between a liquid and a gas. Indeed, in some specific situations involving very heterogeneous thermodynamic conditions at the interface, the distinction between boiling and evaporation is not always possible. For instance, it can occur for a Leidenfrost droplet; a water drop levitating above a hot plate whose temperature is much higher than the boiling temperature. In this case, boiling occurs in the film of saturated vapor which is entrapped between the bottom of the drop and the plate, whereas the top of the water droplet evaporates in contact of ambient air. The situation can also be ambiguous for a superheated droplet or at the contact line between a liquid and a hot wall whose temperature is higher than the saturation temperature of the liquid. In these situations, the interface temperature can locally reach the saturation temperature (boiling point), for instance near a contact line, and be cooler in other places. Thus, boiling and evaporation can occur simultaneously on different regions of the same liquid interface or occur successively at different times of the history of an evaporating droplet. Standard numerical methods are not able to perform computations in these transient regimes, therefore, we propose in this paper a novel numerical method to achieve this challenging task. Finally, we present several accuracy validations against theoretical solutions and experimental results to strengthen the relevance of this new method.

  18. "Notice the Similarities between the Two Sets …": Imperative Usage in a Corpus of Upper-Level Student Papers

    ERIC Educational Resources Information Center

    Neiderhiser, Justine A.; Kelley, Patrick; Kennedy, Kohlee M.; Swales, John M.; Vergaro, Carla

    2016-01-01

    The sparse literature on the use of imperatives in research papers suggests that they are relatively common in a small number of disciplines, but rare, if used at all, in others. The present study addresses the use of imperatives in a corpus of upper-level A-graded student papers from 16 disciplines. A total of 822 papers collected within the past…

  19. Evidence-Centered Assessment Design as a Foundation for Achievement-Level Descriptor Development and for Standard Setting

    ERIC Educational Resources Information Center

    Plake, Barbara S.; Huff, Kristen; Reshetar, Rosemary

    2010-01-01

    In many large-scale assessment programs, achievement level descriptors (ALDs) provide a critical role in communicating what scores on the assessment mean and in interpreting what examinees know and are able to do based on their test performance. Based on their test performance, examinees are often classified into performance categories. The…

  20. Impact of land use and physicochemical settings on aqueous methylmercury levels in the Mobile-Alabama River System.</