A new region-edge based level set model with applications to image segmentation
NASA Astrophysics Data System (ADS)
Zhi, Xuhao; Shen, Hong-Bin
2018-04-01
Level set model has advantages in handling complex shapes and topological changes, and is widely used in image processing tasks. The image segmentation oriented level set models can be grouped into region-based models and edge-based models, both of which have merits and drawbacks. Region-based level set model relies on fitting to color intensity of separated regions, but is not sensitive to edge information. Edge-based level set model evolves by fitting to local gradient information, but can get easily affected by noise. We propose a region-edge based level set model, which considers saliency information into energy function and fuses color intensity with local gradient information. The evolution of the proposed model is implemented by a hierarchical two-stage protocol, and the experimental results show flexible initialization, robust evolution and precise segmentation.
Some New Sets of Sequences of Fuzzy Numbers with Respect to the Partial Metric
Ozluk, Muharrem
2015-01-01
In this paper, we essentially deal with Köthe-Toeplitz duals of fuzzy level sets defined using a partial metric. Since the utilization of Zadeh's extension principle is quite difficult in practice, we prefer the idea of level sets in order to construct some classical notions. In this paper, we present the sets of bounded, convergent, and null series and the set of sequences of bounded variation of fuzzy level sets, based on the partial metric. We examine the relationships between these sets and their classical forms and give some properties including definitions, propositions, and various kinds of partial metric spaces of fuzzy level sets. Furthermore, we study some of their properties like completeness and duality. Finally, we obtain the Köthe-Toeplitz duals of fuzzy level sets with respect to the partial metric based on a partial ordering. PMID:25695102
A level-set procedure for the design of electromagnetic metamaterials.
Zhou, Shiwei; Li, Wei; Sun, Guangyong; Li, Qing
2010-03-29
Achieving negative permittivity and negative permeability signifies a key topic of research in the design of metamaterials. This paper introduces a level-set based topology optimization method, in which the interface between the vacuum and metal phases is implicitly expressed by the zero-level contour of a higher dimensional level-set function. Following a sensitivity analysis, the optimization maximizes the objective based on the normal direction of the level-set function and induced current flow, thereby generating the desirable patterns of current flow on metal surface. As a benchmark example, the U-shaped structure and its variations are obtained from the level-set topology optimization. Numerical examples demonstrate that both negative permittivity and negative permeability can be attained.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, Patrick T.; Schofield, Samuel P.; Nourgaliev, Robert
2016-06-21
A new mesh smoothing method designed to cluster mesh cells near a dynamically evolving interface is presented. The method is based on weighted condition number mesh relaxation with the weight function being computed from a level set representation of the interface. The weight function is expressed as a Taylor series based discontinuous Galerkin projection, which makes the computation of the derivatives of the weight function needed during the condition number optimization process a trivial matter. For cases when a level set is not available, a fast method for generating a low-order level set from discrete cell-centered elds, such as amore » volume fraction or index function, is provided. Results show that the low-order level set works equally well for the weight function as the actual level set. Meshes generated for a number of interface geometries are presented, including cases with multiple level sets. Dynamic cases for moving interfaces are presented to demonstrate the method's potential usefulness to arbitrary Lagrangian Eulerian (ALE) methods.« less
A level set approach for shock-induced α-γ phase transition of RDX
NASA Astrophysics Data System (ADS)
Josyula, Kartik; Rahul; De, Suvranu
2018-02-01
We present a thermodynamically consistent level sets approach based on regularization energy functional which can be directly incorporated into a Galerkin finite element framework to model interface motion. The regularization energy leads to a diffusive form of flux that is embedded within the level sets evolution equation which maintains the signed distance property of the level set function. The scheme is shown to compare well with the velocity extension method in capturing the interface position. The proposed level sets approach is employed to study the α-γphase transformation in RDX single crystal shocked along the (100) plane. Example problems in one and three dimensions are presented. We observe smooth evolution of the phase interface along the shock direction in both models. There is no diffusion of the interface during the zero level set evolution in the three dimensional model. The level sets approach is shown to capture the characteristics of the shock-induced α-γ phase transformation such as stress relaxation behind the phase interface and the finite time required for the phase transformation to complete. The regularization energy based level sets approach is efficient, robust, and easy to implement.
Multi person detection and tracking based on hierarchical level-set method
NASA Astrophysics Data System (ADS)
Khraief, Chadia; Benzarti, Faouzi; Amiri, Hamid
2018-04-01
In this paper, we propose an efficient unsupervised method for mutli-person tracking based on hierarchical level-set approach. The proposed method uses both edge and region information in order to effectively detect objects. The persons are tracked on each frame of the sequence by minimizing an energy functional that combines color, texture and shape information. These features are enrolled in covariance matrix as region descriptor. The present method is fully automated without the need to manually specify the initial contour of Level-set. It is based on combined person detection and background subtraction methods. The edge-based is employed to maintain a stable evolution, guide the segmentation towards apparent boundaries and inhibit regions fusion. The computational cost of level-set is reduced by using narrow band technique. Many experimental results are performed on challenging video sequences and show the effectiveness of the proposed method.
Left ventricle segmentation via two-layer level sets with circular shape constraint.
Yang, Cong; Wu, Weiguo; Su, Yuanqi; Zhang, Shaoxiang
2017-05-01
This paper proposes a circular shape constraint and a novel two-layer level set method for the segmentation of the left ventricle (LV) from short-axis magnetic resonance images without training any shape models. Since the shape of LV throughout the apex-base axis is close to a ring shape, we propose a circle fitting term in the level set framework to detect the endocardium. The circle fitting term imposes a penalty on the evolving contour from its fitting circle, and thereby handles quite well with issues in LV segmentation, especially the presence of outflow track in basal slices and the intensity overlap between TPM and the myocardium. To extract the whole myocardium, the circle fitting term is incorporated into two-layer level set method. The endocardium and epicardium are respectively represented by two specified level contours of the level set function, which are evolved by an edge-based and a region-based active contour model. The proposed method has been quantitatively validated on the public data set from MICCAI 2009 challenge on the LV segmentation. Experimental results and comparisons with state-of-the-art demonstrate the accuracy and robustness of our method. Copyright © 2017 Elsevier Inc. All rights reserved.
Greene, Patrick T.; Schofield, Samuel P.; Nourgaliev, Robert
2017-01-27
A new mesh smoothing method designed to cluster cells near a dynamically evolving interface is presented. The method is based on weighted condition number mesh relaxation with the weight function computed from a level set representation of the interface. The weight function is expressed as a Taylor series based discontinuous Galerkin projection, which makes the computation of the derivatives of the weight function needed during the condition number optimization process a trivial matter. For cases when a level set is not available, a fast method for generating a low-order level set from discrete cell-centered fields, such as a volume fractionmore » or index function, is provided. Results show that the low-order level set works equally well as the actual level set for mesh smoothing. Meshes generated for a number of interface geometries are presented, including cases with multiple level sets. Lastly, dynamic cases with moving interfaces show the new method is capable of maintaining a desired resolution near the interface with an acceptable number of relaxation iterations per time step, which demonstrates the method's potential to be used as a mesh relaxer for arbitrary Lagrangian Eulerian (ALE) methods.« less
Automatic multi-organ segmentation using learning-based segmentation and level set optimization.
Kohlberger, Timo; Sofka, Michal; Zhang, Jingdan; Birkbeck, Neil; Wetzl, Jens; Kaftan, Jens; Declerck, Jérôme; Zhou, S Kevin
2011-01-01
We present a novel generic segmentation system for the fully automatic multi-organ segmentation from CT medical images. Thereby we combine the advantages of learning-based approaches on point cloud-based shape representation, such a speed, robustness, point correspondences, with those of PDE-optimization-based level set approaches, such as high accuracy and the straightforward prevention of segment overlaps. In a benchmark on 10-100 annotated datasets for the liver, the lungs, and the kidneys we show that the proposed system yields segmentation accuracies of 1.17-2.89 mm average surface errors. Thereby the level set segmentation (which is initialized by the learning-based segmentations) contributes with an 20%-40% increase in accuracy.
Sampson Perrin, Alysa J; Guzzetta, Russell C; Miller, Kellee M; Foster, Nicole C; Lee, Anna; Lee, Joyce M; Block, Jennifer M; Beck, Roy W
2015-05-01
To evaluate the impact of infusion set use duration on glycemic control, we conducted an Internet-based study using the T1D Exchange's online patient community, Glu ( myGlu.org ). For 14 days, 243 electronically consented adults with type 1 diabetes (T1D) entered online that day's fasting blood glucose (FBG) level, the prior day's total daily insulin (TDI) dose, and whether the infusion set was changed. Mean duration of infusion set use was 3.0 days. Mean FBG level was higher with each successive day of infusion set use, increasing from 126 mg/dL on Day 1 to 133 mg/dL on Day 3 to 147 mg/dL on Day 5 (P<0.001). TDI dose did not vary with increased duration of infusion set use. Internet-based data collection was used to rapidly conduct the study at low cost. The results indicate that FBG levels increase with each additional day of insulin pump infusion set use.
Setting analyst: A practical harvest planning technique
Olivier R.M. Halleux; W. Dale Greene
2001-01-01
Setting Analyst is an ArcView extension that facilitates practical harvest planning for ground-based systems. By modeling the travel patterns of ground-based machines, it compares different harvesting settings based on projected average skidding distance, logging costs, and site disturbance levels. Setting Analyst uses information commonly available to consulting...
Rastgarpour, Maryam; Shanbehzadeh, Jamshid
2014-01-01
Researchers recently apply an integrative approach to automate medical image segmentation for benefiting available methods and eliminating their disadvantages. Intensity inhomogeneity is a challenging and open problem in this area, which has received less attention by this approach. It has considerable effects on segmentation accuracy. This paper proposes a new kernel-based fuzzy level set algorithm by an integrative approach to deal with this problem. It can directly evolve from the initial level set obtained by Gaussian Kernel-Based Fuzzy C-Means (GKFCM). The controlling parameters of level set evolution are also estimated from the results of GKFCM. Moreover the proposed algorithm is enhanced with locally regularized evolution based on an image model that describes the composition of real-world images, in which intensity inhomogeneity is assumed as a component of an image. Such improvements make level set manipulation easier and lead to more robust segmentation in intensity inhomogeneity. The proposed algorithm has valuable benefits including automation, invariant of intensity inhomogeneity, and high accuracy. Performance evaluation of the proposed algorithm was carried on medical images from different modalities. The results confirm its effectiveness for medical image segmentation.
Novel nonlinear knowledge-based mean force potentials based on machine learning.
Dong, Qiwen; Zhou, Shuigeng
2011-01-01
The prediction of 3D structures of proteins from amino acid sequences is one of the most challenging problems in molecular biology. An essential task for solving this problem with coarse-grained models is to deduce effective interaction potentials. The development and evaluation of new energy functions is critical to accurately modeling the properties of biological macromolecules. Knowledge-based mean force potentials are derived from statistical analysis of proteins of known structures. Current knowledge-based potentials are almost in the form of weighted linear sum of interaction pairs. In this study, a class of novel nonlinear knowledge-based mean force potentials is presented. The potential parameters are obtained by nonlinear classifiers, instead of relative frequencies of interaction pairs against a reference state or linear classifiers. The support vector machine is used to derive the potential parameters on data sets that contain both native structures and decoy structures. Five knowledge-based mean force Boltzmann-based or linear potentials are introduced and their corresponding nonlinear potentials are implemented. They are the DIH potential (single-body residue-level Boltzmann-based potential), the DFIRE-SCM potential (two-body residue-level Boltzmann-based potential), the FS potential (two-body atom-level Boltzmann-based potential), the HR potential (two-body residue-level linear potential), and the T32S3 potential (two-body atom-level linear potential). Experiments are performed on well-established decoy sets, including the LKF data set, the CASP7 data set, and the Decoys “R”Us data set. The evaluation metrics include the energy Z score and the ability of each potential to discriminate native structures from a set of decoy structures. Experimental results show that all nonlinear potentials significantly outperform the corresponding Boltzmann-based or linear potentials, and the proposed discriminative framework is effective in developing knowledge-based mean force potentials. The nonlinear potentials can be widely used for ab initio protein structure prediction, model quality assessment, protein docking, and other challenging problems in computational biology.
Paschall, Mallie J; Saltz, Robert F
2007-11-01
We examined how alcohol risk is distributed based on college students' drinking before, during and after they go to certain settings. Students attending 14 California public universities (N=10,152) completed a web-based or mailed survey in the fall 2003 semester, which included questions about how many drinks they consumed before, during and after the last time they went to six settings/events: fraternity or sorority party, residence hall party, campus event (e.g. football game), off-campus party, bar/restaurant and outdoor setting (referent). Multi-level analyses were conducted in hierarchical linear modeling (HLM) to examine relationships between type of setting and level of alcohol use before, during and after going to the setting, and possible age and gender differences in these relationships. Drinking episodes (N=24,207) were level 1 units, students were level 2 units and colleges were level 3 units. The highest drinking levels were observed during all settings/events except campus events, with the highest number of drinks being consumed at off-campus parties, followed by residence hall and fraternity/sorority parties. The number of drinks consumed before a fraternity/sorority party was higher than other settings/events. Age group and gender differences in relationships between type of setting/event and 'before,''during' and 'after' drinking levels also were observed. For example, going to a bar/restaurant (relative to an outdoor setting) was positively associated with 'during' drinks among students of legal drinking age while no relationship was observed for underage students. Findings of this study indicate differences in the extent to which college settings are associated with student drinking levels before, during and after related events, and may have implications for intervention strategies targeting different types of settings.
Lung segmentation from HRCT using united geometric active contours
NASA Astrophysics Data System (ADS)
Liu, Junwei; Li, Chuanfu; Xiong, Jin; Feng, Huanqing
2007-12-01
Accurate lung segmentation from high resolution CT images is a challenging task due to various detail tracheal structures, missing boundary segments and complex lung anatomy. One popular method is based on gray-level threshold, however its results are usually rough. A united geometric active contours model based on level set is proposed for lung segmentation in this paper. Particularly, this method combines local boundary information and region statistical-based model synchronously: 1) Boundary term ensures the integrality of lung tissue.2) Region term makes the level set function evolve with global characteristic and independent on initial settings. A penalizing energy term is introduced into the model, which forces the level set function evolving without re-initialization. The method is found to be much more efficient in lung segmentation than other methods that are only based on boundary or region. Results are shown by 3D lung surface reconstruction, which indicates that the method will play an important role in the design of computer-aided diagnostic (CAD) system.
Level-set-based reconstruction algorithm for EIT lung images: first clinical results.
Rahmati, Peyman; Soleimani, Manuchehr; Pulletz, Sven; Frerichs, Inéz; Adler, Andy
2012-05-01
We show the first clinical results using the level-set-based reconstruction algorithm for electrical impedance tomography (EIT) data. The level-set-based reconstruction method (LSRM) allows the reconstruction of non-smooth interfaces between image regions, which are typically smoothed by traditional voxel-based reconstruction methods (VBRMs). We develop a time difference formulation of the LSRM for 2D images. The proposed reconstruction method is applied to reconstruct clinical EIT data of a slow flow inflation pressure-volume manoeuvre in lung-healthy and adult lung-injury patients. Images from the LSRM and the VBRM are compared. The results show comparable reconstructed images, but with an improved ability to reconstruct sharp conductivity changes in the distribution of lung ventilation using the LSRM.
An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs
Mosaliganti, Kishore R.; Gelas, Arnaud; Megason, Sean G.
2013-01-01
In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish embryo. PMID:24501592
An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs.
Mosaliganti, Kishore R; Gelas, Arnaud; Megason, Sean G
2013-01-01
In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish embryo.
Hippocampus segmentation using locally weighted prior based level set
NASA Astrophysics Data System (ADS)
Achuthan, Anusha; Rajeswari, Mandava
2015-12-01
Segmentation of hippocampus in the brain is one of a major challenge in medical image segmentation due to its' imaging characteristics, with almost similar intensity between another adjacent gray matter structure, such as amygdala. The intensity similarity has causes the hippocampus to have weak or fuzzy boundaries. With this main challenge being demonstrated by hippocampus, a segmentation method that relies on image information alone may not produce accurate segmentation results. Therefore, it is needed an assimilation of prior information such as shape and spatial information into existing segmentation method to produce the expected segmentation. Previous studies has widely integrated prior information into segmentation methods. However, the prior information has been utilized through a global manner integration, and this does not reflect the real scenario during clinical delineation. Therefore, in this paper, a locally integrated prior information into a level set model is presented. This work utilizes a mean shape model to provide automatic initialization for level set evolution, and has been integrated as prior information into the level set model. The local integration of edge based information and prior information has been implemented through an edge weighting map that decides at voxel level which information need to be observed during a level set evolution. The edge weighting map shows which corresponding voxels having sufficient edge information. Experiments shows that the proposed integration of prior information locally into a conventional edge-based level set model, known as geodesic active contour has shown improvement of 9% in averaged Dice coefficient.
Cha, Kenny H.; Hadjiiski, Lubomir; Samala, Ravi K.; Chan, Heang-Ping; Caoili, Elaine M.; Cohan, Richard H.
2016-01-01
Purpose: The authors are developing a computerized system for bladder segmentation in CT urography (CTU) as a critical component for computer-aided detection of bladder cancer. Methods: A deep-learning convolutional neural network (DL-CNN) was trained to distinguish between the inside and the outside of the bladder using 160 000 regions of interest (ROI) from CTU images. The trained DL-CNN was used to estimate the likelihood of an ROI being inside the bladder for ROIs centered at each voxel in a CTU case, resulting in a likelihood map. Thresholding and hole-filling were applied to the map to generate the initial contour for the bladder, which was then refined by 3D and 2D level sets. The segmentation performance was evaluated using 173 cases: 81 cases in the training set (42 lesions, 21 wall thickenings, and 18 normal bladders) and 92 cases in the test set (43 lesions, 36 wall thickenings, and 13 normal bladders). The computerized segmentation accuracy using the DL likelihood map was compared to that using a likelihood map generated by Haar features and a random forest classifier, and that using our previous conjoint level set analysis and segmentation system (CLASS) without using a likelihood map. All methods were evaluated relative to the 3D hand-segmented reference contours. Results: With DL-CNN-based likelihood map and level sets, the average volume intersection ratio, average percent volume error, average absolute volume error, average minimum distance, and the Jaccard index for the test set were 81.9% ± 12.1%, 10.2% ± 16.2%, 14.0% ± 13.0%, 3.6 ± 2.0 mm, and 76.2% ± 11.8%, respectively. With the Haar-feature-based likelihood map and level sets, the corresponding values were 74.3% ± 12.7%, 13.0% ± 22.3%, 20.5% ± 15.7%, 5.7 ± 2.6 mm, and 66.7% ± 12.6%, respectively. With our previous CLASS with local contour refinement (LCR) method, the corresponding values were 78.0% ± 14.7%, 16.5% ± 16.8%, 18.2% ± 15.0%, 3.8 ± 2.3 mm, and 73.9% ± 13.5%, respectively. Conclusions: The authors demonstrated that the DL-CNN can overcome the strong boundary between two regions that have large difference in gray levels and provides a seamless mask to guide level set segmentation, which has been a problem for many gradient-based segmentation methods. Compared to our previous CLASS with LCR method, which required two user inputs to initialize the segmentation, DL-CNN with level sets achieved better segmentation performance while using a single user input. Compared to the Haar-feature-based likelihood map, the DL-CNN-based likelihood map could guide the level sets to achieve better segmentation. The results demonstrate the feasibility of our new approach of using DL-CNN in combination with level sets for segmentation of the bladder. PMID:27036584
Tandon, Pooja S; Zhou, Chuan; Christakis, Dimitri A
2012-01-01
Given that more than 34% of U.S. children are cared for in home-based child care settings and outdoor play is associated with physical activity and other health benefits, we sought to characterize the outdoor play frequency of preschoolers cared for at home-based child care settings and factors associated with outdoor play. Cross-sectional study of 1900 preschoolers (representing approximately 862,800 children) cared for in home-based child care settings (including relative and nonrelative care) using the nationally representative Early Childhood Longitudinal Study, Birth Cohort. Only 50% of home-based child care providers reported taking the child outside to walk or play at least once/day. More than one-third of all children did not go outside to play daily with either their parent(s) or home-based child care provider. There were increased odds of going outside daily for children cared for by nonrelatives in the child's home compared with care from a relative. Children with ≥3 regular playmates had greater odds of being taken outdoors by either the parents or child care provider. We did not find statistically significant associations between other child level (age, sex, screen-time), family level (highest education in household, mother's race, employment, exercise frequency), and child care level (hours in care, provider's educational attainment, perception of neighborhood safety) factors and frequency of outdoor play. At a national level, the frequency of outdoor play for preschoolers cared for in home-based child care settings is suboptimal. Further study and efforts to increase outdoor playtime for children in home-based child care settings are needed. Copyright © 2012 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Noguchi, Yuki; Yamamoto, Takashi; Yamada, Takayuki; Izui, Kazuhiro; Nishiwaki, Shinji
2017-09-01
This papers proposes a level set-based topology optimization method for the simultaneous design of acoustic and structural material distributions. In this study, we develop a two-phase material model that is a mixture of an elastic material and acoustic medium, to represent an elastic structure and an acoustic cavity by controlling a volume fraction parameter. In the proposed model, boundary conditions at the two-phase material boundaries are satisfied naturally, avoiding the need to express these boundaries explicitly. We formulate a topology optimization problem to minimize the sound pressure level using this two-phase material model and a level set-based method that obtains topologies free from grayscales. The topological derivative of the objective functional is approximately derived using a variational approach and the adjoint variable method and is utilized to update the level set function via a time evolutionary reaction-diffusion equation. Several numerical examples present optimal acoustic and structural topologies that minimize the sound pressure generated from a vibrating elastic structure.
A variational approach to multi-phase motion of gas, liquid and solid based on the level set method
NASA Astrophysics Data System (ADS)
Yokoi, Kensuke
2009-07-01
We propose a simple and robust numerical algorithm to deal with multi-phase motion of gas, liquid and solid based on the level set method [S. Osher, J.A. Sethian, Front propagating with curvature-dependent speed: Algorithms based on Hamilton-Jacobi formulation, J. Comput. Phys. 79 (1988) 12; M. Sussman, P. Smereka, S. Osher, A level set approach for capturing solution to incompressible two-phase flow, J. Comput. Phys. 114 (1994) 146; J.A. Sethian, Level Set Methods and Fast Marching Methods, Cambridge University Press, 1999; S. Osher, R. Fedkiw, Level Set Methods and Dynamics Implicit Surface, Applied Mathematical Sciences, vol. 153, Springer, 2003]. In Eulerian framework, to simulate interaction between a moving solid object and an interfacial flow, we need to define at least two functions (level set functions) to distinguish three materials. In such simulations, in general two functions overlap and/or disagree due to numerical errors such as numerical diffusion. In this paper, we resolved the problem using the idea of the active contour model [M. Kass, A. Witkin, D. Terzopoulos, Snakes: active contour models, International Journal of Computer Vision 1 (1988) 321; V. Caselles, R. Kimmel, G. Sapiro, Geodesic active contours, International Journal of Computer Vision 22 (1997) 61; G. Sapiro, Geometric Partial Differential Equations and Image Analysis, Cambridge University Press, 2001; R. Kimmel, Numerical Geometry of Images: Theory, Algorithms, and Applications, Springer-Verlag, 2003] introduced in the field of image processing.
Skull defect reconstruction based on a new hybrid level set.
Zhang, Ziqun; Zhang, Ran; Song, Zhijian
2014-01-01
Skull defect reconstruction is an important aspect of surgical repair. Historically, a skull defect prosthesis was created by the mirroring technique, surface fitting, or formed templates. These methods are not based on the anatomy of the individual patient's skull, and therefore, the prosthesis cannot precisely correct the defect. This study presented a new hybrid level set model, taking into account both the global optimization region information and the local accuracy edge information, while avoiding re-initialization during the evolution of the level set function. Based on the new method, a skull defect was reconstructed, and the skull prosthesis was produced by rapid prototyping technology. This resulted in a skull defect prosthesis that well matched the skull defect with excellent individual adaptation.
Koorts, Harriet; Gillison, Fiona
2015-11-06
Communities are a pivotal setting in which to promote increases in child and adolescent physical activity behaviours. Interventions implemented in these settings require effective evaluation to facilitate translation of findings to wider settings. The aims of this paper are to i) present findings from a RE-AIM evaluation of a community-based physical activity program, and ii) review the methodological challenges faced when applying RE-AIM in practice. A single mixed-methods case study was conducted based on a concurrent triangulation design. Five sources of data were collected via interviews, questionnaires, archival records, documentation and field notes. Evidence was triangulated within RE-AIM to assess individual and organisational-level program outcomes. Inconsistent availability of data and a lack of robust reporting challenged assessment of all five dimensions. Reach, Implementation and setting-level Adoption were less successful, Effectiveness and Maintenance at an individual and organisational level were moderately successful. Only community-level Adoption was highly successful, reflecting the key program goal to provide community-wide participation in sport and physical activity. This research highlighted important methodological constraints associated with the use of RE-AIM in practice settings. Future evaluators wishing to use RE-AIM may benefit from a mixed-method triangulation approach to offset challenges with data availability and reliability.
Liquid level sensor based on an excessively tilted fibre grating
NASA Astrophysics Data System (ADS)
Mou, Chengbo; Zhou, Kaiming; Yan, Zhijun; Fu, Hongyan; Zhang, Lin
2013-09-01
We propose and demonstrate an optical liquid level sensor based on the surrounding medium refractive index (SRI) sensing using an excessively tilted fibre grating (ETFG). When the ETFG submerged in water, two sets of cladding modes are coupled, corresponding to air- and water-surrounded grating structures, respectively. The coupling strengths of the two sets of cladding modes evolve with the submerging length of the grating, providing a mechanism to measure the liquid level. Comparing with long-period fibre grating based liquid level sensor, the ETFG sensor has a much higher SRI responsivity for liquids with refractive index around 1.33 and a lower thermal cross sensitivity.
Setting Emissions Standards Based on Technology Performance
In setting national emissions standards, EPA sets emissions performance levels rather than mandating use of a particular technology. The law mandates that EPA use numerical performance standards whenever feasible in setting national emissions standards.
A Hybrid Method for Pancreas Extraction from CT Image Based on Level Set Methods
Tan, Hanqing; Fujita, Hiroshi
2013-01-01
This paper proposes a novel semiautomatic method to extract the pancreas from abdominal CT images. Traditional level set and region growing methods that request locating initial contour near the final boundary of object have problem of leakage to nearby tissues of pancreas region. The proposed method consists of a customized fast-marching level set method which generates an optimal initial pancreas region to solve the problem that the level set method is sensitive to the initial contour location and a modified distance regularized level set method which extracts accurate pancreas. The novelty in our method is the proper selection and combination of level set methods, furthermore an energy-decrement algorithm and an energy-tune algorithm are proposed to reduce the negative impact of bonding force caused by connected tissue whose intensity is similar with pancreas. As a result, our method overcomes the shortages of oversegmentation at weak boundary and can accurately extract pancreas from CT images. The proposed method is compared to other five state-of-the-art medical image segmentation methods based on a CT image dataset which contains abdominal images from 10 patients. The evaluated results demonstrate that our method outperforms other methods by achieving higher accuracy and making less false segmentation in pancreas extraction. PMID:24066016
A unified tensor level set for image segmentation.
Wang, Bin; Gao, Xinbo; Tao, Dacheng; Li, Xuelong
2010-06-01
This paper presents a new region-based unified tensor level set model for image segmentation. This model introduces a three-order tensor to comprehensively depict features of pixels, e.g., gray value and the local geometrical features, such as orientation and gradient, and then, by defining a weighted distance, we generalized the representative region-based level set method from scalar to tensor. The proposed model has four main advantages compared with the traditional representative method as follows. First, involving the Gaussian filter bank, the model is robust against noise, particularly the salt- and pepper-type noise. Second, considering the local geometrical features, e.g., orientation and gradient, the model pays more attention to boundaries and makes the evolving curve stop more easily at the boundary location. Third, due to the unified tensor pixel representation representing the pixels, the model segments images more accurately and naturally. Fourth, based on a weighted distance definition, the model possesses the capacity to cope with data varying from scalar to vector, then to high-order tensor. We apply the proposed method to synthetic, medical, and natural images, and the result suggests that the proposed method is superior to the available representative region-based level set method.
Patient Experience-based Value Sets: Are They Stable?
Pickard, A Simon; Hung, Yu-Ting; Lin, Fang-Ju; Lee, Todd A
2017-11-01
Although societal preference weights are desirable to inform resource-allocation decision-making, patient experienced health state-based value sets can be useful for clinical decision-making, but context may matter. To estimate EQ-5D value sets using visual analog scale (VAS) ratings for patients undergoing knee replacement surgery and compare the estimates before and after surgery. We used the Patient Reported Outcome Measures data collected by the UK National Health Service on patients undergoing knee replacement from 2009 to 2012. Generalized least squares regression models were used to derive value sets based on the EQ-5D-3 level using a development sample before and after surgery, and model performance was examined using a validation sample. A total of 90,450 preoperative and postoperative valuations were included. For preoperative valuations, the largest decrement in VAS values was associated with the dimension of anxiety/depression, followed by self-care, mobility, usual activities, and pain/discomfort. However, pain/discomfort had a greater impact on VAS value decrement in postoperative valuations. Compared with preoperative health problems, postsurgical health problems were associated with larger value decrements, with significant differences in several levels and dimensions, including level 2 of mobility, level 2/3 of usual activities, level 3 of pain/discomfort, and level 3 of anxiety/depression. Similar results were observed across subgroups stratified by age and sex. Findings suggest patient experience-based value sets are not stable (ie, context such as timing matters). However, the knowledge that lower values are assigned to health states postsurgery compared with presurgery may be useful for the patient-doctor decision-making process.
NASA Astrophysics Data System (ADS)
Yanites, Brian J.; Becker, Jens K.; Madritsch, Herfried; Schnellmann, Michael; Ehlers, Todd A.
2017-11-01
Landscape evolution is a product of the forces that drive geomorphic processes (e.g., tectonics and climate) and the resistance to those processes. The underlying lithology and structural setting in many landscapes set the resistance to erosion. This study uses a modified version of the Channel-Hillslope Integrated Landscape Development (CHILD) landscape evolution model to determine the effect of a spatially and temporally changing erodibility in a terrain with a complex base level history. Specifically, our focus is to quantify how the effects of variable lithology influence transient base level signals. We set up a series of numerical landscape evolution models with increasing levels of complexity based on the lithologic variability and base level history of the Jura Mountains of northern Switzerland. The models are consistent with lithology (and therewith erodibility) playing an important role in the transient evolution of the landscape. The results show that the erosion rate history at a location depends on the rock uplift and base level history, the range of erodibilities of the different lithologies, and the history of the surface geology downstream from the analyzed location. Near the model boundary, the history of erosion is dominated by the base level history. The transient wave of incision, however, is quite variable in the different model runs and depends on the geometric structure of lithology used. It is thus important to constrain the spatiotemporal erodibility patterns downstream of any given point of interest to understand the evolution of a landscape subject to variable base level in a quantitative framework.
Orr, Serena L; Aubé, Michel; Becker, Werner J; Davenport, W Jeptha; Dilli, Esma; Dodick, David; Giammarco, Rose; Gladstone, Jonathan; Leroux, Elizabeth; Pim, Heather; Dickinson, Garth; Christie, Suzanne N
2015-03-01
There is a considerable amount of practice variation in managing migraines in emergency settings, and evidence-based therapies are often not used first line. A peer-reviewed search of databases (MEDLINE, Embase, CENTRAL) was carried out to identify randomized and quasi-randomized controlled trials of interventions for acute pain relief in adults presenting with migraine to emergency settings. Where possible, data were pooled into meta-analyses. Two independent reviewers screened 831 titles and abstracts for eligibility. Three independent reviewers subsequently evaluated 120 full text articles for inclusion, of which 44 were included. Individual studies were then assigned a US Preventive Services Task Force quality rating. The GRADE scheme was used to assign a level of evidence and recommendation strength for each intervention. We strongly recommend the use of prochlorperazine based on a high level of evidence, lysine acetylsalicylic acid, metoclopramide and sumatriptan, based on a moderate level of evidence, and ketorolac, based on a low level of evidence. We weakly recommend the use of chlorpromazine based on a moderate level of evidence, and ergotamine, dihydroergotamine, lidocaine intranasal and meperidine, based on a low level of evidence. We found evidence to recommend strongly against the use of dexamethasone, based on a moderate level of evidence, and granisetron, haloperidol and trimethobenzamide based on a low level of evidence. Based on moderate-quality evidence, we recommend weakly against the use of acetaminophen and magnesium sulfate. Based on low-quality evidence, we recommend weakly against the use of diclofenac, droperidol, lidocaine intravenous, lysine clonixinate, morphine, propofol, sodium valproate and tramadol. © International Headache Society 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
a Fast Segmentation Algorithm for C-V Model Based on Exponential Image Sequence Generation
NASA Astrophysics Data System (ADS)
Hu, J.; Lu, L.; Xu, J.; Zhang, J.
2017-09-01
For the island coastline segmentation, a fast segmentation algorithm for C-V model method based on exponential image sequence generation is proposed in this paper. The exponential multi-scale C-V model with level set inheritance and boundary inheritance is developed. The main research contributions are as follows: 1) the problems of the "holes" and "gaps" are solved when extraction coastline through the small scale shrinkage, low-pass filtering and area sorting of region. 2) the initial value of SDF (Signal Distance Function) and the level set are given by Otsu segmentation based on the difference of reflection SAR on land and sea, which are finely close to the coastline. 3) the computational complexity of continuous transition are successfully reduced between the different scales by the SDF and of level set inheritance. Experiment results show that the method accelerates the acquisition of initial level set formation, shortens the time of the extraction of coastline, at the same time, removes the non-coastline body part and improves the identification precision of the main body coastline, which automates the process of coastline segmentation.
NASA Astrophysics Data System (ADS)
Anugrah, Wirdah; Suryono; Suseno, Jatmiko Endro
2018-02-01
Management of water resources based on Geographic Information System can provide substantial benefits to water availability settings. Monitoring the potential water level is needed in the development sector, agriculture, energy and others. In this research is developed water resource information system using real-time Geographic Information System concept for monitoring the potential water level of web based area by applying rule based system method. GIS consists of hardware, software, and database. Based on the web-based GIS architecture, this study uses a set of computer that are connected to the network, run on the Apache web server and PHP programming language using MySQL database. The Ultrasound Wireless Sensor System is used as a water level data input. It also includes time and geographic location information. This GIS maps the five sensor locations. GIS is processed through a rule based system to determine the level of potential water level of the area. Water level monitoring information result can be displayed on thematic maps by overlaying more than one layer, and also generating information in the form of tables from the database, as well as graphs are based on the timing of events and the water level values.
NASA Astrophysics Data System (ADS)
Fang, Jingyu; Xu, Haisong; Wang, Zhehong; Wu, Xiaomin
2016-05-01
With colorimetric characterization, digital cameras can be used as image-based tristimulus colorimeters for color communication. In order to overcome the restriction of fixed capture settings adopted in the conventional colorimetric characterization procedures, a novel method was proposed considering capture settings. The method calculating colorimetric value of the measured image contains five main steps, including conversion from RGB values to equivalent ones of training settings through factors based on imaging system model so as to build the bridge between different settings, scaling factors involved in preparation steps for transformation mapping to avoid errors resulted from nonlinearity of polynomial mapping for different ranges of illumination levels. The experiment results indicate that the prediction error of the proposed method, which was measured by CIELAB color difference formula, reaches less than 2 CIELAB units under different illumination levels and different correlated color temperatures. This prediction accuracy for different capture settings remains the same level as the conventional method for particular lighting condition.
A Simulation of Readiness-Based Sparing Policies
2017-06-01
variant of a greedy heuristic algorithm to set stock levels and estimate overall WS availability. Our discrete event simulation is then used to test the...available in the optimization tools. 14. SUBJECT TERMS readiness-based sparing, discrete event simulation, optimization, multi-indenture...variant of a greedy heuristic algorithm to set stock levels and estimate overall WS availability. Our discrete event simulation is then used to test the
Hybrid approach for detection of dental caries based on the methods FCM and level sets
NASA Astrophysics Data System (ADS)
Chaabene, Marwa; Ben Ali, Ramzi; Ejbali, Ridha; Zaied, Mourad
2017-03-01
This paper presents a new technique for detection of dental caries that is a bacterial disease that destroys the tooth structure. In our approach, we have achieved a new segmentation method that combines the advantages of fuzzy C mean algorithm and level set method. The results obtained by the FCM algorithm will be used by Level sets algorithm to reduce the influence of the noise effect on the working of each of these algorithms, to facilitate level sets manipulation and to lead to more robust segmentation. The sensitivity and specificity confirm the effectiveness of proposed method for caries detection.
Men, Hong; Fu, Songlin; Yang, Jialin; Cheng, Meiqi; Shi, Yan; Liu, Jingjing
2018-01-18
Paraffin odor intensity is an important quality indicator when a paraffin inspection is performed. Currently, paraffin odor level assessment is mainly dependent on an artificial sensory evaluation. In this paper, we developed a paraffin odor analysis system to classify and grade four kinds of paraffin samples. The original feature set was optimized using Principal Component Analysis (PCA) and Partial Least Squares (PLS). Support Vector Machine (SVM), Random Forest (RF), and Extreme Learning Machine (ELM) were applied to three different feature data sets for classification and level assessment of paraffin. For classification, the model based on SVM, with an accuracy rate of 100%, was superior to that based on RF, with an accuracy rate of 98.33-100%, and ELM, with an accuracy rate of 98.01-100%. For level assessment, the R² related to the training set was above 0.97 and the R² related to the test set was above 0.87. Through comprehensive comparison, the generalization of the model based on ELM was superior to those based on SVM and RF. The scoring errors for the three models were 0.0016-0.3494, lower than the error of 0.5-1.0 measured by industry standard experts, meaning these methods have a higher prediction accuracy for scoring paraffin level.
NASA Astrophysics Data System (ADS)
Qin, Xulei; Cong, Zhibin; Halig, Luma V.; Fei, Baowei
2013-03-01
An automatic framework is proposed to segment right ventricle on ultrasound images. This method can automatically segment both epicardial and endocardial boundaries from a continuous echocardiography series by combining sparse matrix transform (SMT), a training model, and a localized region based level set. First, the sparse matrix transform extracts main motion regions of myocardium as eigenimages by analyzing statistical information of these images. Second, a training model of right ventricle is registered to the extracted eigenimages in order to automatically detect the main location of the right ventricle and the corresponding transform relationship between the training model and the SMT-extracted results in the series. Third, the training model is then adjusted as an adapted initialization for the segmentation of each image in the series. Finally, based on the adapted initializations, a localized region based level set algorithm is applied to segment both epicardial and endocardial boundaries of the right ventricle from the whole series. Experimental results from real subject data validated the performance of the proposed framework in segmenting right ventricle from echocardiography. The mean Dice scores for both epicardial and endocardial boundaries are 89.1%+/-2.3% and 83.6+/-7.3%, respectively. The automatic segmentation method based on sparse matrix transform and level set can provide a useful tool for quantitative cardiac imaging.
Li, Changyang; Wang, Xiuying; Eberl, Stefan; Fulham, Michael; Yin, Yong; Dagan Feng, David
2015-01-01
Automated and general medical image segmentation can be challenging because the foreground and the background may have complicated and overlapping density distributions in medical imaging. Conventional region-based level set algorithms often assume piecewise constant or piecewise smooth for segments, which are implausible for general medical image segmentation. Furthermore, low contrast and noise make identification of the boundaries between foreground and background difficult for edge-based level set algorithms. Thus, to address these problems, we suggest a supervised variational level set segmentation model to harness the statistical region energy functional with a weighted probability approximation. Our approach models the region density distributions by using the mixture-of-mixtures Gaussian model to better approximate real intensity distributions and distinguish statistical intensity differences between foreground and background. The region-based statistical model in our algorithm can intuitively provide better performance on noisy images. We constructed a weighted probability map on graphs to incorporate spatial indications from user input with a contextual constraint based on the minimization of contextual graphs energy functional. We measured the performance of our approach on ten noisy synthetic images and 58 medical datasets with heterogeneous intensities and ill-defined boundaries and compared our technique to the Chan-Vese region-based level set model, the geodesic active contour model with distance regularization, and the random walker model. Our method consistently achieved the highest Dice similarity coefficient when compared to the other methods.
Ground-based detections of sodium in HD 209458b's atmosphere in two data sets
NASA Astrophysics Data System (ADS)
Albrecht, S.; Snellen, I.; de Mooij, E.; Le Poole, R.
2009-02-01
We present two separate ground-based detections of sodium in the transmission spectrum of HD 209458b. First we reanalyzed an archival data set from the HDS spectrograph on Subaru, which shows sodium at a >5σ level. Secondly, our preliminary results of a UVES/VLT data set indicate sodium absorption at a similar level, although the data cover the eclipse only partially. Both results are fully consistent with the HST results of Charbonneau et al. (2002). The Na D absorption feature seems to be resolved in the narrowest passband.
This dataset (STATSGO_Set1 and STATSGO_Set2) represents the soil characteristics within individual, local NHDPlusV2 catchments and upstream, contributing watersheds based on the STATSGO dataset (see Data Sources for links to NHDPlusV2 data and STATSGO data). Attributes were calculated for every local NHDPlusV2 catchment and accumulated to provide watershed-level metrics. This data set is derived from the STATSGO landscape rasters for the conterminous USA. Individual rasters (Landscape Layers) of organic material (om), permeability (perm), water table depth (wtdep), depth to bedrock (rckdep), percent clay (clay), and percent sand (sand) were used to calculate soil characteristics for each NHDPlusV2 catchment. The soil characteristics were summarized to produce local catchment-level and watershed-level metrics as a continuous data type (see Data Structure and Attribute Information for a description). The STATSGO data are distributed in two sets, STATSGO_Set1 and STATSGO_Set2, based on common NoData locations in each set of soil GIS layers (see ***link to ReadMe html with NoData map here***).
Development of a coupled level set and immersed boundary method for predicting dam break flows
NASA Astrophysics Data System (ADS)
Yu, C. H.; Sheu, Tony W. H.
2017-12-01
Dam-break flow over an immersed stationary object is investigated using a coupled level set (LS)/immersed boundary (IB) method developed in Cartesian grids. This approach adopts an improved interface preserving level set method which includes three solution steps and the differential-based interpolation immersed boundary method to treat fluid-fluid and solid-fluid interfaces, respectively. In the first step of this level set method, the level set function ϕ is advected by a pure advection equation. The intermediate step is performed to obtain a new level set value through a new smoothed Heaviside function. In the final solution step, a mass correction term is added to the re-initialization equation to ensure the new level set is a distance function and to conserve the mass bounded by the interface. For accurately calculating the level set value, the four-point upwinding combined compact difference (UCCD) scheme with three-point boundary combined compact difference scheme is applied to approximate the first-order derivative term shown in the level set equation. For the immersed boundary method, application of the artificial momentum forcing term at points in cells consisting of both fluid and solid allows an imposition of velocity condition to account for the presence of solid object. The incompressible Navier-Stokes solutions are calculated using the projection method. Numerical results show that the coupled LS/IB method can not only predict interface accurately but also preserve the mass conservation excellently for the dam-break flow.
Men, Hong; Fu, Songlin; Yang, Jialin; Cheng, Meiqi; Shi, Yan
2018-01-01
Paraffin odor intensity is an important quality indicator when a paraffin inspection is performed. Currently, paraffin odor level assessment is mainly dependent on an artificial sensory evaluation. In this paper, we developed a paraffin odor analysis system to classify and grade four kinds of paraffin samples. The original feature set was optimized using Principal Component Analysis (PCA) and Partial Least Squares (PLS). Support Vector Machine (SVM), Random Forest (RF), and Extreme Learning Machine (ELM) were applied to three different feature data sets for classification and level assessment of paraffin. For classification, the model based on SVM, with an accuracy rate of 100%, was superior to that based on RF, with an accuracy rate of 98.33–100%, and ELM, with an accuracy rate of 98.01–100%. For level assessment, the R2 related to the training set was above 0.97 and the R2 related to the test set was above 0.87. Through comprehensive comparison, the generalization of the model based on ELM was superior to those based on SVM and RF. The scoring errors for the three models were 0.0016–0.3494, lower than the error of 0.5–1.0 measured by industry standard experts, meaning these methods have a higher prediction accuracy for scoring paraffin level. PMID:29346328
Kashyap, Kanchan L; Bajpai, Manish K; Khanna, Pritee; Giakos, George
2018-01-01
Automatic segmentation of abnormal region is a crucial task in computer-aided detection system using mammograms. In this work, an automatic abnormality detection algorithm using mammographic images is proposed. In the preprocessing step, partial differential equation-based variational level set method is used for breast region extraction. The evolution of the level set method is done by applying mesh-free-based radial basis function (RBF). The limitation of mesh-based approach is removed by using mesh-free-based RBF method. The evolution of variational level set function is also done by mesh-based finite difference method for comparison purpose. Unsharp masking and median filtering is used for mammogram enhancement. Suspicious abnormal regions are segmented by applying fuzzy c-means clustering. Texture features are extracted from the segmented suspicious regions by computing local binary pattern and dominated rotated local binary pattern (DRLBP). Finally, suspicious regions are classified as normal or abnormal regions by means of support vector machine with linear, multilayer perceptron, radial basis, and polynomial kernel function. The algorithm is validated on 322 sample mammograms of mammographic image analysis society (MIAS) and 500 mammograms from digital database for screening mammography (DDSM) datasets. Proficiency of the algorithm is quantified by using sensitivity, specificity, and accuracy. The highest sensitivity, specificity, and accuracy of 93.96%, 95.01%, and 94.48%, respectively, are obtained on MIAS dataset using DRLBP feature with RBF kernel function. Whereas, the highest 92.31% sensitivity, 98.45% specificity, and 96.21% accuracy are achieved on DDSM dataset using DRLBP feature with RBF kernel function. Copyright © 2017 John Wiley & Sons, Ltd.
LandScan 2016 High-Resolution Global Population Data Set
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bright, Edward A; Rose, Amy N; Urban, Marie L
The LandScan data set is a worldwide population database compiled on a 30" x 30" latitude/longitude grid. Census counts (at sub-national level) were apportioned to each grid cell based on likelihood coefficients, which are based on land cover, slope, road proximity, high-resolution imagery, and other data sets. The LandScan data set was developed as part of Oak Ridge National Laboratory (ORNL) Global Population Project for estimating ambient populations at risk.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bright, Edward A.; Rose, Amy N.; Urban, Marie L.
The LandScan data set is a worldwide population database compiled on a 30" x 30" latitude/longitube grid. Census counts (at sub-national level) were apportioned to each grid cell based on likelihood coefficients, which are based on land cover, slope, road proximity, high-resolution imagery, and other data sets. The LandScan data set was developed as part of Oak Ridge National Laboratory (ORNL) Global Population Project for estimating ambient populations at risk.
Kim, Youngwoo; Hong, Byung Woo; Kim, Seung Ja; Kim, Jong Hyo
2014-07-01
A major challenge when distinguishing glandular tissues on mammograms, especially for area-based estimations, lies in determining a boundary on a hazy transition zone from adipose to glandular tissues. This stems from the nature of mammography, which is a projection of superimposed tissues consisting of different structures. In this paper, the authors present a novel segmentation scheme which incorporates the learned prior knowledge of experts into a level set framework for fully automated mammographic density estimations. The authors modeled the learned knowledge as a population-based tissue probability map (PTPM) that was designed to capture the classification of experts' visual systems. The PTPM was constructed using an image database of a selected population consisting of 297 cases. Three mammogram experts extracted regions for dense and fatty tissues on digital mammograms, which was an independent subset used to create a tissue probability map for each ROI based on its local statistics. This tissue class probability was taken as a prior in the Bayesian formulation and was incorporated into a level set framework as an additional term to control the evolution and followed the energy surface designed to reflect experts' knowledge as well as the regional statistics inside and outside of the evolving contour. A subset of 100 digital mammograms, which was not used in constructing the PTPM, was used to validate the performance. The energy was minimized when the initial contour reached the boundary of the dense and fatty tissues, as defined by experts. The correlation coefficient between mammographic density measurements made by experts and measurements by the proposed method was 0.93, while that with the conventional level set was 0.47. The proposed method showed a marked improvement over the conventional level set method in terms of accuracy and reliability. This result suggests that the proposed method successfully incorporated the learned knowledge of the experts' visual systems and has potential to be used as an automated and quantitative tool for estimations of mammographic breast density levels.
A Comprehensive Multi-Level Model for Campus-Based Leadership Education
ERIC Educational Resources Information Center
Rosch, David; Spencer, Gayle L.; Hoag, Beth L.
2017-01-01
Within this application brief, we propose a comprehensive model for mapping the shape and optimizing the effectiveness of leadership education in campus-wide university settings. The four-level model is highlighted by inclusion of a philosophy statement detailing the values and purpose of leadership education on campus, a set of skills and…
Level-Set Methodology on Adaptive Octree Grids
NASA Astrophysics Data System (ADS)
Gibou, Frederic; Guittet, Arthur; Mirzadeh, Mohammad; Theillard, Maxime
2017-11-01
Numerical simulations of interfacial problems in fluids require a methodology capable of tracking surfaces that can undergo changes in topology and capable to imposing jump boundary conditions in a sharp manner. In this talk, we will discuss recent advances in the level-set framework, in particular one that is based on adaptive grids.
Automatic Segmenting Structures in MRI's Based on Texture Analysis and Fuzzy Logic
NASA Astrophysics Data System (ADS)
Kaur, Mandeep; Rattan, Munish; Singh, Pushpinder
2017-12-01
The purpose of this paper is to present the variational method for geometric contours which helps the level set function remain close to the sign distance function, therefor it remove the need of expensive re-initialization procedure and thus, level set method is applied on magnetic resonance images (MRI) to track the irregularities in them as medical imaging plays a substantial part in the treatment, therapy and diagnosis of various organs, tumors and various abnormalities. It favors the patient with more speedy and decisive disease controlling with lesser side effects. The geometrical shape, the tumor's size and tissue's abnormal growth can be calculated by the segmentation of that particular image. It is still a great challenge for the researchers to tackle with an automatic segmentation in the medical imaging. Based on the texture analysis, different images are processed by optimization of level set segmentation. Traditionally, optimization was manual for every image where each parameter is selected one after another. By applying fuzzy logic, the segmentation of image is correlated based on texture features, to make it automatic and more effective. There is no initialization of parameters and it works like an intelligent system. It segments the different MRI images without tuning the level set parameters and give optimized results for all MRI's.
Change detection of polarimetric SAR images based on the KummerU Distribution
NASA Astrophysics Data System (ADS)
Chen, Quan; Zou, Pengfei; Li, Zhen; Zhang, Ping
2014-11-01
In the society of PolSAR image segmentation, change detection and classification, the classical Wishart distribution has been used for a long time, but it especially suit to low-resolution SAR image, because in traditional sensors, only a small number of scatterers are present in each resolution cell. With the improving of SAR systems these years, the classical statistical models can therefore be reconsidered for high resolution and polarimetric information contained in the images acquired by these advanced systems. In this study, SAR image segmentation algorithm based on level-set method, added with distance regularized level-set evolution (DRLSE) is performed using Envisat/ASAR single-polarization data and Radarsat-2 polarimetric images, respectively. KummerU heterogeneous clutter model is used in the later to overcome the homogeneous hypothesis at high resolution cell. An enhanced distance regularized level-set evolution (DRLSE-E) is also applied in the later, to ensure accurate computation and stable level-set evolution. Finally, change detection based on four polarimetric Radarsat-2 time series images is carried out at Genhe area of Inner Mongolia Autonomous Region, NorthEastern of China, where a heavy flood disaster occurred during the summer of 2013, result shows the recommend segmentation method can detect the change of watershed effectively.
Du, Yuncheng; Budman, Hector M; Duever, Thomas A
2016-06-01
Accurate automated quantitative analysis of living cells based on fluorescence microscopy images can be very useful for fast evaluation of experimental outcomes and cell culture protocols. In this work, an algorithm is developed for fast differentiation of normal and apoptotic viable Chinese hamster ovary (CHO) cells. For effective segmentation of cell images, a stochastic segmentation algorithm is developed by combining a generalized polynomial chaos expansion with a level set function-based segmentation algorithm. This approach provides a probabilistic description of the segmented cellular regions along the boundary, from which it is possible to calculate morphological changes related to apoptosis, i.e., the curvature and length of a cell's boundary. These features are then used as inputs to a support vector machine (SVM) classifier that is trained to distinguish between normal and apoptotic viable states of CHO cell images. The use of morphological features obtained from the stochastic level set segmentation of cell images in combination with the trained SVM classifier is more efficient in terms of differentiation accuracy as compared with the original deterministic level set method.
Active-Reserve Force Cost Model
2015-01-01
structure to be maintained for a given level of expenditure. We have developed this methodology and set of associated computer-based tools to...rotational, and deployed units or systems • Attain acceptable steady state operational or presence levels , as measured by the number of units a...at the community level . By community, we mean the set of units of a given type: mission, platform, or capability. We do this because AC-RC force-mix
Variability in ADHD care in community-based pediatrics.
Epstein, Jeffery N; Kelleher, Kelly J; Baum, Rebecca; Brinkman, William B; Peugh, James; Gardner, William; Lichtenstein, Phil; Langberg, Joshua
2014-12-01
Although many efforts have been made to improve the quality of care delivered to children with attention-deficit/hyperactivity disorder (ADHD) in community-based pediatric settings, little is known about typical ADHD care in these settings other than rates garnered through pediatrician self-report. Rates of evidence-based ADHD care and sources of variability (practice-level, pediatrician-level, patient-level) were determined by chart reviews of a random sample of 1594 patient charts across 188 pediatricians at 50 different practices. In addition, the associations of Medicaid-status and practice setting (ie, urban, suburban, and rural) with the quality of ADHD care were examined. Parent- and teacher-rating scales were used during ADHD assessment with approximately half of patients. The use of Diagnostic and Statistical Manual of Mental Disorders criteria was documented in 70.4% of patients. The vast majority (93.4%) of patients with ADHD were receiving medication and only 13.0% were receiving psychosocial treatment. Parent- and teacher-ratings were rarely collected to monitor treatment response or side effects. Further, fewer than half (47.4%) of children prescribed medication had contact with their pediatrician within the first month of prescribing. Most variability in pediatrician-delivered ADHD care was accounted for at the patient level; however, pediatricians and practices also accounted for significant variability on specific ADHD care behaviors. There is great need to improve the quality of ADHD care received by children in community-based pediatric settings. Improvements will likely require systematic interventions at the practice and policy levels to promote change. Copyright © 2014 by the American Academy of Pediatrics.
NASA Technical Reports Server (NTRS)
Skillen, Michael D.; Crossley, William A.
2008-01-01
This report presents an approach for sizing of a morphing aircraft based upon a multi-level design optimization approach. For this effort, a morphing wing is one whose planform can make significant shape changes in flight - increasing wing area by 50% or more from the lowest possible area, changing sweep 30 or more, and/or increasing aspect ratio by as much as 200% from the lowest possible value. The top-level optimization problem seeks to minimize the gross weight of the aircraft by determining a set of "baseline" variables - these are common aircraft sizing variables, along with a set of "morphing limit" variables - these describe the maximum shape change for a particular morphing strategy. The sub-level optimization problems represent each segment in the morphing aircraft's design mission; here, each sub-level optimizer minimizes fuel consumed during each mission segment by changing the wing planform within the bounds set by the baseline and morphing limit variables from the top-level problem.
NASA Astrophysics Data System (ADS)
Li, Huajiao; Fang, Wei; An, Haizhong; Gao, Xiangyun; Yan, Lili
2016-05-01
Economic networks in the real world are not homogeneous; therefore, it is important to study economic networks with heterogeneous nodes and edges to simulate a real network more precisely. In this paper, we present an empirical study of the one-mode derivative holding-based network constructed by the two-mode affiliation network of two sets of actors using the data of worldwide listed energy companies and their shareholders. First, we identify the primitive relationship in the two-mode affiliation network of the two sets of actors. Then, we present the method used to construct the derivative network based on the shareholding relationship between two sets of actors and the affiliation relationship between actors and events. After constructing the derivative network, we analyze different topological features on the node level, edge level and entire network level and explain the meanings of the different values of the topological features combining the empirical data. This study is helpful for expanding the usage of complex networks to heterogeneous economic networks. For empirical research on the worldwide listed energy stock market, this study is useful for discovering the inner relationships between the nations and regions from a new perspective.
Quilling, Eike; Dadaczynski, Kevin; Müller, Merle
2016-11-01
Childhood and adolescent overweight can still be seen as a global public health problem. Based on our socioeconomic understanding, overweight is the result of a complex interplay of a diverse array of factors acting on different levels. Hence, in addition to individual level determinants overweight prevention should also address environmental related factors as part of a holistic and integrated setting approach. This paper aims to discuss the setting approach with regard to overweight prevention in childhood and adolescence. In addition to a summary of environmental factors and their empirical influence on the determinants of overweight, theoretical approaches and planning models of settings-based overweight prevention are discussed. While settings can be characterized as specific social-spatial subsystems (e. g. kindergarten, schools), living environments relate to complex subject-oriented environments that may include various subsystems. Direct social contexts, educational contexts and community contexts as relevant systems for young people contain different evidence-based influences that need to be taken into account in settings based overweight prevention. To support a theory-driven intervention, numerous planning models exist, which are presented here. Given the strengthening of environments for health within the prevention law, the underlying settings approach also needs further development with regard to overweigth prevention. This includes the improvement of the theoretical foundation by aligning intervention practice of planning models, which also has a positive influence on the ability to measure its success.
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1985-01-01
A set of thermoviscoplastic nonlinear constitutive relationships (1VP-NCR) is presented. The set was developed for application to high temperature metal matrix composites (HT-MMC) and is applicable to thermal and mechanical properties. Formulation of the TVP-NCR is based at the micromechanics level. The TVP-NCR are of simple form and readily integrated into nonlinear composite structural analysis. It is shown that the set of TVP-NCR is computationally effective. The set directly predicts complex materials behavior at all levels of the composite simulation, from the constituent materials, through the several levels of composite mechanics, and up to the global response of complex HT-MMC structural components.
Top-Down CO Emissions Based On IASI Observations and Hemispheric Constraints on OH Levels
NASA Astrophysics Data System (ADS)
Müller, J.-F.; Stavrakou, T.; Bauwens, M.; George, M.; Hurtmans, D.; Coheur, P.-F.; Clerbaux, C.; Sweeney, C.
2018-02-01
Assessments of carbon monoxide emissions through inverse modeling are dependent on the modeled abundance of the hydroxyl radical (OH) which controls both the primary sink of CO and its photochemical source through hydrocarbon oxidation. However, most chemistry transport models (CTMs) fall short of reproducing constraints on hemispherically averaged OH levels derived from methylchloroform (MCF) observations. Here we construct five different OH fields compatible with MCF-based analyses, and we prescribe those fields in a global CTM to infer CO fluxes based on Infrared Atmospheric Sounding Interferometer (IASI) CO columns. Each OH field leads to a different set of optimized emissions. Comparisons with independent data (surface, ground-based remotely sensed, aircraft) indicate that the inversion adopting the lowest average OH level in the Northern Hemisphere (7.8 × 105 molec cm-3, ˜18% lower than the best estimate based on MCF measurements) provides the best overall agreement with all tested observation data sets.
Salutogenic factors for mental health promotion in work settings and organizations.
Graeser, Silke
2011-12-01
Accompanied by an increasing awareness of companies and organizations for mental health conditions in work settings and organizations, the salutogenic perspective provides a promising approach to identify supportive factors and resources of organizations to promote mental health. Based on the sense of coherence (SOC) - usually treated as an individual and personality trait concept - an organization-based SOC scale was developed to identify potential salutogenic factors of a university as an organization and work place. Based on results of two samples of employees (n = 362, n = 204), factors associated with the organization-based SOC were evaluated. Statistical analysis yielded significant correlations between mental health and the setting-based SOC as well as the three factors of the SOC yielded by factor analysis yielded three factors comprehensibility, manageability and meaningfulness. Significant statistic results of bivariate and multivariate analyses emphasize the significance of aspects such as participation and comprehensibility referring to the organization, social cohesion and social climate on the social level, and recognition on the individual level for an organization-based SOC. Potential approaches for the further development of interventions for work-place health promotion based on salutogenic factors and resources on the individual, social and organization level are elaborated and the transcultural dimensions of these factors discussed.
Cui, Wenchao; Wang, Yi; Lei, Tao; Fan, Yangyu; Feng, Yan
2013-01-01
This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP) and Bayes' rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method.
Sangster, Janice; Furber, Susan; Phongsavan, Philayrath; Redfern, Julie; Mark, Andrew; Bauman, Adrian
2017-04-01
This study aimed to determine the replicability of a pedometer-based telephone coaching intervention by comparing the outcomes of a study conducted in rural and urban settings to a study that previously found the same intervention effective in a semi-rural setting. Replication studies are conducted to assess whether an efficacious intervention is effective in multiple different settings. This study compared the outcomes of a pedometer-based coaching intervention implemented in urban and rural settings (replication study) with the same intervention implemented in a semi-rural setting (reference study) on physical activity levels. Improvements in total weekly physical activity time in the replication study were significant from baseline to six weeks (p<0.001 urban, p=0.006 rural) and remained significant at six months (p=0.029 urban, p=0.005 rural). These increases were comparable to those achieved in the original efficacy trial conducted in a semi-rural setting. The pedometer-based telephone coaching intervention increases physical activity levels of people with cardiac disease referred to a CR program in diverse settings. This replication study indicates the suitability of this minimal contact, low-cost intervention for further scaling-up to address unmet need in community-dwelling cardiac patients. Copyright © 2016 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). All rights reserved.
Multi-level basis selection of wavelet packet decomposition tree for heart sound classification.
Safara, Fatemeh; Doraisamy, Shyamala; Azman, Azreen; Jantan, Azrul; Abdullah Ramaiah, Asri Ranga
2013-10-01
Wavelet packet transform decomposes a signal into a set of orthonormal bases (nodes) and provides opportunities to select an appropriate set of these bases for feature extraction. In this paper, multi-level basis selection (MLBS) is proposed to preserve the most informative bases of a wavelet packet decomposition tree through removing less informative bases by applying three exclusion criteria: frequency range, noise frequency, and energy threshold. MLBS achieved an accuracy of 97.56% for classifying normal heart sound, aortic stenosis, mitral regurgitation, and aortic regurgitation. MLBS is a promising basis selection to be suggested for signals with a small range of frequencies. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.
76 FR 24960 - Sentencing Guidelines for United States Courts
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-03
... 2010, Public Law 111-220 (the ``Act''). The Act reduced the statutory penalties for cocaine base... other drugs, i.e., the base offense levels for crack cocaine are set in the Drug Quantity Table so that..., offenses involving 28 grams or more of crack cocaine are assigned a base offense level of 26, offenses...
A highly efficient 3D level-set grain growth algorithm tailored for ccNUMA architecture
NASA Astrophysics Data System (ADS)
Mießen, C.; Velinov, N.; Gottstein, G.; Barrales-Mora, L. A.
2017-12-01
A highly efficient simulation model for 2D and 3D grain growth was developed based on the level-set method. The model introduces modern computational concepts to achieve excellent performance on parallel computer architectures. Strong scalability was measured on cache-coherent non-uniform memory access (ccNUMA) architectures. To achieve this, the proposed approach considers the application of local level-set functions at the grain level. Ideal and non-ideal grain growth was simulated in 3D with the objective to study the evolution of statistical representative volume elements in polycrystals. In addition, microstructure evolution in an anisotropic magnetic material affected by an external magnetic field was simulated.
Novel gene sets improve set-level classification of prokaryotic gene expression data.
Holec, Matěj; Kuželka, Ondřej; Železný, Filip
2015-10-28
Set-level classification of gene expression data has received significant attention recently. In this setting, high-dimensional vectors of features corresponding to genes are converted into lower-dimensional vectors of features corresponding to biologically interpretable gene sets. The dimensionality reduction brings the promise of a decreased risk of overfitting, potentially resulting in improved accuracy of the learned classifiers. However, recent empirical research has not confirmed this expectation. Here we hypothesize that the reported unfavorable classification results in the set-level framework were due to the adoption of unsuitable gene sets defined typically on the basis of the Gene ontology and the KEGG database of metabolic networks. We explore an alternative approach to defining gene sets, based on regulatory interactions, which we expect to collect genes with more correlated expression. We hypothesize that such more correlated gene sets will enable to learn more accurate classifiers. We define two families of gene sets using information on regulatory interactions, and evaluate them on phenotype-classification tasks using public prokaryotic gene expression data sets. From each of the two gene-set families, we first select the best-performing subtype. The two selected subtypes are then evaluated on independent (testing) data sets against state-of-the-art gene sets and against the conventional gene-level approach. The novel gene sets are indeed more correlated than the conventional ones, and lead to significantly more accurate classifiers. The novel gene sets are indeed more correlated than the conventional ones, and lead to significantly more accurate classifiers. Novel gene sets defined on the basis of regulatory interactions improve set-level classification of gene expression data. The experimental scripts and other material needed to reproduce the experiments are available at http://ida.felk.cvut.cz/novelgenesets.tar.gz.
On the Relationship between Variational Level Set-Based and SOM-Based Active Contours
Abdelsamea, Mohammed M.; Gnecco, Giorgio; Gaber, Mohamed Medhat; Elyan, Eyad
2015-01-01
Most Active Contour Models (ACMs) deal with the image segmentation problem as a functional optimization problem, as they work on dividing an image into several regions by optimizing a suitable functional. Among ACMs, variational level set methods have been used to build an active contour with the aim of modeling arbitrarily complex shapes. Moreover, they can handle also topological changes of the contours. Self-Organizing Maps (SOMs) have attracted the attention of many computer vision scientists, particularly in modeling an active contour based on the idea of utilizing the prototypes (weights) of a SOM to control the evolution of the contour. SOM-based models have been proposed in general with the aim of exploiting the specific ability of SOMs to learn the edge-map information via their topology preservation property and overcoming some drawbacks of other ACMs, such as trapping into local minima of the image energy functional to be minimized in such models. In this survey, we illustrate the main concepts of variational level set-based ACMs, SOM-based ACMs, and their relationship and review in a comprehensive fashion the development of their state-of-the-art models from a machine learning perspective, with a focus on their strengths and weaknesses. PMID:25960736
Fligor, Brian J; Cox, L Clarke
2004-12-01
To measure the sound levels generated by the headphones of commercially available portable compact disc players and provide hearing healthcare providers with safety guidelines based on a theoretical noise dose model. Using a Knowles Electronics Manikin for Acoustical Research and a personal computer, output levels across volume control settings were recorded from headphones driven by a standard signal (white noise) and compared with output levels from music samples of eight different genres. Many commercially available models from different manufacturers were investigated. Several different styles of headphones (insert, supra-aural, vertical, and circumaural) were used to determine if style of headphone influenced output level. Free-field equivalent sound pressure levels measured at maximum volume control setting ranged from 91 dBA to 121 dBA. Output levels varied across manufacturers and style of headphone, although generally the smaller the headphone, the higher the sound level for a given volume control setting. Specifically, in one manufacturer, insert earphones increased output level 7-9 dB, relative to the output from stock headphones included in the purchase of the CD player. In a few headphone-CD player combinations, peak sound pressure levels exceeded 130 dB SPL. Based on measured sound pressure levels across systems and the noise dose model recommended by National Institute for Occupational Safety and Health for protecting the occupational worker, a maximum permissible noise dose would typically be reached within 1 hr of listening with the volume control set to 70% of maximum gain using supra-aural headphones. Using headphones that resulted in boosting the output level (e.g., insert earphones used in this study) would significantly decrease the maximum safe volume control setting; this effect was unpredictable from one manufacturer to another. In the interest of providing a straightforward recommendation that should protect the hearing of the majority of consumers, reasonable guidelines would include a recommendation to limit headphone use to 1 hr or less per day if using supra-aural style headphones at a gain control setting of 60% of maximum.
Computer-aided detection of initial polyp candidates with level set-based adaptive convolution
NASA Astrophysics Data System (ADS)
Zhu, Hongbin; Duan, Chaijie; Liang, Zhengrong
2009-02-01
In order to eliminate or weaken the interference between different topological structures on the colon wall, adaptive and normalized convolution methods were used to compute the first and second order spatial derivatives of computed tomographic colonography images, which is the beginning of various geometric analyses. However, the performance of such methods greatly depends on the single-layer representation of the colon wall, which is called the starting layer (SL) in the following text. In this paper, we introduce a level set-based adaptive convolution (LSAC) method to compute the spatial derivatives, in which the level set method is employed to determine a more reasonable SL. The LSAC was applied to a computer-aided detection (CAD) scheme to detect the initial polyp candidates, and experiments showed that it benefits the CAD scheme in both the detection sensitivity and specificity as compared to our previous work.
An extension of the directed search domain algorithm to bilevel optimization
NASA Astrophysics Data System (ADS)
Wang, Kaiqiang; Utyuzhnikov, Sergey V.
2017-08-01
A method is developed for generating a well-distributed Pareto set for the upper level in bilevel multiobjective optimization. The approach is based on the Directed Search Domain (DSD) algorithm, which is a classical approach for generation of a quasi-evenly distributed Pareto set in multiobjective optimization. The approach contains a double-layer optimizer designed in a specific way under the framework of the DSD method. The double-layer optimizer is based on bilevel single-objective optimization and aims to find a unique optimal Pareto solution rather than generate the whole Pareto frontier on the lower level in order to improve the optimization efficiency. The proposed bilevel DSD approach is verified on several test cases, and a relevant comparison against another classical approach is made. It is shown that the approach can generate a quasi-evenly distributed Pareto set for the upper level with relatively low time consumption.
Two-way coupled SPH and particle level set fluid simulation.
Losasso, Frank; Talton, Jerry; Kwatra, Nipun; Fedkiw, Ronald
2008-01-01
Grid-based methods have difficulty resolving features on or below the scale of the underlying grid. Although adaptive methods (e.g. RLE, octrees) can alleviate this to some degree, separate techniques are still required for simulating small-scale phenomena such as spray and foam, especially since these more diffuse materials typically behave quite differently than their denser counterparts. In this paper, we propose a two-way coupled simulation framework that uses the particle level set method to efficiently model dense liquid volumes and a smoothed particle hydrodynamics (SPH) method to simulate diffuse regions such as sprays. Our novel SPH method allows us to simulate both dense and diffuse water volumes, fully incorporates the particles that are automatically generated by the particle level set method in under-resolved regions, and allows for two way mixing between dense SPH volumes and grid-based liquid representations.
Occupational Home Economics Education Series. Catering Services. Competency Based Teaching Module.
ERIC Educational Resources Information Center
Lowe, Phyllis; And Others
This module, one of ten competency based modules developed for vocational home economics teachers, is based on a job cluster in the catering industry. It is designed for use with a variety of levels of learners (secondary, postsecondary, adult) in both school and non-school educational settings. Focusing on two levels of employment, food caterer…
Stoutenberg, Mark; Falcon, Ashley; Arheart, Kris; Stasi, Selina; Portacio, Francia; Stepanenko, Bryan; Lan, Mary L; Castruccio-Prince, Catarina; Nackenson, Joshua
2017-06-01
Lifestyle modification programs improve several health-related behaviors, including physical activity (PA) and nutrition. However, few of these programs have been expanded to impact a large number of individuals in one setting at one time. Therefore, the purpose of this study was to determine whether a PA- and nutrition-based lifestyle modification program could be effectively conducted using a large group format in a community-based setting. One hundred twenty-one participants enrolled in a 16-week, community-based lifestyle modification program and separated in small teams of 13 to 17 individuals. Height, weight, fruit and vegetable (FAV) consumption, physical fitness, and several psychosocial measures were assessed before and after the program. Significant improvements in 6-minute walk distance (+68.3 m; p < .001), chair stands (+6.7 repetitions; p < .001), FAV servings (+1.8 servings/day; p < .001), body weight (-3.2 lbs; p < .001), as well as PA social support and eating habits self-efficacy were observed. Our lifestyle modification program was also successful in shifting participants to higher levels of stages of change for nutrition and PA, increasing overall levels of self-efficacy for healthy eating, and improving levels of social support for becoming more active. A lifestyle modification program can be successfully implemented in a community setting using a large group format to improve PA and FAV attitudes and behaviors.
ERIC Educational Resources Information Center
Gogoulou, Agoritsa; Gouli, Evangelia; Grigoriadou, Maria; Samarakou, Maria; Chinou, Dionisia
2007-01-01
In this paper, we present a web-based educational setting, referred to as SCALE (Supporting Collaboration and Adaptation in a Learning Environment), which aims to serve learning and assessment. SCALE enables learners to (i) work on individual and collaborative activities proposed by the environment with respect to learners' knowledge level, (ii)…
The evolution of PBMA: towards a macro-level priority setting framework for health regions.
Mitton, Craig R; Donaldson, Cam; Waldner, Howard; Eagle, Chris
2003-11-01
To date, relatively little work on priority setting has been carried out at a macro-level across major portfolios within integrated health care organizations. This paper describes a macro marginal analysis (MMA) process for setting priorities and allocating resources in health authorities, based on work carried out in a major urban health region in Alberta, Canada. MMA centers around an expert working group of managers and clinicians who are charged with identifying areas for resource re-allocation on an ongoing basis. Trade-offs between services are based on locally defined criteria and are informed by multiple inputs such as evidence from the literature and local expert opinion. The approach is put forth as a significant improvement on historical resource allocation patterns.
Solution to the mean king's problem with mutually unbiased bases for arbitrary levels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kimura, Gen; Tanaka, Hajime; Ozawa, Masanao
2006-05-15
The mean king's problem with mutually unbiased bases is reconsidered for arbitrary d-level systems. Hayashi et al. [Phys. Rev. A 71, 052331 (2005)] related the problem to the existence of a maximal set of d-1 mutually orthogonal Latin squares, in their restricted setting that allows only measurements of projection-valued measures. However, we then cannot find a solution to the problem when, e.g., d=6 or d=10. In contrast to their result, we show that the king's problem always has a solution for arbitrary levels if we also allow positive operator-valued measures. In constructing the solution, we use orthogonal arrays in combinatorialmore » design theory.« less
Segmentation of heterogeneous blob objects through voting and level set formulation
Chang, Hang; Yang, Qing; Parvin, Bahram
2009-01-01
Blob-like structures occur often in nature, where they aid in cueing and the pre-attentive process. These structures often overlap, form perceptual boundaries, and are heterogeneous in shape, size, and intensity. In this paper, voting, Voronoi tessellation, and level set methods are combined to delineate blob-like structures. Voting and subsequent Voronoi tessellation provide the initial condition and the boundary constraints for each blob, while curve evolution through level set formulation provides refined segmentation of each blob within the Voronoi region. The paper concludes with the application of the proposed method to a dataset produced from cell based fluorescence assays and stellar data. PMID:19774202
Priority setting at the micro-, meso- and macro-levels in Canada, Norway and Uganda.
Kapiriri, Lydia; Norheim, Ole Frithjof; Martin, Douglas K
2007-06-01
The objectives of this study were (1) to describe the process of healthcare priority setting in Ontario-Canada, Norway and Uganda at the three levels of decision-making; (2) to evaluate the description using the framework for fair priority setting, accountability for reasonableness; so as to identify lessons of good practices. We carried out case studies involving key informant interviews, with 184 health practitioners and health planners from the macro-level, meso-level and micro-level from Canada-Ontario, Norway and Uganda (selected by virtue of their varying experiences in priority setting). Interviews were audio-recorded, transcribed and analyzed using a modified thematic approach. The descriptions were evaluated against the four conditions of "accountability for reasonableness", relevance, publicity, revisions and enforcement. Areas of adherence to these conditions were identified as lessons of good practices; areas of non-adherence were identified as opportunities for improvement. (i) at the macro-level, in all three countries, cabinet makes most of the macro-level resource allocation decisions and they are influenced by politics, public pressure, and advocacy. Decisions within the ministries of health are based on objective formulae and evidence. International priorities influenced decisions in Uganda. Some priority-setting reasons are publicized through circulars, printed documents and the Internet in Canada and Norway. At the meso-level, hospital priority-setting decisions were made by the hospital managers and were based on national priorities, guidelines, and evidence. Hospital departments that handle emergencies, such as surgery, were prioritized. Some of the reasons are available on the hospital intranet or presented at meetings. Micro-level practitioners considered medical and social worth criteria. These reasons are not publicized. Many practitioners lacked knowledge of the macro- and meso-level priority-setting processes. (ii) Evaluation-relevance: medical evidence and economic criteria were thought to be relevant, but lobbying was thought to be irrelevant. Publicity: all cases lacked clear and effective mechanisms for publicity. REVISIONS: formal mechanisms, following the planning hierarchy, were considered less effective, informal political mechanisms were considered more effective. Canada and Norway had patients' relations officers to deal with patients' dissensions; however, revisions were more difficult in Uganda. Enforcement: leadership for ensuring decision-making fairness was not apparent. The different levels of priority setting in the three countries fulfilled varying conditions of accountability for reasonableness, none satisfied all the four conditions. To improve, decision makers at the three levels in all three cases should engage frontline practitioners, develop more effectively publicized reasons, and develop formal mechanisms for challenging and revising decisions.
Petruzielo, F R; Toulouse, Julien; Umrigar, C J
2011-02-14
A simple yet general method for constructing basis sets for molecular electronic structure calculations is presented. These basis sets consist of atomic natural orbitals from a multiconfigurational self-consistent field calculation supplemented with primitive functions, chosen such that the asymptotics are appropriate for the potential of the system. Primitives are optimized for the homonuclear diatomic molecule to produce a balanced basis set. Two general features that facilitate this basis construction are demonstrated. First, weak coupling exists between the optimal exponents of primitives with different angular momenta. Second, the optimal primitive exponents for a chosen system depend weakly on the particular level of theory employed for optimization. The explicit case considered here is a basis set appropriate for the Burkatzki-Filippi-Dolg pseudopotentials. Since these pseudopotentials are finite at nuclei and have a Coulomb tail, the recently proposed Gauss-Slater functions are the appropriate primitives. Double- and triple-zeta bases are developed for elements hydrogen through argon. These new bases offer significant gains over the corresponding Burkatzki-Filippi-Dolg bases at various levels of theory. Using a Gaussian expansion of the basis functions, these bases can be employed in any electronic structure method. Quantum Monte Carlo provides an added benefit: expansions are unnecessary since the integrals are evaluated numerically.
Boosting standard order sets utilization through clinical decision support.
Li, Haomin; Zhang, Yinsheng; Cheng, Haixia; Lu, Xudong; Duan, Huilong
2013-01-01
Well-designed standard order sets have the potential to integrate and coordinate care by communicating best practices through multiple disciplines, levels of care, and services. However, there are several challenges which certainly affected the benefits expected from standard order sets. To boost standard order sets utilization, a problem-oriented knowledge delivery solution was proposed in this study to facilitate access of standard order sets and evaluation of its treatment effect. In this solution, standard order sets were created along with diagnostic rule sets which can trigger a CDS-based reminder to help clinician quickly discovery hidden clinical problems and corresponding standard order sets during ordering. Those rule set also provide indicators for targeted evaluation of standard order sets during treatment. A prototype system was developed based on this solution and will be presented at Medinfo 2013.
Optic disc segmentation: level set methods and blood vessels inpainting
NASA Astrophysics Data System (ADS)
Almazroa, A.; Sun, Weiwei; Alodhayb, Sami; Raahemifar, Kaamran; Lakshminarayanan, Vasudevan
2017-03-01
Segmenting the optic disc (OD) is an important and essential step in creating a frame of reference for diagnosing optic nerve head (ONH) pathology such as glaucoma. Therefore, a reliable OD segmentation technique is necessary for automatic screening of ONH abnormalities. The main contribution of this paper is in presenting a novel OD segmentation algorithm based on applying a level set method on a localized OD image. To prevent the blood vessels from interfering with the level set process, an inpainting technique is applied. The algorithm is evaluated using a new retinal fundus image dataset called RIGA (Retinal Images for Glaucoma Analysis). In the case of low quality images, a double level set is applied in which the first level set is considered to be a localization for the OD. Five hundred and fifty images are used to test the algorithm accuracy as well as its agreement with manual markings by six ophthalmologists. The accuracy of the algorithm in marking the optic disc area and centroid is 83.9%, and the best agreement is observed between the results of the algorithm and manual markings in 379 images.
Emerging Role of Immunotherapy in Advanced Urothelial Carcinoma.
Koshkin, Vadim S; Grivas, Petros
2018-04-11
Advanced urothelial carcinoma (aUC) has long been treated preferably with cisplatin-based chemotherapy, but many patients are cisplatin-ineligible whereas for those who progress on a platinum-based regimen treatment options are limited. We review key recent data regarding immune checkpoint inhibitors that are changing this treatment landscape. Since May 2016, five different agents targeting the PD-1/PD-L1 pathway (atezolizumab, pembrolizumab, nivolumab, avelumab, durvalumab) have received FDA approval for the treatment of aUC in the platinum-refractory setting, while pembrolizumab and atezolizumab are FDA-approved for cisplatin-ineligible patients in the first-line setting. Clinical outcomes and safety profiles of these agents appear relatively comparable across separate trials; however, only pembrolizumab is supported by level I evidence from a large randomized phase III trial showing overall survival benefit over conventional cytotoxic salvage chemotherapy in the platinum-refractory setting. Pembrolizumab has the highest level of evidence in platinum-refractory aUC, whereas pembrolizumab and atezolizumab have comparable level of evidence in the frontline setting in cisplatin-ineligible patients. Ongoing research is evaluating novel agents, various rational combinations, and sequences, as well as predictive and prognostic biomarkers.
Markel, D; Naqa, I El; Freeman, C; Vallières, M
2012-06-01
To present a novel joint segmentation/registration for multimodality image-guided and adaptive radiotherapy. A major challenge to this framework is the sensitivity of many segmentation or registration algorithms to noise. Presented is a level set active contour based on the Jensen-Renyi (JR) divergence to achieve improved noise robustness in a multi-modality imaging space. To present a novel joint segmentation/registration for multimodality image-guided and adaptive radiotherapy. A major challenge to this framework is the sensitivity of many segmentation or registration algorithms to noise. Presented is a level set active contour based on the Jensen-Renyi (JR) divergence to achieve improved noise robustness in a multi-modality imaging space. It was found that JR divergence when used for segmentation has an improved robustness to noise compared to using mutual information, or other entropy-based metrics. The MI metric failed at around 2/3 the noise power than the JR divergence. The JR divergence metric is useful for the task of joint segmentation/registration of multimodality images and shows improved results compared entropy based metric. The algorithm can be easily modified to incorporate non-intensity based images, which would allow applications into multi-modality and texture analysis. © 2012 American Association of Physicists in Medicine.
Level set method for image segmentation based on moment competition
NASA Astrophysics Data System (ADS)
Min, Hai; Wang, Xiao-Feng; Huang, De-Shuang; Jin, Jing; Wang, Hong-Zhi; Li, Hai
2015-05-01
We propose a level set method for image segmentation which introduces the moment competition and weakly supervised information into the energy functional construction. Different from the region-based level set methods which use force competition, the moment competition is adopted to drive the contour evolution. Here, a so-called three-point labeling scheme is proposed to manually label three independent points (weakly supervised information) on the image. Then the intensity differences between the three points and the unlabeled pixels are used to construct the force arms for each image pixel. The corresponding force is generated from the global statistical information of a region-based method and weighted by the force arm. As a result, the moment can be constructed and incorporated into the energy functional to drive the evolving contour to approach the object boundary. In our method, the force arm can take full advantage of the three-point labeling scheme to constrain the moment competition. Additionally, the global statistical information and weakly supervised information are successfully integrated, which makes the proposed method more robust than traditional methods for initial contour placement and parameter setting. Experimental results with performance analysis also show the superiority of the proposed method on segmenting different types of complicated images, such as noisy images, three-phase images, images with intensity inhomogeneity, and texture images.
Blana, Dimitra; Hincapie, Juan G; Chadwick, Edward K; Kirsch, Robert F
2013-01-01
Neuroprosthetic systems based on functional electrical stimulation aim to restore motor function to individuals with paralysis following spinal cord injury. Identifying the optimal electrode set for the neuroprosthesis is complicated because it depends on the characteristics of the individual (such as injury level), the force capacities of the muscles, the movements the system aims to restore, and the hardware limitations (number and type of electrodes available). An electrode-selection method has been developed that uses a customized musculoskeletal model. Candidate electrode sets are created based on desired functional outcomes and the hard ware limitations of the proposed system. Inverse-dynamic simulations are performed to determine the proportion of target movements that can be accomplished with each set; the set that allows the most movements to be performed is chosen as the optimal set. The technique is demonstrated here for a system recently developed by our research group to restore whole-arm movement to individuals with high-level tetraplegia. The optimal set included selective nerve-cuff electrodes for the radial and musculocutaneous nerves; single-channel cuffs for the axillary, suprascapular, upper subscapular, and long-thoracic nerves; and muscle-based electrodes for the remaining channels. The importance of functional goals, hardware limitations, muscle and nerve anatomy, and surgical feasibility are highlighted.
Grout formulation for disposal of low-level and hazardous waste streams containing fluoride
McDaniel, E.W.; Sams, T.L.; Tallent, O.K.
1987-06-02
A composition and related process for disposal of hazardous waste streams containing fluoride in cement-based materials is disclosed. the presence of fluoride in cement-based materials is disclosed. The presence of fluoride in waste materials acts as a set retarder and as a result, prevents cement-based grouts from setting. This problem is overcome by the present invention wherein calcium hydroxide is incorporated into the dry-solid portion of the grout mix. The calcium hydroxide renders the fluoride insoluble, allowing the grout to set up and immobilize all hazardous constituents of concern. 4 tabs.
Barometric fluctuations in wells tapping deep unconfined aquifers
Weeks, Edwin P.
1979-01-01
Water levels in wells screened only below the water table in unconfined aquifers fluctuate in response to atmospheric pressure changes. These fluctuations occur because the materials composing the unsaturated zone resist air movement and have capacity to store air with a change in pressure. Consequently, the translation of any pressure change at land surface is slowed as it moves through the unsaturated zone to the water table, but it reaches the water surface in the well instantaneously. Thus a pressure imbalance is created that results in a water level fluctuation. Barometric effects on water levels in unconfined aquifers can be computed by solution of the differential equation governing the flow of gas in the unsaturated zone subject to the appropriate boundary conditions. Solutions to this equation for two sets of boundary conditions were applied to compute water level response in a well tapping the Ogallala Formation near Lubbock, Texas from simultaneous microbarograph records. One set of computations, based on the step function unit response solution and convolution, resulted in a very good match between computed and measured water levels. A second set of computations, based on analysis of the amplitude ratios of simultaneous cyclic microbarograph and water level fluctuations, gave inconsistent results in terms of the unsaturated zone pneumatic properties but provided useful insights on the nature of unconfined-aquifer water level fluctuations.
Ramdane, Said; Daoudi-Gueddah, Doria
2011-08-01
We examined retrospectively the concurrent relationships between fasting plasma total cholesterol, triglycerides, and glucose levels, and Alzheimer's disease (AD), in a clinical setting-based study. Total cholesterol level was higher in patients with AD compared to elderly controls; triglycerides or glucose levels did not significantly differ between the 2 groups. Respective plotted trajectories of change in cholesterol level across age were fairly parallel. No significant difference in total cholesterol levels was recorded between patients with AD classified by the Clinical Dementia Rating (CDR) score subgroups. These results suggest that patients with AD have relative mild total hypercholesterolemia, normal triglyceridemia, and normal fasting plasma glucose level. Mild total hypercholesterolemia seems to be permanent across age, and across dementia severity staging, and fairly parallels the trajectory of age-related change in total cholesterolemia of healthy controls. We speculate that these biochemical parameters pattern may be present long before-a decade at least-the symptomatic onset of the disease.
Robustness of observation-based decadal sea level variability in the Indo-Pacific Ocean
NASA Astrophysics Data System (ADS)
Nidheesh, A. G.; Lengaigne, M.; Vialard, J.; Izumo, T.; Unnikrishnan, A. S.; Meyssignac, B.; Hamlington, B.; de Boyer Montegut, C.
2017-07-01
We examine the consistency of Indo-Pacific decadal sea level variability in 10 gridded, observation-based sea level products for the 1960-2010 period. Decadal sea level variations are robust in the Pacific, with more than 50% of variance explained by decadal modulation of two flavors of El Niño-Southern Oscillation (classical ENSO and Modoki). Amplitude of decadal sea level variability is weaker in the Indian Ocean than in the Pacific. All data sets indicate a transmission of decadal sea level signals from the western Pacific to the northwest Australian coast through the Indonesian throughflow. The southern tropical Indian Ocean sea level variability is associated with decadal modulations of ENSO in reconstructions but not in reanalyses or in situ data set. The Pacific-independent Indian Ocean decadal sea level variability is not robust but tends to be maximum in the southwestern tropical Indian Ocean. The inconsistency of Indian Ocean decadal variability across the sea level products calls for caution in making definitive conclusions on decadal sea level variability in this basin.
Combining multiple tools outperforms individual methods in gene set enrichment analyses.
Alhamdoosh, Monther; Ng, Milica; Wilson, Nicholas J; Sheridan, Julie M; Huynh, Huy; Wilson, Michael J; Ritchie, Matthew E
2017-02-01
Gene set enrichment (GSE) analysis allows researchers to efficiently extract biological insight from long lists of differentially expressed genes by interrogating them at a systems level. In recent years, there has been a proliferation of GSE analysis methods and hence it has become increasingly difficult for researchers to select an optimal GSE tool based on their particular dataset. Moreover, the majority of GSE analysis methods do not allow researchers to simultaneously compare gene set level results between multiple experimental conditions. The ensemble of genes set enrichment analyses (EGSEA) is a method developed for RNA-sequencing data that combines results from twelve algorithms and calculates collective gene set scores to improve the biological relevance of the highest ranked gene sets. EGSEA's gene set database contains around 25 000 gene sets from sixteen collections. It has multiple visualization capabilities that allow researchers to view gene sets at various levels of granularity. EGSEA has been tested on simulated data and on a number of human and mouse datasets and, based on biologists' feedback, consistently outperforms the individual tools that have been combined. Our evaluation demonstrates the superiority of the ensemble approach for GSE analysis, and its utility to effectively and efficiently extrapolate biological functions and potential involvement in disease processes from lists of differentially regulated genes. EGSEA is available as an R package at http://www.bioconductor.org/packages/EGSEA/ . The gene sets collections are available in the R package EGSEAdata from http://www.bioconductor.org/packages/EGSEAdata/ . monther.alhamdoosh@csl.com.au mritchie@wehi.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Malacarne, D; Pesenti, R; Paolucci, M; Parodi, S
1993-01-01
For a database of 826 chemicals tested for carcinogenicity, we fragmented the structural formula of the chemicals into all possible contiguous-atom fragments with size between two and eight (nonhydrogen) atoms. The fragmentation was obtained using a new software program based on graph theory. We used 80% of the chemicals as a training set and 20% as a test set. The two sets were obtained by random sorting. From the training sets, an average (8 computer runs with independently sorted chemicals) of 315 different fragments were significantly (p < 0.125) associated with carcinogenicity or lack thereof. Even using this relatively low level of statistical significance, 23% of the molecules of the test sets lacked significant fragments. For 77% of the molecules of the test sets, we used the presence of significant fragments to predict carcinogenicity. The average level of accuracy of the predictions in the test sets was 67.5%. Chemicals containing only positive fragments were predicted with an accuracy of 78.7%. The level of accuracy was around 60% for chemicals characterized by contradictory fragments or only negative fragments. In a parallel manner, we performed eight paired runs in which carcinogenicity was attributed randomly to the molecules of the training sets. The fragments generated by these pseudo-training sets were devoid of any predictivity in the corresponding test sets. Using an independent software program, we confirmed (for the complex biological endpoint of carcinogenicity) the validity of a structure-activity relationship approach of the type proposed by Klopman and Rosenkranz with their CASE program. Images Figure 1. Figure 2. Figure 3. Figure 4. Figure 5. Figure 6. PMID:8275991
NASA Astrophysics Data System (ADS)
Fu, Lin; Hu, Xiangyu Y.; Adams, Nikolaus A.
2017-12-01
We propose efficient single-step formulations for reinitialization and extending algorithms, which are critical components of level-set based interface-tracking methods. The level-set field is reinitialized with a single-step (non iterative) "forward tracing" algorithm. A minimum set of cells is defined that describes the interface, and reinitialization employs only data from these cells. Fluid states are extrapolated or extended across the interface by a single-step "backward tracing" algorithm. Both algorithms, which are motivated by analogy to ray-tracing, avoid multiple block-boundary data exchanges that are inevitable for iterative reinitialization and extending approaches within a parallel-computing environment. The single-step algorithms are combined with a multi-resolution conservative sharp-interface method and validated by a wide range of benchmark test cases. We demonstrate that the proposed reinitialization method achieves second-order accuracy in conserving the volume of each phase. The interface location is invariant to reapplication of the single-step reinitialization. Generally, we observe smaller absolute errors than for standard iterative reinitialization on the same grid. The computational efficiency is higher than for the standard and typical high-order iterative reinitialization methods. We observe a 2- to 6-times efficiency improvement over the standard method for serial execution. The proposed single-step extending algorithm, which is commonly employed for assigning data to ghost cells with ghost-fluid or conservative interface interaction methods, shows about 10-times efficiency improvement over the standard method while maintaining same accuracy. Despite their simplicity, the proposed algorithms offer an efficient and robust alternative to iterative reinitialization and extending methods for level-set based multi-phase simulations.
Kukafka, Rita; Allegrante, John P; Khan, Sharib; Bigger, J Thomas; Johnson, Stephen B
2013-09-01
Solutions are employed to support clinical research trial tasks in community-based practice settings. Using the IT Implementation Framework (ITIF), an integrative framework intended to guide the synthesis of theoretical perspectives for planning multi-level interventions to enhance IT use, we sought to understand the barriers and facilitators to clinical research in community-based practice settings preliminary to implementing new informatics solutions for improving clinical research infrastructure. The studies were conducted in practices within the Columbia University Clinical Trials Network. A mixed-method approach, including surveys, interviews, time-motion studies, and observations was used. The data collected, which incorporates predisposing, enabling, and reinforcing factors in IT use, were analyzed according to each phase of ITIF. Themes identified in the first phase of ITIF were 1) processes and tools to support clinical trial research and 2) clinical research peripheral to patient care processes. Not all of the problems under these themes were found to be amenable to IT solutions. Using the multi-level orientation of the ITIF, we set forth strategies beyond IT solutions that can have an impact on reengineering clinical research tasks in practice-based settings. Developing strategies to target enabling and reinforcing factors, which focus on organizational factors, and the motivation of the practice at large to use IT solutions to integrate clinical research tasks with patient care processes, is most challenging. The ITIF should be used to consider both IT and non-IT solutions concurrently for reengineering of clinical research in community-based practice settings. © 2013.
Nutrient intake values (NIVs): a recommended terminology and framework for the derivation of values.
King, Janet C; Vorster, Hester H; Tome, Daniel G
2007-03-01
Although most countries and regions around the world set recommended nutrient intake values for their populations, there is no standardized terminology or framework for establishing these standards. Different terms used for various components of a set of dietary standards are described in this paper and a common set of terminology is proposed. The recommended terminology suggests that the set of values be called nutrient intake values (NIVs) and that the set be composed of three different values. The average nutrient requirement (ANR) reflects the median requirement for a nutrient in a specific population. The individual nutrient level (INLx) is the recommended level of nutrient intake for all healthy people in the population, which is set at a certain level x above the mean requirement. For example, a value set at 2 standard deviations above the mean requirement would cover the needs of 98% of the population and would be INL98. The third component of the NIVs is an upper nutrient level (UNL), which is the highest level of daily nutrient intake that is likely to pose no risk of adverse health effects for almost all individuals in a specified life-stage group. The proposed framework for deriving a set of NIVs is based on a statistical approach for determining the midpoint of a distribution of requirements for a set of nutrients in a population (the ANR), the standard deviation of the requirements, and an individual nutrient level that assures health at some point above the mean, e.g., 2 standard deviations. Ideally, a second set of distributions of risk of excessive intakes is used as the basis for a UNL.
Hogan, Tiffany P.
2010-01-01
In this study, we examined the influence of word-level phonological and lexical characteristics on early phoneme awareness. Typically-developing children, ages 61–78 months, completed a phoneme-based, odd-one-out task that included consonant-vowel-consonant word sets (e.g., “chair-chain-ship”) that varied orthogonally by a phonological characteristic, sound-contrast similarity (similar vs. dissimilar), and a lexical characteristic, neighborhood density (dense vs. sparse). In a subsample of the participants – those with the highest vocabularies – results were in line with a predicted interactive effect of phonological and lexical characteristics on phoneme awareness performance: word sets contrasting similar sounds were less likely to yield correct responses in words from sparse neighborhoods than words from dense neighborhoods. Word sets contrasting dissimilar sounds were most likely to yield correct responses regardless of the words’ neighborhood density. Based on these findings, theories of early phoneme awareness development should consider both word-level (e.g., phonological and lexical characteristics) and child-level (e.g., vocabulary knowledge) influences on phoneme awareness performance. Attention to these word-level item influences is predicted to result in more sensitive and specific measures of reading risk. PMID:20574064
Population-level interventions to reduce alcohol-related harm: an overview of systematic reviews.
Martineau, Fred; Tyner, Elizabeth; Lorenc, Theo; Petticrew, Mark; Lock, Karen
2013-10-01
To analyse available review-level evidence on the effectiveness of population-level interventions in non-clinical settings to reduce alcohol consumption or related health or social harm. Health, social policy and specialist review databases between 2002 and 2012 were searched for systematic reviews of the effectiveness of population-level alcohol interventions on consumption or alcohol-related health or social outcomes. Data were extracted on review research aim, inclusion criteria, outcome indicators, results, conclusions and limitations. Reviews were quality-assessed using AMSTAR criteria. A narrative synthesis was conducted overall and by policy area. Fifty-two reviews were included from ten policy areas. There is good evidence for policies and interventions to limit alcohol sale availability, to reduce drink-driving, to increase alcohol price or taxation. There is mixed evidence for family- and community-level interventions, school-based interventions, and interventions in the alcohol server setting and the mass media. There is weak evidence for workplace interventions and for interventions targeting illicit alcohol sales. There is evidence of the ineffectiveness of interventions in higher education settings. There is a pattern of support from the evidence base for regulatory or statutory enforcement interventions over local non-regulatory approaches targeting specific population groups. © 2013.
Hayenga, Heather N; Thorne, Bryan C; Peirce, Shayn M; Humphrey, Jay D
2011-11-01
There is a need to develop multiscale models of vascular adaptations to understand tissue-level manifestations of cellular level mechanisms. Continuum-based biomechanical models are well suited for relating blood pressures and flows to stress-mediated changes in geometry and properties, but less so for describing underlying mechanobiological processes. Discrete stochastic agent-based models are well suited for representing biological processes at a cellular level, but not for describing tissue-level mechanical changes. We present here a conceptually new approach to facilitate the coupling of continuum and agent-based models. Because of ubiquitous limitations in both the tissue- and cell-level data from which one derives constitutive relations for continuum models and rule-sets for agent-based models, we suggest that model verification should enforce congruency across scales. That is, multiscale model parameters initially determined from data sets representing different scales should be refined, when possible, to ensure that common outputs are consistent. Potential advantages of this approach are illustrated by comparing simulated aortic responses to a sustained increase in blood pressure predicted by continuum and agent-based models both before and after instituting a genetic algorithm to refine 16 objectively bounded model parameters. We show that congruency-based parameter refinement not only yielded increased consistency across scales, it also yielded predictions that are closer to in vivo observations.
Improved inhalation technology for setting safe exposure levels for workplace chemicals
NASA Technical Reports Server (NTRS)
Stuart, Bruce O.
1993-01-01
Threshold Limit Values recommended as allowable air concentrations of a chemical in the workplace are often based upon a no-observable-effect-level (NOEL) determined by experimental inhalation studies using rodents. A 'safe level' for human exposure must then be estimated by the use of generalized safety factors in attempts to extrapolate from experimental rodents to man. The recent development of chemical-specific physiologically-based toxicokinetics makes use of measured physiological, biochemical, and metabolic parameters to construct a validated model that is able to 'scale-up' rodent response data to predict the behavior of the chemical in man. This procedure is made possible by recent advances in personal computer software and the emergence of appropriate biological data, and provides an analytical tool for much more reliable risk evaluation and airborne chemical exposure level setting for humans.
Quality in Early Years Settings and Children's School Achievement. CEP Discussion Paper No. 1468
ERIC Educational Resources Information Center
Blanden, Jo; Hansen, Kirstine; McNally, Sandra
2017-01-01
Childcare quality is often thought to be important for influencing children's subsequent attainment at school. The English Government regulates the quality of early education by setting minimum levels of qualifications for workers and grading settings based on a national Inspectorate (OfSTED). This paper uses administrative data on over two…
Adaptive Set-Based Methods for Association Testing
Su, Yu-Chen; Gauderman, W. James; Kiros, Berhane; Lewinger, Juan Pablo
2017-01-01
With a typical sample size of a few thousand subjects, a single genomewide association study (GWAS) using traditional one-SNP-at-a-time methods can only detect genetic variants conferring a sizable effect on disease risk. Set-based methods, which analyze sets of SNPs jointly, can detect variants with smaller effects acting within a gene, a pathway, or other biologically relevant sets. While self-contained set-based methods (those that test sets of variants without regard to variants not in the set) are generally more powerful than competitive set-based approaches (those that rely on comparison of variants in the set of interest with variants not in the set), there is no consensus as to which self-contained methods are best. In particular, several self-contained set tests have been proposed to directly or indirectly ‘adapt’ to the a priori unknown proportion and distribution of effects of the truly associated SNPs in the set, which is a major determinant of their power. A popular adaptive set-based test is the adaptive rank truncated product (ARTP), which seeks the set of SNPs that yields the best-combined evidence of association. We compared the standard ARTP, several ARTP variations we introduced, and other adaptive methods in a comprehensive simulation study to evaluate their performance. We used permutations to assess significance for all the methods and thus provide a level playing field for comparison. We found the standard ARTP test to have the highest power across our simulations followed closely by the global model of random effects (GMRE) and a LASSO based test. PMID:26707371
NASA Astrophysics Data System (ADS)
Qin, Xulei; Cong, Zhibin; Fei, Baowei
2013-11-01
An automatic segmentation framework is proposed to segment the right ventricle (RV) in echocardiographic images. The method can automatically segment both epicardial and endocardial boundaries from a continuous echocardiography series by combining sparse matrix transform, a training model, and a localized region-based level set. First, the sparse matrix transform extracts main motion regions of the myocardium as eigen-images by analyzing the statistical information of the images. Second, an RV training model is registered to the eigen-images in order to locate the position of the RV. Third, the training model is adjusted and then serves as an optimized initialization for the segmentation of each image. Finally, based on the initializations, a localized, region-based level set algorithm is applied to segment both epicardial and endocardial boundaries in each echocardiograph. Three evaluation methods were used to validate the performance of the segmentation framework. The Dice coefficient measures the overall agreement between the manual and automatic segmentation. The absolute distance and the Hausdorff distance between the boundaries from manual and automatic segmentation were used to measure the accuracy of the segmentation. Ultrasound images of human subjects were used for validation. For the epicardial and endocardial boundaries, the Dice coefficients were 90.8 ± 1.7% and 87.3 ± 1.9%, the absolute distances were 2.0 ± 0.42 mm and 1.79 ± 0.45 mm, and the Hausdorff distances were 6.86 ± 1.71 mm and 7.02 ± 1.17 mm, respectively. The automatic segmentation method based on a sparse matrix transform and level set can provide a useful tool for quantitative cardiac imaging.
Automatic classification of tissue malignancy for breast carcinoma diagnosis.
Fondón, Irene; Sarmiento, Auxiliadora; García, Ana Isabel; Silvestre, María; Eloy, Catarina; Polónia, António; Aguiar, Paulo
2018-05-01
Breast cancer is the second leading cause of cancer death among women. Its early diagnosis is extremely important to prevent avoidable deaths. However, malignancy assessment of tissue biopsies is complex and dependent on observer subjectivity. Moreover, hematoxylin and eosin (H&E)-stained histological images exhibit a highly variable appearance, even within the same malignancy level. In this paper, we propose a computer-aided diagnosis (CAD) tool for automated malignancy assessment of breast tissue samples based on the processing of histological images. We provide four malignancy levels as the output of the system: normal, benign, in situ and invasive. The method is based on the calculation of three sets of features related to nuclei, colour regions and textures considering local characteristics and global image properties. By taking advantage of well-established image processing techniques, we build a feature vector for each image that serves as an input to an SVM (Support Vector Machine) classifier with a quadratic kernel. The method has been rigorously evaluated, first with a 5-fold cross-validation within an initial set of 120 images, second with an external set of 30 different images and third with images with artefacts included. Accuracy levels range from 75.8% when the 5-fold cross-validation was performed to 75% with the external set of new images and 61.11% when the extremely difficult images were added to the classification experiment. The experimental results indicate that the proposed method is capable of distinguishing between four malignancy levels with high accuracy. Our results are close to those obtained with recent deep learning-based methods. Moreover, it performs better than other state-of-the-art methods based on feature extraction, and it can help improve the CAD of breast cancer. Copyright © 2018 Elsevier Ltd. All rights reserved.
Radio frequency tank eigenmode sensor for propellant quantity gauging
NASA Technical Reports Server (NTRS)
Zimmerli, Gregory A. (Inventor)
2013-01-01
A method for measuring the quantity of fluid in a tank may include the steps of selecting a match between a measured set of electromagnetic eigenfrequencies and a simulated plurality of sets of electromagnetic eigenfrequencies using a matching algorithm, wherein the match is one simulated set of electromagnetic eigenfrequencies from the simulated plurality of sets of electromagnetic eigenfrequencies, and determining the fill level of the tank based upon the match.
Neary, W J; Hillier, V F; Flute, T; Stephens, S D G; Ramsden, R T; Evans, D G R
2010-08-01
To investigate the relationship between those issues concerning quality of life in patients with neurofibromatosis type 2 (NF2) as identified by the closed set NF2 questionnaire and the eight norm-based measures and the physical component summary (PCS) and mental component summary (MCS) scores of the Short Form-36 (SF-36) Questionnaire. Postal questionnaire study. Questionnaires sent to subjects' home addresses. Eighty-seven adult subjects under the care of the Manchester Multidisciplinary NF2 Clinic were invited to participate. Sixty-two (71%) completed sets of closed set NF2 questionnaires and SF-36 questionnaires were returned. Subjects with NF2 scored less than the norm of 50 on both the physical component summary and mental component summary scores and the eight individual norm-based measures of the Short Form-36 questionnaire. Correlations (using Kendall's tau) were examined between patients' perceptions of their severity of difficulty with the following activities and the eight norm-based measures and the physical component summary and mental component summary scores of the Short Form-36 questionnaire: Communicating with spouse/significant other (N = 61). The correlation coefficients were significant at the 0.01 level for the mental component summary score, together with three of the norm-based scores [vitality (VT), social functioning and role emotional]. Social communication (N = 62). All 10 correlations were significant at the 0.01 or 0.001 level. Balance (N = 59). All 10 correlations were highly significant at the P < 0.001 level. Hearing difficulties (N = 61). All correlations were significant at either the 0.01 level or less apart from the mental component summary score and three of the norm-based scores (role physical, VT and mental health). Mood change (N = 61). All correlations were significant at the 0.01 level or less, apart from one norm-based score (role physical). The Short Form-36 questionnaire has allowed us to relate patients' perceptions of their difficulties, as identified by the closed set NF2 questionnaire, to the physical and mental domains measured by this validated and widely used scale, and has provided further insight into areas of functioning affected by NF2.
NASA Astrophysics Data System (ADS)
Li, Xiaobing; Qiu, Tianshuang; Lebonvallet, Stephane; Ruan, Su
2010-02-01
This paper presents a brain tumor segmentation method which automatically segments tumors from human brain MRI image volume. The presented model is based on the symmetry of human brain and level set method. Firstly, the midsagittal plane of an MRI volume is searched, the slices with potential tumor of the volume are checked out according to their symmetries, and an initial boundary of the tumor in the slice, in which the tumor is in the largest size, is determined meanwhile by watershed and morphological algorithms; Secondly, the level set method is applied to the initial boundary to drive the curve evolving and stopping to the appropriate tumor boundary; Lastly, the tumor boundary is projected one by one to its adjacent slices as initial boundaries through the volume for the whole tumor. The experiment results are compared with hand tracking of the expert and show relatively good accordance between both.
A novel approach to segmentation and measurement of medical image using level set methods.
Chen, Yao-Tien
2017-06-01
The study proposes a novel approach for segmentation and visualization plus value-added surface area and volume measurements for brain medical image analysis. The proposed method contains edge detection and Bayesian based level set segmentation, surface and volume rendering, and surface area and volume measurements for 3D objects of interest (i.e., brain tumor, brain tissue, or whole brain). Two extensions based on edge detection and Bayesian level set are first used to segment 3D objects. Ray casting and a modified marching cubes algorithm are then adopted to facilitate volume and surface visualization of medical-image dataset. To provide physicians with more useful information for diagnosis, the surface area and volume of an examined 3D object are calculated by the techniques of linear algebra and surface integration. Experiment results are finally reported in terms of 3D object extraction, surface and volume rendering, and surface area and volume measurements for medical image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.
Achuthan, Anusha; Rajeswari, Mandava; Ramachandram, Dhanesh; Aziz, Mohd Ezane; Shuaib, Ibrahim Lutfi
2010-07-01
This paper introduces an approach to perform segmentation of regions in computed tomography (CT) images that exhibit intra-region intensity variations and at the same time have similar intensity distributions with surrounding/adjacent regions. In this work, we adapt a feature computed from wavelet transform called wavelet energy to represent the region information. The wavelet energy is embedded into a level set model to formulate the segmentation model called wavelet energy-guided level set-based active contour (WELSAC). The WELSAC model is evaluated using several synthetic and CT images focusing on tumour cases, which contain regions demonstrating the characteristics of intra-region intensity variations and having high similarity in intensity distributions with the adjacent regions. The obtained results show that the proposed WELSAC model is able to segment regions of interest in close correspondence with the manual delineation provided by the medical experts and to provide a solution for tumour detection. Copyright 2010 Elsevier Ltd. All rights reserved.
Castillo-Cara, Manuel; Lovón-Melgarejo, Jesús; Bravo-Rocca, Gusseppe; Orozco-Barbosa, Luis; García-Varea, Ismael
2017-01-01
Nowadays, there is a great interest in developing accurate wireless indoor localization mechanisms enabling the implementation of many consumer-oriented services. Among the many proposals, wireless indoor localization mechanisms based on the Received Signal Strength Indication (RSSI) are being widely explored. Most studies have focused on the evaluation of the capabilities of different mobile device brands and wireless network technologies. Furthermore, different parameters and algorithms have been proposed as a means of improving the accuracy of wireless-based localization mechanisms. In this paper, we focus on the tuning of the RSSI fingerprint to be used in the implementation of a Bluetooth Low Energy 4.0 (BLE4.0) Bluetooth localization mechanism. Following a holistic approach, we start by assessing the capabilities of two Bluetooth sensor/receiver devices. We then evaluate the relevance of the RSSI fingerprint reported by each BLE4.0 beacon operating at various transmission power levels using feature selection techniques. Based on our findings, we use two classification algorithms in order to improve the setting of the transmission power levels of each of the BLE4.0 beacons. Our main findings show that our proposal can greatly improve the localization accuracy by setting a custom transmission power level for each BLE4.0 beacon. PMID:28590413
Castillo-Cara, Manuel; Lovón-Melgarejo, Jesús; Bravo-Rocca, Gusseppe; Orozco-Barbosa, Luis; García-Varea, Ismael
2017-06-07
Nowadays, there is a great interest in developing accurate wireless indoor localization mechanisms enabling the implementation of many consumer-oriented services. Among the many proposals, wireless indoor localization mechanisms based on the Received Signal Strength Indication (RSSI) are being widely explored. Most studies have focused on the evaluation of the capabilities of different mobile device brands and wireless network technologies. Furthermore, different parameters and algorithms have been proposed as a means of improving the accuracy of wireless-based localization mechanisms. In this paper, we focus on the tuning of the RSSI fingerprint to be used in the implementation of a Bluetooth Low Energy 4.0 (BLE4.0) Bluetooth localization mechanism. Following a holistic approach, we start by assessing the capabilities of two Bluetooth sensor/receiver devices. We then evaluate the relevance of the RSSI fingerprint reported by each BLE4.0 beacon operating at various transmission power levels using feature selection techniques. Based on our findings, we use two classification algorithms in order to improve the setting of the transmission power levels of each of the BLE4.0 beacons. Our main findings show that our proposal can greatly improve the localization accuracy by setting a custom transmission power level for each BLE4.0 beacon.
Sequence stratigraphic distribution of coaly rocks: Fundamental controls and paralic examples
Bohacs, K.; Suter, J.
1997-01-01
Significant volumes of terrigenous organic matter can be preserved to form coals only when and where the overall increase in accommodation approximately equals the production rate of peat. Accommodation is a function of subsidence and base level. For mires, base level is very specifically the groundwater table. In paralic settings, the groundwater table is strongly controlled by sea level and the precipitation/evaporation ratio. Peat accumulates over a range of rates, but always with a definite maximum rate set by original organic productivity and space available below depositional base level (groundwater table). Below a threshold accommodation rate (nonzero), no continuous peats accumulate, due to falling or low groundwater table, sedimentary bypass, and extensive erosion by fluvial channels. This is typical of upper highstand, lowstand fan, and basal lowstand-wedge systems tracts. Higher accommodation rates provide relatively stable conditions with rising groundwater tables. Mires initiate and thrive, quickly filling local accommodation vertically and expanding laterally, favoring accumulation of laterally continuous coals in paralic zones within both middle lowstand and middle highstand systems tracts. If the accommodation increase balances or slightly exceeds organic productivity, mires accumulate peat vertically, yielding thicker, more isolated coals most likely during of late lowstand-early transgressive and late transgressive-early highstand periods. At very large accommodation increases, mires are stressed and eventually inundated by clastics or standing water (as in middle transgressive systems tracts). These relations should be valid for mires in all settings, including alluvial, lake plain, and paralic. The tie to sea level in paralic zones depends on local subsidence, sediment supply, and groundwater regimes. These concepts are also useful for investigating the distribution of seal and reservoir facies in nonmarine settings.
Using rewards and penalties to obtain desired subject performance
NASA Technical Reports Server (NTRS)
Cook, M.; Jex, H. R.; Stein, A. C.; Allen, R. W.
1981-01-01
Operant conditioning procedures, specifically the use of negative reinforcement, in achieving stable learning behavior is described. The critical tracking test (CTT) a method of detecting human operator impairment was tested. A pass level is set for each subject, based on that subject's asymptotic skill level while sober. It is critical that complete training take place before the individualized pass level is set in order that the impairment can be detected. The results provide a more general basis for the application of reward/penalty structures in manual control research.
ERIC Educational Resources Information Center
Boursicot, Katharine
2006-01-01
In this era of audit and accountability, there is an imperative to demonstrate and document that appropriate standards have been set in professional education. In medicine, stakeholders want assurance that graduates have attained the required level of competence to be awarded a provisional licence to practise. To investigate the results of a…
Segmentation Using Multispectral Adaptive Contours
2004-02-29
Geometry, University of Toronto Press, 1959. 13. R . Malladi , J. Sethian, “Image Processing via Level Set Curvature Flow,” National Academy of Science, vol...92, pp. 7046, 1995. 14. R . Malladi , J. Sethian, C. Vemuri, "Shape Modeling with Front Propagation: a Level Set Approach," IEEE Transactions on...boundary-based active contour models are reviewed in this report; geometric active contours proposed by Caselles et al. [2] and by Malladi and Sethian [13
AISLE: an automatic volumetric segmentation method for the study of lung allometry.
Ren, Hongliang; Kazanzides, Peter
2011-01-01
We developed a fully automatic segmentation method for volumetric CT (computer tomography) datasets to support construction of a statistical atlas for the study of allometric laws of the lung. The proposed segmentation method, AISLE (Automated ITK-Snap based on Level-set), is based on the level-set implementation from an existing semi-automatic segmentation program, ITK-Snap. AISLE can segment the lung field without human interaction and provide intermediate graphical results as desired. The preliminary experimental results show that the proposed method can achieve accurate segmentation, in terms of volumetric overlap metric, by comparing with the ground-truth segmentation performed by a radiologist.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, F; Yang, Y; Young, L
Purpose: Radiomic texture features derived from the oncologic PET have recently been brought under intense investigation within the context of patient stratification and treatment outcome prediction in a variety of cancer types; however, their validity has not yet been examined. This work is aimed to validate radiomic PET texture metrics through the use of realistic simulations in the ground truth setting. Methods: Simulation of FDG-PET was conducted by applying the Zubal phantom as an attenuation map to the SimSET software package that employs Monte Carlo techniques to model the physical process of emission imaging. A total of 15 irregularly-shaped lesionsmore » featuring heterogeneous activity distribution were simulated. For each simulated lesion, 28 texture features in relation to the intensity histograms (GLIH), grey-level co-occurrence matrices (GLCOM), neighborhood difference matrices (GLNDM), and zone size matrices (GLZSM) were evaluated and compared with their respective values extracted from the ground truth activity map. Results: In reference to the values from the ground truth images, texture parameters appearing on the simulated data varied with a range of 0.73–3026.2% for GLIH-based, 0.02–100.1% for GLCOM-based, 1.11–173.8% for GLNDM-based, and 0.35–66.3% for GLZSM-based. For majority of the examined texture metrics (16/28), their values on the simulated data differed significantly from those from the ground truth images (P-value ranges from <0.0001 to 0.04). Features not exhibiting significant difference comprised of GLIH-based standard deviation, GLCO-based energy and entropy, GLND-based coarseness and contrast, and GLZS-based low gray-level zone emphasis, high gray-level zone emphasis, short zone low gray-level emphasis, long zone low gray-level emphasis, long zone high gray-level emphasis, and zone size nonuniformity. Conclusion: The extent to which PET imaging disturbs texture appearance is feature-dependent and could be substantial. It is thus advised that use of PET texture parameters for predictive and prognostic measurements in oncologic setting awaits further systematic and critical evaluation.« less
Targeted exploration and analysis of large cross-platform human transcriptomic compendia
Zhu, Qian; Wong, Aaron K; Krishnan, Arjun; Aure, Miriam R; Tadych, Alicja; Zhang, Ran; Corney, David C; Greene, Casey S; Bongo, Lars A; Kristensen, Vessela N; Charikar, Moses; Li, Kai; Troyanskaya, Olga G.
2016-01-01
We present SEEK (http://seek.princeton.edu), a query-based search engine across very large transcriptomic data collections, including thousands of human data sets from almost 50 microarray and next-generation sequencing platforms. SEEK uses a novel query-level cross-validation-based algorithm to automatically prioritize data sets relevant to the query and a robust search approach to identify query-coregulated genes, pathways, and processes. SEEK provides cross-platform handling, multi-gene query search, iterative metadata-based search refinement, and extensive visualization-based analysis options. PMID:25581801
Eriks-Hoogland, Inge E; Brinkhof, Martin W G; Al-Khodairy, Abdul; Baumberger, Michael; Brechbühl, Jörg; Curt, Armin; Mäder, Mark; Stucki, Gerold; Post, Marcel W M
2011-11-01
The aims of this study were to provide a selection of biomedical domains based on the comprehensive International Classification of Functioning, Disability, and Health (ICF) core sets for spinal cord injury (SCI) and to present an overview of the corresponding measurement instruments. Based on the Biomedical Domain Set, the SCI literature, the International Spinal Cord Society international data sets, and the Spinal Cord Injury Rehabilitation Evidence project publications were used to derive category specifications for use in SCI research. Expert opinion was used to derive a priority selection. The same sources were used to determine candidate measurement instruments for the specification of body functions and body structures using an example, and guiding principles were applied to select the most appropriate biomedical measurement instrument(s) for use in an SCI research project. Literature searches were performed for 41 second-level ICF body functions categories and for four second-level ICF body structures categories. For some of these categories, only a few candidate measurement instruments were found with limited variation in the type of measurement instruments. An ICF-based measurement set for biomedical aspects of functioning with SCI was established. For some categories of the ICF core sets for SCI, there is a need to develop measurement instruments.
Forcina, Alessandra; Rancoita, Paola M V; Marcatti, Magda; Greco, Raffaella; Lupo-Stanghellini, Maria Teresa; Carrabba, Matteo; Marasco, Vincenzo; Di Serio, Clelia; Bernardi, Massimo; Peccatori, Jacopo; Corti, Consuelo; Bondanza, Attilio; Ciceri, Fabio
2017-12-01
Infection-related mortality (IRM) is a substantial component of nonrelapse mortality (NRM) after allogeneic hematopoietic stem cell transplantation (allo-HSCT). No scores have been developed to predict IRM before transplantation. Pretransplantation clinical and biochemical data were collected from a study cohort of 607 adult patients undergoing allo-HSCT between January 2009 and February 2017. In a training set of 273 patients, multivariate analysis revealed that age >60 years (P = .003), cytomegalovirus host/donor serostatus different from negative/negative (P < .001), pretransplantation IgA level <1.11 g/L (P = .004), and pretransplantation IgM level <.305 g/L (P = .028) were independent predictors of increased IRM. Based on these results, we developed and subsequently validated a 3-tiered weighted prognostic index for IRM in a retrospective set of patients (n = 219) and a prospective set of patients (n = 115). Patients were assigned to 3 different IRM risk classes based on this index score. The score significantly predicted IRM in the training set, retrospective validation set, and prospective validation set (P < .001, .044, and .011, respectively). In the training set, 100-day IRM was 5% for the low-risk group, 11% for the intermediate-riak group, and 16% for the high-risk groups. In the retrospective validation set, the respective 100-day IRM values were 7%, 17%, and 28%, and in the prospective set, they were 0%, 5%, and 7%. This score predicted also overall survival (P < .001 in the training set, P < 041 in the retrospective validation set, and P < .023 in the prospective validation set). Because pretransplantation levels of IgA/IgM can be modulated by the supplementation of enriched immunoglobulins, these results suggest the possibility of prophylactic interventional studies to improve transplantation outcomes. Copyright © 2017 The American Society for Blood and Marrow Transplantation. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Castillo, Miguel; Bishop, Paul; Jansen, John D.
2013-01-01
A sudden drop in river base-level can trigger a knickpoint that propagates throughout the fluvial network causing a transient state in the landscape. Knickpoint retreat has been confirmed in large fluvial settings (drainage areas > 100 km2) and field data suggest that the same applies to the case of small bedrock river catchments (drainage areas < 100 km2). Nevertheless, knickpoint recession on resistant lithologies with structure that potentially affects the retreat rate needs to be confirmed with field-based data. Moreover, it remains unclear whether small bedrock rivers can absorb base-level fall via knickpoint retreat. Here we evaluate the response of small bedrock rivers to base-level fall on the isle of Jura in western Scotland (UK), where rivers incise into dipping quartzite. The mapping of raised beach deposits and strath terraces, and the analysis of stream long profiles, were used to identify knickpoints that had been triggered by base-level fall. Our results indicate that the distance of knickpoint retreat scales to the drainage area in a power law function irrespective of structural setting. On the other hand, local channel slope and basin size influence the vertical distribution of knickpoints. As well, at low drainage areas (~ 4 km2) rivers are unable to absorb the full amount of base-level fall and channel reach morphology downstream of the knickpoint tends towards convexity. The results obtained here confirm that knickpoint retreat is mostly controlled by stream discharge, as has been observed for other transient landscapes. Local controls, reflecting basin size and channel slope, have an effect on the vertical distribution of knickpoints; such controls are also related to the ability of rivers to absorb the base-level fall.
Measuring and Specifying Combinatorial Coverage of Test Input Configurations
Kuhn, D. Richard; Kacker, Raghu N.; Lei, Yu
2015-01-01
A key issue in testing is how many tests are needed for a required level of coverage or fault detection. Estimates are often based on error rates in initial testing, or on code coverage. For example, tests may be run until a desired level of statement or branch coverage is achieved. Combinatorial methods present an opportunity for a different approach to estimating required test set size, using characteristics of the test set. This paper describes methods for estimating the coverage of, and ability to detect, t-way interaction faults of a test set based on a covering array. We also develop a connection between (static) combinatorial coverage and (dynamic) code coverage, such that if a specific condition is satisfied, 100% branch coverage is assured. Using these results, we propose practical recommendations for using combinatorial coverage in specifying test requirements. PMID:28133442
Simulated impact of RTS,S/AS01 vaccination programs in the context of changing malaria transmission.
Brooks, Alan; Briët, Olivier J T; Hardy, Diggory; Steketee, Richard; Smith, Thomas A
2012-01-01
The RTS,S/AS01 pre-erythrocytic malaria vaccine is in phase III clinical trials. It is critical to anticipate where and how it should be implemented if trials are successful. Such planning may be complicated by changing levels of malaria transmission. Computer simulations were used to examine RTS,S/AS01 impact, using a vaccine profile based on phase II trial results, and assuming that protection decays only slowly. Settings were simulated in which baseline transmission (in the absence of vaccine) was fixed or varied between 2 and 20 infectious mosquito bites per person per annum (ibpa) over ten years. Four delivery strategies were studied: routine infant immunization (EPI), EPI plus infant catch-up, EPI plus school-based campaigns, and EPI plus mass campaigns. Impacts in changing transmission settings were similar to those in fixed settings. Assuming a persistent effect of vaccination, at 2 ibpa, the vaccine averted approximately 5-7 deaths per 1000 doses of vaccine when delivered via mass campaigns, but the benefit was less at higher transmission levels. EPI, catch-up and school-based strategies averted 2-3 deaths per 1000 doses in settings with 2 ibpa. In settings where transmission was decreasing or increasing, EPI, catch-up and school-based strategies averted approximately 3-4 deaths per 1000 doses. Where transmission is changing, it appears to be sufficient to consider simulations of pre-erythrocytic vaccine impact at a range of initial transmission levels. At 2 ibpa, mass campaigns averted the most deaths and reduced transmission, but this requires further study. If delivered via EPI, RTS,S/AS01 could avert approximately 6-11 deaths per 1000 vaccinees in all examined settings, similar to estimates for pneumococcal conjugate vaccine in African infants. These results support RTS,S/AS01 implementation via EPI, for example alongside vector control interventions, providing that the phase III trials provide support for our assumptions about efficacy.
Data discretization for novel resource discovery in large medical data sets.
Benoît, G.; Andrews, J. E.
2000-01-01
This paper is motivated by the problems of dealing with large data sets in information retrieval. The authors suggest an information retrieval framework based on mathematical principles to organize and permit end-user manipulation of a retrieval set. By adjusting through the interface the weights and types of relationships between query and set members, it is possible to expose unanticipated, novel relationships between the query/document pair. The retrieval set as a whole is parsed into discrete concept-oriented subsets (based on within-set similarity measures) and displayed on screen as interactive "graphic nodes" in an information space, distributed at first based on the vector model (similarity measure of set to query). The result is a visualized map wherein it is possible to identify main concept regions and multiple sub-regions as dimensions of the same data. Users may examine the membership within sub-regions. Based on this framework, a data visualization user interface was designed to encourage users to work with the data on multiple levels to find novel relationships between the query and retrieval set members. Space constraints prohibit addressing all aspects of this project. PMID:11079845
Using Registered Dental Hygienists to Promote a School-Based Approach to Dental Public Health
Wellever, Anthony; Kelly, Patricia
2017-01-01
We examine a strategy for improving oral health in the United States by focusing on low-income children in school-based settings. Vulnerable children often experience cultural, social, economic, structural, and geographic barriers when trying to access dental services in traditional dental office settings. These disparities have been discussed for more than a decade in multiple US Department of Health and Human Services publications. One solution is to revise dental practice acts to allow registered dental hygienists increased scope of services, expanded public health delivery opportunities, and decreased dentist supervision. We provide examples of how federally qualified health centers have implemented successful school-based dental models within the parameters of two state policies that allow registered dental hygienists varying levels of dentist supervision. Changes to dental practice acts at the state level allowing registered dental hygienists to practice with limited supervision in community settings, such as schools, may provide vulnerable populations greater access to screening and preventive services. We derive our recommendations from expert opinion. PMID:28661808
Affect-Aware Adaptive Tutoring Based on Human-Automation Etiquette Strategies.
Yang, Euijung; Dorneich, Michael C
2018-06-01
We investigated adapting the interaction style of intelligent tutoring system (ITS) feedback based on human-automation etiquette strategies. Most ITSs adapt the content difficulty level, adapt the feedback timing, or provide extra content when they detect cognitive or affective decrements. Our previous work demonstrated that changing the interaction style via different feedback etiquette strategies has differential effects on students' motivation, confidence, satisfaction, and performance. The best etiquette strategy was also determined by user frustration. Based on these findings, a rule set was developed that systemically selected the proper etiquette strategy to address one of four learning factors (motivation, confidence, satisfaction, and performance) under two different levels of user frustration. We explored whether etiquette strategy selection based on this rule set (systematic) or random changes in etiquette strategy for a given level of frustration affected the four learning factors. Participants solved mathematics problems under different frustration conditions with feedback that adapted dynamic changes in etiquette strategies either systematically or randomly. The results demonstrated that feedback with etiquette strategies chosen systematically via the rule set could selectively target and improve motivation, confidence, satisfaction, and performance more than changing etiquette strategies randomly. The systematic adaptation was effective no matter the level of frustration for the participant. If computer tutors can vary the interaction style to effectively mitigate negative emotions, then ITS designers would have one more mechanism in which to design affect-aware adaptations that provide the proper responses in situations where human emotions affect the ability to learn.
Systematic review of skills transfer after surgical simulation-based training.
Dawe, S R; Pena, G N; Windsor, J A; Broeders, J A J L; Cregan, P C; Hewett, P J; Maddern, G J
2014-08-01
Simulation-based training assumes that skills are directly transferable to the patient-based setting, but few studies have correlated simulated performance with surgical performance. A systematic search strategy was undertaken to find studies published since the last systematic review, published in 2007. Inclusion of articles was determined using a predetermined protocol, independent assessment by two reviewers and a final consensus decision. Studies that reported on the use of surgical simulation-based training and assessed the transferability of the acquired skills to a patient-based setting were included. Twenty-seven randomized clinical trials and seven non-randomized comparative studies were included. Fourteen studies investigated laparoscopic procedures, 13 endoscopic procedures and seven other procedures. These studies provided strong evidence that participants who reached proficiency in simulation-based training performed better in the patient-based setting than their counterparts who did not have simulation-based training. Simulation-based training was equally as effective as patient-based training for colonoscopy, laparoscopic camera navigation and endoscopic sinus surgery in the patient-based setting. These studies strengthen the evidence that simulation-based training, as part of a structured programme and incorporating predetermined proficiency levels, results in skills transfer to the operative setting. © 2014 BJS Society Ltd. Published by John Wiley & Sons Ltd.
The Predictive Value of Ultrasound Learning Curves Across Simulated and Clinical Settings.
Madsen, Mette E; Nørgaard, Lone N; Tabor, Ann; Konge, Lars; Ringsted, Charlotte; Tolsgaard, Martin G
2017-01-01
The aim of the study was to explore whether learning curves on a virtual-reality (VR) sonographic simulator can be used to predict subsequent learning curves on a physical mannequin and learning curves during clinical training. Twenty midwives completed a simulation-based training program in transvaginal sonography. The training was conducted on a VR simulator as well as on a physical mannequin. A subgroup of 6 participants underwent subsequent clinical training. During each of the 3 steps, the participants' performance was assessed using instruments with established validity evidence, and they advanced to the next level only after attaining predefined levels of performance. The number of repetitions and time needed to achieve predefined performance levels were recorded along with the performance scores in each setting. Finally, the outcomes were correlated across settings. A good correlation was found between time needed to achieve predefined performance levels on the VR simulator and the physical mannequin (Pearson correlation coefficient .78; P < .001). Performance scores on the VR simulator correlated well to the clinical performance scores (Pearson correlation coefficient .81; P = .049). No significant correlations were found between numbers of attempts needed to reach proficiency across the 3 different settings. A post hoc analysis found that the 50% fastest trainees at reaching proficiency during simulation-based training received higher clinical performance scores compared to trainees with scores placing them among the 50% slowest (P = .025). Performances during simulation-based sonography training may predict performance in related tasks and subsequent clinical learning curves. © 2016 by the American Institute of Ultrasound in Medicine.
NASA Astrophysics Data System (ADS)
Yuan, H. Z.; Chen, Z.; Shu, C.; Wang, Y.; Niu, X. D.; Shu, S.
2017-09-01
In this paper, a free energy-based surface tension force (FESF) model is presented for accurately resolving the surface tension force in numerical simulation of multiphase flows by the level set method. By using the analytical form of order parameter along the normal direction to the interface in the phase-field method and the free energy principle, FESF model offers an explicit and analytical formulation for the surface tension force. The only variable in this formulation is the normal distance to the interface, which can be substituted by the distance function solved by the level set method. On one hand, as compared to conventional continuum surface force (CSF) model in the level set method, FESF model introduces no regularized delta function, due to which it suffers less from numerical diffusions and performs better in mass conservation. On the other hand, as compared to the phase field surface tension force (PFSF) model, the evaluation of surface tension force in FESF model is based on an analytical approach rather than numerical approximations of spatial derivatives. Therefore, better numerical stability and higher accuracy can be expected. Various numerical examples are tested to validate the robustness of the proposed FESF model. It turns out that FESF model performs better than CSF model and PFSF model in terms of accuracy, stability, convergence speed and mass conservation. It is also shown in numerical tests that FESF model can effectively simulate problems with high density/viscosity ratio, high Reynolds number and severe topological interfacial changes.
PRN 96-8: Toxicologically Significant Levels of Pesticide Active Ingredients
This notice sets out EPA's interpretation of the term toxicologically significant as it applies to contaminants in pesticide products that are also pesticide active ingredients. It provides risk-based concentration levels of such contaminants.
Ecological Soil Screening Level
The Eco-SSL derivation process is used to derive a set of risk-based ecological soil screening levels (Eco-SSLs) for many of the soil contaminants that are frequently of ecological concern for plants and animals at hazardous waste sites.
Adaptive Set-Based Methods for Association Testing.
Su, Yu-Chen; Gauderman, William James; Berhane, Kiros; Lewinger, Juan Pablo
2016-02-01
With a typical sample size of a few thousand subjects, a single genome-wide association study (GWAS) using traditional one single nucleotide polymorphism (SNP)-at-a-time methods can only detect genetic variants conferring a sizable effect on disease risk. Set-based methods, which analyze sets of SNPs jointly, can detect variants with smaller effects acting within a gene, a pathway, or other biologically relevant sets. Although self-contained set-based methods (those that test sets of variants without regard to variants not in the set) are generally more powerful than competitive set-based approaches (those that rely on comparison of variants in the set of interest with variants not in the set), there is no consensus as to which self-contained methods are best. In particular, several self-contained set tests have been proposed to directly or indirectly "adapt" to the a priori unknown proportion and distribution of effects of the truly associated SNPs in the set, which is a major determinant of their power. A popular adaptive set-based test is the adaptive rank truncated product (ARTP), which seeks the set of SNPs that yields the best-combined evidence of association. We compared the standard ARTP, several ARTP variations we introduced, and other adaptive methods in a comprehensive simulation study to evaluate their performance. We used permutations to assess significance for all the methods and thus provide a level playing field for comparison. We found the standard ARTP test to have the highest power across our simulations followed closely by the global model of random effects (GMRE) and a least absolute shrinkage and selection operator (LASSO)-based test. © 2015 WILEY PERIODICALS, INC.
NASA Astrophysics Data System (ADS)
Howard, A. D.; Matsubara, Y.; Lloyd, H.
2006-12-01
The DELIM landform evolution model has been adapted to investigate erosional and depositional landforms in two setting with fluctuating base levels. The first is erosion and wave planation of terraced landscapes in Coastal Plain sediments along the estuarine Potomac River. The last 3.5 million years of erosion is simulated with base level fluctuations based upon the long-term oceanic delta 18O record, eustatic sea level changes during the last 120 ka, estimates of the history of tectonic uplift in the region, and maximum depths of incision of the Potomac River during sea-level lowstands. Inhibition of runoff erosion by vegetation has been a crucial factor allowing persistence of uplands in the soft coastal plain bedrock. The role of vegetation is simulated as a contributing area- dependent critical shear stress. Development of wave-cut terraces is simulated by episodic planation of the landscape during base-level highstands. Although low base level excursions are infrequent and of short duration, the total amount of erosion is largely controlled by the depth and frequency of lowstands. The model has also been adapted to account for flow routing and accompanying erosion and sedimentation in landscapes with multiple enclosed depressions. The hydrological portion of the model has been calibrated and tested in the Great Basin and Mojave regions of the southwestern U.S. In such a setting, runoff, largely from mountains, may flow through several lacustrine basins, each with evaporative losses. An iterative approach determines the size and depth of lakes, including overflow (or not) that balances runoff and evaporation. The model utilizes information on temperatures, rainfall, runoff, and evaporation within the region to parameterize evaporation and runoff as functions of latitude, mean annual temperature, precipitation, and elevation. The model is successful in predicting the location of modern perennial lakes in the region as well as that of lakes during the last glacial maximum based upon published estimates of changes in mean annual temperature and precipitation within the region. The hydrological model has been coupled with the DELIM landform evolution model to investigate expected patterns of basin sedimentation in cratered landscapes on Mars and the role that fluctuating lake levels has on the form and preservation of deltaic and shoreline sedimentary platforms. As would be expected, base levels that fluctuate widely complicate the pattern of depositional landforms, but recognizable coastal benches develop even with high-amplitude variations.
Chouinard, Véronique; Contandriopoulos, Damien; Perroux, Mélanie; Larouche, Catherine
2017-06-26
While greater reliance on nurse practitioners in primary healthcare settings can improve service efficiency and accessibility, their integration is not straightforward, challenging existing role definitions of both registered nurses and physicians. Developing adequate support practices is therefore essential in primary healthcare nurse practitioners' integration. This study's main objective is to examine different structures and mechanisms put in place to support the development of primary healthcare nurse practitioner's practice in different healthcare settings, and develop a practical model for identifying and planning adequate support practices. This study is part of a larger multicentre study on primary healthcare nurse practitioners in the province of Quebec, Canada. It focuses on three healthcare settings into which one or more primary healthcare nurse practitioners have been integrated. Case studies have been selected to cover a maximum of variations in terms of location, organizational setting, and stages of primary healthcare nurse practitioner integration. Findings are based on the analysis of available documentation in each primary healthcare setting and on semi-structured interviews with key actors in each clinical team. Data were analyzed following thematic and cross-sectional analysis approaches. This article identifies three types of support practices: clinical, team, and systemic. This three-level analysis demonstrates that, on the ground, primary healthcare nurse practitioner integration is essentially a team-based, multilevel endeavour. Despite the existence of a provincial implementation plan, the three settings adopted very different implementation structures and practices, and different actors were involved at each of the three levels. The results also indicated that nursing departments played a decisive role at all three levels. Based on these findings, we suggest that support practices should be adapted to each organization's environment and experience and be modified as needed throughout the integration process. We also stress the importance of combining this approach with a strong coordination mechanism involving managers who have in-depth understanding of nursing professional roles and scopes of practice. Making primary healthcare nurse practitioner integration frameworks more flexible and clarifying and strengthening the role of senior nursing managers could be the key to successful integration.
2012-01-01
Background While progress has been made to develop automatic segmentation techniques for mitochondria, there remains a need for more accurate and robust techniques to delineate mitochondria in serial blockface scanning electron microscopic data. Previously developed texture based methods are limited for solving this problem because texture alone is often not sufficient to identify mitochondria. This paper presents a new three-step method, the Cytoseg process, for automated segmentation of mitochondria contained in 3D electron microscopic volumes generated through serial block face scanning electron microscopic imaging. The method consists of three steps. The first is a random forest patch classification step operating directly on 2D image patches. The second step consists of contour-pair classification. At the final step, we introduce a method to automatically seed a level set operation with output from previous steps. Results We report accuracy of the Cytoseg process on three types of tissue and compare it to a previous method based on Radon-Like Features. At step 1, we show that the patch classifier identifies mitochondria texture but creates many false positive pixels. At step 2, our contour processing step produces contours and then filters them with a second classification step, helping to improve overall accuracy. We show that our final level set operation, which is automatically seeded with output from previous steps, helps to smooth the results. Overall, our results show that use of contour pair classification and level set operations improve segmentation accuracy beyond patch classification alone. We show that the Cytoseg process performs well compared to another modern technique based on Radon-Like Features. Conclusions We demonstrated that texture based methods for mitochondria segmentation can be enhanced with multiple steps that form an image processing pipeline. While we used a random-forest based patch classifier to recognize texture, it would be possible to replace this with other texture identifiers, and we plan to explore this in future work. PMID:22321695
ERIC Educational Resources Information Center
King, Marianne; Walsh, Joan
"Healthy Choices for Kids" is a nutrition education program based on the 1990 U.S. Dietary Guidelines. This kit, the first of a series, provides elementary school teachers with tools to teach students about good nutrition. This set has five levels (Grades 1-5), bound separately. Each level has its own unit complete with teacher…
Couto, Thomaz Bittencourt; Kerrey, Benjamin T; Taylor, Regina G; FitzGerald, Michael; Geis, Gary L
2015-04-01
Pediatric emergencies require effective teamwork. These skills are developed and demonstrated in actual emergencies and in simulated environments, including simulation centers (in center) and the real care environment (in situ). Our aims were to compare teamwork performance across these settings and to identify perceived educational strengths and weaknesses between simulated settings. We hypothesized that teamwork performance in actual emergencies and in situ simulations would be higher than for in-center simulations. A retrospective, video-based assessment of teamwork was performed in an academic, pediatric level 1 trauma center, using the Team Emergency Assessment Measure (TEAM) tool (range, 0-44) among emergency department providers (physicians, nurses, respiratory therapists, paramedics, patient care assistants, and pharmacists). A survey-based, cross-sectional assessment was conducted to determine provider perceptions regarding simulation training. One hundred thirty-two videos, 44 from each setting, were reviewed. Mean total TEAM scores were similar and high in all settings (31.2 actual, 31.1 in situ, and 32.3 in-center, P = 0.39). Of 236 providers, 154 (65%) responded to the survey. For teamwork training, in situ simulation was considered more realistic (59% vs. 10%) and more effective (45% vs. 15%) than in-center simulation. In a video-based study in an academic pediatric institution, ratings of teamwork were relatively high among actual resuscitations and 2 simulation settings, substantiating the influence of simulation-based training on instilling a culture of communication and teamwork. On the basis of survey results, providers favored the in situ setting for teamwork training and suggested an expansion of our existing in situ program.
Evaluating healthcare priority setting at the meso level: A thematic review of empirical literature
Waithaka, Dennis; Tsofa, Benjamin; Barasa, Edwine
2018-01-01
Background: Decentralization of health systems has made sub-national/regional healthcare systems the backbone of healthcare delivery. These regions are tasked with the difficult responsibility of determining healthcare priorities and resource allocation amidst scarce resources. We aimed to review empirical literature that evaluated priority setting practice at the meso (sub-national) level of health systems. Methods: We systematically searched PubMed, ScienceDirect and Google scholar databases and supplemented these with manual searching for relevant studies, based on the reference list of selected papers. We only included empirical studies that described and evaluated, or those that only evaluated priority setting practice at the meso-level. A total of 16 papers were identified from LMICs and HICs. We analyzed data from the selected papers by thematic review. Results: Few studies used systematic priority setting processes, and all but one were from HICs. Both formal and informal criteria are used in priority-setting, however, informal criteria appear to be more perverse in LMICs compared to HICs. The priority setting process at the meso-level is a top-down approach with minimal involvement of the community. Accountability for reasonableness was the most common evaluative framework as it was used in 12 of the 16 studies. Efficiency, reallocation of resources and options for service delivery redesign were the most common outcome measures used to evaluate priority setting. Limitations: Our study was limited by the fact that there are very few empirical studies that have evaluated priority setting at the meso-level and there is likelihood that we did not capture all the studies. Conclusions: Improving priority setting practices at the meso level is crucial to strengthening health systems. This can be achieved through incorporating and adapting systematic priority setting processes and frameworks to the context where used, and making considerations of both process and outcome measures during priority setting and resource allocation. PMID:29511741
Experiences from occupational exposure limits set on aerosols containing allergenic proteins.
Nielsen, Gunnar D; Larsen, Søren T; Hansen, Jitka S; Poulsen, Lars K
2012-10-01
Occupational exposure limits (OELs) together with determined airborne exposures are used in risk assessment based managements of occupational exposures to prevent occupational diseases. In most countries, OELs have only been set for few protein-containing aerosols causing IgE-mediated allergies. They comprise aerosols of flour dust, grain dust, wood dust, natural rubber latex, and the subtilisins, which are proteolytic enzymes. These aerosols show dose-dependent effects and levels have been established, where nearly all workers may be exposed without adverse health effects, which are required for setting OELs. Our aim is to analyse prerequisites for setting OELs for the allergenic protein-containing aerosols. Opposite to the key effect of toxicological reactions, two thresholds, one for the sensitization phase and one for elicitation of IgE-mediated symptoms in sensitized individuals, are used in the OEL settings. For example, this was the case for flour dust, where OELs were based on dust levels due to linearity between flour dust and its allergen levels. The critical effects for flour and grain dust OELs were different, which indicates that conclusion by analogy (read-across) must be scientifically well founded. Except for subtilisins, no OEL have been set for other industrial enzymes, where many of which are high volume chemicals. For several of these, OELs have been proposed in the scientific literature during the last two decades. It is apparent that the scientific methodology is available for setting OELs for proteins and protein-containing aerosols where the critical effect is IgE sensitization and IgE-mediated airway diseases.
Experiences from Occupational Exposure Limits Set on Aerosols Containing Allergenic Proteins
Nielsen, Gunnar D.
2012-01-01
Occupational exposure limits (OELs) together with determined airborne exposures are used in risk assessment based managements of occupational exposures to prevent occupational diseases. In most countries, OELs have only been set for few protein-containing aerosols causing IgE-mediated allergies. They comprise aerosols of flour dust, grain dust, wood dust, natural rubber latex, and the subtilisins, which are proteolytic enzymes. These aerosols show dose-dependent effects and levels have been established, where nearly all workers may be exposed without adverse health effects, which are required for setting OELs. Our aim is to analyse prerequisites for setting OELs for the allergenic protein-containing aerosols. Opposite to the key effect of toxicological reactions, two thresholds, one for the sensitization phase and one for elicitation of IgE-mediated symptoms in sensitized individuals, are used in the OEL settings. For example, this was the case for flour dust, where OELs were based on dust levels due to linearity between flour dust and its allergen levels. The critical effects for flour and grain dust OELs were different, which indicates that conclusion by analogy (read-across) must be scientifically well founded. Except for subtilisins, no OEL have been set for other industrial enzymes, where many of which are high volume chemicals. For several of these, OELs have been proposed in the scientific literature during the last two decades. It is apparent that the scientific methodology is available for setting OELs for proteins and protein-containing aerosols where the critical effect is IgE sensitization and IgE-mediated airway diseases. PMID:22843406
Application of short-data methods on extreme surge levels
NASA Astrophysics Data System (ADS)
Feng, X.
2014-12-01
Tropical cyclone-induced storm surges are among the most destructive natural hazards that impact the United States. Unfortunately for academic research, the available time series for extreme surge analysis are very short. The limited data introduces uncertainty and affects the accuracy of statistical analyses of extreme surge levels. This study deals with techniques applicable to data sets less than 20 years, including simulation modelling and methods based on the parameters of the parent distribution. The verified water levels from water gauges spread along the Southwest and Southeast Florida Coast, as well as the Florida Keys, are used in this study. Methods to calculate extreme storm surges are described and reviewed, including 'classical' methods based on the generalized extreme value (GEV) distribution and the generalized Pareto distribution (GPD), and approaches designed specifically to deal with short data sets. Incorporating global-warming influence, the statistical analysis reveals enhanced extreme surge magnitudes and frequencies during warm years, while reduced levels of extreme surge activity are observed in the same study domain during cold years. Furthermore, a non-stationary GEV distribution is applied to predict the extreme surge levels with warming sea surface temperatures. The non-stationary GEV distribution indicates that with 1 Celsius degree warming in sea surface temperature from the baseline climate, the 100-year return surge level in Southwest and Southeast Florida will increase by up to 40 centimeters. The considered statistical approaches for extreme surge estimation based on short data sets will be valuable to coastal stakeholders, including urban planners, emergency managers, and the hurricane and storm surge forecasting and warning system.
Teaching Affective Qualities in Physical Education
ERIC Educational Resources Information Center
Heidorn, Brent; Welch, Mindy M.
2010-01-01
Physical educators at all levels have observed learners in a school-based physical education setting as well as physical activity or sport settings outside of organized school curricula demonstrating behaviors deemed inappropriate or inconsistent with professional standards. Because sport is such a public, social, and international phenomenon,…
Hirsch-Moverman, Yael; Burkot, Camilla; Saito, Suzue; Frederix, Koen; Pitt, Blanche; Melaku, Zenebe; Gadisa, Tsigereda; Howard, Andrea A
2017-01-01
Accurate measurement of adherence is necessary to ensure that therapeutic outcomes can be attributed to the recommended treatment. Phone-based unannounced pill counts were shown to be feasible and reliable measures of adherence in developed settings; and have been further used as part of medication adherence interventions. However, it is not clear whether this method can be implemented successfully in resource-limited settings, where cellular network and mobile phone coverage may be low. Our objective is to describe operational issues surrounding the use of phone-based unannounced pill counts in Lesotho and Ethiopia. Phone-based monthly unannounced pill counts, using an adaptation of a standardized protocol from previous US-based studies, were utilized to measure anti-TB and antiretroviral medication adherence in two implementation science studies in resource-limited settings, START (Lesotho) and ENRICH (Ethiopia). In START, 19.6% of calls were completed, with 71.9% of participants reached at least once; majority of failed call attempts were due to phones not being available (54.8%) or because participants were away from the pills (32.7%). In ENRICH, 33.5% of calls were completed, with 86.7% of participants reached at least once; the main reasons for failed call attempts were phones being switched off (31.5%), participants not answering (27.3%), participants' discomfort speaking on the phone (15.4%), and network problems (13.2%). Structural, facility-level, participant-level, and data collection challenges were encountered in these settings. Phone-based unannounced pill counts were found to be challenging, and response rates suboptimal. While some of these challenges were specific to local contexts, most of them are generalizable to resource-limited settings. In a research study context, a possible solution to ease operational challenges may be to focus phone-based unannounced pill count efforts on a randomly selected sample from participants who are provided with study phones and rigorously ensure that call attempts are made for these participants.
Level-Set Simulation of Viscous Free Surface Flow Around a Commercial Hull Form
2005-04-15
Abstract The viscous free surface flow around a 3600 TEU KRISO Container Ship is computed using the finite volume based multi-block RANS code, WAVIS...developed at KRISO . The free surface is captured with the Level-set method and the realizable k-ε model is employed for turbulence closure. The...computations are done for a 3600 TEU container ship of Korea Research Institute of Ships & Ocean Engineering, KORDI (hereafter, KRISO ) selected as
The Development of the Speaker Independent ARM Continuous Speech Recognition System
1992-01-01
spokeTi airborne reconnaissance reports u-ing a speech recognition system based on phoneme-level hidden Markov models (HMMs). Previous versions of the ARM...will involve automatic selection from multiple model sets, corresponding to different speaker types, and that the most rudimen- tary partition of a...The vocabulary size for the ARM task is 497 words. These words are related to the phoneme-level symbols corresponding to the models in the model set
Latha, Manohar; Kavitha, Ganesan
2018-02-03
Schizophrenia (SZ) is a psychiatric disorder that especially affects individuals during their adolescence. There is a need to study the subanatomical regions of SZ brain on magnetic resonance images (MRI) based on morphometry. In this work, an attempt was made to analyze alterations in structure and texture patterns in images of the SZ brain using the level-set method and Laws texture features. T1-weighted MRI of the brain from Center of Biomedical Research Excellence (COBRE) database were considered for analysis. Segmentation was carried out using the level-set method. Geometrical and Laws texture features were extracted from the segmented brain stem, corpus callosum, cerebellum, and ventricle regions to analyze pattern changes in SZ. The level-set method segmented multiple brain regions, with higher similarity and correlation values compared with an optimized method. The geometric features obtained from regions of the corpus callosum and ventricle showed significant variation (p < 0.00001) between normal and SZ brain. Laws texture feature identified a heterogeneous appearance in the brain stem, corpus callosum and ventricular regions, and features from the brain stem were correlated with Positive and Negative Syndrome Scale (PANSS) score (p < 0.005). A framework of geometric and Laws texture features obtained from brain subregions can be used as a supplement for diagnosis of psychiatric disorders.
Glässel, Andrea; Rauch, Alexandra; Selb, Melissa; Emmenegger, Karl; Lückenkemper, Miriam; Escorpizo, Reuben
2012-01-01
Vocational rehabilitation (VR) plays a key role in bringing persons with acquired disabilities back to work, while encouraging employment participation. The purpose of this case study is to illustrate the systematic application of International Classification of Functioning, Disability, and Health (ICF)-based documentation tools by using ICF Core Sets in VR shown with a case example of a client with traumatic spinal cord injury (SCI). The client was a 26-year-old male with paraplegia (7th thoracic level), working in the past as a mover. This case study describes the integration of the ICF Core Sets for VR into an interdisciplinary rehabilitation program by using ICF-based documentation tools. Improvements in the client's impairments, activity limitations, and participation restrictions were observed following rehabilitation. Goals in different areas of functioning were achieved. The use of the ICF Core Sets in VR allows a comprehensive assessment of the client's level of functioning and intervention planning. Specifically, the Brief ICF Core Set in VR can provide domains for intervention relevant to each member of an interdisciplinary team and hence, can facilitate the VR management process in a SCI center in Switzerland.
Graph-state formalism for mutually unbiased bases
NASA Astrophysics Data System (ADS)
Spengler, Christoph; Kraus, Barbara
2013-11-01
A pair of orthonormal bases is called mutually unbiased if all mutual overlaps between any element of one basis and an arbitrary element of the other basis coincide. In case the dimension, d, of the considered Hilbert space is a power of a prime number, complete sets of d+1 mutually unbiased bases (MUBs) exist. Here we present a method based on the graph-state formalism to construct such sets of MUBs. We show that for n p-level systems, with p being prime, one particular graph suffices to easily construct a set of pn+1 MUBs. In fact, we show that a single n-dimensional vector, which is associated with this graph, can be used to generate a complete set of MUBs and demonstrate that this vector can be easily determined. Finally, we discuss some advantages of our formalism regarding the analysis of entanglement structures in MUBs, as well as experimental realizations.
Knowledge-based low-level image analysis for computer vision systems
NASA Technical Reports Server (NTRS)
Dhawan, Atam P.; Baxi, Himanshu; Ranganath, M. V.
1988-01-01
Two algorithms for entry-level image analysis and preliminary segmentation are proposed which are flexible enough to incorporate local properties of the image. The first algorithm involves pyramid-based multiresolution processing and a strategy to define and use interlevel and intralevel link strengths. The second algorithm, which is designed for selected window processing, extracts regions adaptively using local histograms. The preliminary segmentation and a set of features are employed as the input to an efficient rule-based low-level analysis system, resulting in suboptimal meaningful segmentation.
Heo, Moonseong; Litwin, Alain H; Blackstock, Oni; Kim, Namhee; Arnsten, Julia H
2017-02-01
We derived sample size formulae for detecting main effects in group-based randomized clinical trials with different levels of data hierarchy between experimental and control arms. Such designs are necessary when experimental interventions need to be administered to groups of subjects whereas control conditions need to be administered to individual subjects. This type of trial, often referred to as a partially nested or partially clustered design, has been implemented for management of chronic diseases such as diabetes and is beginning to emerge more commonly in wider clinical settings. Depending on the research setting, the level of hierarchy of data structure for the experimental arm can be three or two, whereas that for the control arm is two or one. Such different levels of data hierarchy assume correlation structures of outcomes that are different between arms, regardless of whether research settings require two or three level data structure for the experimental arm. Therefore, the different correlations should be taken into account for statistical modeling and for sample size determinations. To this end, we considered mixed-effects linear models with different correlation structures between experimental and control arms to theoretically derive and empirically validate the sample size formulae with simulation studies.
NASA Astrophysics Data System (ADS)
Ghafouri, H. R.; Mosharaf-Dehkordi, M.; Afzalan, B.
2017-07-01
A simulation-optimization model is proposed for identifying the characteristics of local immiscible NAPL contaminant sources inside aquifers. This model employs the UTCHEM 9.0 software as its simulator for solving the governing equations associated with the multi-phase flow in porous media. As the optimization model, a novel two-level saturation based Imperialist Competitive Algorithm (ICA) is proposed to estimate the parameters of contaminant sources. The first level consists of three parallel independent ICAs and plays as a pre-conditioner for the second level which is a single modified ICA. The ICA in the second level is modified by dividing each country into a number of provinces (smaller parts). Similar to countries in the classical ICA, these provinces are optimized by the assimilation, competition, and revolution steps in the ICA. To increase the diversity of populations, a new approach named knock the base method is proposed. The performance and accuracy of the simulation-optimization model is assessed by solving a set of two and three-dimensional problems considering the effects of different parameters such as the grid size, rock heterogeneity and designated monitoring networks. The obtained numerical results indicate that using this simulation-optimization model provides accurate results at a less number of iterations when compared with the model employing the classical one-level ICA. A model is proposed to identify characteristics of immiscible NAPL contaminant sources. The contaminant is immiscible in water and multi-phase flow is simulated. The model is a multi-level saturation-based optimization algorithm based on ICA. Each answer string in second level is divided into a set of provinces. Each ICA is modified by incorporating a new knock the base model.
ERIC Educational Resources Information Center
Erguvan, Deniz
2014-01-01
This study sets out to explore the faculty members' perceptions of a specific web-based instruction tool (Achieve3000) in a private higher education institute in Kuwait. The online tool provides highly differentiated instruction, which is initiated with a level set at the beginning of the term. The program is used in two consecutive courses as…
NASA Astrophysics Data System (ADS)
Muñoz-Esparza, Domingo; Kosović, Branko; Jiménez, Pedro A.; Coen, Janice L.
2018-04-01
The level-set method is typically used to track and propagate the fire perimeter in wildland fire models. Herein, a high-order level-set method using fifth-order WENO scheme for the discretization of spatial derivatives and third-order explicit Runge-Kutta temporal integration is implemented within the Weather Research and Forecasting model wildland fire physics package, WRF-Fire. The algorithm includes solution of an additional partial differential equation for level-set reinitialization. The accuracy of the fire-front shape and rate of spread in uncoupled simulations is systematically analyzed. It is demonstrated that the common implementation used by level-set-based wildfire models yields to rate-of-spread errors in the range 10-35% for typical grid sizes (Δ = 12.5-100 m) and considerably underestimates fire area. Moreover, the amplitude of fire-front gradients in the presence of explicitly resolved turbulence features is systematically underestimated. In contrast, the new WRF-Fire algorithm results in rate-of-spread errors that are lower than 1% and that become nearly grid independent. Also, the underestimation of fire area at the sharp transition between the fire front and the lateral flanks is found to be reduced by a factor of ≈7. A hybrid-order level-set method with locally reduced artificial viscosity is proposed, which substantially alleviates the computational cost associated with high-order discretizations while preserving accuracy. Simulations of the Last Chance wildfire demonstrate additional benefits of high-order accurate level-set algorithms when dealing with complex fuel heterogeneities, enabling propagation across narrow fuel gaps and more accurate fire backing over the lee side of no fuel clusters.
NASA Astrophysics Data System (ADS)
Cai, Wenli; Yoshida, Hiroyuki; Harris, Gordon J.
2007-03-01
Measurement of the volume of focal liver tumors, called liver tumor volumetry, is indispensable for assessing the growth of tumors and for monitoring the response of tumors to oncology treatments. Traditional edge models, such as the maximum gradient and zero-crossing methods, often fail to detect the accurate boundary of a fuzzy object such as a liver tumor. As a result, the computerized volumetry based on these edge models tends to differ from manual segmentation results performed by physicians. In this study, we developed a novel computerized volumetry method for fuzzy objects, called dynamic-thresholding level set (DT level set). An optimal threshold value computed from a histogram tends to shift, relative to the theoretical threshold value obtained from a normal distribution model, toward a smaller region in the histogram. We thus designed a mobile shell structure, called a propagating shell, which is a thick region encompassing the level set front. The optimal threshold calculated from the histogram of the shell drives the level set front toward the boundary of a liver tumor. When the volume ratio between the object and the background in the shell approaches one, the optimal threshold value best fits the theoretical threshold value and the shell stops propagating. Application of the DT level set to 26 hepatic CT cases with 63 biopsy-confirmed hepatocellular carcinomas (HCCs) and metastases showed that the computer measured volumes were highly correlated with those of tumors measured manually by physicians. Our preliminary results showed that DT level set was effective and accurate in estimating the volumes of liver tumors detected in hepatic CT images.
NASA Technical Reports Server (NTRS)
Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Acker, James G. (Editor); Campbell, Janet W.; Blaisdell, John M.; Darzi, Michael
1995-01-01
The level-3 data products from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) are statistical data sets derived from level-2 data. Each data set will be based on a fixed global grid of equal-area bins that are approximately 9 x 9 sq km. Statistics available for each bin include the sum and sum of squares of the natural logarithm of derived level-2 geophysical variables where sums are accumulated over a binning period. Operationally, products with binning periods of 1 day, 8 days, 1 month, and 1 year will be produced and archived. From these accumulated values and for each bin, estimates of the mean, standard deviation, median, and mode may be derived for each geophysical variable. This report contains two major parts: the first (Section 2) is intended as a users' guide for level-3 SeaWiFS data products. It contains an overview of level-0 to level-3 data processing, a discussion of important statistical considerations when using level-3 data, and details of how to use the level-3 data. The second part (Section 3) presents a comparative statistical study of several binning algorithms based on CZCS and moored fluorometer data. The operational binning algorithms were selected based on the results of this study.
The quality and availability of hardwood logging residue based on developed quality levels
Floyd G. Timson
1980-01-01
Hardwood logging residue was examined for salvageable quality material. Four quality levels (QL 1 to QL 4), based on four sets of specifications, were developed. The specifications used surface indicators, sweep, center decay, and piece size to determine quality. Twenty-six percent of the total logging residue (residue ≥ 4 inches in diameter outside bark at...
The Effects of Multimedia Task-Based Language Teaching on EFL Learners' Oral L2 Production
ERIC Educational Resources Information Center
BavaHarji, Madhubala; Gheitanchian, Mehrnaz; Letchumanan, Krishnaveni
2014-01-01
This study examined the effects of tasks, with varying levels of complexity, i.e. simple, + complex and ++ complex tasks on EFL learners' oral production in a multimedia task-based language teaching environment. 57 EFL adult learners carried out a total of 12 tasks, in sets of four tasks within three different themes and different levels of…
An Analysis of Turkey's PISA 2015 Results Using Two-Level Hierarchical Linear Modelling
ERIC Educational Resources Information Center
Atas, Dogu; Karadag, Özge
2017-01-01
In the field of education, most of the data collected are multi-level structured. Cities, city based schools, school based classes and finally students in the classrooms constitute a hierarchical structure. Hierarchical linear models give more accurate results compared to standard models when the data set has a structure going far as individuals,…
ERIC Educational Resources Information Center
Hoff, Paula
2014-01-01
Students are entering a suburban middle school with significant achievement gaps. The skill deficits that students bring to the school setting must be addressed based on data that reflect their greatest area of need. At the middle school level, it is critical to address the learning gaps and prepare students for success at the high school level.…
Activity-based prospective memory in schizophrenia.
Kumar, Devvarta; Nizamie, S Haque; Jahan, Masroor
2008-05-01
The study reports activity-based prospective memory as well as its clinical and neuropsychological correlates in schizophrenia. A total of 42 persons diagnosed with schizophrenia and 42 healthy controls were administered prospective memory, set-shifting, and verbal working memory tasks. The schizophrenia group was additionally administered various psychopathology rating scales. Group differences, with poorer performances of the schizophrenia group, were observed on the measures of prospective memory, working memory, and set shifting. The performance on prospective memory tasks correlated with the performance levels on verbal working memory and set-shifting tasks but not with the clinical measures. This study demonstrated impaired activity-based prospective memory in schizophrenia. The impairment can be due to deficits in various neuropsychological domains.
Process Evaluation Results from an Environmentally Focused Worksite Weight Management Study
ERIC Educational Resources Information Center
DeJoy, David M.; Wilson, Mark G.; Padilla, Heather M.; Goetzel, Ron Z.; Parker, Kristin B.; Della, Lindsay J.; Roemer, Enid C.
2012-01-01
There is currently much interest in exploring environmental approaches to combat weight gain and obesity. This study presents process evaluation results from a workplace-based study that tested two levels of environmentally focused weight management interventions in a manufacturing setting. The moderate treatment featured a set of relatively…
Designing a Standard Model for Development and Execution of an Analysis Project Plan
2012-06-01
mitigations set forth are agreeable to all parties involved. 1.3 Document Risks, Issues, and Constraints 1.1 Gather Information 1.2 Develop...parent requirement into lower level objective, performance-based sibling actions. Collective accomplishment of the set of derived “ sibling ” actions
Fleig, Timo; Knecht, Stefan; Hättig, Christof
2007-06-28
We study the ground-state structures and singlet- and triplet-excited states of the nucleic acid bases by applying the coupled cluster model CC2 in combination with a resolution-of-the-identity approximation for electron interaction integrals. Both basis set effects and the influence of dynamic electron correlation on the molecular structures are elucidated; the latter by comparing CC2 with Hartree-Fock and Møller-Plesset perturbation theory to second order. Furthermore, we investigate basis set and electron correlation effects on the vertical excitation energies and compare our highest-level results with experiment and other theoretical approaches. It is shown that small basis sets are insufficient for obtaining accurate results for excited states of these molecules and that the CC2 approach to dynamic electron correlation is a reliable and efficient tool for electronic structure calculations on medium-sized molecules.
Setting Healthcare Priorities at the Macro and Meso Levels: A Framework for Evaluation
Barasa, Edwine W.; Molyneux, Sassy; English, Mike; Cleary, Susan
2015-01-01
Background: Priority setting in healthcare is a key determinant of health system performance. However, there is no widely accepted priority setting evaluation framework. We reviewed literature with the aim of developing and proposing a framework for the evaluation of macro and meso level healthcare priority setting practices. Methods: We systematically searched Econlit, PubMed, CINAHL, and EBSCOhost databases and supplemented this with searches in Google Scholar, relevant websites and reference lists of relevant papers. A total of 31 papers on evaluation of priority setting were identified. These were supplemented by broader theoretical literature related to evaluation of priority setting. A conceptual review of selected papers was undertaken. Results: Based on a synthesis of the selected literature, we propose an evaluative framework that requires that priority setting practices at the macro and meso levels of the health system meet the following conditions: (1) Priority setting decisions should incorporate both efficiency and equity considerations as well as the following outcomes; (a) Stakeholder satisfaction, (b) Stakeholder understanding, (c) Shifted priorities (reallocation of resources), and (d) Implementation of decisions. (2) Priority setting processes should also meet the procedural conditions of (a) Stakeholder engagement, (b) Stakeholder empowerment, (c) Transparency, (d) Use of evidence, (e) Revisions, (f) Enforcement, and (g) Being grounded on community values. Conclusion: Available frameworks for the evaluation of priority setting are mostly grounded on procedural requirements, while few have included outcome requirements. There is, however, increasing recognition of the need to incorporate both consequential and procedural considerations in priority setting practices. In this review, we adapt an integrative approach to develop and propose a framework for the evaluation of priority setting practices at the macro and meso levels that draws from these complementary schools of thought. PMID:26673332
Setting Healthcare Priorities at the Macro and Meso Levels: A Framework for Evaluation.
Barasa, Edwine W; Molyneux, Sassy; English, Mike; Cleary, Susan
2015-09-16
Priority setting in healthcare is a key determinant of health system performance. However, there is no widely accepted priority setting evaluation framework. We reviewed literature with the aim of developing and proposing a framework for the evaluation of macro and meso level healthcare priority setting practices. We systematically searched Econlit, PubMed, CINAHL, and EBSCOhost databases and supplemented this with searches in Google Scholar, relevant websites and reference lists of relevant papers. A total of 31 papers on evaluation of priority setting were identified. These were supplemented by broader theoretical literature related to evaluation of priority setting. A conceptual review of selected papers was undertaken. Based on a synthesis of the selected literature, we propose an evaluative framework that requires that priority setting practices at the macro and meso levels of the health system meet the following conditions: (1) Priority setting decisions should incorporate both efficiency and equity considerations as well as the following outcomes; (a) Stakeholder satisfaction, (b) Stakeholder understanding, (c) Shifted priorities (reallocation of resources), and (d) Implementation of decisions. (2) Priority setting processes should also meet the procedural conditions of (a) Stakeholder engagement, (b) Stakeholder empowerment, (c) Transparency, (d) Use of evidence, (e) Revisions, (f) Enforcement, and (g) Being grounded on community values. Available frameworks for the evaluation of priority setting are mostly grounded on procedural requirements, while few have included outcome requirements. There is, however, increasing recognition of the need to incorporate both consequential and procedural considerations in priority setting practices. In this review, we adapt an integrative approach to develop and propose a framework for the evaluation of priority setting practices at the macro and meso levels that draws from these complementary schools of thought. © 2015 by Kerman University of Medical Sciences.
A level-set method for two-phase flows with moving contact line and insoluble surfactant
NASA Astrophysics Data System (ADS)
Xu, Jian-Jun; Ren, Weiqing
2014-04-01
A level-set method for two-phase flows with moving contact line and insoluble surfactant is presented. The mathematical model consists of the Navier-Stokes equation for the flow field, a convection-diffusion equation for the surfactant concentration, together with the Navier boundary condition and a condition for the dynamic contact angle derived by Ren et al. (2010) [37]. The numerical method is based on the level-set continuum surface force method for two-phase flows with surfactant developed by Xu et al. (2012) [54] with some cautious treatment for the boundary conditions. The numerical method consists of three components: a flow solver for the velocity field, a solver for the surfactant concentration, and a solver for the level-set function. In the flow solver, the surface force is dealt with using the continuum surface force model. The unbalanced Young stress at the moving contact line is incorporated into the Navier boundary condition. A convergence study of the numerical method and a parametric study are presented. The influence of surfactant on the dynamics of the moving contact line is illustrated using examples. The capability of the level-set method to handle complex geometries is demonstrated by simulating a pendant drop detaching from a wall under gravity.
Yao, Jincao; Yu, Huimin; Hu, Roland
2017-01-01
This paper introduces a new implicit-kernel-sparse-shape-representation-based object segmentation framework. Given an input object whose shape is similar to some of the elements in the training set, the proposed model can automatically find a cluster of implicit kernel sparse neighbors to approximately represent the input shape and guide the segmentation. A distance-constrained probabilistic definition together with a dualization energy term is developed to connect high-level shape representation and low-level image information. We theoretically prove that our model not only derives from two projected convex sets but is also equivalent to a sparse-reconstruction-error-based representation in the Hilbert space. Finally, a "wake-sleep"-based segmentation framework is applied to drive the evolutionary curve to recover the original shape of the object. We test our model on two public datasets. Numerical experiments on both synthetic images and real applications show the superior capabilities of the proposed framework.
Fast and efficient indexing approach for object recognition
NASA Astrophysics Data System (ADS)
Hefnawy, Alaa; Mashali, Samia A.; Rashwan, Mohsen; Fikri, Magdi
1999-08-01
This paper introduces a fast and efficient indexing approach for both 2D and 3D model-based object recognition in the presence of rotation, translation, and scale variations of objects. The indexing entries are computed after preprocessing the data by Haar wavelet decomposition. The scheme is based on a unified image feature detection approach based on Zernike moments. A set of low level features, e.g. high precision edges, gray level corners, are estimated by a set of orthogonal Zernike moments, calculated locally around every image point. A high dimensional, highly descriptive indexing entries are then calculated based on the correlation of these local features and employed for fast access to the model database to generate hypotheses. A list of the most candidate models is then presented by evaluating the hypotheses. Experimental results are included to demonstrate the effectiveness of the proposed indexing approach.
Al-Shaikhli, Saif Dawood Salman; Yang, Michael Ying; Rosenhahn, Bodo
2016-12-01
This paper presents a novel method for Alzheimer's disease classification via an automatic 3D caudate nucleus segmentation. The proposed method consists of segmentation and classification steps. In the segmentation step, we propose a novel level set cost function. The proposed cost function is constrained by a sparse representation of local image features using a dictionary learning method. We present coupled dictionaries: a feature dictionary of a grayscale brain image and a label dictionary of a caudate nucleus label image. Using online dictionary learning, the coupled dictionaries are learned from the training data. The learned coupled dictionaries are embedded into a level set function. In the classification step, a region-based feature dictionary is built. The region-based feature dictionary is learned from shape features of the caudate nucleus in the training data. The classification is based on the measure of the similarity between the sparse representation of region-based shape features of the segmented caudate in the test image and the region-based feature dictionary. The experimental results demonstrate the superiority of our method over the state-of-the-art methods by achieving a high segmentation (91.5%) and classification (92.5%) accuracy. In this paper, we find that the study of the caudate nucleus atrophy gives an advantage over the study of whole brain structure atrophy to detect Alzheimer's disease. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Automated extraction of decision rules for leptin dynamics--a rough sets approach.
Brtka, Vladimir; Stokić, Edith; Srdić, Biljana
2008-08-01
A significant area in the field of medical informatics is concerned with the learning of medical models from low-level data. The goals of inducing models from data are twofold: analysis of the structure of the models so as to gain new insight into the unknown phenomena, and development of classifiers or outcome predictors for unseen cases. In this paper, we will employ approach based on the relation of indiscernibility and rough sets theory to study certain questions concerning the design of model based on if-then rules, from low-level data including 36 parameters, one of them leptin. To generate easy to read, interpret, and inspect model, we have used ROSETTA software system. The main goal of this work is to get new insight into phenomena of leptin levels while interplaying with other risk factors in obesity.
An integrated set of UNIX based system tools at control room level
NASA Astrophysics Data System (ADS)
Potepan, F.; Scafuri, C.; Bortolotto, C.; Surace, G.
1994-12-01
The design effort of providing a simple point-and-click approach to the equipment access has led to the definition and realization of a modular set of software tools to be used at the ELETTRA control room level. Point-to-point equipment access requires neither programming nor specific knowledge of the control system architecture. The development and integration of communication, graphic, editing and global database modules are described in depth, followed by a report of their use in the first commissioning period.
A new level set model for cell image segmentation
NASA Astrophysics Data System (ADS)
Ma, Jing-Feng; Hou, Kai; Bao, Shang-Lian; Chen, Chun
2011-02-01
In this paper we first determine three phases of cell images: background, cytoplasm and nucleolus according to the general physical characteristics of cell images, and then develop a variational model, based on these characteristics, to segment nucleolus and cytoplasm from their relatively complicated backgrounds. In the meantime, the preprocessing obtained information of cell images using the OTSU algorithm is used to initialize the level set function in the model, which can speed up the segmentation and present satisfactory results in cell image processing.
Rudolf, Klaus-Dieter; Kus, Sandra; Chung, Kevin C; Johnston, Marie; LeBlanc, Monique; Cieza, Alarcos
2012-01-01
A formal decision-making and consensus process was applied to develop the first version of the International Classification on Functioning, Disability and Health (ICF) Core Sets for Hand Conditions. To convene an international panel to develop the ICF Core Sets for Hand Conditions (HC), preparatory studies were conducted, which included an expert survey, a systematic literature review, a qualitative study and an empirical data collection process involving persons with hand conditions. A consensus conference was convened in Switzerland in May 2009 that was attended by 23 healthcare professionals, who treat hand conditions, representing 22 countries. The preparatory studies identified a set of 743 ICF categories at the second, third or fourth hierarchical level. Altogether, 117 chapter-, second-, or third-level categories were included in the comprehensive ICF Core Set for HC. The brief ICF Core Set for HC included a total of 23 chapter- and second-level categories. A formal consensus process integrating evidence and expert opinion based on the ICF led to the formal adoption of the ICF Core Sets for Hand Conditions. The next phase of this ICF project is to conduct a formal validation process to establish its applicability in clinical settings.
Tang, Jian; Jiang, Xiaoliang
2017-01-01
Image segmentation has always been a considerable challenge in image analysis and understanding due to the intensity inhomogeneity, which is also commonly known as bias field. In this paper, we present a novel region-based approach based on local entropy for segmenting images and estimating the bias field simultaneously. Firstly, a local Gaussian distribution fitting (LGDF) energy function is defined as a weighted energy integral, where the weight is local entropy derived from a grey level distribution of local image. The means of this objective function have a multiplicative factor that estimates the bias field in the transformed domain. Then, the bias field prior is fully used. Therefore, our model can estimate the bias field more accurately. Finally, minimization of this energy function with a level set regularization term, image segmentation, and bias field estimation can be achieved. Experiments on images of various modalities demonstrated the superior performance of the proposed method when compared with other state-of-the-art approaches.
Aller, Marta-Beatriz; Vargas, Ingrid; Coderch, Jordi; Calero, Sebastià; Cots, Francesc; Abizanda, Mercè; Farré, Joan; Llopart, Josep Ramon; Colomés, Lluís; Vázquez, María Luisa
2015-08-13
Coordination across levels of care is becoming increasingly important due to rapid advances in technology, high specialisation and changes in the organization of healthcare services; to date, however, the development of indicators to evaluate coordination has been limited. The aim of this study is to develop and test a set of indicators to comprehensively evaluate clinical coordination across levels of care. A systematic review of literature was conducted to identify indicators of clinical coordination across levels of care. These indicators were analysed to identify attributes of coordination and classified accordingly. They were then discussed within an expert team and adapted or newly developed, and their relevance, scientific soundness and feasibility were examined. The indicators were tested in three healthcare areas of the Catalan health system. 52 indicators were identified addressing 11 attributes of clinical coordination across levels of care. The final set consisted of 21 output indicators. Clinical information transfer is evaluated based on information flow (4) and the adequacy of shared information (3). Clinical management coordination indicators evaluate care coherence through diagnostic testing (2) and medication (1), provision of care at the most appropriate level (2), completion of diagnostic process (1), follow-up after hospital discharge (4) and accessibility across levels of care (4). The application of indicators showed differences in the degree of clinical coordination depending on the attribute and area. A set of rigorous and scientifically sound measures of clinical coordination across levels of care were developed based on a literature review and discussion with experts. This set of indicators comprehensively address the different attributes of clinical coordination in main transitions across levels of care. It could be employed to identify areas in which health services can be improved, as well as to measure the effect of efforts to improve clinical coordination in healthcare organizations.
A coastal hazards data base for the U.S. West Coast
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gornitz, V.M.; Beaty, T.W.; Daniels, R.C.
1997-12-01
This document describes the contents of a digital data base that may be used to identify coastlines along the US West Coast that are at risk to sea-level rise. This data base integrates point, line, and polygon data for the US West Coast into 0.25{degree} latitude by 0.25{degree} longitude grid cells and into 1:2,000,000 digitized line segments that can be used by raster or vector geographic information systems (GIS) as well as by non-GIS data bases. Each coastal grid cell and line segment contains data variables from the following seven data sets: elevation, geology, geomorphology, sea-level trends, shoreline displacement (erosion/accretion),more » tidal ranges, and wave heights. One variable from each data set was classified according to its susceptibility to sea-level rise and/or erosion to form 7 relative risk variables. These risk variables range in value from 1 to 5 and may be used to calculate a Coastal Vulnerability Index (CVI). Algorithms used to calculate several CVIs are listed within this text.« less
Optic disc segmentation for glaucoma screening system using fundus images.
Almazroa, Ahmed; Sun, Weiwei; Alodhayb, Sami; Raahemifar, Kaamran; Lakshminarayanan, Vasudevan
2017-01-01
Segmenting the optic disc (OD) is an important and essential step in creating a frame of reference for diagnosing optic nerve head pathologies such as glaucoma. Therefore, a reliable OD segmentation technique is necessary for automatic screening of optic nerve head abnormalities. The main contribution of this paper is in presenting a novel OD segmentation algorithm based on applying a level set method on a localized OD image. To prevent the blood vessels from interfering with the level set process, an inpainting technique was applied. As well an important contribution was to involve the variations in opinions among the ophthalmologists in detecting the disc boundaries and diagnosing the glaucoma. Most of the previous studies were trained and tested based on only one opinion, which can be assumed to be biased for the ophthalmologist. In addition, the accuracy was calculated based on the number of images that coincided with the ophthalmologists' agreed-upon images, and not only on the overlapping images as in previous studies. The ultimate goal of this project is to develop an automated image processing system for glaucoma screening. The disc algorithm is evaluated using a new retinal fundus image dataset called RIGA (retinal images for glaucoma analysis). In the case of low-quality images, a double level set was applied, in which the first level set was considered to be localization for the OD. Five hundred and fifty images are used to test the algorithm accuracy as well as the agreement among the manual markings of six ophthalmologists. The accuracy of the algorithm in marking the optic disc area and centroid was 83.9%, and the best agreement was observed between the results of the algorithm and manual markings in 379 images.
Study of Burn Scar Extraction Automatically Based on Level Set Method using Remote Sensing Data
Liu, Yang; Dai, Qin; Liu, JianBo; Liu, ShiBin; Yang, Jin
2014-01-01
Burn scar extraction using remote sensing data is an efficient way to precisely evaluate burn area and measure vegetation recovery. Traditional burn scar extraction methodologies have no well effect on burn scar image with blurred and irregular edges. To address these issues, this paper proposes an automatic method to extract burn scar based on Level Set Method (LSM). This method utilizes the advantages of the different features in remote sensing images, as well as considers the practical needs of extracting the burn scar rapidly and automatically. This approach integrates Change Vector Analysis (CVA), Normalized Difference Vegetation Index (NDVI) and the Normalized Burn Ratio (NBR) to obtain difference image and modifies conventional Level Set Method Chan-Vese (C-V) model with a new initial curve which results from a binary image applying K-means method on fitting errors of two near-infrared band images. Landsat 5 TM and Landsat 8 OLI data sets are used to validate the proposed method. Comparison with conventional C-V model, OSTU algorithm, Fuzzy C-mean (FCM) algorithm are made to show that the proposed approach can extract the outline curve of fire burn scar effectively and exactly. The method has higher extraction accuracy and less algorithm complexity than that of the conventional C-V model. PMID:24503563
An Active Contour Model Based on Adaptive Threshold for Extraction of Cerebral Vascular Structures.
Wang, Jiaxin; Zhao, Shifeng; Liu, Zifeng; Tian, Yun; Duan, Fuqing; Pan, Yutong
2016-01-01
Cerebral vessel segmentation is essential and helpful for the clinical diagnosis and the related research. However, automatic segmentation of brain vessels remains challenging because of the variable vessel shape and high complex of vessel geometry. This study proposes a new active contour model (ACM) implemented by the level-set method for segmenting vessels from TOF-MRA data. The energy function of the new model, combining both region intensity and boundary information, is composed of two region terms, one boundary term and one penalty term. The global threshold representing the lower gray boundary of the target object by maximum intensity projection (MIP) is defined in the first-region term, and it is used to guide the segmentation of the thick vessels. In the second term, a dynamic intensity threshold is employed to extract the tiny vessels. The boundary term is used to drive the contours to evolve towards the boundaries with high gradients. The penalty term is used to avoid reinitialization of the level-set function. Experimental results on 10 clinical brain data sets demonstrate that our method is not only able to achieve better Dice Similarity Coefficient than the global threshold based method and localized hybrid level-set method but also able to extract whole cerebral vessel trees, including the thin vessels.
Ray Casting of Large Multi-Resolution Volume Datasets
NASA Astrophysics Data System (ADS)
Lux, C.; Fröhlich, B.
2009-04-01
High quality volume visualization through ray casting on graphics processing units (GPU) has become an important approach for many application domains. We present a GPU-based, multi-resolution ray casting technique for the interactive visualization of massive volume data sets commonly found in the oil and gas industry. Large volume data sets are represented as a multi-resolution hierarchy based on an octree data structure. The original volume data is decomposed into small bricks of a fixed size acting as the leaf nodes of the octree. These nodes are the highest resolution of the volume. Coarser resolutions are represented through inner nodes of the hierarchy which are generated by down sampling eight neighboring nodes on a finer level. Due to limited memory resources of current desktop workstations and graphics hardware only a limited working set of bricks can be locally maintained for a frame to be displayed. This working set is chosen to represent the whole volume at different local resolution levels depending on the current viewer position, transfer function and distinct areas of interest. During runtime the working set of bricks is maintained in CPU- and GPU memory and is adaptively updated by asynchronously fetching data from external sources like hard drives or a network. The CPU memory hereby acts as a secondary level cache for these sources from which the GPU representation is updated. Our volume ray casting algorithm is based on a 3D texture-atlas in GPU memory. This texture-atlas contains the complete working set of bricks of the current multi-resolution representation of the volume. This enables the volume ray casting algorithm to access the whole working set of bricks through only a single 3D texture. For traversing rays through the volume, information about the locations and resolution levels of visited bricks are required for correct compositing computations. We encode this information into a small 3D index texture which represents the current octree subdivision on its finest level and spatially organizes the bricked data. This approach allows us to render a bricked multi-resolution volume data set utilizing only a single rendering pass with no loss of compositing precision. In contrast most state-of-the art volume rendering systems handle the bricked data as individual 3D textures, which are rendered one at a time while the results are composited into a lower precision frame buffer. Furthermore, our method enables us to integrate advanced volume rendering techniques like empty-space skipping, adaptive sampling and preintegrated transfer functions in a very straightforward manner with virtually no extra costs. Our interactive volume ray tracing implementation allows high quality visualizations of massive volume data sets of tens of Gigabytes in size on standard desktop workstations.
Kourdioukova, Elena V; Verstraete, Koenraad L; Valcke, Martin
2011-06-01
The aim of this research was to explore (1) clinical years students' perceptions about radiology case-based learning within a computer supported collaborative learning (CSCL) setting, (2) an analysis of the collaborative learning process, and (3) the learning impact of collaborative work on the radiology cases. The first part of this study focuses on a more detailed analysis of a survey study about CSCL based case-based learning, set up in the context of a broader radiology curriculum innovation. The second part centers on a qualitative and quantitative analysis of 52 online collaborative learning discussions from 5th year and nearly graduating medical students. The collaborative work was based on 26 radiology cases regarding musculoskeletal radiology. The analysis of perceptions about collaborative learning on radiology cases reflects a rather neutral attitude that also does not differ significantly in students of different grade levels. Less advanced students are more positive about CSCL as compared to last year students. Outcome evaluation shows a significantly higher level of accuracy in identification of radiology key structures and in radiology diagnosis as well as in linking the radiological signs with available clinical information in nearly graduated students. No significant differences between different grade levels were found in accuracy of using medical terminology. Students appreciate computer supported collaborative learning settings when tackling radiology case-based learning. Scripted computer supported collaborative learning groups proved to be useful for both 5th and 7th year students in view of developing components of their radiology diagnostic approaches. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Converging evidence for control of color-word Stroop interference at the item level.
Bugg, Julie M; Hutchison, Keith A
2013-04-01
Prior studies have shown that cognitive control is implemented at the list and context levels in the color-word Stroop task. At first blush, the finding that Stroop interference is reduced for mostly incongruent items as compared with mostly congruent items (i.e., the item-specific proportion congruence [ISPC] effect) appears to provide evidence for yet a third level of control, which modulates word reading at the item level. However, evidence to date favors the view that ISPC effects reflect the rapid prediction of high-contingency responses and not item-specific control. In Experiment 1, we first show that an ISPC effect is obtained when the relevant dimension (i.e., color) signals proportion congruency, a problematic pattern for theories based on differential response contingencies. In Experiment 2, we replicate and extend this pattern by showing that item-specific control settings transfer to new stimuli, ruling out alternative frequency-based accounts. In Experiment 3, we revert to the traditional design in which the irrelevant dimension (i.e., word) signals proportion congruency. Evidence for item-specific control, including transfer of the ISPC effect to new stimuli, is apparent when 4-item sets are employed but not when 2-item sets are employed. We attribute this pattern to the absence of high-contingency responses on incongruent trials in the 4-item set. These novel findings provide converging evidence for reactive control of color-word Stroop interference at the item level, reveal theoretically important factors that modulate reliance on item-specific control versus contingency learning, and suggest an update to the item-specific control account (Bugg, Jacoby, & Chanani, 2011).
A level set method for cupping artifact correction in cone-beam CT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Shipeng; Li, Haibo; Ge, Qi
2015-08-15
Purpose: To reduce cupping artifacts and improve the contrast-to-noise ratio in cone-beam computed tomography (CBCT). Methods: A level set method is proposed to reduce cupping artifacts in the reconstructed image of CBCT. The authors derive a local intensity clustering property of the CBCT image and define a local clustering criterion function of the image intensities in a neighborhood of each point. This criterion function defines an energy in terms of the level set functions, which represent a segmentation result and the cupping artifacts. The cupping artifacts are estimated as a result of minimizing this energy. Results: The cupping artifacts inmore » CBCT are reduced by an average of 90%. The results indicate that the level set-based algorithm is practical and effective for reducing the cupping artifacts and preserving the quality of the reconstructed image. Conclusions: The proposed method focuses on the reconstructed image without requiring any additional physical equipment, is easily implemented, and provides cupping correction through a single-scan acquisition. The experimental results demonstrate that the proposed method successfully reduces the cupping artifacts.« less
Joint level-set and spatio-temporal motion detection for cell segmentation.
Boukari, Fatima; Makrogiannis, Sokratis
2016-08-10
Cell segmentation is a critical step for quantification and monitoring of cell cycle progression, cell migration, and growth control to investigate cellular immune response, embryonic development, tumorigenesis, and drug effects on live cells in time-lapse microscopy images. In this study, we propose a joint spatio-temporal diffusion and region-based level-set optimization approach for moving cell segmentation. Moving regions are initially detected in each set of three consecutive sequence images by numerically solving a system of coupled spatio-temporal partial differential equations. In order to standardize intensities of each frame, we apply a histogram transformation approach to match the pixel intensities of each processed frame with an intensity distribution model learned from all frames of the sequence during the training stage. After the spatio-temporal diffusion stage is completed, we compute the edge map by nonparametric density estimation using Parzen kernels. This process is followed by watershed-based segmentation and moving cell detection. We use this result as an initial level-set function to evolve the cell boundaries, refine the delineation, and optimize the final segmentation result. We applied this method to several datasets of fluorescence microscopy images with varying levels of difficulty with respect to cell density, resolution, contrast, and signal-to-noise ratio. We compared the results with those produced by Chan and Vese segmentation, a temporally linked level-set technique, and nonlinear diffusion-based segmentation. We validated all segmentation techniques against reference masks provided by the international Cell Tracking Challenge consortium. The proposed approach delineated cells with an average Dice similarity coefficient of 89 % over a variety of simulated and real fluorescent image sequences. It yielded average improvements of 11 % in segmentation accuracy compared to both strictly spatial and temporally linked Chan-Vese techniques, and 4 % compared to the nonlinear spatio-temporal diffusion method. Despite the wide variation in cell shape, density, mitotic events, and image quality among the datasets, our proposed method produced promising segmentation results. These results indicate the efficiency and robustness of this method especially for mitotic events and low SNR imaging, enabling the application of subsequent quantification tasks.
Madapana, Naveen; Gonzalez, Glebys; Rodgers, Richard; Zhang, Lingsong; Wachs, Juan P
2018-01-01
Gestural interfaces allow accessing and manipulating Electronic Medical Records (EMR) in hospitals while keeping a complete sterile environment. Particularly, in the Operating Room (OR), these interfaces enable surgeons to browse Picture Archiving and Communication System (PACS) without the need of delegating functions to the surgical staff. Existing gesture based medical interfaces rely on a suboptimal and an arbitrary small set of gestures that are mapped to a few commands available in PACS software. The objective of this work is to discuss a method to determine the most suitable set of gestures based on surgeon's acceptability. To achieve this goal, the paper introduces two key innovations: (a) a novel methodology to incorporate gestures' semantic properties into the agreement analysis, and (b) a new agreement metric to determine the most suitable gesture set for a PACS. Three neurosurgical diagnostic tasks were conducted by nine neurosurgeons. The set of commands and gesture lexicons were determined using a Wizard of Oz paradigm. The gestures were decomposed into a set of 55 semantic properties based on the motion trajectory, orientation and pose of the surgeons' hands and their ground truth values were manually annotated. Finally, a new agreement metric was developed, using the known Jaccard similarity to measure consensus between users over a gesture set. A set of 34 PACS commands were found to be a sufficient number of actions for PACS manipulation. In addition, it was found that there is a level of agreement of 0.29 among the surgeons over the gestures found. Two statistical tests including paired t-test and Mann Whitney Wilcoxon test were conducted between the proposed metric and the traditional agreement metric. It was found that the agreement values computed using the former metric are significantly higher (p < 0.001) for both tests. This study reveals that the level of agreement among surgeons over the best gestures for PACS operation is higher than the previously reported metric (0.29 vs 0.13). This observation is based on the fact that the agreement focuses on main features of the gestures rather than the gestures themselves. The level of agreement is not very high, yet indicates a majority preference, and is better than using gestures based on authoritarian or arbitrary approaches. The methods described in this paper provide a guiding framework for the design of future gesture based PACS systems for the OR.
NASA Astrophysics Data System (ADS)
Ji, Yuanbo; van der Geest, Rob J.; Nazarian, Saman; Lelieveldt, Boudewijn P. F.; Tao, Qian
2018-03-01
Anatomical objects in medical images very often have dual contours or surfaces that are highly correlated. Manually segmenting both of them by following local image details is tedious and subjective. In this study, we proposed a two-layer region-based level set method with a soft distance constraint, which not only regularizes the level set evolution at two levels, but also imposes prior information on wall thickness in an effective manner. By updating the level set function and distance constraint functions alternatingly, the method simultaneously optimizes both contours while regularizing their distance. The method was applied to segment the inner and outer wall of both left atrium (LA) and left ventricle (LV) from MR images, using a rough initialization from inside the blood pool. Compared to manual annotation from experience observers, the proposed method achieved an average perpendicular distance (APD) of less than 1mm for the LA segmentation, and less than 1.5mm for the LV segmentation, at both inner and outer contours. The method can be used as a practical tool for fast and accurate dual wall annotations given proper initialization.
NASA Astrophysics Data System (ADS)
Seyedhosseini, Seyed Mohammad; Fahimi, Kaveh; Makui, Ahmad
2017-12-01
This paper presents the competitive supply chain network design problem in which n decentralized supply chains simultaneously enter the market with no existing rival chain, shape their networks and set wholesale and retail prices in competitive mode. The customer demand is elastic and price dependent, customer utility function is based on the Hoteling model and the chains produce identical or highly substitutable products. We construct a solution algorithm based on bi-level programming and possibility theory. In the proposed bi-level model, the inner part sets the prices based on simultaneous extra- and Stackleberg intra- chains competitions, and the outer part shapes the networks in cooperative competitions. Finally, we use a real-word study to discuss the effect of the different structures of the competitors on the equilibrium solution. Moreover, sensitivity analyses are conducted and managerial insights are offered.
Wraparound: As a Tertiary Level Intervention for Students with Emotional/Behavioral Needs
ERIC Educational Resources Information Center
Eber, Lucille; Breen, Kimberli; Rose, Jennifer; Unizycki, Renee M.; London, Tasha H.
2008-01-01
If a student has multiple behavior problems that escalate over time and across different settings, school-based problem-solving teams can become quickly overwhelmed, especially when educators identify "setting events" for problem behaviors that have occurred outside of school and are beyond the control of school personnel. Instead of resorting to…
ERIC Educational Resources Information Center
Hansen, Mary A.; Lyon, Steven R.; Heh, Peter; Zigmond, Naomi
2013-01-01
Large-scale assessment programs, including alternate assessments based on alternate achievement standards (AA-AAS), must provide evidence of technical quality and validity. This study provides information about the technical quality of one AA-AAS by evaluating the standard setting for the science component. The assessment was designed to have…
Lim, Eunjung; Mbowe, Omar; Lee, Angela S. W.; Davis, James
2016-01-01
Background Assessment of the health effects of low-level exposure to hydrogen sulfide (H2S) on humans through experiments, industrial, and community studies has shown inconsistent results. Objective To critically appraise available studies investigating the effect of H2S on the central nervous system (CNS) and on respiratory function. Methods A search was conducted in 16 databases for articles published between January 1980 and July 2014. Two researchers independently evaluated potentially relevant papers based on a set of inclusion/exclusion criteria. Results Twenty-seven articles met the inclusion criteria: 6 experimental, 12 industry-based studies, and 10 community-based studies (one article included both experimental and industry-based studies). The results of the systematic review varied by study setting and quality. Several community-based studies reported associations between day-to-day variations in H2S levels and health outcomes among patients with chronic respiratory conditions. However, evidence from the largest and better-designed community-based studies did not support that chronic, ambient H2S exposure has health effects on the CNS or respiratory function. Results from industry-based studies varied, reflecting the diversity of settings and the broad range of H2S exposures. Most studies did not have individual measurements of H2S exposure. Discussion The results across studies were inconsistent, justifying the need for further research. PMID:27128692
Lim, Eunjung; Mbowe, Omar; Lee, Angela S W; Davis, James
2016-01-01
Assessment of the health effects of low-level exposure to hydrogen sulfide (H2S) on humans through experiments, industrial, and community studies has shown inconsistent results. To critically appraise available studies investigating the effect of H2S on the central nervous system (CNS) and on respiratory function. A search was conducted in 16 databases for articles published between January 1980 and July 2014. Two researchers independently evaluated potentially relevant papers based on a set of inclusion/exclusion criteria. Twenty-seven articles met the inclusion criteria: 6 experimental, 12 industry-based studies, and 10 community-based studies (one article included both experimental and industry-based studies). The results of the systematic review varied by study setting and quality. Several community-based studies reported associations between day-to-day variations in H2S levels and health outcomes among patients with chronic respiratory conditions. However, evidence from the largest and better-designed community-based studies did not support that chronic, ambient H2S exposure has health effects on the CNS or respiratory function. Results from industry-based studies varied, reflecting the diversity of settings and the broad range of H2S exposures. Most studies did not have individual measurements of H2S exposure. The results across studies were inconsistent, justifying the need for further research.
Gowin, Ewelina; Januszkiewicz-Lewandowska, Danuta; Słowiński, Roman; Błaszczyński, Jerzy; Michalak, Michał; Wysocki, Jacek
2017-01-01
Abstract Differential Diagnosis of bacterial and viral meningitis remains an important clinical problem. A number of methods to assist in the diagnoses of meningitis have been developed, but none of them have been found to have high specificity with 100% sensitivity. We conducted a retrospective analysis of the medical records of 148 children hospitalized in St. Joseph Children's Hospital in Poznań. In this study, we applied for the first time the original methodology of dominance-based rough set approach (DRSA) to diagnostic patterns of meningitis data and represented them by decision rules useful in discriminating between bacterial and viral meningitis. The induction algorithm is called VC-DomLEM; it has been implemented as software package called jMAF (http://www.cs.put.poznan.pl/jblaszczynski/Site/jRS.html), based on java Rough Set (jRS) library. In the studied group, there were 148 patients (78 boys and 70 girls), and the mean age was 85 months. We analyzed 14 attributes, of which only 4 were used to generate the 6 rules, with C-reactive protein (CRP) being the most valuable. Factors associated with bacterial meningitis were: CRP level ≥86 mg/L, number of leukocytes in cerebrospinal fluid (CSF) ≥4481 μL−1, symptoms duration no longer than 2 days, or age less than 1 month. Factors associated with viral meningitis were CRP level not higher than 19 mg/L, or CRP level not higher than 84 mg/L in a patient older than 11 months with no more than 1100 μL−1 leukocytes in CSF. We established the minimum set of attributes significant for classification of patients with meningitis. This is new set of rules, which, although intuitively anticipated by some clinicians, has not been formally demonstrated until now. PMID:28796045
Gowin, Ewelina; Januszkiewicz-Lewandowska, Danuta; Słowiński, Roman; Błaszczyński, Jerzy; Michalak, Michał; Wysocki, Jacek
2017-08-01
Differential Diagnosis of bacterial and viral meningitis remains an important clinical problem. A number of methods to assist in the diagnoses of meningitis have been developed, but none of them have been found to have high specificity with 100% sensitivity.We conducted a retrospective analysis of the medical records of 148 children hospitalized in St. Joseph Children's Hospital in Poznań. In this study, we applied for the first time the original methodology of dominance-based rough set approach (DRSA) to diagnostic patterns of meningitis data and represented them by decision rules useful in discriminating between bacterial and viral meningitis. The induction algorithm is called VC-DomLEM; it has been implemented as software package called jMAF (http://www.cs.put.poznan.pl/jblaszczynski/Site/jRS.html), based on java Rough Set (jRS) library.In the studied group, there were 148 patients (78 boys and 70 girls), and the mean age was 85 months. We analyzed 14 attributes, of which only 4 were used to generate the 6 rules, with C-reactive protein (CRP) being the most valuable.Factors associated with bacterial meningitis were: CRP level ≥86 mg/L, number of leukocytes in cerebrospinal fluid (CSF) ≥4481 μL, symptoms duration no longer than 2 days, or age less than 1 month. Factors associated with viral meningitis were CRP level not higher than 19 mg/L, or CRP level not higher than 84 mg/L in a patient older than 11 months with no more than 1100 μL leukocytes in CSF.We established the minimum set of attributes significant for classification of patients with meningitis. This is new set of rules, which, although intuitively anticipated by some clinicians, has not been formally demonstrated until now.
Energy-optimal path planning by stochastic dynamically orthogonal level-set optimization
NASA Astrophysics Data System (ADS)
Subramani, Deepak N.; Lermusiaux, Pierre F. J.
2016-04-01
A stochastic optimization methodology is formulated for computing energy-optimal paths from among time-optimal paths of autonomous vehicles navigating in a dynamic flow field. Based on partial differential equations, the methodology rigorously leverages the level-set equation that governs time-optimal reachability fronts for a given relative vehicle-speed function. To set up the energy optimization, the relative vehicle-speed and headings are considered to be stochastic and new stochastic Dynamically Orthogonal (DO) level-set equations are derived. Their solution provides the distribution of time-optimal reachability fronts and corresponding distribution of time-optimal paths. An optimization is then performed on the vehicle's energy-time joint distribution to select the energy-optimal paths for each arrival time, among all stochastic time-optimal paths for that arrival time. Numerical schemes to solve the reduced stochastic DO level-set equations are obtained, and accuracy and efficiency considerations are discussed. These reduced equations are first shown to be efficient at solving the governing stochastic level-sets, in part by comparisons with direct Monte Carlo simulations. To validate the methodology and illustrate its accuracy, comparisons with semi-analytical energy-optimal path solutions are then completed. In particular, we consider the energy-optimal crossing of a canonical steady front and set up its semi-analytical solution using a energy-time nested nonlinear double-optimization scheme. We then showcase the inner workings and nuances of the energy-optimal path planning, considering different mission scenarios. Finally, we study and discuss results of energy-optimal missions in a wind-driven barotropic quasi-geostrophic double-gyre ocean circulation.
Complicating Counterspaces: Intersectionality and the Michigan Womyn's Music Festival.
McConnell, Elizabeth A; Todd, Nathan R; Odahl-Ruan, Charlynn; Shattell, Mona
2016-06-01
The counterspaces framework articulated by Case and Hunter (2012), follows from community psychology's long-standing interest in the potential for settings to promote well-being and liberatory responses to oppression. This framework proposes that certain settings (i.e., "counterspaces") facilitate a specific set of processes that promote the well-being of marginalized groups. We argue that an intersectional analysis is crucial to understand whether and how counterspaces achieve these goals. We draw from literature on safe spaces and present a case study of the Michigan Womyn's Music Festival (Michfest) to illustrate the value of an intersectional analysis and explore how these processes operate. Based on 20 in-person interviews, 23 responses to an online survey, and ethnographic field notes, we show how Michfest was characterized by a particular intersection of identities at the setting level, and intersectional diversity complicated experiences at the individual level. Moreover, intersectional identities provided opportunities for dialogue and change at the setting level, including the creation of counterspaces within counterspaces. Overall, we demonstrate the need to attend to intersectionality in counterspaces, and more broadly in how we conceptualize settings in community psychology. © Society for Community Research and Action 2016.
Knowledge-Based Motion Control of AN Intelligent Mobile Autonomous System
NASA Astrophysics Data System (ADS)
Isik, Can
An Intelligent Mobile Autonomous System (IMAS), which is equipped with vision and low level sensors to cope with unknown obstacles, is modeled as a hierarchy of path planning and motion control. This dissertation concentrates on the lower level of this hierarchy (Pilot) with a knowledge-based controller. The basis of a theory of knowledge-based controllers is established, using the example of the Pilot level motion control of IMAS. In this context, the knowledge-based controller with a linguistic world concept is shown to be adequate for the minimum time control of an autonomous mobile robot motion. The Pilot level motion control of IMAS is approached in the framework of production systems. The three major components of the knowledge-based control that are included here are the hierarchies of the database, the rule base and the rule evaluator. The database, which is the representation of the state of the world, is organized as a semantic network, using a concept of minimal admissible vocabulary. The hierarchy of rule base is derived from the analytical formulation of minimum-time control of IMAS motion. The procedure introduced for rule derivation, which is called analytical model verbalization, utilizes the concept of causalities to describe the system behavior. A realistic analytical system model is developed and the minimum-time motion control in an obstacle strewn environment is decomposed to a hierarchy of motion planning and control. The conditions for the validity of the hierarchical problem decomposition are established, and the consistency of operation is maintained by detecting the long term conflicting decisions of the levels of the hierarchy. The imprecision in the world description is modeled using the theory of fuzzy sets. The method developed for the choice of the rule that prescribes the minimum-time motion control among the redundant set of applicable rules is explained and the usage of fuzzy set operators is justified. Also included in the dissertation are the description of the computer simulation of Pilot within the hierarchy of IMAS control and the simulated experiments that demonstrate the theoretical work.
Phillips, Jane; Morgan, Susan; Cawthorne, Karen; Barnett, Bryanne
2008-08-01
Parent-child interaction therapy (PCIT) is a short-term, evidence-based parent training intervention used widely in the treatment of behaviourally disordered preschool-aged children. Outcome studies have shown PCIT to be associated with lasting improvements in child and sibling behaviours and in the interactional styles, stress levels, confidence, and psychological functioning of parents. To date, however, all outcome studies have been conducted in university research clinic settings, and therefore understanding about the effectiveness of PCIT applied in a real-world setting has been limited. The present study evaluated the effectiveness of PCIT delivered to families in an Australian community-based early childhood clinic. Participants included 43 families with children aged 19-52 months who were referred for treatment of disruptive child behaviours and who completed PCIT treatment at the Karitane Toddler Clinic, in Sydney, Australia. Parents provided pre- and post-treatment ratings of child behaviours, parental stress, parental psychopathology and parental attitudes to therapy. At the end of the programme, clinically and statistically significant improvements were seen in child behaviours and parental well-being, and parents reported high levels of satisfaction with treatment. Implications for the implementation of PCIT programmes in community-based settings are discussed and areas of further research are identified.
Closed-field capacitive liquid level sensor
Kronberg, James W.
1998-01-01
A liquid level sensor based on a closed field circuit comprises a ring oscillator using a symmetrical array of plate units that creates a displacement current. The displacement current varies as a function of the proximity of a liquid to the plate units. The ring oscillator circuit produces an output signal with a frequency inversely proportional to the presence of a liquid. A continuous liquid level sensing device and a two point sensing device are both proposed sensing arrangements. A second set of plates may be located inside of the probe housing relative to the sensing plate units. The second set of plates prevent any interference between the sensing plate units.
Closed-field capacitive liquid level sensor
Kronberg, J.W.
1998-03-03
A liquid level sensor based on a closed field circuit comprises a ring oscillator using a symmetrical array of plate units that creates a displacement current. The displacement current varies as a function of the proximity of a liquid to the plate units. The ring oscillator circuit produces an output signal with a frequency inversely proportional to the presence of a liquid. A continuous liquid level sensing device and a two point sensing device are both proposed sensing arrangements. A second set of plates may be located inside of the probe housing relative to the sensing plate units. The second set of plates prevent any interference between the sensing plate units. 12 figs.
Closed-field capacitive liquid level sensor
Kronberg, J.W.
1995-01-01
A liquid level sensor based on a closed field circuit comprises a ring oscillator using a symmetrical array of plate units that creates a displacement current. The displacement current varies as a function of the proximity of a liquid to the plate units. The ring oscillator circuit produces an output signal with a frequency inversely proportional to the presence of a liquid. A continuous liquid level sensing device and a two point sensing device are both proposed sensing arrangements. A second set of plates may be located inside of the probe housing relative to the sensing plate units. The second set of plates prevent any interference between the sensing plate units.
Image-guided regularization level set evolution for MR image segmentation and bias field correction.
Wang, Lingfeng; Pan, Chunhong
2014-01-01
Magnetic resonance (MR) image segmentation is a crucial step in surgical and treatment planning. In this paper, we propose a level-set-based segmentation method for MR images with intensity inhomogeneous problem. To tackle the initialization sensitivity problem, we propose a new image-guided regularization to restrict the level set function. The maximum a posteriori inference is adopted to unify segmentation and bias field correction within a single framework. Under this framework, both the contour prior and the bias field prior are fully used. As a result, the image intensity inhomogeneity can be well solved. Extensive experiments are provided to evaluate the proposed method, showing significant improvements in both segmentation and bias field correction accuracies as compared with other state-of-the-art approaches. Copyright © 2014 Elsevier Inc. All rights reserved.
Atlas-based segmentation of 3D cerebral structures with competitive level sets and fuzzy control.
Ciofolo, Cybèle; Barillot, Christian
2009-06-01
We propose a novel approach for the simultaneous segmentation of multiple structures with competitive level sets driven by fuzzy control. To this end, several contours evolve simultaneously toward previously defined anatomical targets. A fuzzy decision system combines the a priori knowledge provided by an anatomical atlas with the intensity distribution of the image and the relative position of the contours. This combination automatically determines the directional term of the evolution equation of each level set. This leads to a local expansion or contraction of the contours, in order to match the boundaries of their respective targets. Two applications are presented: the segmentation of the brain hemispheres and the cerebellum, and the segmentation of deep internal structures. Experimental results on real magnetic resonance (MR) images are presented, quantitatively assessed and discussed.
A Cartesian Adaptive Level Set Method for Two-Phase Flows
NASA Technical Reports Server (NTRS)
Ham, F.; Young, Y.-N.
2003-01-01
In the present contribution we develop a level set method based on local anisotropic Cartesian adaptation as described in Ham et al. (2002). Such an approach should allow for the smallest possible Cartesian grid capable of resolving a given flow. The remainder of the paper is organized as follows. In section 2 the level set formulation for free surface calculations is presented and its strengths and weaknesses relative to the other free surface methods reviewed. In section 3 the collocated numerical method is described. In section 4 the method is validated by solving the 2D and 3D drop oscilation problem. In section 5 we present some results from more complex cases including the 3D drop breakup in an impulsively accelerated free stream, and the 3D immiscible Rayleigh-Taylor instability. Conclusions are given in section 6.
Periodical capacity setting methods for make-to-order multi-machine production systems
Altendorfer, Klaus; Hübl, Alexander; Jodlbauer, Herbert
2014-01-01
The paper presents different periodical capacity setting methods for make-to-order, multi-machine production systems with stochastic customer required lead times and stochastic processing times to improve service level and tardiness. These methods are developed as decision support when capacity flexibility exists, such as, a certain range of possible working hours a week for example. The methods differ in the amount of information used whereby all are based on the cumulated capacity demand at each machine. In a simulation study the methods’ impact on service level and tardiness is compared to a constant provided capacity for a single and a multi-machine setting. It is shown that the tested capacity setting methods can lead to an increase in service level and a decrease in average tardiness in comparison to a constant provided capacity. The methods using information on processing time and customer required lead time distribution perform best. The results found in this paper can help practitioners to make efficient use of their flexible capacity. PMID:27226649
Habibi, Narjeskhatoon; Norouzi, Alireza; Mohd Hashim, Siti Z; Shamsir, Mohd Shahir; Samian, Razip
2015-11-01
Recombinant protein overexpression, an important biotechnological process, is ruled by complex biological rules which are mostly unknown, is in need of an intelligent algorithm so as to avoid resource-intensive lab-based trial and error experiments in order to determine the expression level of the recombinant protein. The purpose of this study is to propose a predictive model to estimate the level of recombinant protein overexpression for the first time in the literature using a machine learning approach based on the sequence, expression vector, and expression host. The expression host was confined to Escherichia coli which is the most popular bacterial host to overexpress recombinant proteins. To provide a handle to the problem, the overexpression level was categorized as low, medium and high. A set of features which were likely to affect the overexpression level was generated based on the known facts (e.g. gene length) and knowledge gathered from related literature. Then, a representative sub-set of features generated in the previous objective was determined using feature selection techniques. Finally a predictive model was developed using random forest classifier which was able to adequately classify the multi-class imbalanced small dataset constructed. The result showed that the predictive model provided a promising accuracy of 80% on average, in estimating the overexpression level of a recombinant protein. Copyright © 2015 Elsevier Ltd. All rights reserved.
Bos, Peter Martinus Jozef; Zeilmaker, Marco Jacob; van Eijkeren, Jan Cornelis Henri
2006-06-01
Acute exposure guideline levels (AEGLs) are derived to protect the human population from adverse health effects in case of single exposure due to an accidental release of chemicals into the atmosphere. AEGLs are set at three different levels of increasing toxicity for exposure durations ranging from 10 min to 8 h. In the AEGL setting for methylene chloride, specific additional topics had to be addressed. This included a change of relevant toxicity endpoint within the 10-min to 8-h exposure time range from central nervous system depression caused by the parent compound to formation of carboxyhemoglobin (COHb) via biotransformation to carbon monoxide. Additionally, the biotransformation of methylene chloride includes both a saturable step as well as genetic polymorphism of the glutathione transferase involved. Physiologically based pharmacokinetic modeling was considered to be the appropriate tool to address all these topics in an adequate way. Two available PBPK models were combined and extended with additional algorithms for the estimation of the maximum COHb levels. The model was validated and verified with data obtained from volunteer studies. It was concluded that all the mentioned topics could be adequately accounted for by the PBPK model. The AEGL values as calculated with the model were substantiated by experimental data with volunteers and are concluded to be practically applicable.
Comparing supervised learning techniques on the task of physical activity recognition.
Dalton, A; OLaighin, G
2013-01-01
The objective of this study was to compare the performance of base-level and meta-level classifiers on the task of physical activity recognition. Five wireless kinematic sensors were attached to each subject (n = 25) while they completed a range of basic physical activities in a controlled laboratory setting. Subjects were then asked to carry out similar self-annotated physical activities in a random order and in an unsupervised environment. A combination of time-domain and frequency-domain features were extracted from the sensor data including the first four central moments, zero-crossing rate, average magnitude, sensor cross-correlation, sensor auto-correlation, spectral entropy and dominant frequency components. A reduced feature set was generated using a wrapper subset evaluation technique with a linear forward search and this feature set was employed for classifier comparison. The meta-level classifier AdaBoostM1 with C4.5 Graft as its base-level classifier achieved an overall accuracy of 95%. Equal sized datasets of subject independent data and subject dependent data were used to train this classifier and high recognition rates could be achieved without the need for user specific training. Furthermore, it was found that an accuracy of 88% could be achieved using data from the ankle and wrist sensors only.
Analysis of pressure distortion testing
NASA Technical Reports Server (NTRS)
Koch, K. E.; Rees, R. L.
1976-01-01
The development of a distortion methodology, method D, was documented, and its application to steady state and unsteady data was demonstrated. Three methodologies based upon DIDENT, a NASA-LeRC distortion methodology based upon the parallel compressor model, were investigated by applying them to a set of steady state data. The best formulation was then applied to an independent data set. The good correlation achieved with this data set showed that method E, one of the above methodologies, is a viable concept. Unsteady data were analyzed by using the method E methodology. This analysis pointed out that the method E sensitivities are functions of pressure defect level as well as corrected speed and pattern.
Geodynamics branch data base for main magnetic field analysis
NASA Technical Reports Server (NTRS)
Langel, Robert A.; Baldwin, R. T.
1991-01-01
The data sets used in geomagnetic field modeling at GSFC are described. Data are measured and obtained from a variety of information and sources. For clarity, data sets from different sources are categorized and processed separately. The data base is composed of magnetic observatory data, surface data, high quality aeromagnetic, high quality total intensity marine data, satellite data, and repeat data. These individual data categories are described in detail in a series of notebooks in the Geodynamics Branch, GSFC. This catalog reviews the original data sets, the processing history, and the final data sets available for each individual category of the data base and is to be used as a reference manual for the notebooks. Each data type used in geomagnetic field modeling has varying levels of complexity requiring specialized processing routines for satellite and observatory data and two general routines for processing aeromagnetic, marine, land survey, and repeat data.
Raccanello, Daniela; Brondino, Margherita; De Bernardi, Bianca
2013-12-01
The present work investigates students' representation of achievement emotions, focusing in context-specific situations in terms of settings and subject-domains, as a function of grade level. We involved 527 fourth-, seventh-, and eleventh-graders, who evaluated ten discrete emotions through questionnaires, with reference to verbal language and mathematics, and different settings (class, homework, tests). Confirmatory multitrait-multimethod analyses indicated higher salience of subject-domains rather than settings for all the emotions; however, complexity of reality was best explained when also settings were accounted for. Analyses of variance revealed higher intensity of positive emotions for younger students, and the opposite pattern for older students; significant differences for most of the emotions based on the evaluative nature of settings, moderated by class levels; more intense positive emotions for mathematics and more intense negative emotions for Italian. Results are discussed considering their theoretical and applied relevance, corroborating previous literature on domain-specificity. © 2013 The Scandinavian Psychological Associations.
2010-09-01
l ri Laser Splicing / Welding r li i / l i Contact Bonding t t i Wafer Level Bonding Mineralic, Fusion . Anodic, Eutectic, Glass-frit, liquid...28-29 September 2010 SET-171 Mid-IR Fiber Laser Workshop partly sponsored by Tapering and splicing device as well as process control developed...Components Laser based splicing and tapering Multimode fiber (ø720µm) with spliced end cap (ø1500µm) © Fraunhofer IOF 28-29 September 2010 SET-171 Mid-IR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Zhijie; Li, Dongsheng; Xu, Wei
2015-04-01
In atom probe tomography (APT), accurate reconstruction of the spatial positions of field evaporated ions from measured detector patterns depends upon a correct understanding of the dynamic tip shape evolution and evaporation laws of component atoms. Artifacts in APT reconstructions of heterogeneous materials can be attributed to the assumption of homogeneous evaporation of all the elements in the material in addition to the assumption of a steady state hemispherical dynamic tip shape evolution. A level set method based specimen shape evolution model is developed in this study to simulate the evaporation of synthetic layered-structured APT tips. The simulation results ofmore » the shape evolution by the level set model qualitatively agree with the finite element method and the literature data using the finite difference method. The asymmetric evolving shape predicted by the level set model demonstrates the complex evaporation behavior of heterogeneous tip and the interface curvature can potentially lead to the artifacts in the APT reconstruction of such materials. Compared with other APT simulation methods, the new method provides smoother interface representation with the aid of the intrinsic sub-grid accuracy. Two evaporation models (linear and exponential evaporation laws) are implemented in the level set simulations and the effect of evaporation laws on the tip shape evolution is also presented.« less
Natural concepts in a juvenile gorilla (gorilla gorilla gorilla) at three levels of abstraction.
Vonk, Jennifer; MacDonald, Suzanne E
2002-01-01
The extent to which nonhumans are able to form conceptual versus perceptual discriminations remains a matter of debate. Among the great apes, only chimpanzees have been tested for conceptual understanding, defined as the ability to form discriminations not based solely on simple perceptual features of stimuli, and to transfer this learning to novel stimuli. In the present investigation, a young captive female gorilla was trained at three levels of abstraction (concrete, intermediate, and abstract) involving sets of photographs representing natural categories (e.g., orangutans vs. humans, primates vs. nonprimate animals, animals vs. foods). Within each level of abstraction, when the gorilla had learned to discriminate positive from negative exemplars in one set of photographs, a novel set was introduced. Transfer was defined in terms of high accuracy during the first two sessions with the new stimuli. The gorilla acquired discriminations at all three levels of abstraction but showed unambiguous transfer only with the concrete and abstract stimulus sets. Detailed analyses of response patterns revealed little evidence of control by simple stimulus features. Acquisition and transfer involving abstract stimulus sets suggest a conceptual basis for gorilla categorization. The gorilla's relatively poor performance with intermediate-level discriminations parallels findings with pigeons, and suggests a need to reconsider the role of perceptual information in discriminations thought to indicate conceptual behavior in nonhumans. PMID:12507006
Discrete Ordinate Quadrature Selection for Reactor-based Eigenvalue Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarrell, Joshua J; Evans, Thomas M; Davidson, Gregory G
2013-01-01
In this paper we analyze the effect of various quadrature sets on the eigenvalues of several reactor-based problems, including a two-dimensional (2D) fuel pin, a 2D lattice of fuel pins, and a three-dimensional (3D) reactor core problem. While many quadrature sets have been applied to neutral particle discrete ordinate transport calculations, the Level Symmetric (LS) and the Gauss-Chebyshev product (GC) sets are the most widely used in production-level reactor simulations. Other quadrature sets, such as Quadruple Range (QR) sets, have been shown to be more accurate in shielding applications. In this paper, we compare the LS, GC, QR, and themore » recently developed linear-discontinuous finite element (LDFE) sets, as well as give a brief overview of other proposed quadrature sets. We show that, for a given number of angles, the QR sets are more accurate than the LS and GC in all types of reactor problems analyzed (2D and 3D). We also show that the LDFE sets are more accurate than the LS and GC sets for these problems. We conclude that, for problems where tens to hundreds of quadrature points (directions) per octant are appropriate, QR sets should regularly be used because they have similar integration properties as the LS and GC sets, have no noticeable impact on the speed of convergence of the solution when compared with other quadrature sets, and yield more accurate results. We note that, for very high-order scattering problems, the QR sets exactly integrate fewer angular flux moments over the unit sphere than the GC sets. The effects of those inexact integrations have yet to be analyzed. We also note that the LDFE sets only exactly integrate the zeroth and first angular flux moments. Pin power comparisons and analyses are not included in this paper and are left for future work.« less
Discrete ordinate quadrature selection for reactor-based Eigenvalue problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarrell, J. J.; Evans, T. M.; Davidson, G. G.
2013-07-01
In this paper we analyze the effect of various quadrature sets on the eigenvalues of several reactor-based problems, including a two-dimensional (2D) fuel pin, a 2D lattice of fuel pins, and a three-dimensional (3D) reactor core problem. While many quadrature sets have been applied to neutral particle discrete ordinate transport calculations, the Level Symmetric (LS) and the Gauss-Chebyshev product (GC) sets are the most widely used in production-level reactor simulations. Other quadrature sets, such as Quadruple Range (QR) sets, have been shown to be more accurate in shielding applications. In this paper, we compare the LS, GC, QR, and themore » recently developed linear-discontinuous finite element (LDFE) sets, as well as give a brief overview of other proposed quadrature sets. We show that, for a given number of angles, the QR sets are more accurate than the LS and GC in all types of reactor problems analyzed (2D and 3D). We also show that the LDFE sets are more accurate than the LS and GC sets for these problems. We conclude that, for problems where tens to hundreds of quadrature points (directions) per octant are appropriate, QR sets should regularly be used because they have similar integration properties as the LS and GC sets, have no noticeable impact on the speed of convergence of the solution when compared with other quadrature sets, and yield more accurate results. We note that, for very high-order scattering problems, the QR sets exactly integrate fewer angular flux moments over the unit sphere than the GC sets. The effects of those inexact integrations have yet to be analyzed. We also note that the LDFE sets only exactly integrate the zeroth and first angular flux moments. Pin power comparisons and analyses are not included in this paper and are left for future work. (authors)« less
Gademan, Maaike G J; Hofstede, Stefanie N; Vliet Vlieland, Thea P M; Nelissen, Rob G H H; Marang-van de Mheen, Perla J
2016-11-09
This systematic review gives an overview of guidelines and original publications as well as the evidence on which the currently proposed indication criteria are based. Until now such a state-of-the-science overview was lacking. Websites of orthopaedic and arthritis organizations (English/Dutch language) were independently searched by two authors for THA/TKA guidelines for OA. Furthermore, a systematic search strategy in several databases through August 2014 was performed. Quality of the guidelines was assessed with the AGREE II instrument, which consists of 6 domains (maximum summed score of 6 indicating high quality). Also, the level of evidence of all included studies was assessed. We found 6 guidelines and 18 papers, out of 3065 references. The quality of the guidelines summed across 6 domains ranged from 0.46 to 4.78. In total, 12 THA, 10 TKA and 2 THA/TKA indication sets were found. Four studies stated that no evidence-based indication criteria are available. Indication criteria concerning THA/TKA consisted of the following domains: pain (in respectively 11 and 10 sets), function (12 and 7 sets), radiological changes (10 and 9 sets), failed conservative therapy (8 and 4 sets) and other indications (6 and 7 sets). Specific cut-off values or ranges were often not stated and the level of evidence was low. The indication criteria for THA/TKA are based on limited evidence. Empirical research is needed, especially regarding domain specific cut-off values or ranges at which the best postoperative outcomes are achieved for patients, taking into account the limited lifespan of a prosthesis.
Wang, Haili; Tso, Victor; Wong, Clarence; Sadowski, Dan; Fedorak, Richard N
2014-03-20
Adenomatous polyps are precursors of colorectal cancer; their detection and removal is the goal of colon cancer screening programs. However, fecal-based methods identify patients with adenomatous polyps with low levels of sensitivity. The aim or this study was to develop a highly accurate, prototypic, proof-of-concept, spot urine-based diagnostic test using metabolomic technology to distinguish persons with adenomatous polyps from those without polyps. Prospective urine and stool samples were collected from 876 participants undergoing colonoscopy examination in a colon cancer screening program, from April 2008 to October 2009 at the University of Alberta. Colonoscopy reference standard identified 633 participants with no colonic polyps and 243 with colonic adenomatous polyps. One-dimensional nuclear magnetic resonance spectra of urine metabolites were analyzed to define a diagnostic metabolomic profile for colonic adenomas. A urine metabolomic diagnostic test for colonic adenomatous polyps was established using 67% of the samples (un-blinded training set) and validated using the other 33% of the samples (blinded testing set). The urine metabolomic diagnostic test's specificity and sensitivity were compared with those of fecal-based tests. Using a two-component, orthogonal, partial least-squares model of the metabolomic profile, the un-blinded training set identified patients with colonic adenomatous polyps with 88.9% sensitivity and 50.2% specificity. Validation using the blinded testing set confirmed sensitivity and specificity values of 82.7% and 51.2%, respectively. Sensitivities of fecal-based tests to identify colonic adenomas ranged from 2.5 to 11.9%. We describe a proof-of-concept spot urine-based metabolomic diagnostic test that identifies patients with colonic adenomatous polyps with a greater level of sensitivity (83%) than fecal-based tests.
ERIC Educational Resources Information Center
Murphy, Gregory J.
2012-01-01
This quantitative study explores the 2010 recommendation of the Educational Funding Advisory Board to consider the Evidence-Based Adequacy model of school funding in Illinois. This school funding model identifies and costs research based practices necessary in a prototypical school and sets funding levels based upon those practices. This study…
Hasson, Henna; Arnetz, Judith E
2008-02-01
The aims of this study were to: (1) compare older people care nursing staff's perceptions of their competence, work strain and work satisfaction in nursing homes and home-based care; and (2) to examine determinants of work satisfaction in both care settings. The shift in older people care from hospitals to community-based facilities and home care has had implications for nursing practice. Lack of competence development, high levels of work strain and low levels of work satisfaction among nursing staff in both care settings have been associated with high turnover. Few studies have compared staff perceptions of their competence and work in nursing homes as opposed to home-based care. A cross-sectional questionnaire survey. Nursing staff perceptions of their competence, work strain, stress and satisfaction were measured by questionnaire in 2003 in two older people care organizations in Sweden. Comparisons of all outcome variables were made between care settings both within and between the two organizations. Multiple regression analysis was used to determine predictors of work satisfaction in home care and nursing homes respectively. In general, staff in home-based care reported significantly less sufficient knowledge compared with staff in nursing homes. However, home care staff experienced significantly less physical and emotional strain compared with staff in nursing homes. Ratings of work-related exhaustion, mental energy and overall work satisfaction did not differ significantly between care settings. In both care settings, work-related exhaustion was the strongest (inverse) predictor of work satisfaction. Future interventions should focus on counteracting work-related exhaustion and improving competence development to improve work satisfaction among older people care nursing staff in both care settings. Relevance to clinical practice. Work-related exhaustion and lack of competence development may have significant negative implications for work satisfaction among older people care nursing staff in both home care and nursing homes.
Battery control system for hybrid vehicle and method for controlling a hybrid vehicle battery
Bockelmann, Thomas R [Battle Creek, MI; Beaty, Kevin D [Kalamazoo, MI; Zou, Zhanijang [Battle Creek, MI; Kang, Xiaosong [Battle Creek, MI
2009-07-21
A battery control system for controlling a state of charge of a hybrid vehicle battery includes a detecting arrangement for determining a vehicle operating state or an intended vehicle operating state and a controller for setting a target state of charge level of the battery based on the vehicle operating state or the intended vehicle operating state. The controller is operable to set a target state of charge level at a first level during a mobile vehicle operating state and at a second level during a stationary vehicle operating state or in anticipation of the vehicle operating in the stationary vehicle operating state. The invention further includes a method for controlling a state of charge of a hybrid vehicle battery.
Krüger, Manuela; Stockinger, Herbert; Krüger, Claudia; Schüssler, Arthur
2009-01-01
* At present, molecular ecological studies of arbuscular mycorrhizal fungi (AMF) are only possible above species level when targeting entire communities. To improve molecular species characterization and to allow species level community analyses in the field, a set of newly designed AMF specific PCR primers was successfully tested. * Nuclear rDNA fragments from diverse phylogenetic AMF lineages were sequenced and analysed to design four primer mixtures, each targeting one binding site in the small subunit (SSU) or large subunit (LSU) rDNA. To allow species resolution, they span a fragment covering the partial SSU, whole internal transcribed spacer (ITS) rDNA region and partial LSU. * The new primers are suitable for specifically amplifying AMF rDNA from material that may be contaminated by other organisms (e.g., samples from pot cultures or the field), characterizing the diversity of AMF species from field samples, and amplifying a SSU-ITS-LSU fragment that allows phylogenetic analyses with species level resolution. * The PCR primers can be used to monitor entire AMF field communities, based on a single rDNA marker region. Their application will improve the base for deep sequencing approaches; moreover, they can be efficiently used as DNA barcoding primers.
A geometric level set model for ultrasounds analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarti, A.; Malladi, R.
We propose a partial differential equation (PDE) for filtering and segmentation of echocardiographic images based on a geometric-driven scheme. The method allows edge-preserving image smoothing and a semi-automatic segmentation of the heart chambers, that regularizes the shapes and improves edge fidelity especially in presence of distinct gaps in the edge map as is common in ultrasound imagery. A numerical scheme for solving the proposed PDE is borrowed from level set methods. Results on human in vivo acquired 2D, 2D+time,3D, 3D+time echocardiographic images are shown.
ERIC Educational Resources Information Center
Agbuga, Bulent
2011-01-01
Most studies focusing on the relationship between physical activity and obesity have been conducted in middle class Caucasian adults and children and few such studies are available concerning minority children in physical activity settings (Johnson, Kulinna, Tudor-Locke, Darst, & Pangrazi, 2007; Rowlands et al., 1999; Tudor-Locke, Lee, Morgan,…
Zhang, Zhi-Feng; Gao, Zhan; Liu, Yuan-Yuan; Jiang, Feng-Chun; Yang, Yan-Li; Ren, Yu-Fen; Yang, Hong-Jun; Yang, Kun; Zhang, Xiao-Dong
2012-01-01
Train wheel sets must be periodically inspected for possible or actual premature failures and it is very significant to record the wear history for the full life of utilization of wheel sets. This means that an online measuring system could be of great benefit to overall process control. An online non-contact method for measuring a wheel set's geometric parameters based on the opto-electronic measuring technique is presented in this paper. A charge coupled device (CCD) camera with a selected optical lens and a frame grabber was used to capture the image of the light profile of the wheel set illuminated by a linear laser. The analogue signals of the image were transformed into corresponding digital grey level values. The 'mapping function method' is used to transform an image pixel coordinate to a space coordinate. The images of wheel sets were captured when the train passed through the measuring system. The rim inside thickness and flange thickness were measured and analyzed. The spatial resolution of the whole image capturing system is about 0.33 mm. Theoretic and experimental results show that the online measurement system based on computer vision can meet wheel set measurement requirements.
Data Set for Pathology Reporting of Cutaneous Invasive Melanoma
Judge, Meagan J.; Evans, Alan; Frishberg, David P.; Prieto, Victor G.; Thompson, John F.; Trotter, Martin J.; Walsh, Maureen Y.; Walsh, Noreen M.G.; Ellis, David W.
2013-01-01
An accurate and complete pathology report is critical for the optimal management of cutaneous melanoma patients. Protocols for the pathologic reporting of melanoma have been independently developed by the Royal College of Pathologists of Australasia (RCPA), Royal College of Pathologists (United Kingdom) (RCPath), and College of American Pathologists (CAP). In this study, data sets, checklists, and structured reporting protocols for pathologic examination and reporting of cutaneous melanoma were analyzed by an international panel of melanoma pathologists and clinicians with the aim of developing a common, internationally agreed upon, evidence-based data set. The International Collaboration on Cancer Reporting cutaneous melanoma expert review panel analyzed the existing RCPA, RCPath, and CAP data sets to develop a protocol containing “required” (mandatory/core) and “recommended” (nonmandatory/noncore) elements. Required elements were defined as those that had agreed evidentiary support at National Health and Medical Research Council level III-2 level of evidence or above and that were unanimously agreed upon by the review panel to be essential for the clinical management, staging, or assessment of the prognosis of melanoma or fundamental for pathologic diagnosis. Recommended elements were those considered to be clinically important and recommended for good practice but with lesser degrees of supportive evidence. Sixteen core/required data elements for cutaneous melanoma pathology reports were defined (with an additional 4 core/required elements for specimens received with lymph nodes). Eighteen additional data elements with a lesser level of evidentiary support were included in the recommended data set. Consensus response values (permitted responses) were formulated for each data item. Development and agreement of this evidence-based protocol at an international level was accomplished in a timely and efficient manner, and the processes described herein may facilitate the development of protocols for other tumor types. Widespread utilization of an internationally agreed upon, structured pathology data set for melanoma will lead not only to improved patient management but is a prerequisite for research and for international benchmarking in health care. PMID:24061524
A Coastal Hazards Data Base for the U.S. West Coast (1997) (NDP-043C)
Gomitz, Vivien M. [Columbia Univ., New York, NY (United States); Beaty, Tammy W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Daniels, Richard C. [The University of Tennessee, Knoville, TN (United States)
1997-01-01
This data base integrates point, line, and polygon data for the U.S. West Coast into 0.25 degree latitude by 0.25 degree longitude grid cells and into 1:2,000,000 digitized line segments that can be used by raster or vector geographic information systems (GIS) as well as by non-GIS data bases. Each coastal grid cell and line segment contains data variables from the following seven data sets: elevation, geology, geomorphology, sea-level trends, shoreline displacement (erosion/accretion), tidal ranges, and wave heights. One variable from each data set was classified according to its susceptibility to sea-level rise and/or erosion to form 7 relative risk variables. These risk variables range in value from 1 to 5 and may be used to calculate a Coastal Vulnerability Index (CVI). Algorithms used to calculate several CVIs are listed within this text.
ERIC Educational Resources Information Center
Larkin, Wallace; Hawkins, Renee O.; Collins, Tai
2016-01-01
Functional behavior assessments and function-based interventions are effective methods for addressing the challenging behaviors of children; however, traditional functional analysis has limitations that impact usability in applied settings. Trial-based functional analysis addresses concerns relating to the length of time, level of expertise…
12 CFR 652.65 - Risk-based capital stress test.
Code of Federal Regulations, 2014 CFR
2014-01-01
... defaulted loans in the data set (20.9 percent). (3) You will calculate losses by multiplying the loss rate...) Data requirements. You will use the following data to implement the risk-based capital stress test. (1) You will use Corporation loan-level data to implement the credit risk component of the risk-based...
12 CFR 652.65 - Risk-based capital stress test.
Code of Federal Regulations, 2013 CFR
2013-01-01
... defaulted loans in the data set (20.9 percent). (3) You will calculate losses by multiplying the loss rate...) Data requirements. You will use the following data to implement the risk-based capital stress test. (1) You will use Corporation loan-level data to implement the credit risk component of the risk-based...
12 CFR 652.65 - Risk-based capital stress test.
Code of Federal Regulations, 2012 CFR
2012-01-01
... defaulted loans in the data set (20.9 percent). (3) You will calculate losses by multiplying the loss rate...) Data requirements. You will use the following data to implement the risk-based capital stress test. (1) You will use Corporation loan-level data to implement the credit risk component of the risk-based...
Liam, Chong-Kin; Pang, Yong-Kek; Chua, Keong-Tiong
2014-06-01
To evaluate Malaysian patients' satisfaction levels and asthma control with Symbicort SMART® in the primary care setting. This is a cross-sectional, multicentre study involving adult patients with persistent asthma who were prescribed only Symbicort SMART in the preceding one month prior to recruitment. Patients' satisfaction with Symbicort SMART and asthma control were evaluated using the self-administered Satisfaction with Asthma Treatment Questionnaire (SATQ) and the Asthma Control Test (ACT). Asthma was controlled (ACT score >20) in 189 (83%) of 228 patients. The mean overall SATQ score for patients with controlled asthma was 5.65 indicating a high satisfaction level, which was positively correlated with high ACT scores. There were differences in asthma control based on ethnicity, number of unscheduled visits and treatment compliance. Symbicort SMART resulted in a high satisfaction level and asthma control among Malaysian patients treated in the primary care setting and it is an effective and appealing treatment for asthmatic patients.
3D Segmentation with an application of level set-method using MRI volumes for image guided surgery.
Bosnjak, A; Montilla, G; Villegas, R; Jara, I
2007-01-01
This paper proposes an innovation in the application for image guided surgery using a comparative study of three different method of segmentation. This segmentation method is faster than the manual segmentation of images, with the advantage that it allows to use the same patient as anatomical reference, which has more precision than a generic atlas. This new methodology for 3D information extraction is based on a processing chain structured of the following modules: 1) 3D Filtering: the purpose is to preserve the contours of the structures and to smooth the homogeneous areas; several filters were tested and finally an anisotropic diffusion filter was used. 2) 3D Segmentation. This module compares three different methods: Region growing Algorithm, Cubic spline hand assisted, and Level Set Method. It then proposes a Level Set-based on the front propagation method that allows the making of the reconstruction of the internal walls of the anatomical structures of the brain. 3) 3D visualization. The new contribution of this work consists on the visualization of the segmented model and its use in the pre-surgery planning.
Robust space-time extraction of ventricular surface evolution using multiphase level sets
NASA Astrophysics Data System (ADS)
Drapaca, Corina S.; Cardenas, Valerie; Studholme, Colin
2004-05-01
This paper focuses on the problem of accurately extracting the CSF-tissue boundary, particularly around the ventricular surface, from serial structural MRI of the brain acquired in imaging studies of aging and dementia. This is a challenging problem because of the common occurrence of peri-ventricular lesions which locally alter the appearance of white matter. We examine a level set approach which evolves a four dimensional description of the ventricular surface over time. This has the advantage of allowing constraints on the contour in the temporal dimension, improving the consistency of the extracted object over time. We follow the approach proposed by Chan and Vese which is based on the Mumford and Shah model and implemented using the Osher and Sethian level set method. We have extended this to the 4 dimensional case to propagate a 4D contour toward the tissue boundaries through the evolution of a 5D implicit function. For convergence we use region-based information provided by the image rather than the gradient of the image. This is adapted to allow intensity contrast changes between time frames in the MRI sequence. Results on time sequences of 3D brain MR images are presented and discussed.
GSOD Based Daily Global Mean Surface Temperature and Mean Sea Level Air Pressure (1982-2011)
Xuan Shi, Dali Wang
2014-05-05
This data product contains all the gridded data set at 1/4 degree resolution in ASCII format. Both mean temperature and mean sea level air pressure data are available. It also contains the GSOD data (1982-2011) from NOAA site, contains station number, location, temperature and pressures (sea level and station level). The data package also contains information related to the data processing methods
Paths to Upper Level Positions in Public Relations
ERIC Educational Resources Information Center
Bishop, Patrick J.
2010-01-01
Preparation for a career in the field of public relations (PR) is based on a set of unique core competencies typically found in liberal arts. Though PR professionals rarely gain business degrees, they acquire knowledge, skills, perspectives, and strategies well-suited to executive-level positions in business. Additionally, managerial positions in…
Comparison of data used for setting occupational exposure limits.
Schenk, Linda
2010-01-01
It has previously been shown that occupational exposure limits (OELs) for the same substance can vary significantly between different standard-setters. The work presented in this paper identifies the steps in the process towards establishing an OEL and how variations in those processes could account for these differences. This study selects for further scrutiny substances for which the level of OELs vary by a factor of 100, focussing on 45 documents concerning 14 substances from eight standard-setters. Several of the OELs studied were more than 20 years old and based on outdated knowledge. Furthermore, different standard-setters sometimes based their OELs on different sets of data, and data availability alone could not explain all differences in the selection of data sets used by standard-setters. While the interpretation of key studies did not differ significantly in standard-setters' documentations, the evaluations of the key studies' quality did. Also, differences concerning the critical effect coincided with differences in the level of OELs for half of the substances.
Mid-Holocene hydrologic model of the Shingobee watershed, Minnesota
Filby, S.K.; Locke, Sharon M.; Person, M.A.; Winter, T.C.; Rosenberry, D.O.; Nieber, J.L.; Gutowski, W.J.; Ito, E.
2002-01-01
A hydrologifc model of the Shingobee Watershed in north-central Minnesota was developed to reconstruct mid-Holocene paleo-lake levels for Williams Lake, a surface-water body located in the southern portion of the watershed. Hydrologic parameters for the model were first estimated in a calibration exercise using a 9-yr historical record (1990-1998) of climatic and hydrologic stresses. The model reproduced observed temporal and spatial trends in surface/groundwater levels across the watershed. Mid-Holocene aquifer and lake levels were then reconstructed using two paleoclimatic data sets: CCM1 atmospheric general circulation model output and pollen-transfer functions using sediment core data from Williams Lake. Calculated paleo-lake levels based on pollen-derived paleoclimatic reconstructions indicated a 3.5-m drop in simulated lake levels and were in good agreement with the position of mid-Holocene beach sands observed in a Williams Lake sediment core transect. However, calculated paleolake levels based on CCM1 climate forcing produced only a 0.05-m drop in lake levels. We found that decreases in winter precipitation rather than temperature increases had the largest effect on simulated mid-Holocene lake levels. The study illustrates how watershed models can be used to critically evaluate paleoclimatic reconstructions by integrating geologic, climatic, limnologic, and hydrogeologic data sets. ?? 2002 University of Washington.
A High-Level Language for Modeling Algorithms and Their Properties
NASA Astrophysics Data System (ADS)
Akhtar, Sabina; Merz, Stephan; Quinson, Martin
Designers of concurrent and distributed algorithms usually express them using pseudo-code. In contrast, most verification techniques are based on more mathematically-oriented formalisms such as state transition systems. This conceptual gap contributes to hinder the use of formal verification techniques. Leslie Lamport introduced PlusCal, a high-level algorithmic language that has the "look and feel" of pseudo-code, but is equipped with a precise semantics and includes a high-level expression language based on set theory. PlusCal models can be compiled to TLA + and verified using the model checker tlc.
A comparative study of two hazard handling training methods for novice drivers.
Wang, Y B; Zhang, W; Salvendy, G
2010-10-01
The effectiveness of two hazard perception training methods, simulation-based error training (SET) and video-based guided error training (VGET), for novice drivers' hazard handling performance was tested, compared, and analyzed. Thirty-two novice drivers participated in the hazard perception training. Half of the participants were trained using SET by making errors and/or experiencing accidents while driving with a desktop simulator. The other half were trained using VGET by watching prerecorded video clips of errors and accidents that were made by other people. The two groups had exposure to equal numbers of errors for each training scenario. All the participants were tested and evaluated for hazard handling on a full cockpit driving simulator one week after training. Hazard handling performance and hazard response were measured in this transfer test. Both hazard handling performance scores and hazard response distances were significantly better for the SET group than the VGET group. Furthermore, the SET group had more metacognitive activities and intrinsic motivation. SET also seemed more effective in changing participants' confidence, but the result did not reach the significance level. SET exhibited a higher training effectiveness of hazard response and handling than VGET in the simulated transfer test. The superiority of SET might benefit from the higher levels of metacognition and intrinsic motivation during training, which was observed in the experiment. Future research should be conducted to assess whether the advantages of error training are still effective under real road conditions.
Kibicho, Jennifer; Pinkerton, Steven D.; Owczarzak, Jill; Mkandawire–Valhmu, Lucy; Kako, Peninnah M.
2016-01-01
Objectives To describe community pharmacists' perceptions on their current role in direct patient care services, an expanded role for pharmacists in providing patient care services, and changes needed to optimally use pharmacists' expertise to provide high-quality direct patient care services to people living with human immunodeficiency virus (HIV) infections. Design Cross-sectional study. Setting Four Midwestern cities in the United States in August through October 2009. Participants 28 community-based pharmacists practicing in 17 pharmacies. Interventions Interviews. Main Outcome Measures Opinions of participants about roles of specialty and nonspecialty pharmacists in caring for patients living with human immunodeficiency virus infections. Results Pharmacists noted that although challenges in our health care system characterized by inaccessible health professionals presented opportunities for a greater pharmacist role, there were missed opportunities for greater level of patient care services in many community-based nonspecialty settings. Many pharmacists in semispecialty and nonspecialty pharmacies expressed a desire for an expanded role in patient care congruent with their pharmacy education and training. Conclusion Structural-level policy changes needed to transform community-based pharmacy settings to patient-centered medical homes include recognizing pharmacists as important players in the multidisciplinary health care team, extending the health information exchange highway to include pharmacist-generated electronic therapeutic records, and realigning financial incentives. Comprehensive policy initiatives are needed to optimize the use of highly trained pharmacists in enhancing the quality of health care to an ever-growing number of Americans with chronic conditions who access care in community-based pharmacy settings. PMID:25575148
Steel shear strength of anchors with stand-off base plates : [technical summary].
DOT National Transportation Integrated Search
2013-09-01
Sign and signal structures are often connected : to concrete foundations through an annular base : plate set on anchor bolts. The plate is leveled : with nuts beneath it and secured with nuts : above it a double-nut connection. In many : in...
Shared Bases of Influence within a College
ERIC Educational Resources Information Center
Simplicio, Joseph S. C.
2009-01-01
This article discusses several strategies for building varied bases of influence within a college setting. These strategies include, providing opportunities for success, advocating and implementing good ideas, interacting with individuals on both the personal and non-personal levels, establishing support among marginal individuals, learning to…
Modelling wildland fire propagation by tracking random fronts
NASA Astrophysics Data System (ADS)
Pagnini, G.; Mentrelli, A.
2013-11-01
Wildland fire propagation is studied in literature by two alternative approaches, namely the reaction-diffusion equation and the level-set method. These two approaches are considered alternative each other because the solution of the reaction-diffusion equation is generally a continuous smooth function that has an exponential decay and an infinite support, while the level-set method, which is a front tracking technique, generates a sharp function with a finite support. However, these two approaches can indeed be considered complementary and reconciled. Turbulent hot-air transport and fire spotting are phenomena with a random character that are extremely important in wildland fire propagation. As a consequence the fire front gets a random character, too. Hence a tracking method for random fronts is needed. In particular, the level-set contourn is here randomized accordingly to the probability density function of the interface particle displacement. Actually, when the level-set method is developed for tracking a front interface with a random motion, the resulting averaged process emerges to be governed by an evolution equation of the reaction-diffusion type. In this reconciled approach, the rate of spread of the fire keeps the same key and characterizing role proper to the level-set approach. The resulting model emerges to be suitable to simulate effects due to turbulent convection as fire flank and backing fire, the faster fire spread because of the actions by hot air pre-heating and by ember landing, and also the fire overcoming a firebreak zone that is a case not resolved by models based on the level-set method. Moreover, from the proposed formulation it follows a correction for the rate of spread formula due to the mean jump-length of firebrands in the downwind direction for the leeward sector of the fireline contour.
Bruggmann, Philip; Litwin, Alain H
2013-08-01
One of the major obstacles to hepatitis C virus (HCV) care in people who inject drugs (PWID) is the lack of treatment settings that are suitably adapted for the needs of this vulnerable population. Nevertheless, HCV treatment has been delivered successfully to PWID through various multidisciplinary models such as community-based clinics, substance abuse treatment clinics, and specialized hospital-based clinics. Models may be integrated in primary care--all under one roof in either addiction care units or general practitioner-based models--or can occur in secondary or tertiary care settings. Additional innovative models include directly observed therapy and peer-based models. A high level of acceptance of the individual life circumstances of PWID rather than rigid exclusion criteria will determine the level of success of any model of HCV management. The impact of highly potent and well-tolerated interferon-free HCV treatment regimens will remain negligible as long as access to therapy cannot be expanded to the most affected risk groups.
Knowledge-based approach to video content classification
NASA Astrophysics Data System (ADS)
Chen, Yu; Wong, Edward K.
2001-01-01
A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.
Knowledge-based approach to video content classification
NASA Astrophysics Data System (ADS)
Chen, Yu; Wong, Edward K.
2000-12-01
A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.
A proposed classification scheme for Ada-based software products
NASA Technical Reports Server (NTRS)
Cernosek, Gary J.
1986-01-01
As the requirements for producing software in the Ada language become a reality for projects such as the Space Station, a great amount of Ada-based program code will begin to emerge. Recognizing the potential for varying levels of quality to result in Ada programs, what is needed is a classification scheme that describes the quality of a software product whose source code exists in Ada form. A 5-level classification scheme is proposed that attempts to decompose this potentially broad spectrum of quality which Ada programs may possess. The number of classes and their corresponding names are not as important as the mere fact that there needs to be some set of criteria from which to evaluate programs existing in Ada. An exact criteria for each class is not presented, nor are any detailed suggestions of how to effectively implement this quality assessment. The idea of Ada-based software classification is introduced and a set of requirements from which to base further research and development is suggested.
NASA Astrophysics Data System (ADS)
Xu, Jing; Wu, Jian; Feng, Daming; Cui, Zhiming
Serious types of vascular diseases such as carotid stenosis, aneurysm and vascular malformation may lead to brain stroke, which are the third leading cause of death and the number one cause of disability. In the clinical practice of diagnosis and treatment of cerebral vascular diseases, how to do effective detection and description of the vascular structure of two-dimensional angiography sequence image that is blood vessel skeleton extraction has been a difficult study for a long time. This paper mainly discussed two-dimensional image of blood vessel skeleton extraction based on the level set method, first do the preprocessing to the DSA image, namely uses anti-concentration diffusion model for the effective enhancement and uses improved Otsu local threshold segmentation technology based on regional division for the image binarization, then vascular skeleton extraction based on GMM (Group marching method) with fast sweeping theory was actualized. Experiments show that our approach not only improved the time complexity, but also make a good extraction results.
Genetic network inference as a series of discrimination tasks.
Kimura, Shuhei; Nakayama, Satoshi; Hatakeyama, Mariko
2009-04-01
Genetic network inference methods based on sets of differential equations generally require a great deal of time, as the equations must be solved many times. To reduce the computational cost, researchers have proposed other methods for inferring genetic networks by solving sets of differential equations only a few times, or even without solving them at all. When we try to obtain reasonable network models using these methods, however, we must estimate the time derivatives of the gene expression levels with great precision. In this study, we propose a new method to overcome the drawbacks of inference methods based on sets of differential equations. Our method infers genetic networks by obtaining classifiers capable of predicting the signs of the derivatives of the gene expression levels. For this purpose, we defined a genetic network inference problem as a series of discrimination tasks, then solved the defined series of discrimination tasks with a linear programming machine. Our experimental results demonstrated that the proposed method is capable of correctly inferring genetic networks, and doing so more than 500 times faster than the other inference methods based on sets of differential equations. Next, we applied our method to actual expression data of the bacterial SOS DNA repair system. And finally, we demonstrated that our approach relates to the inference method based on the S-system model. Though our method provides no estimation of the kinetic parameters, it should be useful for researchers interested only in the network structure of a target system. Supplementary data are available at Bioinformatics online.
Henry, Heather; Naujokas, Marisa F; Attanayake, Chammi; Basta, Nicholas T; Cheng, Zhongqi; Hettiarachchi, Ganga M; Maddaloni, Mark; Schadt, Christopher; Scheckel, Kirk G
2015-08-04
Recently the Centers for Disease Control and Prevention lowered the blood Pb reference value to 5 μg/dL. The lower reference value combined with increased repurposing of postindustrial lands are heightening concerns and driving interest in reducing soil Pb exposures. As a result, regulatory decision makers may lower residential soil screening levels (SSLs), used in setting Pb cleanup levels, to levels that may be difficult to achieve, especially in urban areas. This paper discusses challenges in remediation and bioavailability assessments of Pb in urban soils in the context of lower SSLs and identifies research needs to better address those challenges. Although in situ remediation with phosphate amendments is a viable option, the scope of the problem and conditions in urban settings may necessitate that SSLs be based on bioavailable rather than total Pb concentrations. However, variability in soil composition can influence bioavailability testing and soil amendment effectiveness. More data are urgently needed to better understand this variability and increase confidence in using these approaches in risk-based decision making, particularly in urban areas.
Henry, Heather; Naujokas, Marisa F.; Attanayake, Chammi; ...
2015-07-03
Recently the Centers for Disease Control and Prevention lowered the blood Pb reference value to 5 μg/dL. The lower reference value combined with increased repurposing of postindustrial lands are heightening concerns and driving interest in reducing soil Pb exposures. As a result, regulatory decision makers may lower residential soil screening levels (SSLs), used in setting Pb cleanup levels, to levels that may be difficult to achieve, especially in urban areas. This study discusses challenges in remediation and bioavailability assessments of Pb in urban soils in the context of lower SSLs and identifies research needs to better address those challenges. Althoughmore » in situ remediation with phosphate amendments is a viable option, the scope of the problem and conditions in urban settings may necessitate that SSLs be based on bioavailable rather than total Pb concentrations. However, variability in soil composition can influence bioavailability testing and soil amendment effectiveness. Finally, more data are urgently needed to better understand this variability and increase confidence in using these approaches in risk-based decision making, particularly in urban areas.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henry, Heather; Naujokas, Marisa F.; Attanayake, Chammi
Recently the Centers for Disease Control and Prevention lowered the blood Pb reference value to 5 μg/dL. The lower reference value combined with increased repurposing of postindustrial lands are heightening concerns and driving interest in reducing soil Pb exposures. As a result, regulatory decision makers may lower residential soil screening levels (SSLs), used in setting Pb cleanup levels, to levels that may be difficult to achieve, especially in urban areas. This study discusses challenges in remediation and bioavailability assessments of Pb in urban soils in the context of lower SSLs and identifies research needs to better address those challenges. Althoughmore » in situ remediation with phosphate amendments is a viable option, the scope of the problem and conditions in urban settings may necessitate that SSLs be based on bioavailable rather than total Pb concentrations. However, variability in soil composition can influence bioavailability testing and soil amendment effectiveness. Finally, more data are urgently needed to better understand this variability and increase confidence in using these approaches in risk-based decision making, particularly in urban areas.« less
Level set formulation of two-dimensional Lagrangian vortex detection methods
NASA Astrophysics Data System (ADS)
Hadjighasem, Alireza; Haller, George
2016-10-01
We propose here the use of the variational level set methodology to capture Lagrangian vortex boundaries in 2D unsteady velocity fields. This method reformulates earlier approaches that seek material vortex boundaries as extremum solutions of variational problems. We demonstrate the performance of this technique for two different variational formulations built upon different notions of coherence. The first formulation uses an energy functional that penalizes the deviation of a closed material line from piecewise uniform stretching [Haller and Beron-Vera, J. Fluid Mech. 731, R4 (2013)]. The second energy function is derived for a graph-based approach to vortex boundary detection [Hadjighasem et al., Phys. Rev. E 93, 063107 (2016)]. Our level-set formulation captures an a priori unknown number of vortices simultaneously at relatively low computational cost. We illustrate the approach by identifying vortices from different coherence principles in several examples.
NASA Astrophysics Data System (ADS)
Farahi, Maria; Rabbani, Hossein; Talebi, Ardeshir; Sarrafzadeh, Omid; Ensafi, Shahab
2015-12-01
Visceral Leishmaniasis is a parasitic disease that affects liver, spleen and bone marrow. According to World Health Organization report, definitive diagnosis is possible just by direct observation of the Leishman body in the microscopic image taken from bone marrow samples. We utilize morphological and CV level set method to segment Leishman bodies in digital color microscopic images captured from bone marrow samples. Linear contrast stretching method is used for image enhancement and morphological method is applied to determine the parasite regions and wipe up unwanted objects. Modified global and local CV level set methods are proposed for segmentation and a shape based stopping factor is used to hasten the algorithm. Manual segmentation is considered as ground truth to evaluate the proposed method. This method is tested on 28 samples and achieved 10.90% mean of segmentation error for global model and 9.76% for local model.
Stratton, T D; Dunkin, J W; Juhl, N; Geller, J M
1995-05-01
Researchers have demonstrated repeatedly the importance of the relationship linking job satisfaction to employee retention. In rural areas of the country, where a persistent maldistribution of nurses continues to hamper health care delivery, the potential benefits of bolstering retention via enhancements in job satisfaction are of utmost utility to administrators and providers alike. Data were gathered from a multistate survey of registered nurses (RNs) practicing in rural hospitals, skilled nursing facilities, and community/public health settings (N = 1,647; response rate = 40.3%). The investigators found that the use of tuition reimbursement corresponded significantly with increased levels of job satisfaction among nurses in all three practice environments, as did day care services for nurses in acute care settings. Also, among hospital-based RNs, level of nursing education was found to be a significant factor in the relationship between tuition reimbursement and job satisfaction, with the highest level occurring among diploma-prepared nurses.
ERIC Educational Resources Information Center
Rieckmann, Traci R.; Kovas, Anne E.; Cassidy, Elaine F.; McCarty, Dennis
2011-01-01
State public health authorities are critical to the successful implementation of science based addiction treatment practices by community-based providers. The literature to date, however, lacks examples of state level policy strategies that promote evidence-based practices (EBPs). This mixed-methods study documents changes in two critical…
NASA Astrophysics Data System (ADS)
Ge, Zhouyang; Loiseau, Jean-Christophe; Tammisola, Outi; Brandt, Luca
2018-01-01
Aiming for the simulation of colloidal droplets in microfluidic devices, we present here a numerical method for two-fluid systems subject to surface tension and depletion forces among the suspended droplets. The algorithm is based on an efficient solver for the incompressible two-phase Navier-Stokes equations, and uses a mass-conserving level set method to capture the fluid interface. The four novel ingredients proposed here are, firstly, an interface-correction level set (ICLS) method; global mass conservation is achieved by performing an additional advection near the interface, with a correction velocity obtained by locally solving an algebraic equation, which is easy to implement in both 2D and 3D. Secondly, we report a second-order accurate geometric estimation of the curvature at the interface and, thirdly, the combination of the ghost fluid method with the fast pressure-correction approach enabling an accurate and fast computation even for large density contrasts. Finally, we derive a hydrodynamic model for the interaction forces induced by depletion of surfactant micelles and combine it with a multiple level set approach to study short-range interactions among droplets in the presence of attracting forces.
Core addiction medicine competencies for doctors: An international consultation on training.
Ayu, Astri Parawita; El-Guebaly, Nady; Schellekens, Arnt; De Jong, Cor; Welle-Strand, Gabrielle; Small, William; Wood, Evan; Cullen, Walter; Klimas, Jan
2017-01-01
Despite the high prevalence of substance use disorders, associated comorbidities, and the evidence base upon which to base clinical practice, most health systems have not invested in standardized training of health care providers in addiction medicine. As a result, people with substance use disorders often receive inadequate care, at the cost of quality of life and enormous direct health care costs and indirect societal costs. Therefore, this study was undertaken to assess the views of international scholars, representing different countries, on the core set of addiction medicine competencies that need to be covered in medical education. A total of 13 members of the International Society of 20 Addiction Medicine (ISAM), from 12 different countries (37% response rate), were interviewed over Skype, e-mail survey, or in person at the annual conference. Content analysis was used to analyze interview transcripts, using constant comparison methodology. We identified recommendations related to the core set of the addiction medicine competencies at 3 educational levels: (i) undergraduate, (ii) postgraduate, and (iii) continued medical education (CME). The participants described broad ideas, such as knowledge/skills/attitudes towards addiction to be obtained at undergraduate level, or knowledge of addiction treatment to be acquired at graduate level, as well as specific recommendations, including the need to tailor curriculum to national settings and different specialties. Although it is unclear whether a global curriculum is needed, a consensus on a core set of principles for progression of knowledge, attitudes, and skills in addiction medicine to be developed at each educational level amongst medical graduates would likely have substantial value.
Liang, Shuting; Kegler, Michelle C; Cotter, Megan; Emily, Phillips; Beasley, Derrick; Hermstad, April; Morton, Rentonia; Martinez, Jeremy; Riehman, Kara
2016-08-02
Implementing evidence-based practices (EBPs) to increase cancer screenings in safety net primary care systems has great potential for reducing cancer disparities. Yet there is a gap in understanding the factors and mechanisms that influence EBP implementation within these high-priority systems. Guided by the Consolidated Framework for Implementation Research (CFIR), our study aims to fill this gap with a multiple case study of health care safety net systems that were funded by an American Cancer Society (ACS) grants program to increase breast and colorectal cancer screening rates. The initiative funded 68 safety net systems to increase cancer screening through implementation of evidence-based provider and client-oriented strategies. Data are from a mixed-methods evaluation with nine purposively selected safety net systems. Fifty-two interviews were conducted with project leaders, implementers, and ACS staff. Funded safety net systems were categorized into high-, medium-, and low-performing cases based on the level of EBP implementation. Within- and cross-case analyses were performed to identify CFIR constructs that influenced level of EBP implementation. Of 39 CFIR constructs examined, six distinguished levels of implementation. Two constructs were from the intervention characteristics domain: adaptability and trialability. Three were from the inner setting domain: leadership engagement, tension for change, and access to information and knowledge. Engaging formally appointed internal implementation leaders, from the process domain, also distinguished level of implementation. No constructs from the outer setting or individual characteristics domain differentiated systems by level of implementation. Our study identified a number of influential CFIR constructs and illustrated how they impacted EBP implementation across a variety of safety net systems. Findings may inform future dissemination efforts of EBPs for increasing cancer screening in similar settings. Moreover, our analytic approach is similar to previous case studies using CFIR and hence could facilitate comparisons across studies.
The Study and Design of Adaptive Learning System Based on Fuzzy Set Theory
NASA Astrophysics Data System (ADS)
Jia, Bing; Zhong, Shaochun; Zheng, Tianyang; Liu, Zhiyong
Adaptive learning is an effective way to improve the learning outcomes, that is, the selection of learning content and presentation should be adapted to each learner's learning context, learning levels and learning ability. Adaptive Learning System (ALS) can provide effective support for adaptive learning. This paper proposes a new ALS based on fuzzy set theory. It can effectively estimate the learner's knowledge level by test according to learner's target. Then take the factors of learner's cognitive ability and preference into consideration to achieve self-organization and push plan of knowledge. This paper focuses on the design and implementation of domain model and user model in ALS. Experiments confirmed that the system providing adaptive content can effectively help learners to memory the content and improve their comprehension.
Rule-based navigation control design for autonomous flight
NASA Astrophysics Data System (ADS)
Contreras, Hugo; Bassi, Danilo
2008-04-01
This article depicts a navigation control system design that is based on a set of rules in order to follow a desired trajectory. The full control of the aircraft considered here comprises: a low level stability control loop, based on classic PID controller and the higher level navigation whose main job is to exercise lateral control (course) and altitude control, trying to follow a desired trajectory. The rules and PID gains were adjusted systematically according to the result of flight simulation. In spite of its simplicity, the rule-based navigation control proved to be robust, even with big perturbation, like crossing winds.
Integrated Medical Model (IMM) Optimization Version 4.0 Functional Improvements
NASA Technical Reports Server (NTRS)
Arellano, John; Young, M.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Goodenow, D. A.; Myers, J. G.
2016-01-01
The IMMs ability to assess mission outcome risk levels relative to available resources provides a unique capability to provide guidance on optimal operational medical kit and vehicle resources. Post-processing optimization allows IMM to optimize essential resources to improve a specific model outcome such as maximization of the Crew Health Index (CHI), or minimization of the probability of evacuation (EVAC) or the loss of crew life (LOCL). Mass and or volume constrain the optimized resource set. The IMMs probabilistic simulation uses input data on one hundred medical conditions to simulate medical events that may occur in spaceflight, the resources required to treat those events, and the resulting impact to the mission based on specific crew and mission characteristics. Because IMM version 4.0 provides for partial treatment for medical events, IMM Optimization 4.0 scores resources at the individual resource unit increment level as opposed to the full condition-specific treatment set level, as done in version 3.0. This allows the inclusion of as many resources as possible in the event that an entire set of resources called out for treatment cannot satisfy the constraints. IMM Optimization version 4.0 adds capabilities that increase efficiency by creating multiple resource sets based on differing constraints and priorities, CHI, EVAC, or LOCL. It also provides sets of resources that improve mission-related IMM v4.0 outputs with improved performance compared to the prior optimization. The new optimization represents much improved fidelity that will improve the utility of the IMM 4.0 for decision support.
NASA Astrophysics Data System (ADS)
Varandas, António J. C.
2018-04-01
Because the one-electron basis set limit is difficult to reach in correlated post-Hartree-Fock ab initio calculations, the low-cost route of using methods that extrapolate to the estimated basis set limit attracts immediate interest. The situation is somewhat more satisfactory at the Hartree-Fock level because numerical calculation of the energy is often affordable at nearly converged basis set levels. Still, extrapolation schemes for the Hartree-Fock energy are addressed here, although the focus is on the more slowly convergent and computationally demanding correlation energy. Because they are frequently based on the gold-standard coupled-cluster theory with single, double, and perturbative triple excitations [CCSD(T)], correlated calculations are often affordable only with the smallest basis sets, and hence single-level extrapolations from one raw energy could attain maximum usefulness. This possibility is examined. Whenever possible, this review uses raw data from second-order Møller-Plesset perturbation theory, as well as CCSD, CCSD(T), and multireference configuration interaction methods. Inescapably, the emphasis is on work done by the author's research group. Certain issues in need of further research or review are pinpointed.
Developing an objective evaluation method to estimate diabetes risk in community-based settings.
Kenya, Sonjia; He, Qing; Fullilove, Robert; Kotler, Donald P
2011-05-01
Exercise interventions often aim to affect abdominal obesity and glucose tolerance, two significant risk factors for type 2 diabetes. Because of limited financial and clinical resources in community and university-based environments, intervention effects are often measured with interviews or questionnaires and correlated with weight loss or body fat indicated by body bioimpedence analysis (BIA). However, self-reported assessments are subject to high levels of bias and low levels of reliability. Because obesity and body fat are correlated with diabetes at different levels in various ethnic groups, data reflecting changes in weight or fat do not necessarily indicate changes in diabetes risk. To determine how exercise interventions affect diabetes risk in community and university-based settings, improved evaluation methods are warranted. We compared a noninvasive, objective measurement technique--regional BIA--with whole-body BIA for its ability to assess abdominal obesity and predict glucose tolerance in 39 women. To determine regional BIA's utility in predicting glucose, we tested the association between the regional BIA method and blood glucose levels. Regional BIA estimates of abdominal fat area were significantly correlated (r = 0.554, P < 0.003) with fasting glucose. When waist circumference and family history of diabetes were added to abdominal fat in multiple regression models, the association with glucose increased further (r = 0.701, P < 0.001). Regional BIA estimates of abdominal fat may predict fasting glucose better than whole-body BIA as well as provide an objective assessment of changes in diabetes risk achieved through physical activity interventions in community settings.
An Empirical Comparison of Five Linear Equating Methods for the NEAT Design
ERIC Educational Resources Information Center
Suh, Youngsuk; Mroch, Andrew A.; Kane, Michael T.; Ripkey, Douglas R.
2009-01-01
In this study, a data base containing the responses of 40,000 candidates to 90 multiple-choice questions was used to mimic data sets for 50-item tests under the "nonequivalent groups with anchor test" (NEAT) design. Using these smaller data sets, we evaluated the performance of five linear equating methods for the NEAT design with five levels of…
NASA Technical Reports Server (NTRS)
Ostroff, Aaron J.
1998-01-01
This paper describes a redesigned longitudinal controller that flew on the High-Alpha Research Vehicle (HARV) during calendar years (CY) 1995 and 1996. Linear models are developed for both the modified controller and a baseline controller that was flown in CY 1994. The modified controller was developed with three gain sets for flight evaluation, and several linear analysis results are shown comparing the gain sets. A Neal-Smith flying qualities analysis shows that performance for the low- and medium-gain sets is near the level 1 boundary, depending upon the bandwidth assumed, whereas the high-gain set indicates a sensitivity problem. A newly developed high-alpha Bode envelope criterion indicates that the control system gains may be slightly high, even for the low-gain set. A large motion-base simulator in the United Kingdom was used to evaluate the various controllers. Desired performance, which appeared to be satisfactory for flight, was generally met with both the low- and medium-gain sets. Both the high-gain set and the baseline controller were very sensitive, and it was easy to generate pilot-induced oscillation (PIO) in some of the target-tracking maneuvers. Flight target-tracking results varied from level 1 to level 3 and from no sensitivity to PIO. These results were related to pilot technique and whether actuator rate saturation was encountered.
Learning and Recognition of Clothing Genres From Full-Body Images.
Hidayati, Shintami C; You, Chuang-Wen; Cheng, Wen-Huang; Hua, Kai-Lung
2018-05-01
According to the theory of clothing design, the genres of clothes can be recognized based on a set of visually differentiable style elements, which exhibit salient features of visual appearance and reflect high-level fashion styles for better describing clothing genres. Instead of using less-discriminative low-level features or ambiguous keywords to identify clothing genres, we proposed a novel approach for automatically classifying clothing genres based on the visually differentiable style elements. A set of style elements, that are crucial for recognizing specific visual styles of clothing genres, were identified based on the clothing design theory. In addition, the corresponding salient visual features of each style element were identified and formulated with variables that can be computationally derived with various computer vision algorithms. To evaluate the performance of our algorithm, a dataset containing 3250 full-body shots crawled from popular online stores was built. Recognition results show that our proposed algorithms achieved promising overall precision, recall, and -score of 88.76%, 88.53%, and 88.64% for recognizing upperwear genres, and 88.21%, 88.17%, and 88.19% for recognizing lowerwear genres, respectively. The effectiveness of each style element and its visual features on recognizing clothing genres was demonstrated through a set of experiments involving different sets of style elements or features. In summary, our experimental results demonstrate the effectiveness of the proposed method in clothing genre recognition.
Occupational Home Economics Education Series. Consumer Services. Competency Based Teaching Module.
ERIC Educational Resources Information Center
Lowe, Phyllis; And Others
This module, one of ten competency based modules developed for vocational home economics teachers, is based on a job cluster in consumer services. It is designed for a variety of levels (secondary, post-secondary, adult) in both school and non-school settings. Focusing on the specific job title of consumer advisor, eight competencies are listed…
ERIC Educational Resources Information Center
Koka, Andre
2017-01-01
This study examined the effectiveness of a brief theory-based intervention on muscular strength among adolescents in a physical education setting. The intervention adopted a process-based mental simulation technique. The self-reported frequency of practising for and actual levels of abdominal muscular strength/endurance as one component of…
ERIC Educational Resources Information Center
O'Neill, Sue; Stephenson, Jennifer
2009-01-01
This article examines literature published since 1997 on functional behaviour assessment (FBA) and behaviour intervention plans (BIPs), involving school-based personnel, for children identified as having or being at risk of emotional/behavioural disorder (E/BD) in school settings. Of interest was the level of involvement of school-based personnel…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-17
... regulations require NMFS to set these annual catch levels for the Pacific sardine fishery based on the annual... HG, the primary management target for the fishery, for the current fishing season. The HG is based... Fisheries Science Center and the resulting Pacific sardine biomass estimate of 659,539 mt. Based on the...
ERIC Educational Resources Information Center
Evans, Rhiannon; Murphy, Simon; Scourfield, Jonathan; Turley, Ruth
2017-01-01
Objective: The utilisation of evidence-based health interventions remains a challenge in educational settings. Although driving forward the scientific evidence-base may contribute to the diffusion of such approaches, abstract notions of population-level impact may not be seen as priorities in local. This paper considers the alternative forms of…
ERIC Educational Resources Information Center
Neiworth, Julie J.; Parsons, Richard R.; Hassett, Janice M.
2004-01-01
A preference to novelty paradigm used to study human infants (Quinn, 2002) examined attention to novel animal pictures at subordinate, basic and superordinate levels in tamarins. First, pairs of pictures were presented in phases, starting with a monkey species (subordinate level) and ending with mammal and dinosaur sets (superordinate levels).…
Baur, Kilian; Wolf, Peter; Riener, Robert; Duarte, Jaime E
2017-07-01
Multiplayer environments are thought to increase the training intensity in robot-aided rehabilitation therapy after stroke. We developed a haptic-based environment to investigate the dynamics of two-player training performing time-constrained reaching movements using the ARMin rehabilitation robot. We implemented a challenge level adaptation algorithm that controlled a virtual damping coefficient to reach a desired success rate. We tested the algorithm's effectiveness in regulating the success rate during game play in a simulation with computer-controlled players, in a feasibility study with six unimpaired players, and in a single session with one stroke patient. The algorithm demonstrated its capacity to adjust the damping coefficient to reach three levels of success rate (low [50%], moderate [70%], and high [90%]) during singleplayer and multiplayer training. For the patient - tested in single-player mode at the moderate success rate only - the algorithm showed also promising behavior. Results of the feasibility study showed that to increase the player's willingness to play at a more challenging task condition, the effect of the challenge level adaptation - regardless of being played in single player or multiplayer mode - might be more important than the provision of multiplayer setting alone. Furthermore, the multiplayer setting tends to be a motivating and encouraging therapy component. Based on these results we will optimize and expand the multiplayer training platform and further investigate multiplayer settings in stroke therapy.
Income Levels and Response to Contingency Management for Smoking Cessation.
López-Núñez, Carla; Secades-Villa, Roberto; Peña-Suárez, Elsa; Fernández-Artamendi, Sergio; Weidberg, Sara
2017-06-07
Contingency management (CM) has demonstrated its efficacy in treating many drug addictions, including nicotine. However, one of the most commonly perceived limitations with regard to its dissemination into community settings is whether this protocol could be equally effective for treating patients across different income levels. This study aimed to examine whether individuals' income levels affect treatment success in a cognitive behavioral treatment (CBT) that included a voucher-based CM protocol for smoking cessation. A total of 92 treatment-seeking smokers in a community setting were randomly assigned to a CBT group (N = 49) or to a CBT plus CM group (N = 43). The CM procedure included a voucher program through which smoking abstinence was reinforced on a schedule of escalating magnitude of reinforcement with a reset contingency. We analyzed the impact of self-reported monthly income, alone and in combination with treatment condition, on short-term (treatment retention) and long-term (self-reported number of days of continuous smoking abstinence at 6-month follow-up) results. Income had no effect on treatment retention and continuous abstinence outcomes at 6-month follow-up in either treatment condition. Treatment modality emerged as the only significant predictor of treatment success. Our findings suggest that treatment-seeking smokers from the general population respond equally well to CM regardless of their income levels. The results of this randomized controlled trial support the generalizability of this evidenced-based program into community settings.
Ontology-Based Model Of Firm Competitiveness
NASA Astrophysics Data System (ADS)
Deliyska, Boryana; Stoenchev, Nikolay
2010-10-01
Competitiveness is important characteristics of each business organization (firm, company, corporation etc). It is of great significance for the organization existence and defines evaluation criteria of business success at microeconomical level. Each criterium comprises set of indicators with specific weight coefficients. In the work an ontology-based model of firm competitiveness is presented as a set of several mutually connected ontologies. It would be useful for knowledge structuring, standardization and sharing among experts and software engineers who develop application in the domain. Then the assessment of the competitiveness of various business organizations could be generated more effectively.
Robinson, Thomas N.; Matheson, Donna; Desai, Manisha; Wilson, Darrell M.; Weintraub, Dana L.; Haskell, William L.; McClain, Arianna; McClure, Samuel; Banda, Jorge; Sanders, Lee M.; Haydel, K. Farish; Killen, Joel D.
2013-01-01
Objective To test the effects of a three-year, community-based, multi-component, multi-level, multi-setting (MMM) approach for treating overweight and obese children. Design Two-arm, parallel group, randomized controlled trial with measures at baseline, 12, 24, and 36 months after randomization. Participants Seven through eleven year old, overweight and obese children (BMI ≥ 85th percentile) and their parents/caregivers recruited from community locations in low-income, primarily Latino neighborhoods in Northern California. Interventions Families are randomized to the MMM intervention versus a community health education active-placebo comparison intervention. Interventions last for three years for each participant. The MMM intervention includes a community-based after school team sports program designed specifically for overweight and obese children, a home-based family intervention to reduce screen time, alter the home food/eating environment, and promote self-regulatory skills for eating and activity behavior change, and a primary care behavioral counseling intervention linked to the community and home interventions. The active-placebo comparison intervention includes semi-annual health education home visits, monthly health education newsletters for children and for parents/guardians, and a series of community-based health education events for families. Main Outcome Measure Body mass index trajectory over the three-year study. Secondary outcome measures include waist circumference, triceps skinfold thickness, accelerometer-measured physical activity, 24-hour dietary recalls, screen time and other sedentary behaviors, blood pressure, fasting lipids, glucose, insulin, hemoglobin A1c, C-reactive protein, alanine aminotransferase, and psychosocial measures. Conclusions The Stanford GOALS trial is testing the efficacy of a novel community-based multi-component, multi-level, multi-setting treatment for childhood overweight and obesity in low-income, Latino families. PMID:24028942
Robinson, Thomas N; Matheson, Donna; Desai, Manisha; Wilson, Darrell M; Weintraub, Dana L; Haskell, William L; McClain, Arianna; McClure, Samuel; Banda, Jorge A; Sanders, Lee M; Haydel, K Farish; Killen, Joel D
2013-11-01
To test the effects of a three-year, community-based, multi-component, multi-level, multi-setting (MMM) approach for treating overweight and obese children. Two-arm, parallel group, randomized controlled trial with measures at baseline, 12, 24, and 36 months after randomization. Seven through eleven year old, overweight and obese children (BMI ≥ 85th percentile) and their parents/caregivers recruited from community locations in low-income, primarily Latino neighborhoods in Northern California. Families are randomized to the MMM intervention versus a community health education active-placebo comparison intervention. Interventions last for three years for each participant. The MMM intervention includes a community-based after school team sports program designed specifically for overweight and obese children, a home-based family intervention to reduce screen time, alter the home food/eating environment, and promote self-regulatory skills for eating and activity behavior change, and a primary care behavioral counseling intervention linked to the community and home interventions. The active-placebo comparison intervention includes semi-annual health education home visits, monthly health education newsletters for children and for parents/guardians, and a series of community-based health education events for families. Body mass index trajectory over the three-year study. Secondary outcome measures include waist circumference, triceps skinfold thickness, accelerometer-measured physical activity, 24-hour dietary recalls, screen time and other sedentary behaviors, blood pressure, fasting lipids, glucose, insulin, hemoglobin A1c, C-reactive protein, alanine aminotransferase, and psychosocial measures. The Stanford GOALS trial is testing the efficacy of a novel community-based multi-component, multi-level, multi-setting treatment for childhood overweight and obesity in low-income, Latino families. © 2013 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Briesch, Amy M.; Chafouleas, Sandra M.; Chaffee, Ruth
2017-01-01
Despite recommendations to extend prevention and early intervention related to behavioral health into school settings, limited research has been directed toward understanding how these recommendations have been translated by states into education policies and initiatives. This macro-level information is important toward understanding the…
Setting the Standards; The Importance of the Language Audit.
ERIC Educational Resources Information Center
Greensmith, Catherine
It is proposed that business language training be based on results of a "language audit," designed to identify the skill levels required to carry out specific tasks, propose training solutions, and monitor progress. A British organization has established a framework for assessing five levels of language skill in this context. These standards…
The Impact of Written Comments on Student Achievement.
ERIC Educational Resources Information Center
Freeman, Donald J.; Niemeyer, Roger C.
This study sought to determine if an instructor of a competency-based course who sets performance standards at a comparatively low level might facilitate higher levels of student achievement through the use of written comments on unit posttests. Eighty-eight students in a graduate education course were randomly assigned to one of four experimental…
NASA Astrophysics Data System (ADS)
Nemoto, Mitsutaka; Nomura, Yukihiro; Hanaoka, Shohei; Masutani, Yoshitaka; Yoshikawa, Takeharu; Hayashi, Naoto; Yoshioka, Naoki; Ohtomo, Kuni
Anatomical point landmarks as most primitive anatomical knowledge are useful for medical image understanding. In this study, we propose a detection method for anatomical point landmark based on appearance models, which include gray-level statistical variations at point landmarks and their surrounding area. The models are built based on results of Principal Component Analysis (PCA) of sample data sets. In addition, we employed generative learning method by transforming ROI of sample data. In this study, we evaluated our method with 24 data sets of body trunk CT images and obtained 95.8 ± 7.3 % of the average sensitivity in 28 landmarks.
Hoffman, R.A.; Kothari, S.; Phan, J.H.; Wang, M.D.
2016-01-01
Computational analysis of histopathological whole slide images (WSIs) has emerged as a potential means for improving cancer diagnosis and prognosis. However, an open issue relating to the automated processing of WSIs is the identification of biological regions such as tumor, stroma, and necrotic tissue on the slide. We develop a method for classifying WSI portions (512x512-pixel tiles) into biological regions by (1) extracting a set of 461 image features from each WSI tile, (2) optimizing tile-level prediction models using nested cross-validation on a small (600 tile) manually annotated tile-level training set, and (3) validating the models against a much larger (1.7x106 tile) data set for which ground truth was available on the whole-slide level. We calculated the predicted prevalence of each tissue region and compared this prevalence to the ground truth prevalence for each image in an independent validation set. Results show significant correlation between the predicted (using automated system) and reported biological region prevalences with p < 0.001 for eight of nine cases considered. PMID:27532012
Hoffman, R A; Kothari, S; Phan, J H; Wang, M D
Computational analysis of histopathological whole slide images (WSIs) has emerged as a potential means for improving cancer diagnosis and prognosis. However, an open issue relating to the automated processing of WSIs is the identification of biological regions such as tumor, stroma, and necrotic tissue on the slide. We develop a method for classifying WSI portions (512x512-pixel tiles) into biological regions by (1) extracting a set of 461 image features from each WSI tile, (2) optimizing tile-level prediction models using nested cross-validation on a small (600 tile) manually annotated tile-level training set, and (3) validating the models against a much larger (1.7x10 6 tile) data set for which ground truth was available on the whole-slide level. We calculated the predicted prevalence of each tissue region and compared this prevalence to the ground truth prevalence for each image in an independent validation set. Results show significant correlation between the predicted (using automated system) and reported biological region prevalences with p < 0.001 for eight of nine cases considered.
Bhaumik, Soumyadeep; Rana, Sangeeta; Karimkhani, Chante; Welch, Vivian; Armstrong, Rebecca; Pottie, Kevin; Dellavalle, Robert; Dhakal, Purushottam; Oliver, Sandy; Francis, Damian K; Nasser, Mona; Crowe, Sally; Aksut, Baran; Amico, Roberto D
2015-01-01
A transparent and evidence-based priority-setting process promotes the optimal use of resources to improve health outcomes. Decision-makers and funders have begun to increasingly engage representatives of patients and healthcare consumers to ensure that research becomes more relevant. However, disadvantaged groups and their needs may not be integrated into the priority-setting process since they do not have a "political voice" or are unable to organise into interest groups. Equitable priority-setting methods need to balance patient needs, values, experiences with population-level issues and issues related to the health system.
Sauvé, Jean-François; Beaudry, Charles; Bégin, Denis; Dion, Chantal; Gérin, Michel; Lavoué, Jérôme
2013-05-01
Many construction activities can put workers at risk of breathing silica containing dusts, and there is an important body of literature documenting exposure levels using a task-based strategy. In this study, statistical modeling was used to analyze a data set containing 1466 task-based, personal respirable crystalline silica (RCS) measurements gathered from 46 sources to estimate exposure levels during construction tasks and the effects of determinants of exposure. Monte-Carlo simulation was used to recreate individual exposures from summary parameters, and the statistical modeling involved multimodel inference with Tobit models containing combinations of the following exposure variables: sampling year, sampling duration, construction sector, project type, workspace, ventilation, and controls. Exposure levels by task were predicted based on the median reported duration by activity, the year 1998, absence of source control methods, and an equal distribution of the other determinants of exposure. The model containing all the variables explained 60% of the variability and was identified as the best approximating model. Of the 27 tasks contained in the data set, abrasive blasting, masonry chipping, scabbling concrete, tuck pointing, and tunnel boring had estimated geometric means above 0.1mg m(-3) based on the exposure scenario developed. Water-fed tools and local exhaust ventilation were associated with a reduction of 71 and 69% in exposure levels compared with no controls, respectively. The predictive model developed can be used to estimate RCS concentrations for many construction activities in a wide range of circumstances.
Deriving a Set of Privacy Specific Heuristics for the Assessment of PHRs (Personal Health Records).
Furano, Riccardo F; Kushniruk, Andre; Barnett, Jeff
2017-01-01
With the emergence of personal health record (PHR) platforms becoming more widely available, this research focused on the development of privacy heuristics to assess PHRs regarding privacy. Existing sets of heuristics are typically not application specific and do not address patient-centric privacy as a main concern prior to undergoing PHR procurement. A set of privacy specific heuristics were developed based on a scoping review of the literature. An internet-based commercially available, vendor specific PHR application was evaluated using the derived set of privacy specific heuristics. The proposed set of privacy specific derived heuristics is explored in detail in relation to ISO 29100. The assessment of the internet-based commercially available, vendor specific PHR application indicated numerous violations. These violations were noted within the study. It is argued that the new derived privacy heuristics should be used in addition to Nielsen's well-established set of heuristics. Privacy specific heuristics could be used to assess PHR portal system-level privacy mechanisms in the procurement process of a PHR application and may prove to be a beneficial form of assessment to prevent the selection of a PHR platform with a poor privacy specific interface design.
Representing Micro-Macro Linkages by Actor-Based Dynamic Network Models
Snijders, Tom A.B.; Steglich, Christian E.G.
2014-01-01
Stochastic actor-based models for network dynamics have the primary aim of statistical inference about processes of network change, but may be regarded as a kind of agent-based models. Similar to many other agent-based models, they are based on local rules for actor behavior. Different from many other agent-based models, by including elements of generalized linear statistical models they aim to be realistic detailed representations of network dynamics in empirical data sets. Statistical parallels to micro-macro considerations can be found in the estimation of parameters determining local actor behavior from empirical data, and the assessment of goodness of fit from the correspondence with network-level descriptives. This article studies several network-level consequences of dynamic actor-based models applied to represent cross-sectional network data. Two examples illustrate how network-level characteristics can be obtained as emergent features implied by micro-specifications of actor-based models. PMID:25960578
Chen, Hongda; Zucknick, Manuela; Werner, Simone; Knebel, Phillip; Brenner, Hermann
2015-07-15
Novel noninvasive blood-based screening tests are strongly desirable for early detection of colorectal cancer. We aimed to conduct a head-to-head comparison of the diagnostic performance of 92 plasma-based tumor-associated protein biomarkers for early detection of colorectal cancer in a true screening setting. Among all available 35 carriers of colorectal cancer and a representative sample of 54 men and women free of colorectal neoplasms recruited in a cohort of screening colonoscopy participants in 2005-2012 (N = 5,516), the plasma levels of 92 protein biomarkers were measured. ROC analyses were conducted to evaluate the diagnostic performance. A multimarker algorithm was developed through the Lasso logistic regression model and validated in an independent validation set. The .632+ bootstrap method was used to adjust for the potential overestimation of diagnostic performance. Seventeen protein markers were identified to show statistically significant differences in plasma levels between colorectal cancer cases and controls. The adjusted area under the ROC curves (AUC) of these 17 individual markers ranged from 0.55 to 0.70. An eight-marker classifier was constructed that increased the adjusted AUC to 0.77 [95% confidence interval (CI), 0.59-0.91]. When validating this algorithm in an independent validation set, the AUC was 0.76 (95% CI, 0.65-0.85), and sensitivities at cutoff levels yielding 80% and 90% specificities were 65% (95% CI, 41-80%) and 44% (95% CI, 24-72%), respectively. The identified profile of protein biomarkers could contribute to the development of a powerful multimarker blood-based test for early detection of colorectal cancer. ©2015 American Association for Cancer Research.
Formal verification of a set of memory management units
NASA Technical Reports Server (NTRS)
Schubert, E. Thomas; Levitt, K.; Cohen, Gerald C.
1992-01-01
This document describes the verification of a set of memory management units (MMU). The verification effort demonstrates the use of hierarchical decomposition and abstract theories. The MMUs can be organized into a complexity hierarchy. Each new level in the hierarchy adds a few significant features or modifications to the lower level MMU. The units described include: (1) a page check translation look-aside module (TLM); (2) a page check TLM with supervisor line; (3) a base bounds MMU; (4) a virtual address translation MMU; and (5) a virtual address translation MMU with memory resident segment table.
Kray, Jutta
2006-08-11
Adult age differences in task switching and advance preparation were examined by comparing cue-based and memory-based switching conditions. Task switching was assessed by determining two types of costs that occur at the general (mixing costs) and specific (switching costs) level of switching. Advance preparation was investigated by varying the time interval until the next task (short, middle, very long). Results indicated that the implementation of task sets was different for cue-based switching with random task sequences and memory-based switching with predictable task sequences. Switching costs were strongly reduced under cue-based switching conditions, indicating that task-set cues facilitate the retrieval of the next task. Age differences were found for mixing costs and for switching costs only under cue-based conditions in which older adults showed smaller switching costs than younger adults. It is suggested that older adults adopt a less extreme bias between two tasks than younger adults in situations associated with uncertainty. For cue-based switching with random task sequences, older adults are less engaged in a complete reconfiguration of task sets because of the probability of a further task change. Furthermore, the reduction of switching costs was more pronounced for cue- than memory-based switching for short preparation intervals, whereas the reduction of switch costs was more pronounced for memory- than cue-based switching for longer preparation intervals at least for older adults. Together these findings suggest that the implementation of task sets is functionally different for the two types of task-switching conditions.
Li, Baoyue; Lingsma, Hester F; Steyerberg, Ewout W; Lesaffre, Emmanuel
2011-05-23
Logistic random effects models are a popular tool to analyze multilevel also called hierarchical data with a binary or ordinal outcome. Here, we aim to compare different statistical software implementations of these models. We used individual patient data from 8509 patients in 231 centers with moderate and severe Traumatic Brain Injury (TBI) enrolled in eight Randomized Controlled Trials (RCTs) and three observational studies. We fitted logistic random effects regression models with the 5-point Glasgow Outcome Scale (GOS) as outcome, both dichotomized as well as ordinal, with center and/or trial as random effects, and as covariates age, motor score, pupil reactivity or trial. We then compared the implementations of frequentist and Bayesian methods to estimate the fixed and random effects. Frequentist approaches included R (lme4), Stata (GLLAMM), SAS (GLIMMIX and NLMIXED), MLwiN ([R]IGLS) and MIXOR, Bayesian approaches included WinBUGS, MLwiN (MCMC), R package MCMCglmm and SAS experimental procedure MCMC.Three data sets (the full data set and two sub-datasets) were analysed using basically two logistic random effects models with either one random effect for the center or two random effects for center and trial. For the ordinal outcome in the full data set also a proportional odds model with a random center effect was fitted. The packages gave similar parameter estimates for both the fixed and random effects and for the binary (and ordinal) models for the main study and when based on a relatively large number of level-1 (patient level) data compared to the number of level-2 (hospital level) data. However, when based on relatively sparse data set, i.e. when the numbers of level-1 and level-2 data units were about the same, the frequentist and Bayesian approaches showed somewhat different results. The software implementations differ considerably in flexibility, computation time, and usability. There are also differences in the availability of additional tools for model evaluation, such as diagnostic plots. The experimental SAS (version 9.2) procedure MCMC appeared to be inefficient. On relatively large data sets, the different software implementations of logistic random effects regression models produced similar results. Thus, for a large data set there seems to be no explicit preference (of course if there is no preference from a philosophical point of view) for either a frequentist or Bayesian approach (if based on vague priors). The choice for a particular implementation may largely depend on the desired flexibility, and the usability of the package. For small data sets the random effects variances are difficult to estimate. In the frequentist approaches the MLE of this variance was often estimated zero with a standard error that is either zero or could not be determined, while for Bayesian methods the estimates could depend on the chosen "non-informative" prior of the variance parameter. The starting value for the variance parameter may be also critical for the convergence of the Markov chain.
Computer-Based Practice in Editing.
ERIC Educational Resources Information Center
Cronnell, Bruce
One goal of computer-based instruction in writing is to help students to edit their compositions, particularly those compositions written on a word processor. This can be accomplished by a complete editing program that would contain the full set of mechanics rules--capitalization, punctuation, spelling, usage--appropriate for the grade level of…
Evaluation of a School-Based Teen Obesity Prevention Minimal Intervention
ERIC Educational Resources Information Center
Abood, Doris A.; Black, David R.; Coster, Daniel C.
2008-01-01
Objective: A school-based nutrition education minimal intervention (MI) was evaluated. Design: The design was experimental, with random assignment at the school level. Setting: Seven schools were randomly assigned as experimental, and 7 as delayed-treatment. Participants: The experimental group included 551 teens, and the delayed treatment group…
Diet and Colorectal Cancer Risk: Evaluation of a Nutrition Education Leaflet
ERIC Educational Resources Information Center
Dyer, K. J.; Fearon, K. C. H.; Buckner, K.; Richardson, R. A.
2005-01-01
Objective: To evaluate the effect of a needs-based, nutrition education leaflet on nutritional knowledge. Design: Comparison of nutritional knowledge levels before and after exposure to a nutrition education leaflet. Setting: A regional colorectal out-patient clinic in Edinburgh. Method: A nutrition education leaflet, based on an earlier…
Setting Evidence-Based Language Goals
ERIC Educational Resources Information Center
Goertler, Senta; Kraemer, Angelika; Schenker, Theresa
2016-01-01
The purpose of this project was to identify target language benchmarks for the German program at Michigan State University (MSU) based on national and international guidelines and previous research, to assess language skills across course levels and class sections in the entire German program, and to adjust the language benchmarks as needed based…
Text categorization of biomedical data sets using graph kernels and a controlled vocabulary.
Bleik, Said; Mishra, Meenakshi; Huan, Jun; Song, Min
2013-01-01
Recently, graph representations of text have been showing improved performance over conventional bag-of-words representations in text categorization applications. In this paper, we present a graph-based representation for biomedical articles and use graph kernels to classify those articles into high-level categories. In our representation, common biomedical concepts and semantic relationships are identified with the help of an existing ontology and are used to build a rich graph structure that provides a consistent feature set and preserves additional semantic information that could improve a classifier's performance. We attempt to classify the graphs using both a set-based graph kernel that is capable of dealing with the disconnected nature of the graphs and a simple linear kernel. Finally, we report the results comparing the classification performance of the kernel classifiers to common text-based classifiers.
Wu, Jia-ting; Wang, Jian-qiang; Wang, Jing; Zhang, Hong-yu; Chen, Xiao-hong
2014-01-01
Based on linguistic term sets and hesitant fuzzy sets, the concept of hesitant fuzzy linguistic sets was introduced. The focus of this paper is the multicriteria decision-making (MCDM) problems in which the criteria are in different priority levels and the criteria values take the form of hesitant fuzzy linguistic numbers (HFLNs). A new approach to solving these problems is proposed, which is based on the generalized prioritized aggregation operator of HFLNs. Firstly, the new operations and comparison method for HFLNs are provided and some linguistic scale functions are applied. Subsequently, two prioritized aggregation operators and a generalized prioritized aggregation operator of HFLNs are developed and applied to MCDM problems. Finally, an illustrative example is given to illustrate the effectiveness and feasibility of the proposed method, which are then compared to the existing approach.
Simulation-based planning for theater air warfare
NASA Astrophysics Data System (ADS)
Popken, Douglas A.; Cox, Louis A., Jr.
2004-08-01
Planning for Theatre Air Warfare can be represented as a hierarchy of decisions. At the top level, surviving airframes must be assigned to roles (e.g., Air Defense, Counter Air, Close Air Support, and AAF Suppression) in each time period in response to changing enemy air defense capabilities, remaining targets, and roles of opposing aircraft. At the middle level, aircraft are allocated to specific targets to support their assigned roles. At the lowest level, routing and engagement decisions are made for individual missions. The decisions at each level form a set of time-sequenced Courses of Action taken by opposing forces. This paper introduces a set of simulation-based optimization heuristics operating within this planning hierarchy to optimize allocations of aircraft. The algorithms estimate distributions for stochastic outcomes of the pairs of Red/Blue decisions. Rather than using traditional stochastic dynamic programming to determine optimal strategies, we use an innovative combination of heuristics, simulation-optimization, and mathematical programming. Blue decisions are guided by a stochastic hill-climbing search algorithm while Red decisions are found by optimizing over a continuous representation of the decision space. Stochastic outcomes are then provided by fast, Lanchester-type attrition simulations. This paper summarizes preliminary results from top and middle level models.
ERIC Educational Resources Information Center
Lobelo, Felipe; Garcia de Quevedo, Isabel; Holub, Christina K.; Nagle, Brian J.; Arredondo, Elva M.; Barquera, Simon; Elder, John P.
2013-01-01
Background: Rapidly rising childhood obesity rates constitute a public health priority in Latin America which makes it imperative to develop evidence-based strategies. Schools are a promising setting but to date it is unclear how many school-based obesity interventions have been documented in Latin America and what level of evidence can be…
NASA Astrophysics Data System (ADS)
McLaughlin, Joyce; Renzi, Daniel
2006-04-01
Transient elastography and supersonic imaging are promising new techniques for characterizing the elasticity of soft tissues. Using this method, an 'ultrafast imaging' system (up to 10 000 frames s-1) follows in real time the propagation of a low-frequency shear wave. The displacement of the propagating shear wave is measured as a function of time and space. Here we develop a fast level set based algorithm for finding the shear wave speed from the interior positions of the propagating front. We compare the performance of level curve methods developed here and our previously developed (McLaughlin J and Renzi D 2006 Shear wave speed recovery in transient elastography and supersonic imaging using propagating fronts Inverse Problems 22 681-706) distance methods. We give reconstruction examples from synthetic data and from data obtained from a phantom experiment accomplished by Mathias Fink's group (the Laboratoire Ondes et Acoustique, ESPCI, Université Paris VII).
Inference of combinatorial Boolean rules of synergistic gene sets from cancer microarray datasets.
Park, Inho; Lee, Kwang H; Lee, Doheon
2010-06-15
Gene set analysis has become an important tool for the functional interpretation of high-throughput gene expression datasets. Moreover, pattern analyses based on inferred gene set activities of individual samples have shown the ability to identify more robust disease signatures than individual gene-based pattern analyses. Although a number of approaches have been proposed for gene set-based pattern analysis, the combinatorial influence of deregulated gene sets on disease phenotype classification has not been studied sufficiently. We propose a new approach for inferring combinatorial Boolean rules of gene sets for a better understanding of cancer transcriptome and cancer classification. To reduce the search space of the possible Boolean rules, we identify small groups of gene sets that synergistically contribute to the classification of samples into their corresponding phenotypic groups (such as normal and cancer). We then measure the significance of the candidate Boolean rules derived from each group of gene sets; the level of significance is based on the class entropy of the samples selected in accordance with the rules. By applying the present approach to publicly available prostate cancer datasets, we identified 72 significant Boolean rules. Finally, we discuss several identified Boolean rules, such as the rule of glutathione metabolism (down) and prostaglandin synthesis regulation (down), which are consistent with known prostate cancer biology. Scripts written in Python and R are available at http://biosoft.kaist.ac.kr/~ihpark/. The refined gene sets and the full list of the identified Boolean rules are provided in the Supplementary Material. Supplementary data are available at Bioinformatics online.
Evaluation of a cardiopulmonary resuscitation curriculum in a low resource environment.
Chang, Mary P; Lyon, Camila B; Janiszewski, David; Aksamit, Deborah; Kateh, Francis; Sampson, John
2015-11-07
To evaluate whether a 2-day International Liaison Committee on Resuscitation (ILCOR) Universal Algorithm-based curriculum taught in a tertiary care hospital in Liberia increases local health care provider knowledge and skill comfort level. A combined basic and advanced cardiopulmonary resuscitation (CPR) curriculum was developed for low-resource settings that included lectures and low-fidelity manikin-based simulations. In March 2014, the curriculum was taught to healthcare providers in a tertiary care hospital in Liberia. In a quality assurance review, participants were evaluated for knowledge and comfort levels with resuscitation before and after the workshop. They were also videotaped during simulation sessions and evaluated on standardized performance metrics. Fifty-two hospital staff completed both pre-and post-curriculum surveys. The median score was 45% pre-curriculum and 82% post-curriculum (p<0.00001). The median provider comfort level score was 4 of 5 pre-curriculum and 5 of 5 post-curriculum (p<0.00001). During simulations, 93.2% of participants performed the pulse check within 10 seconds, and 97.7% performed defibrillation within 180 seconds. Clinician knowledge of and comfort level with CPR increased significantly after participating in our curriculum. A CPR curriculum based on lectures and low-fidelity manikin simulations may be an effective way to teach resuscitation in this low-resource setting.
Boet, Sylvain; Bould, M Dylan; Fung, Lillia; Qosa, Haytham; Perrier, Laure; Tavares, Walter; Reeves, Scott; Tricco, Andrea C
2014-06-01
Simulation-based learning is increasingly used by healthcare professionals as a safe method to learn and practice non-technical skills, such as communication and leadership, required for effective crisis resource management (CRM). This systematic review was conducted to gain a better understanding of the impact of simulation-based CRM teaching on transfer of learning to the workplace and subsequent changes in patient outcomes. Studies on CRM, crisis management, crew resource management, teamwork, and simulation published up to September 2012 were searched in MEDLINE(®), EMBASE™, CINAHL, Cochrane Central Register of Controlled Trials, and ERIC. All studies that used simulation-based CRM teaching with outcomes measured at Kirkpatrick Level 3 (transfer of learning to the workplace) or 4 (patient outcome) were included. Studies measuring only learners' reactions or simple learning (Kirkpatrick Level 1 or 2, respectively) were excluded. Two authors independently reviewed all identified titles and abstracts for eligibility. Nine articles were identified as meeting the inclusion criteria. Four studies measured transfer of simulation-based CRM learning into the clinical setting (Kirkpatrick Level 3). In three of these studies, simulation-enhanced CRM training was found significantly more effective than no intervention or didactic teaching. Five studies measured patient outcomes (Kirkpatrick Level 4). Only one of these studies found that simulation-based CRM training made a clearly significant impact on patient mortality. Based on a small number of studies, this systematic review found that CRM skills learned at the simulation centre are transferred to clinical settings, and the acquired CRM skills may translate to improved patient outcomes, including a decrease in mortality.
Computer-aided detection of bladder wall thickening in CT urography (CTU)
NASA Astrophysics Data System (ADS)
Cha, Kenny H.; Hadjiiski, Lubomir M.; Chan, Heang-Ping; Caoili, Elaine M.; Cohan, Richard H.; Weizer, Alon Z.; Gordon, Marshall N.; Samala, Ravi K.
2018-02-01
We are developing a computer-aided detection system for bladder cancer in CT urography (CTU). Bladder wall thickening is a manifestation of bladder cancer and its detection is more challenging than the detection of bladder masses. We first segmented the inner and outer bladder walls using our method that combined deep-learning convolutional neural network with level sets. The non-contrast-enhanced region was separated from the contrast-enhanced region with a maximum-intensity-projection-based method. The non-contrast region was smoothed and gray level threshold was applied to the contrast and non-contrast regions separately to extract the bladder wall and potential lesions. The bladder wall was transformed into a straightened thickness profile, which was analyzed to identify regions of wall thickening candidates. Volume-based features of the wall thickening candidates were analyzed with linear discriminant analysis (LDA) to differentiate bladder wall thickenings from false positives. A data set of 112 patients, 87 with wall thickening and 25 with normal bladders, was collected retrospectively with IRB approval, and split into independent training and test sets. Of the 57 training cases, 44 had bladder wall thickening and 13 were normal. Of the 55 test cases, 43 had wall thickening and 12 were normal. The LDA classifier was trained with the training set and evaluated with the test set. FROC analysis showed that the system achieved sensitivities of 93.2% and 88.4% for the training and test sets, respectively, at 0.5 FPs/case.
Changes in Mobility of Children with Cerebral Palsy over Time and across Environmental Settings
ERIC Educational Resources Information Center
Tieman, Beth L.; Palisano, Robert J.; Gracely, Edward J.; Rosenbaum, Peter L.; Chiarello, Lisa A.; O'Neil, Margaret E.
2004-01-01
This study examined changes in mobility methods of children with cerebral palsy (CP) over time and across environmental settings. Sixty-two children with CP, ages 6-14 years and classified as levels II-IV on the Gross Motor Function Classification System, were randomly selected from a larger data base and followed for three to four years. On each…
Esposito, Fabrizio; Singer, Neomi; Podlipsky, Ilana; Fried, Itzhak; Hendler, Talma; Goebel, Rainer
2013-02-01
Linking regional metabolic changes with fluctuations in the local electromagnetic fields directly on the surface of the human cerebral cortex is of tremendous importance for a better understanding of detailed brain processes. Functional magnetic resonance imaging (fMRI) and intra-cranial electro-encephalography (iEEG) measure two technically unrelated but spatially and temporally complementary sets of functional descriptions of human brain activity. In order to allow fine-grained spatio-temporal human brain mapping at the population-level, an effective comparative framework for the cortex-based inter-subject analysis of iEEG and fMRI data sets is needed. We combined fMRI and iEEG recordings of the same patients with epilepsy during alternated intervals of passive movie viewing and music listening to explore the degree of local spatial correspondence and temporal coupling between blood oxygen level dependent (BOLD) fMRI changes and iEEG spectral power modulations across the cortical surface after cortex-based inter-subject alignment. To this purpose, we applied a simple model of the iEEG activity spread around each electrode location and the cortex-based inter-subject alignment procedure to transform discrete iEEG measurements into cortically distributed group patterns by establishing a fine anatomic correspondence of many iEEG cortical sites across multiple subjects. Our results demonstrate the feasibility of a multi-modal inter-subject cortex-based distributed analysis for combining iEEG and fMRI data sets acquired from multiple subjects with the same experimental paradigm but with different iEEG electrode coverage. The proposed iEEG-fMRI framework allows for improved group statistics in a common anatomical space and preserves the dynamic link between the temporal features of the two modalities. Copyright © 2012 Elsevier Inc. All rights reserved.
A Round Robin evaluation of AMSR-E soil moisture retrievals
NASA Astrophysics Data System (ADS)
Mittelbach, Heidi; Hirschi, Martin; Nicolai-Shaw, Nadine; Gruber, Alexander; Dorigo, Wouter; de Jeu, Richard; Parinussa, Robert; Jones, Lucas A.; Wagner, Wolfgang; Seneviratne, Sonia I.
2014-05-01
Large-scale and long-term soil moisture observations based on remote sensing are promising data sets to investigate and understand various processes of the climate system including the water and biochemical cycles. Currently, the ESA Climate Change Initiative for soil moisture develops and evaluates a consistent global long-term soil moisture data set, which is based on merging passive and active remotely sensed soil moisture. Within this project an inter-comparison of algorithms for AMSR-E and ASCAT Level 2 products was conducted separately to assess the performance of different retrieval algorithms. Here we present the inter-comparison of AMSR-E Level 2 soil moisture products. These include the public data sets from University of Montana (UMT), Japan Aerospace and Space Exploration Agency (JAXA), VU University of Amsterdam (VUA; two algorithms) and National Aeronautics and Space Administration (NASA). All participating algorithms are applied to the same AMSR-E Level 1 data set. Ascending and descending paths of scaled surface soil moisture are considered and evaluated separately in daily and monthly resolution over the 2007-2011 time period. Absolute values of soil moisture as well as their long-term anomalies (i.e. removing the mean seasonal cycle) and short-term anomalies (i.e. removing a five weeks moving average) are evaluated. The evaluation is based on conventional measures like correlation and unbiased root-mean-square differences as well as on the application of the triple collocation method. As reference data set, surface soil moisture of 75 quality controlled soil moisture sites from the International Soil Moisture Network (ISMN) are used, which cover a wide range of vegetation density and climate conditions. For the application of the triple collocation method, surface soil moisture estimates from the Global Land Data Assimilation System are used as third independent data set. We find that the participating algorithms generally display a better performance for the descending compared to the ascending paths. A first classification of the sites defined by geographical locations show that the algorithms have a very similar average performance. Further classifications of the sites by land cover types and climate regions will be conducted which might result in a more diverse performance of the algorithms.
Efficient hyperspectral image segmentation using geometric active contour formulation
NASA Astrophysics Data System (ADS)
Albalooshi, Fatema A.; Sidike, Paheding; Asari, Vijayan K.
2014-10-01
In this paper, we present a new formulation of geometric active contours that embeds the local hyperspectral image information for an accurate object region and boundary extraction. We exploit self-organizing map (SOM) unsupervised neural network to train our model. The segmentation process is achieved by the construction of a level set cost functional, in which, the dynamic variable is the best matching unit (BMU) coming from SOM map. In addition, we use Gaussian filtering to discipline the deviation of the level set functional from a signed distance function and this actually helps to get rid of the re-initialization step that is computationally expensive. By using the properties of the collective computational ability and energy convergence capability of the active control models (ACM) energy functional, our method optimizes the geometric ACM energy functional with lower computational time and smoother level set function. The proposed algorithm starts with feature extraction from raw hyperspectral images. In this step, the principal component analysis (PCA) transformation is employed, and this actually helps in reducing dimensionality and selecting best sets of the significant spectral bands. Then the modified geometric level set functional based ACM is applied on the optimal number of spectral bands determined by the PCA. By introducing local significant spectral band information, our proposed method is capable to force the level set functional to be close to a signed distance function, and therefore considerably remove the need of the expensive re-initialization procedure. To verify the effectiveness of the proposed technique, we use real-life hyperspectral images and test our algorithm in varying textural regions. This framework can be easily adapted to different applications for object segmentation in aerial hyperspectral imagery.
Propellant Readiness Level: A Methodological Approach to Propellant Characterization
NASA Technical Reports Server (NTRS)
Bossard, John A.; Rhys, Noah O.
2010-01-01
A methodological approach to defining propellant characterization is presented. The method is based on the well-established Technology Readiness Level nomenclature. This approach establishes the Propellant Readiness Level as a metric for ascertaining the readiness of a propellant or a propellant combination by evaluating the following set of propellant characteristics: thermodynamic data, toxicity, applications, combustion data, heat transfer data, material compatibility, analytical prediction modeling, injector/chamber geometry, pressurization, ignition, combustion stability, system storability, qualification testing, and flight capability. The methodology is meant to be applicable to all propellants or propellant combinations; liquid, solid, and gaseous propellants as well as monopropellants and propellant combinations are equally served. The functionality of the proposed approach is tested through the evaluation and comparison of an example set of hydrocarbon fuels.
Communicating science in social settings.
Scheufele, Dietram A
2013-08-20
This essay examines the societal dynamics surrounding modern science. It first discusses a number of challenges facing any effort to communicate science in social environments: lay publics with varying levels of preparedness for fully understanding new scientific breakthroughs; the deterioration of traditional media infrastructures; and an increasingly complex set of emerging technologies that are surrounded by a host of ethical, legal, and social considerations. Based on this overview, I discuss four areas in which empirical social science helps clarify intuitive but sometimes faulty assumptions about the social-level mechanisms of science communication and outline an agenda for bench and social scientists--driven by current social-scientific research in the field of science communication--to guide more effective communication efforts at the societal level in the future.
Communicating science in social settings
Scheufele, Dietram A.
2013-01-01
This essay examines the societal dynamics surrounding modern science. It first discusses a number of challenges facing any effort to communicate science in social environments: lay publics with varying levels of preparedness for fully understanding new scientific breakthroughs; the deterioration of traditional media infrastructures; and an increasingly complex set of emerging technologies that are surrounded by a host of ethical, legal, and social considerations. Based on this overview, I discuss four areas in which empirical social science helps clarify intuitive but sometimes faulty assumptions about the social-level mechanisms of science communication and outline an agenda for bench and social scientists—driven by current social-scientific research in the field of science communication—to guide more effective communication efforts at the societal level in the future. PMID:23940341
ERIC Educational Resources Information Center
Lowe, Phyllis; And Others
This module, one of ten competency based modules developed for vocational home economics teachers, is based on a job cluster in the housing management field. It is designed for a variety of levels of learners (secondary, postsecondary, adult) in both school and non-school settings. Focusing on the specific job title of housing management aide,…
ERIC Educational Resources Information Center
Fallon, Lindsay M.; Collier-Meek, Melissa A.; Maggin, Daniel M.; Sanetti, Lisa M. H.; Johnson, Austin H.
2015-01-01
Optimal levels of treatment fidelity, a critical moderator of intervention effectiveness, are often difficult to sustain in applied settings. It is unknown whether performance feedback, a widely researched method for increasing educators' treatment fidelity, is an evidence-based practice. The purpose of this review was to evaluate the current…
ERIC Educational Resources Information Center
Majeika, Caitlyn E.; Walder, Jessica P.; Hubbard, Jessica P.; Steeb, Kelly M.; Ferris, Geoffrey J.; Oakes, Wendy P.; Lane, Kathleen Lynne
2011-01-01
A comprehensive, integrated, three-tiered model (CI3T) of prevention is a framework for proactively meeting students' academic, behavioral, and social skills. At the tertiary (Tier 3) level of prevention, functional-assessment based interventions (FABIs) may be used to identify, develop, and implement supports based on the function, or purpose, of…
The Dynamics of Scaling: A Memory-Based Anchor Model of Category Rating and Absolute Identification
ERIC Educational Resources Information Center
Petrov, Alexander A.; Anderson, John R.
2005-01-01
A memory-based scaling model--ANCHOR--is proposed and tested. The perceived magnitude of the target stimulus is compared with a set of anchors in memory. Anchor selection is probabilistic and sensitive to similarity, base-level strength, and recency. The winning anchor provides a reference point near the target and thereby converts the global…
Using Performance Measures to Allocate Consumable Funding
2007-06-01
The Air Force is now using the Customer Oriented Leveling Technique (COLT) to determine levels for consumable items at its bases. COLT is an...Overview • COLT is a system to set AF retail stock levels for DLA-managed consumable parts to minimize expected customer wait time (ECWT) • COLT...changed please list both.) Original title on 712 A/B: Using Performance Measures to Allocate Consumable Funding If the title was revised
A methodology for modeling barrier island storm-impact scenarios
Mickey, Rangley C.; Long, Joseph W.; Plant, Nathaniel G.; Thompson, David M.; Dalyander, P. Soupy
2017-02-16
A methodology for developing a representative set of storm scenarios based on historical wave buoy and tide gauge data for a region at the Chandeleur Islands, Louisiana, was developed by the U.S. Geological Survey. The total water level was calculated for a 10-year period and analyzed against existing topographic data to identify when storm-induced wave action would affect island morphology. These events were categorized on the basis of the threshold of total water level and duration to create a set of storm scenarios that were simulated, using a high-fidelity, process-based, morphologic evolution model, on an idealized digital elevation model of the Chandeleur Islands. The simulated morphological changes resulting from these scenarios provide a range of impacts that can help coastal managers determine resiliency of proposed or existing coastal structures and identify vulnerable areas within those structures.
Du, Q S; Ma, Y; Xie, N Z; Huang, R B
2014-01-01
In the design of peptide inhibitors the huge possible variety of the peptide sequences is of high concern. In collaboration with the fast accumulation of the peptide experimental data and database, a statistical method is suggested for peptide inhibitor design. In the two-level peptide prediction network (2L-QSAR) one level is the physicochemical properties of amino acids and the other level is the peptide sequence position. The activity contributions of amino acids are the functions of physicochemical properties and the sequence positions. In the prediction equation two weight coefficient sets {ak} and {bl} are assigned to the physicochemical properties and to the sequence positions, respectively. After the two coefficient sets are optimized based on the experimental data of known peptide inhibitors using the iterative double least square (IDLS) procedure, the coefficients are used to evaluate the bioactivities of new designed peptide inhibitors. The two-level prediction network can be applied to the peptide inhibitor design that may aim for different target proteins, or different positions of a protein. A notable advantage of the two-level statistical algorithm is that there is no need for host protein structural information. It may also provide useful insight into the amino acid properties and the roles of sequence positions.
A novel earth observation based ecological indicator for cyanobacterial blooms
NASA Astrophysics Data System (ADS)
Anttila, Saku; Fleming-Lehtinen, Vivi; Attila, Jenni; Junttila, Sofia; Alasalmi, Hanna; Hällfors, Heidi; Kervinen, Mikko; Koponen, Sampsa
2018-02-01
Cyanobacteria form spectacular mass occurrences almost annually in the Baltic Sea. These harmful algal blooms are the most visible consequences of marine eutrophication, driven by a surplus of nutrients from anthropogenic sources and internal processes of the ecosystem. We present a novel Cyanobacterial Bloom Indicator (CyaBI) targeted for the ecosystem assessment of eutrophication in marine areas. The method measures the current cyanobacterial bloom situation (an average condition of recent 5 years) and compares this to the estimated target level for 'good environmental status' (GES). The current status is derived with an index combining indicative bloom event variables. As such we used seasonal information from the duration, volume and severity of algal blooms derived from earth observation (EO) data. The target level for GES was set by using a remote sensing based data set named Fraction with Cyanobacterial Accumulations (FCA; Kahru & Elmgren, 2014) covering years 1979-2014. Here a shift-detection algorithm for time series was applied to detect time-periods in the FCA data where the level of blooms remained low several consecutive years. The average conditions from these time periods were transformed into respective CyaBI target values to represent target level for GES. The indicator is shown to pass the three critical factors set for marine indicator development, namely it measures the current status accurately, the target setting can be scientifically proven and it can be connected to the ecosystem management goal. An advantage of the CyaBI method is that it's not restricted to the data used in the development work, but can be complemented, or fully applied, by using different types of data sources providing information on cyanobacterial accumulations.
A study on the application of topic models to motif finding algorithms.
Basha Gutierrez, Josep; Nakai, Kenta
2016-12-22
Topic models are statistical algorithms which try to discover the structure of a set of documents according to the abstract topics contained in them. Here we try to apply this approach to the discovery of the structure of the transcription factor binding sites (TFBS) contained in a set of biological sequences, which is a fundamental problem in molecular biology research for the understanding of transcriptional regulation. Here we present two methods that make use of topic models for motif finding. First, we developed an algorithm in which first a set of biological sequences are treated as text documents, and the k-mers contained in them as words, to then build a correlated topic model (CTM) and iteratively reduce its perplexity. We also used the perplexity measurement of CTMs to improve our previous algorithm based on a genetic algorithm and several statistical coefficients. The algorithms were tested with 56 data sets from four different species and compared to 14 other methods by the use of several coefficients both at nucleotide and site level. The results of our first approach showed a performance comparable to the other methods studied, especially at site level and in sensitivity scores, in which it scored better than any of the 14 existing tools. In the case of our previous algorithm, the new approach with the addition of the perplexity measurement clearly outperformed all of the other methods in sensitivity, both at nucleotide and site level, and in overall performance at site level. The statistics obtained show that the performance of a motif finding method based on the use of a CTM is satisfying enough to conclude that the application of topic models is a valid method for developing motif finding algorithms. Moreover, the addition of topic models to a previously developed method dramatically increased its performance, suggesting that this combined algorithm can be a useful tool to successfully predict motifs in different kinds of sets of DNA sequences.
The clubhouse as an empowering setting.
Mowbray, Carol T; Lewandowski, Lisa; Holter, Mark; Bybee, Deborah
2006-08-01
Attention to psychosocial rehabilitation (PSR) practice has expanded in recent years. However, social work research studies on PSR are not numerous. This study focuses on operational characteristics of clubhouses, a major PSR program model, and the organizational attributes (including resource levels) that predict the extent to which the clubhouse constitutes an empowering setting. The authors present data from a statewide sample of 30 clubhouses, annually serving nearly 4,000 consumers (adults with serious mental illnesses), based on interviews of clubhouse directors, on-site observations, and government information sources. Results indicate that users were predominantly male, white, and middle age; about one-third had a major functional disability. There were wide variations in member characteristics as well as in resource levels. In terms of empowerment, this sample of clubs averaged rather low levels of member involvement in governance and operations but seemed to provide members with opportunities and assistance in making their own decisions. The empowerment variables had different predictors, including client characteristics, urban-related characteristics, staffing, and resource levels. Implications for social work practice in PSR settings are discussed.
Aggarwal, Rohit; Rider, Lisa G; Ruperto, Nicolino; Bayat, Nastaran; Erman, Brian; Feldman, Brian M; Oddis, Chester V; Amato, Anthony A; Chinoy, Hector; Cooper, Robert G; Dastmalchi, Maryam; Fiorentino, David; Isenberg, David; Katz, James D; Mammen, Andrew; de Visser, Marianne; Ytterberg, Steven R; Lundberg, Ingrid E; Chung, Lorinda; Danko, Katalin; García-De la Torre, Ignacio; Song, Yeong Wook; Villa, Luca; Rinaldi, Mariangela; Rockette, Howard; Lachenbruch, Peter A; Miller, Frederick W; Vencovsky, Jiri
2017-05-01
To develop response criteria for adult dermatomyositis (DM) and polymyositis (PM). Expert surveys, logistic regression, and conjoint analysis were used to develop 287 definitions using core set measures. Myositis experts rated greater improvement among multiple pairwise scenarios in conjoint analysis surveys, where different levels of improvement in 2 core set measures were presented. The PAPRIKA (Potentially All Pairwise Rankings of All Possible Alternatives) method determined the relative weights of core set measures and conjoint analysis definitions. The performance characteristics of the definitions were evaluated on patient profiles using expert consensus (gold standard) and were validated using data from a clinical trial. The nominal group technique was used to reach consensus. Consensus was reached for a conjoint analysis-based continuous model using absolute percent change in core set measures (physician, patient, and extramuscular global activity, muscle strength, Health Assessment Questionnaire, and muscle enzyme levels). A total improvement score (range 0-100), determined by summing scores for each core set measure, was based on improvement in and relative weight of each core set measure. Thresholds for minimal, moderate, and major improvement were ≥20, ≥40, and ≥60 points in the total improvement score. The same criteria were chosen for juvenile DM, with different improvement thresholds. Sensitivity and specificity in DM/PM patient cohorts were 85% and 92%, 90% and 96%, and 92% and 98% for minimal, moderate, and major improvement, respectively. Definitions were validated in the clinical trial analysis for differentiating the physician rating of improvement (P < 0.001). The response criteria for adult DM/PM consisted of the conjoint analysis model based on absolute percent change in 6 core set measures, with thresholds for minimal, moderate, and major improvement. © 2017, American College of Rheumatology.
Development of Education in Iraq during 1974/75 and 1975/76.
ERIC Educational Resources Information Center
Ministry of Education, Baghdad (Iraq).
The document describes educational development in Iraq during the period 1974-76. General principles upon which the educational system is based were set down at a Congress of the Arab Ba'ath Socialist Party in 1974. According to these principles, education is compulsory on the primary level, free on all levels, administered by a re-organized…
ERIC Educational Resources Information Center
Cleary, Timothy J.; Callan, Gregory L.; Malatesta, Jaime; Adams, Tanya
2015-01-01
This study examined the convergent and predictive validity of self-regulated learning (SRL) microanalytic measures. Specifically, theoretically based relations among a set of self-reflection processes, self-efficacy, and achievement were examined as was the level of convergence between a microanalytic strategy measure and a SRL self-report…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-29
... measures based on updated scallop biomass projections. The proposed FY 2013 DAS allocations would be set at a precautionary level (i.e., 75 percent of what current biomass levels project would be the DAS... of what current biomass projections indicate could be allocated to each LA scallop vessel for the...
The Adequacy of the B2 Level as University Entrance Requirement
ERIC Educational Resources Information Center
Carlsen, Cecilie Hamnes
2018-01-01
This article reports on a study of foreign students' success in higher education in Norway and focuses specifically on the relation between academic success and students' proficiency level of Norwegian as measured by a CEFR-based university entrance test. This study is distinguished from prior studies because it sets out to investigate not only…
GPU accelerated edge-region based level set evolution constrained by 2D gray-scale histogram.
Balla-Arabé, Souleymane; Gao, Xinbo; Wang, Bin
2013-07-01
Due to its intrinsic nature which allows to easily handle complex shapes and topological changes, the level set method (LSM) has been widely used in image segmentation. Nevertheless, LSM is computationally expensive, which limits its applications in real-time systems. For this purpose, we propose a new level set algorithm, which uses simultaneously edge, region, and 2D histogram information in order to efficiently segment objects of interest in a given scene. The computational complexity of the proposed LSM is greatly reduced by using the highly parallelizable lattice Boltzmann method (LBM) with a body force to solve the level set equation (LSE). The body force is the link with image data and is defined from the proposed LSE. The proposed LSM is then implemented using an NVIDIA graphics processing units to fully take advantage of the LBM local nature. The new algorithm is effective, robust against noise, independent to the initial contour, fast, and highly parallelizable. The edge and region information enable to detect objects with and without edges, and the 2D histogram information enable the effectiveness of the method in a noisy environment. Experimental results on synthetic and real images demonstrate subjectively and objectively the performance of the proposed method.
NASA Astrophysics Data System (ADS)
Tian, J.; Krauß, T.; d'Angelo, P.
2017-05-01
Automatic rooftop extraction is one of the most challenging problems in remote sensing image analysis. Classical 2D image processing techniques are expensive due to the high amount of features required to locate buildings. This problem can be avoided when 3D information is available. In this paper, we show how to fuse the spectral and height information of stereo imagery to achieve an efficient and robust rooftop extraction. In the first step, the digital terrain model (DTM) and in turn the normalized digital surface model (nDSM) is generated by using a newly step-edge approach. In the second step, the initial building locations and rooftop boundaries are derived by removing the low-level pixels and high-level pixels with higher probability to be trees and shadows. This boundary is then served as the initial level set function, which is further refined to fit the best possible boundaries through distance regularized level-set curve evolution. During the fitting procedure, the edge-based active contour model is adopted and implemented by using the edges indicators extracted from panchromatic image. The performance of the proposed approach is tested by using the WorldView-2 satellite data captured over Munich.
Accuracy of Handheld Blood Glucose Meters at High Altitude
de Vries, Suzanna T.; Fokkert, Marion J.; Dikkeschei, Bert D.; Rienks, Rienk; Bilo, Karin M.; Bilo, Henk J. G.
2010-01-01
Background Due to increasing numbers of people with diabetes taking part in extreme sports (e.g., high-altitude trekking), reliable handheld blood glucose meters (BGMs) are necessary. Accurate blood glucose measurement under extreme conditions is paramount for safe recreation at altitude. Prior studies reported bias in blood glucose measurements using different BGMs at high altitude. We hypothesized that glucose-oxidase based BGMs are more influenced by the lower atmospheric oxygen pressure at altitude than glucose dehydrogenase based BGMs. Methodology/Principal Findings Glucose measurements at simulated altitude of nine BGMs (six glucose dehydrogenase and three glucose oxidase BGMs) were compared to glucose measurement on a similar BGM at sea level and to a laboratory glucose reference method. Venous blood samples of four different glucose levels were used. Moreover, two glucose oxidase and two glucose dehydrogenase based BGMs were evaluated at different altitudes on Mount Kilimanjaro. Accuracy criteria were set at a bias <15% from reference glucose (when >6.5 mmol/L) and <1 mmol/L from reference glucose (when <6.5 mmol/L). No significant difference was observed between measurements at simulated altitude and sea level for either glucose oxidase based BGMs or glucose dehydrogenase based BGMs as a group phenomenon. Two GDH based BGMs did not meet set performance criteria. Most BGMs are generally overestimating true glucose concentration at high altitude. Conclusion At simulated high altitude all tested BGMs, including glucose oxidase based BGMs, did not show influence of low atmospheric oxygen pressure. All BGMs, except for two GDH based BGMs, performed within predefined criteria. At true high altitude one GDH based BGM had best precision and accuracy. PMID:21103399
Ngo, Tuan Anh; Lu, Zhi; Carneiro, Gustavo
2017-01-01
We introduce a new methodology that combines deep learning and level set for the automated segmentation of the left ventricle of the heart from cardiac cine magnetic resonance (MR) data. This combination is relevant for segmentation problems, where the visual object of interest presents large shape and appearance variations, but the annotated training set is small, which is the case for various medical image analysis applications, including the one considered in this paper. In particular, level set methods are based on shape and appearance terms that use small training sets, but present limitations for modelling the visual object variations. Deep learning methods can model such variations using relatively small amounts of annotated training, but they often need to be regularised to produce good generalisation. Therefore, the combination of these methods brings together the advantages of both approaches, producing a methodology that needs small training sets and produces accurate segmentation results. We test our methodology on the MICCAI 2009 left ventricle segmentation challenge database (containing 15 sequences for training, 15 for validation and 15 for testing), where our approach achieves the most accurate results in the semi-automated problem and state-of-the-art results for the fully automated challenge. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.
Web-based international studies in limited populations of pediatric leukemia.
Valsecchi, Maria Grazia; Silvestri, Daniela; Covezzoli, Anna; De Lorenzo, Paola
2008-02-01
Recent progress in cancer research leads to the characterization of small subgroups of patients by genetic/biological features. Clinical studies in this setting are frequently promoted by international networks of independent researchers and are limited by practical and methodological constraints, not least the regulations recently issued by national and international institutions (EU Directive 2001/20/EC). We reviewed various methods in the design of international multicenter studies, with focus on randomized clinical trials. This paper reports our experience in planning and conducting international studies in childhood leukemia. We applied a decentralized study conduct based on a two-level structure, comprising a national and an international coordinating level. For the more recent trials this structure was implemented as a web-based system. This approach accommodates major legal requirements (e.g., safety reporting) and ensures Good Clinical Practice principles by implementing risk-oriented monitoring procedures. Setting up international non-commercial trials is increasingly complicated. Still, they are strongly needed for answering relevant questions in limited populations. (c) 2007 Wiley-Liss, Inc.
Identity related to living situation in six individuals with congenital quadriplegia.
Robey, Kenneth L
2008-01-01
This study was a preliminary examination of structural aspects of identity, particularly identity associated with living situation, in individuals who have quadriplegia due to cerebral palsy. A hierarchical classes algorithm (HICLAS) was used to construct idiographic 'identity structure' models for three individuals who are living in an inpatient hospital setting and for three individuals living in community-based group residences. Indices derived from the models indicate that the identity 'myself as one who has a disability' was structurally superordinate (i.e., resided at a high hierarchical level) for all six participants, suggesting a high level of importance of this identity in participants' sense of self. The models also indicate that while identity associated with one's particular living situation was superordinate for persons living in the hospital, it was not for persons living in community residences. While conclusions based on this small sample are necessarily limited, the data suggest that identity associated with living situation might differ in structural centrality, and presumably subjective importance, for persons living in inpatient versus community-based settings.
Gradient augmented level set method for phase change simulations
NASA Astrophysics Data System (ADS)
Anumolu, Lakshman; Trujillo, Mario F.
2018-01-01
A numerical method for the simulation of two-phase flow with phase change based on the Gradient-Augmented-Level-set (GALS) strategy is presented. Sharp capturing of the vaporization process is enabled by: i) identification of the vapor-liquid interface, Γ (t), at the subgrid level, ii) discontinuous treatment of thermal physical properties (except for μ), and iii) enforcement of mass, momentum, and energy jump conditions, where the gradients of the dependent variables are obtained at Γ (t) and are consistent with their analytical expression, i.e. no local averaging is applied. Treatment of the jump in velocity and pressure at Γ (t) is achieved using the Ghost Fluid Method. The solution of the energy equation employs the sub-grid knowledge of Γ (t) to discretize the temperature Laplacian using second-order one-sided differences, i.e. the numerical stencil completely resides within each respective phase. To carefully evaluate the benefits or disadvantages of the GALS approach, the standard level set method is implemented and compared against the GALS predictions. The results show the expected trend that interface identification and transport are predicted noticeably better with GALS over the standard level set. This benefit carries over to the prediction of the Laplacian and temperature gradients in the neighborhood of the interface, which are directly linked to the calculation of the vaporization rate. However, when combining the calculation of interface transport and reinitialization with two-phase momentum and energy, the benefits of GALS are to some extent neutralized, and the causes for this behavior are identified and analyzed. Overall the additional computational costs associated with GALS are almost the same as those using the standard level set technique.
Using Trained Pixel Classifiers to Select Images of Interest
NASA Technical Reports Server (NTRS)
Mazzoni, D.; Wagstaff, K.; Castano, R.
2004-01-01
We present a machine-learning-based approach to ranking images based on learned priorities. Unlike previous methods for image evaluation, which typically assess the value of each image based on the presence of predetermined specific features, this method involves using two levels of machine-learning classifiers: one level is used to classify each pixel as belonging to one of a group of rather generic classes, and another level is used to rank the images based on these pixel classifications, given some example rankings from a scientist as a guide. Initial results indicate that the technique works well, producing new rankings that match the scientist's rankings significantly better than would be expected by chance. The method is demonstrated for a set of images collected by a Mars field-test rover.
Fuchs, Sven; Röthlisberger, Veronika; Thaler, Thomas; Zischg, Andreas; Keiler, Margreth
2017-03-04
A coevolutionary perspective is adopted to understand the dynamics of exposure to mountain hazards in the European Alps. A spatially explicit, object-based temporal assessment of elements at risk to mountain hazards (river floods, torrential floods, and debris flows) in Austria and Switzerland is presented for the period from 1919 to 2012. The assessment is based on two different data sets: (1) hazard information adhering to legally binding land use planning restrictions and (2) information on building types combined from different national-level spatial data. We discuss these transdisciplinary dynamics and focus on economic, social, and institutional interdependencies and interactions between human and physical systems. Exposure changes in response to multiple drivers, including population growth and land use conflicts. The results show that whereas some regional assets are associated with a strong increase in exposure to hazards, others are characterized by a below-average level of exposure. The spatiotemporal results indicate relatively stable hot spots in the European Alps. These results coincide with the topography of the countries and with the respective range of economic activities and political settings. Furthermore, the differences between management approaches as a result of multiple institutional settings are discussed. A coevolutionary framework widens the explanatory power of multiple drivers to changes in exposure and risk and supports a shift from structural, security-based policies toward an integrated, risk-based natural hazard management system.
Fuchs, Sven; Röthlisberger, Veronika; Thaler, Thomas; Zischg, Andreas; Keiler, Margreth
2017-01-01
A coevolutionary perspective is adopted to understand the dynamics of exposure to mountain hazards in the European Alps. A spatially explicit, object-based temporal assessment of elements at risk to mountain hazards (river floods, torrential floods, and debris flows) in Austria and Switzerland is presented for the period from 1919 to 2012. The assessment is based on two different data sets: (1) hazard information adhering to legally binding land use planning restrictions and (2) information on building types combined from different national-level spatial data. We discuss these transdisciplinary dynamics and focus on economic, social, and institutional interdependencies and interactions between human and physical systems. Exposure changes in response to multiple drivers, including population growth and land use conflicts. The results show that whereas some regional assets are associated with a strong increase in exposure to hazards, others are characterized by a below-average level of exposure. The spatiotemporal results indicate relatively stable hot spots in the European Alps. These results coincide with the topography of the countries and with the respective range of economic activities and political settings. Furthermore, the differences between management approaches as a result of multiple institutional settings are discussed. A coevolutionary framework widens the explanatory power of multiple drivers to changes in exposure and risk and supports a shift from structural, security-based policies toward an integrated, risk-based natural hazard management system. PMID:28267154
A Simplified Approach for the Rapid Generation of Transient Heat-Shield Environments
NASA Technical Reports Server (NTRS)
Wurster, Kathryn E.; Zoby, E. Vincent; Mills, Janelle C.; Kamhawi, Hilmi
2007-01-01
A simplified approach has been developed whereby transient entry heating environments are reliably predicted based upon a limited set of benchmark radiative and convective solutions. Heating, pressure and shear-stress levels, non-dimensionalized by an appropriate parameter at each benchmark condition are applied throughout the entry profile. This approach was shown to be valid based on the observation that the fully catalytic, laminar distributions examined were relatively insensitive to altitude as well as velocity throughout the regime of significant heating. In order to establish a best prediction by which to judge the results that can be obtained using a very limited benchmark set, predictions based on a series of benchmark cases along a trajectory are used. Solutions which rely only on the limited benchmark set, ideally in the neighborhood of peak heating, are compared against the resultant transient heating rates and total heat loads from the best prediction. Predictions based on using two or fewer benchmark cases at or near the trajectory peak heating condition, yielded results to within 5-10 percent of the best predictions. Thus, the method provides transient heating environments over the heat-shield face with sufficient resolution and accuracy for thermal protection system design and also offers a significant capability to perform rapid trade studies such as the effect of different trajectories, atmospheres, or trim angle of attack, on convective and radiative heating rates and loads, pressure, and shear-stress levels.
Occupational Home Economics Education Series. Securing Employment. Competency Based Teaching Module.
ERIC Educational Resources Information Center
Lowe, Phyllis; And Others
This module, one of ten competency based modules developed for vocational teachers, focuses on securing employment in home economics. It is designed for a variety of levels of learners (secondary, postsecondary, adult) in both school and nonschool educational settings. Five competencies to be developed with this module deal with the meaning of…
Effects-based approaches that employ molecular and tissue level tools to detect and characterize biological responses to contaminants can be a useful complement to chemical monitoring approaches. When the source/type of contamination is known, a predetermined, or supervised, set...
PBPK models are useful in estimating exposure levels based on in vitro to in vivo extrapolation (IVIVE) calculations. Linkage of large sets of chemically screened vitro signature effects to in vivo adverse outcomes using IVIVE is central to the concepts of toxicology in the 21st ...
Problem-Based Learning in Secondary Education: Evaluation by an Experiment
ERIC Educational Resources Information Center
De Witte, Kristof; Rogge, Nicky
2016-01-01
The effectiveness of problem-based learning (PBL) in terms of increasing students' educational attainments has been extensively studied for higher education students and in nonexperimental settings. This paper tests the effectiveness of PBL as an alternative instruction method in secondary education. In a controlled experiment at the class level,…
ERIC Educational Resources Information Center
National Advisory Commission on Work-Based Learning (DOL), Washington, DC.
The National Advisory Commission on Work-Based Learning worked to identify practical steps that the Labor Department could take to help increase the skill levels of the U.S. work force and expand work-based training. The findings gained from a series of roundtables and further studies were synthesized into a set of recommendations in five major…
An Adaptive Evaluation Structure for Computer-Based Instruction.
ERIC Educational Resources Information Center
Welsh, William A.
Adaptive Evaluation Structure (AES) is a set of linked computer programs designed to increase the effectiveness of interactive computer-assisted instruction at the college level. The package has four major features, the first of which is based on a prior cognitive inventory and on the accuracy and pace of student responses. AES adjusts materials…
Caregivers' Cortisol Levels and Perceived Stress in Home-Based and Center-Based Childcare
ERIC Educational Resources Information Center
Groeneveld, Marleen G.; Vermeer, Harriet J.; van IJzendoorn, Marinus H.; Linting, Marielle
2012-01-01
The current study examined professional caregivers' perceived and physiological stress, and associations with the quality of care they provide. Participants were 55 female caregivers from childcare homes and 46 female caregivers from childcare centers in the Netherlands. In both types of settings, equivalent measures and procedures were used. On…
An Evidence-Based Practice Model across the Academic and Clinical Settings
ERIC Educational Resources Information Center
Wolter, Julie A.; Corbin-Lewis, Kim; Self, Trisha; Elsweiler, Anne
2011-01-01
This tutorial is designed to provide academic communication sciences and disorders (CSD) programs, at both the undergraduate and graduate levels, with a comprehensive instructional model on evidence-based practice (EBP). The model was designed to help students view EBP as an ongoing process needed in all clinical decision making. The three facets…
ERIC Educational Resources Information Center
Dieckmann, Peter; Friis, Susanne Molin; Lippert, Anne; Ostergaard, Doris
2012-01-01
Introduction: This study describes (a) process goals, (b) success factors, and (c) barriers for optimizing simulation-based learning environments within the simulation setting model developed by Dieckmann. Methods: Seven simulation educators of different experience levels were interviewed using the Critical Incident Technique. Results: (a) The…
Collaborative knowledge acquisition for the design of context-aware alert systems.
Joffe, Erel; Havakuk, Ofer; Herskovic, Jorge R; Patel, Vimla L; Bernstam, Elmer Victor
2012-01-01
To present a framework for combining implicit knowledge acquisition from multiple experts with machine learning and to evaluate this framework in the context of anemia alerts. Five internal medicine residents reviewed 18 anemia alerts, while 'talking aloud'. They identified features that were reviewed by two or more physicians to determine appropriate alert level, etiology and treatment recommendation. Based on these features, data were extracted from 100 randomly-selected anemia cases for a training set and an additional 82 cases for a test set. Two staff internists assigned an alert level, etiology and treatment recommendation before and after reviewing the entire electronic medical record. The training set of 118 cases (100 plus 18) and the test set of 82 cases were explored using RIDOR and JRip algorithms. The feature set was sufficient to assess 93% of anemia cases (intraclass correlation for alert level before and after review of the records by internists 1 and 2 were 0.92 and 0.95, respectively). High-precision classifiers were constructed to identify low-level alerts (precision p=0.87, recall R=0.4), iron deficiency (p=1.0, R=0.73), and anemia associated with kidney disease (p=0.87, R=0.77). It was possible to identify low-level alerts and several conditions commonly associated with chronic anemia. This approach may reduce the number of clinically unimportant alerts. The study was limited to anemia alerts. Furthermore, clinicians were aware of the study hypotheses potentially biasing their evaluation. Implicit knowledge acquisition, collaborative filtering and machine learning were combined automatically to induce clinically meaningful and precise decision rules.
NASA Astrophysics Data System (ADS)
Liu, Hong; Nodine, Calvin F.
1996-07-01
This paper presents a generalized image contrast enhancement technique, which equalizes the perceived brightness distribution based on the Heinemann contrast discrimination model. It is based on the mathematically proven existence of a unique solution to a nonlinear equation, and is formulated with easily tunable parameters. The model uses a two-step log-log representation of luminance contrast between targets and surround in a luminous background setting. The algorithm consists of two nonlinear gray scale mapping functions that have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of the gray-level distribution of the given image, and can be uniquely determined once the previous three are set. Tests have been carried out to demonstrate the effectiveness of the algorithm for increasing the overall contrast of radiology images. The traditional histogram equalization can be reinterpreted as an image enhancement technique based on the knowledge of human contrast perception. In fact, it is a special case of the proposed algorithm.
Ultrasound-Guided Regional Anesthesia Simulation Training: A Systematic Review.
Chen, Xiao Xu; Trivedi, Vatsal; AlSaflan, AbdulHadi A; Todd, Suzanne Clare; Tricco, Andrea C; McCartney, Colin J L; Boet, Sylvain
Ultrasound-guided regional anesthesia (UGRA) has become the criterion standard of regional anesthesia practice. Ultrasound-guided regional anesthesia teaching programs often use simulation, and guidelines have been published to help guide URGA education. This systematic review aimed to examine the effectiveness of simulation-based education for the acquisition and maintenance of competence in UGRA. Studies identified in MEDLINE, EMBASE, CINAHL, Cochrane Central Register of Controlled Trials, and ERIC were included if they assessed simulation-based UGRA teaching with outcomes measured at Kirkpatrick level 2 (knowledge and skills), 3 (transfer of learning to the workplace), or 4 (patient outcomes). Two authors independently reviewed all identified references for eligibility, abstracted data, and appraised quality. After screening 176 citations and 45 full-text articles, 12 studies were included. Simulation-enhanced training improved knowledge acquisition (Kirkpatrick level 2) when compared with nonsimulation training. Seven studies measuring skill acquisition (Kirkpatrick level 2) found that simulation-enhanced UGRA training was significantly more effective than alternative teaching methods or no intervention. One study measuring transfer of learning into the clinical setting (Kirkpatrick level 3) found no difference between simulation-enhanced UGRA training and non-simulation-based training. However, this study was discontinued early because of technical challenges. Two studies examined patient outcomes (Kirkpatrick level 4), and one of these found that simulation-based UGRA training improved patient outcomes compared with didactic teaching. Ultrasound-guided regional anesthesia knowledge and skills significantly improved with simulation training. The acquired UGRA skills may be transferred to the clinical setting; however, further studies are required to confirm these changes translate to improved patient outcomes.
Addressing Inter-set Write-Variation for Improving Lifetime of Non-Volatile Caches
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh; Vetter, Jeffrey S
We propose a technique which minimizes inter-set write variation in NVM caches for improving its lifetime. Our technique uses cache coloring scheme to add a software-controlled mapping layer between groups of physical pages (called memory regions) and cache sets. Periodically, the number of writes to different colors of the cache is computed and based on this result, the mapping of a few colors is changed to channel the write traffic to least utilized cache colors. This change helps to achieve wear-leveling.
Clarke, E; Desselberger, U
2015-01-01
Rotaviruses (RV) are the leading cause of gastroenteritis in infants and children worldwide and are associated with high mortality predominately in low-income settings. The virus is classified into G and P serotypes and further into P genotypes based on differences in the surface-exposed proteins VP7 and VP4, respectively. Infection results in a variable level of protection from subsequent reinfection and disease. This protection is predominantly homotypic in some settings, whereas broader heterotypic protection is reported in other cohorts. Two antigenically distinct oral RV vaccines are licensed and are being rolled out widely, including in resource-poor setting, with funding provided by the GAVI alliance. First is a monovalent vaccine derived from a live-attenuated human RV strain, whereas the second is a pentavalent bovine-human reassortment vaccine. Both vaccines are highly efficacious in high-income settings, but greatly reduced levels of protection are reported in low-income countries. Here, the current challenges facing mucosal immunologists and vaccinologists aiming to define immunological correlates and to understand the variable levels of protection conferred by these vaccines in humans is considered. Such understanding is critical to maximize the public health impact of the current vaccines and also to the development of the next generation of RV vaccines, which are needed.
Lu, Tong; Tai, Chiew-Lan; Yang, Huafei; Cai, Shijie
2009-08-01
We present a novel knowledge-based system to automatically convert real-life engineering drawings to content-oriented high-level descriptions. The proposed method essentially turns the complex interpretation process into two parts: knowledge representation and knowledge-based interpretation. We propose a new hierarchical descriptor-based knowledge representation method to organize the various types of engineering objects and their complex high-level relations. The descriptors are defined using an Extended Backus Naur Form (EBNF), facilitating modification and maintenance. When interpreting a set of related engineering drawings, the knowledge-based interpretation system first constructs an EBNF-tree from the knowledge representation file, then searches for potential engineering objects guided by a depth-first order of the nodes in the EBNF-tree. Experimental results and comparisons with other interpretation systems demonstrate that our knowledge-based system is accurate and robust for high-level interpretation of complex real-life engineering projects.
Kibicho, Jennifer; Pinkerton, Steven D; Owczarzak, Jill; Mkandawire-Valhmu, Lucy; Kako, Peninnah M
2015-01-01
To describe community pharmacists' perceptions on their current role in direct patient care services, an expanded role for pharmacists in providing patient care services, and changes needed to optimally use pharmacists' expertise to provide high-quality direct patient care services to people living with human immunodeficiency virus (HIV) infections. Cross-sectional study. Four Midwestern cities in the United States in August through October 2009. 28 community-based pharmacists practicing in 17 pharmacies. Interviews. Opinions of participants about roles of specialty and nonspecialty pharmacists in caring for patients living with HIV infections. Pharmacists noted that although challenges in our health care system characterized by inaccessible health professionals presented opportunities for a greater pharmacist role, there were missed opportunities for greater level of patient care services in many community-based nonspecialty settings. Many pharmacists in semispecialty and nonspecialty pharmacies expressed a desire for an expanded role in patient care congruent with their pharmacy education and training. Structural-level policy changes needed to transform community-based pharmacy settings to patient-centered medical homes include recognizing pharmacists as important players in the multidisciplinary health care team, extending the health information exchange highway to include pharmacist-generated electronic therapeutic records, and realigning financial incentives. Comprehensive policy initiatives are needed to optimize the use of highly trained pharmacists in enhancing the quality of health care to an ever-growing number of Americans with chronic conditions who access care in community-based pharmacy settings.
Cerebrospinal fluid neopterin decay characteristics after initiation of antiretroviral therapy.
Yilmaz, Aylin; Yiannoutsos, Constantin T; Fuchs, Dietmar; Price, Richard W; Crozier, Kathryn; Hagberg, Lars; Spudich, Serena; Gisslén, Magnus
2013-05-10
Neopterin, a biomarker of macrophage activation, is elevated in the cerebrospinal fluid (CSF) of most HIV-infected individuals and decreases after initiation of antiretroviral therapy (ART). We studied decay characteristics of neopterin in CSF and blood after commencement of ART in HIV-infected subjects and estimated the set-point levels of CSF neopterin after ART-mediated viral suppression. CSF and blood neopterin were longitudinally measured in 102 neurologically asymptomatic HIV-infected subjects who were treatment-naïve or had been off ART for ≥ 6 months. We used a non-linear model to estimate neopterin decay in response to ART and a stable neopterin set-point attained after prolonged ART. Seven subjects with HIV-associated dementia (HAD) who initiated ART were studied for comparison. Non-HAD patients were followed for a median 84.7 months. Though CSF neopterin concentrations decreased rapidly after ART initiation, it was estimated that set-point levels would be below normal CSF neopterin levels (<5.8 nmol/L) in only 60/102 (59%) of these patients. Pre-ART CSF neopterin was the primary predictor of set-point (P <0.001). HAD subjects had higher baseline median CSF neopterin levels than non-HAD subjects (P <0.0001). Based on the non-HAD model, only 14% of HAD patients were predicted to reach normal levels. After virologically suppressive ART, abnormal CSF neopterin levels persisted in 41% of non-HAD and the majority of HAD patients. ART is not fully effective in ameliorating macrophage activation in CNS as well as blood, especially in subjects with higher pre-ART levels of immune activation.
glideinWMS—a generic pilot-based workload management system
NASA Astrophysics Data System (ADS)
Sfiligoi, I.
2008-07-01
The Grid resources are distributed among hundreds of independent Grid sites, requiring a higher level Workload Management System (WMS) to be used efficiently. Pilot jobs have been used for this purpose by many communities, bringing increased reliability, global fair share and just in time resource matching. glideinWMS is a WMS based on the Condor glidein concept, i.e. a regular Condor pool, with the Condor daemons (startds) being started by pilot jobs, and real jobs being vanilla, standard or MPI universe jobs. The glideinWMS is composed of a set of Glidein Factories, handling the submission of pilot jobs to a set of Grid sites, and a set of VO Frontends, requesting pilot submission based on the status of user jobs. This paper contains the structural overview of glideinWMS as well as a detailed description of the current implementation and the current scalability limits.
Gene Selection and Cancer Classification: A Rough Sets Based Approach
NASA Astrophysics Data System (ADS)
Sun, Lijun; Miao, Duoqian; Zhang, Hongyun
Indentification of informative gene subsets responsible for discerning between available samples of gene expression data is an important task in bioinformatics. Reducts, from rough sets theory, corresponding to a minimal set of essential genes for discerning samples, is an efficient tool for gene selection. Due to the compuational complexty of the existing reduct algoritms, feature ranking is usually used to narrow down gene space as the first step and top ranked genes are selected . In this paper,we define a novel certierion based on the expression level difference btween classes and contribution to classification of the gene for scoring genes and present a algorithm for generating all possible reduct from informative genes.The algorithm takes the whole attribute sets into account and find short reduct with a significant reduction in computational complexity. An exploration of this approach on benchmark gene expression data sets demonstrates that this approach is successful for selecting high discriminative genes and the classification accuracy is impressive.
Experience-based co-design in an adult psychological therapies service.
Cooper, Kate; Gillmore, Chris; Hogg, Lorna
2016-01-01
Experience-based co-design (EBCD) is a methodology for service improvement and development, which puts service-user voices at the heart of improving health services. The aim of this paper was to implement the EBCD methodology in a mental health setting, and to investigate the challenges which arise during this process. In order to achieve this, a modified version of the EBCD methodology was undertaken, which involved listening to the experiences of the people who work in and use the mental health setting and sharing these experiences with the people who could effect change within the service, through collaborative work between service-users, staff and managers. EBCD was implemented within the mental health setting and was well received by service-users, staff and stakeholders. A number of modifications were necessary in this setting, for example high levels of support available to participants. It was concluded that EBCD is a suitable methodology for service improvement in mental health settings.
Ma, Yue; Tuskan, Gerald A.
2018-01-01
The existence of complete genome sequences makes it important to develop different approaches for classification of large-scale data sets and to make extraction of biological insights easier. Here, we propose an approach for classification of complete proteomes/protein sets based on protein distributions on some basic attributes. We demonstrate the usefulness of this approach by determining protein distributions in terms of two attributes: protein lengths and protein intrinsic disorder contents (ID). The protein distributions based on L and ID are surveyed for representative proteome organisms and protein sets from the three domains of life. The two-dimensional maps (designated as fingerprints here) from the protein distribution densities in the LD space defined by ln(L) and ID are then constructed. The fingerprints for different organisms and protein sets are found to be distinct with each other, and they can therefore be used for comparative studies. As a test case, phylogenetic trees have been constructed based on the protein distribution densities in the fingerprints of proteomes of organisms without performing any protein sequence comparison and alignments. The phylogenetic trees generated are biologically meaningful, demonstrating that the protein distributions in the LD space may serve as unique phylogenetic signals of the organisms at the proteome level. PMID:29686995
An Architecture for Autonomous Rovers on Future Planetary Missions
NASA Astrophysics Data System (ADS)
Ocon, J.; Avilés, M.; Graziano, M.
2018-04-01
This paper proposes an architecture for autonomous planetary rovers. This architecture combines a set of characteristics required in this type of system: high level of abstraction, reactive event-based activity execution, and automous navigation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardisty, M.; Gordon, L.; Agarwal, P.
2007-08-15
Quantitative assessment of metastatic disease in bone is often considered immeasurable and, as such, patients with skeletal metastases are often excluded from clinical trials. In order to effectively quantify the impact of metastatic tumor involvement in the spine, accurate segmentation of the vertebra is required. Manual segmentation can be accurate but involves extensive and time-consuming user interaction. Potential solutions to automating segmentation of metastatically involved vertebrae are demons deformable image registration and level set methods. The purpose of this study was to develop a semiautomated method to accurately segment tumor-bearing vertebrae using the aforementioned techniques. By maintaining morphology of anmore » atlas, the demons-level set composite algorithm was able to accurately differentiate between trans-cortical tumors and surrounding soft tissue of identical intensity. The algorithm successfully segmented both the vertebral body and trabecular centrum of tumor-involved and healthy vertebrae. This work validates our approach as equivalent in accuracy to an experienced user.« less
Dissipative N-point-vortex Models in the Plane
NASA Astrophysics Data System (ADS)
Shashikanth, Banavara N.
2010-02-01
A method is presented for constructing point vortex models in the plane that dissipate the Hamiltonian function at any prescribed rate and yet conserve the level sets of the invariants of the Hamiltonian model arising from the SE (2) symmetries. The method is purely geometric in that it uses the level sets of the Hamiltonian and the invariants to construct the dissipative field and is based on elementary classical geometry in ℝ3. Extension to higher-dimensional spaces, such as the point vortex phase space, is done using exterior algebra. The method is in fact general enough to apply to any smooth finite-dimensional system with conserved quantities, and, for certain special cases, the dissipative vector field constructed can be associated with an appropriately defined double Nambu-Poisson bracket. The most interesting feature of this method is that it allows for an infinite sequence of such dissipative vector fields to be constructed by repeated application of a symmetric linear operator (matrix) at each point of the intersection of the level sets.
Reimann, Clemens; Banks, David
2004-10-01
Clean and healthy drinking water is important for life. Drinking water can be drawn from streams, lakes and rivers, directly collected (and stored) from rain, acquired by desalination of ocean water and melting of ice or it can be extracted from groundwater resources. Groundwater may reach the earth's surface in the form of springs or can be extracted via dug or drilled wells; it also contributes significantly to river baseflow. Different water quality issues have to be faced when utilising these different water resources. Some of these are at present largely neglected in water quality regulations. This paper focuses on the inorganic chemical quality of natural groundwater. Possible health effects, the problems of setting meaningful action levels or maximum admissible concentrations (MAC-values) for drinking water, and potential shortcomings in current legislation are discussed. An approach to setting action levels based on transparency, toxicological risk assessment, completeness, and identifiable responsibility is suggested.
Toward an Integrated Online Learning Environment
NASA Astrophysics Data System (ADS)
Teodorescu, Raluca E.; Pawl, Andrew; Rayyan, Saif; Barrantes, Analia; Pritchard, David E.
2010-10-01
We are building in LON-CAPA an integrated learning environment that will enable the development, dissemination and evaluation of PER-based material. This environment features a collection of multi-level research-based homework sets organized by topic and cognitive complexity. These sets are associated with learning modules that contain very short exposition of the content supplemented by integrated open-access videos, worked examples, simulations, and tutorials (some from ANDES). To assess students' performance accurately with respect to a system-wide standard, we plan to implement Item Response Theory. Together with other PER assessments and purposeful solicitation of student feedback, this will allow us to measure and improve the efficacy of various research-based materials, while getting insights into teaching and learning.
Geospatial analysis based on GIS integrated with LADAR.
Fetterman, Matt R; Freking, Robert; Fernandez-Cull, Christy; Hinkle, Christopher W; Myne, Anu; Relyea, Steven; Winslow, Jim
2013-10-07
In this work, we describe multi-layered analyses of a high-resolution broad-area LADAR data set in support of expeditionary activities. High-level features are extracted from the LADAR data, such as the presence and location of buildings and cars, and then these features are used to populate a GIS (geographic information system) tool. We also apply line-of-sight (LOS) analysis to develop a path-planning module. Finally, visualization is addressed and enhanced with a gesture-based control system that allows the user to navigate through the enhanced data set in a virtual immersive experience. This work has operational applications including military, security, disaster relief, and task-based robotic path planning.
Large Terrain Modeling and Visualization for Planets
NASA Technical Reports Server (NTRS)
Myint, Steven; Jain, Abhinandan; Cameron, Jonathan; Lim, Christopher
2011-01-01
Physics-based simulations are actively used in the design, testing, and operations phases of surface and near-surface planetary space missions. One of the challenges in realtime simulations is the ability to handle large multi-resolution terrain data sets within models as well as for visualization. In this paper, we describe special techniques that we have developed for visualization, paging, and data storage for dealing with these large data sets. The visualization technique uses a real-time GPU-based continuous level-of-detail technique that delivers multiple frames a second performance even for planetary scale terrain model sizes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendell, Mark J.; Fisk, William J.
Background - The goal of this project, with a focus on commercial buildings in California, was to develop a new framework for evidence-based minimum ventilation rate (MVR) standards that protect occupants in buildings while also considering energy use and cost. This was motivated by research findings suggesting that current prescriptive MVRs in commercial buildings do not provide occupants with fully safe and satisfactory indoor environments. Methods - The project began with a broad review in several areas ? the diverse strategies now used for standards or guidelines for MVRs or for environmental contaminant exposures, current knowledge about adverse human effectsmore » associated with VRs, and current knowledge about contaminants in commercial buildings, including their their presence, their adverse human effects, and their relationships with VRs. Based on a synthesis of the reviewed information, new principles and approaches are proposed for setting evidence-based VRs standards for commercial buildings, considering a range of human effects including health, performance, and acceptability of air. Results ? A review and evaluation is first presented of current approaches to setting prescriptive building ventilation standards and setting acceptable limits for human contaminant exposures in outdoor air and occupational settings. Recent research on approaches to setting acceptable levels of environmental exposures in evidence-based MVR standards is also described. From a synthesis and critique of these materials, a set of principles for setting MVRs is presented, along with an example approach based on these principles. The approach combines two sequential strategies. In a first step, an acceptable threshold is set for each adverse outcome that has a demonstrated relationship to VRs, as an increase from a (low) outcome level at a high reference ventilation rate (RVR, the VR needed to attain the best achievable levels of the adverse outcome); MVRs required to meet each specific outcome threshold are estimated; and the highest of these MVRs, which would then meet all outcome thresholds, is selected as the target MVR. In a second step, implemented only if the target MVR from step 1 is judged impractically high, costs and benefits are estimated and this information is used in a risk management process. Four human outcomes with substantial quantitative evidence of relationships to VRs are identified for initial consideration in setting MVR standards. These are: building-related symptoms (sometimes called sick building syndrome symptoms), poor perceived indoor air quality, and diminished work performance, all with data relating them directly to VRs; and cancer and non-cancer chronic outcomes, related indirectly to VRs through specific VR-influenced indoor contaminants. In an application of step 1 for offices using a set of example outcome thresholds, a target MVR of 9 L/s (19 cfm) per person was needed. Because this target MVR was close to MVRs in current standards, use of a cost/benefit process seemed unnecessary. Selection of more stringent thresholds for one or more human outcomes, however, could raise the target MVR to 14 L/s (30 cfm) per person or higher, triggering the step 2 risk management process. Consideration of outdoor air pollutant effects would add further complexity to the framework. For balancing the objective and subjective factors involved in setting MVRs in a cost-benefit process, it is suggested that a diverse group of stakeholders make the determination after assembling as much quantitative data as possible.« less
ERIC Educational Resources Information Center
Stuebing, Susan; And Others
This paper reviews an ongoing study on the physical settings of education with technology at the elementary and high school levels. The study, which is multi-disciplinary in nature, is based in sites in the process of change in teaching strategies, using learning technology as a catalyst for this change to take place. The focus of the study is on…
Degeneffe, Dennis; Reicks, Marla
2008-01-01
Objective To identify a comprehensive set of distinct “need states” based on the eating occasions experienced by midlife women. Design Series of 7 focus group interviews. Setting Meeting room on a university campus. Participants A convenience sample of 34 multi-ethnic women (mean age = 46 years). Phenomenon of Interest Descriptions of eating occasions by “need states” - specific patterns of needs for the occasion. Analysis Interviews were audio taped, transcribed verbatim and analyzed for common themes using qualitative data analysis procedures. Findings Eight need states suggested a hypothetical framework reflecting a wide range in emotional gratification. Need states with a low level of emotional gratification were dominated by sets of functional needs such as coping with stress, balancing intake across occasions, meeting external demands of time and effort and maintaining a routine. Food was a means for reinforcing family identity, social expression and celebration in need states with high levels of emotional gratification. Occurrence of need states varied by day and meal/snack occasion, with food type/amount dependent on need state. Conclusions and Implications Eating occasions are driven by specific sets of needs ranging from physical/functional to more emotional/social needs. Addressing need states may improve weight intervention programs for midlife women. PMID:18984495
Modelling wildland fire propagation by tracking random fronts
NASA Astrophysics Data System (ADS)
Pagnini, G.; Mentrelli, A.
2014-08-01
Wildland fire propagation is studied in the literature by two alternative approaches, namely the reaction-diffusion equation and the level-set method. These two approaches are considered alternatives to each other because the solution of the reaction-diffusion equation is generally a continuous smooth function that has an exponential decay, and it is not zero in an infinite domain, while the level-set method, which is a front tracking technique, generates a sharp function that is not zero inside a compact domain. However, these two approaches can indeed be considered complementary and reconciled. Turbulent hot-air transport and fire spotting are phenomena with a random nature and they are extremely important in wildland fire propagation. Consequently, the fire front gets a random character, too; hence, a tracking method for random fronts is needed. In particular, the level-set contour is randomised here according to the probability density function of the interface particle displacement. Actually, when the level-set method is developed for tracking a front interface with a random motion, the resulting averaged process emerges to be governed by an evolution equation of the reaction-diffusion type. In this reconciled approach, the rate of spread of the fire keeps the same key and characterising role that is typical of the level-set approach. The resulting model emerges to be suitable for simulating effects due to turbulent convection, such as fire flank and backing fire, the faster fire spread being because of the actions by hot-air pre-heating and by ember landing, and also due to the fire overcoming a fire-break zone, which is a case not resolved by models based on the level-set method. Moreover, from the proposed formulation, a correction follows for the formula of the rate of spread which is due to the mean jump length of firebrands in the downwind direction for the leeward sector of the fireline contour. The presented study constitutes a proof of concept, and it needs to be subjected to a future validation.
Metrics for Performance Evaluation of Patient Exercises during Physical Therapy.
Vakanski, Aleksandar; Ferguson, Jake M; Lee, Stephen
2017-06-01
The article proposes a set of metrics for evaluation of patient performance in physical therapy exercises. Taxonomy is employed that classifies the metrics into quantitative and qualitative categories, based on the level of abstraction of the captured motion sequences. Further, the quantitative metrics are classified into model-less and model-based metrics, in reference to whether the evaluation employs the raw measurements of patient performed motions, or whether the evaluation is based on a mathematical model of the motions. The reviewed metrics include root-mean square distance, Kullback Leibler divergence, log-likelihood, heuristic consistency, Fugl-Meyer Assessment, and similar. The metrics are evaluated for a set of five human motions captured with a Kinect sensor. The metrics can potentially be integrated into a system that employs machine learning for modelling and assessment of the consistency of patient performance in home-based therapy setting. Automated performance evaluation can overcome the inherent subjectivity in human performed therapy assessment, and it can increase the adherence to prescribed therapy plans, and reduce healthcare costs.
Stochastic evolutionary dynamics in minimum-effort coordination games
NASA Astrophysics Data System (ADS)
Li, Kun; Cong, Rui; Wang, Long
2016-08-01
The minimum-effort coordination game draws recently more attention for the fact that human behavior in this social dilemma is often inconsistent with the predictions of classical game theory. Here, we combine evolutionary game theory and coalescence theory to investigate this game in finite populations. Both analytic results and individual-based simulations show that effort costs play a key role in the evolution of contribution levels, which is in good agreement with those observed experimentally. Besides well-mixed populations, set structured populations have also been taken into consideration. Therein we find that large number of sets and moderate migration rate greatly promote effort levels, especially for high effort costs.
Malleable architecture generator for FPGA computing
NASA Astrophysics Data System (ADS)
Gokhale, Maya; Kaba, James; Marks, Aaron; Kim, Jang
1996-10-01
The malleable architecture generator (MARGE) is a tool set that translates high-level parallel C to configuration bit streams for field-programmable logic based computing systems. MARGE creates an application-specific instruction set and generates the custom hardware components required to perform exactly those computations specified by the C program. In contrast to traditional fixed-instruction processors, MARGE's dynamic instruction set creation provides for efficient use of hardware resources. MARGE processes intermediate code in which each operation is annotated by the bit lengths of the operands. Each basic block (sequence of straight line code) is mapped into a single custom instruction which contains all the operations and logic inherent in the block. A synthesis phase maps the operations comprising the instructions into register transfer level structural components and control logic which have been optimized to exploit functional parallelism and function unit reuse. As a final stage, commercial technology-specific tools are used to generate configuration bit streams for the desired target hardware. Technology- specific pre-placed, pre-routed macro blocks are utilized to implement as much of the hardware as possible. MARGE currently supports the Xilinx-based Splash-2 reconfigurable accelerator and National Semiconductor's CLAy-based parallel accelerator, MAPA. The MARGE approach has been demonstrated on systolic applications such as DNA sequence comparison.
Work-based learning in health care environments.
Spouse, J
2001-03-01
In reviewing contemporary literature and theories about work-based learning, this paper explores recent trends promoting life-long learning. In the process the paper reviews and discusses some implications of implementing recent policies and fostering le arning in health care practice settings. Recent Government policies designed to provide quality health care services and to improve staffing levels in the nursing workforce, have emphasized the importance of life-long learning whilst learning-on-the-job and the need to recognize and credit experiential learning. Such calls include negotiation of personal development plans tailored to individual educational need and context-sensitive learning activities. To be implemented effectively, this policy cann ot be seen as a cheap option but requires considerable financial resourcing for preparation of staff and the conduct of such activities. Successful work-based learning requires investment in staff at all levels as well as changes to staffing structures in organizations and trusts; changes designed to free people up to work and learn collaboratively. Creating an organizational environment where learning is prized depends upon a climate of trust; a climate where investigation and speculation are fostered and where time is protected for engaging in discussions about practice. Such a change may be radical for many health care organizations and may require a review of current policies and practices ensuring that they include education at all levels. The nature of such education also requires reconceptualizing. In the past, learning in practice settings was seen as formal lecturing or demonstration, and relied upon behaviourist principles of learning. Contemporary thinking suggests effective learning in work-settings is multi-faceted and draws on previously acquired formal knowledge, contextualizes it and moulds it according to situations at hand. Thinking about work-based learning in this way raises questions about how such learning can be supported and facilitated.
Effective Multi-Query Expansions: Collaborative Deep Networks for Robust Landmark Retrieval.
Wang, Yang; Lin, Xuemin; Wu, Lin; Zhang, Wenjie
2017-03-01
Given a query photo issued by a user (q-user), the landmark retrieval is to return a set of photos with their landmarks similar to those of the query, while the existing studies on the landmark retrieval focus on exploiting geometries of landmarks for similarity matches between candidate photos and a query photo. We observe that the same landmarks provided by different users over social media community may convey different geometry information depending on the viewpoints and/or angles, and may, subsequently, yield very different results. In fact, dealing with the landmarks with low quality shapes caused by the photography of q-users is often nontrivial and has seldom been studied. In this paper, we propose a novel framework, namely, multi-query expansions, to retrieve semantically robust landmarks by two steps. First, we identify the top- k photos regarding the latent topics of a query landmark to construct multi-query set so as to remedy its possible low quality shape. For this purpose, we significantly extend the techniques of Latent Dirichlet Allocation. Then, motivated by the typical collaborative filtering methods, we propose to learn a collaborative deep networks-based semantically, nonlinear, and high-level features over the latent factor for landmark photo as the training set, which is formed by matrix factorization over collaborative user-photo matrix regarding the multi-query set. The learned deep network is further applied to generate the features for all the other photos, meanwhile resulting into a compact multi-query set within such space. Then, the final ranking scores are calculated over the high-level feature space between the multi-query set and all other photos, which are ranked to serve as the final ranking list of landmark retrieval. Extensive experiments are conducted on real-world social media data with both landmark photos together with their user information to show the superior performance over the existing methods, especially our recently proposed multi-query based mid-level pattern representation method [1].
Influence of chewing rate on salivary stress hormone levels.
Tasaka, Akinori; Tahara, Yasuaki; Sugiyama, Tetsuya; Sakurai, Kaoru
2008-10-01
The purpose of this study was to clarify the effect of different chewing rates on salivary cortisol levels as a stress indicator. The subject group consisted of 16 healthy males. They were required to rest for 30 min, and then given arithmetic calculations to perform for 30 min as stress loading. Immediately after, the first set of saliva specimens (S1) was collected over a period of 1 min to measure cortisol levels. Next, they were asked to chew a tasteless gum base for 10 min, and the second set of saliva specimens (S2) was collected in the same manner. They were then required to rest for 10 min, after which the third set of saliva specimens (S3) was collected. Chewing rates were set to slow, habitual, and fast in time with a metronome. Salivary cortisol levels were analyzed by radioimmunoassay. Changes in salivary cortisol levels comparing S1 with S2, and S1 with S3 were determined. Changes in salivary cortisol levels between S1 and S2 showed a reduction of 4.7%, 14.6%, and 16.2% with slow, habitual, and fast chewing, respectively. A significant difference was observed between slow and fast chewing. Changes in salivary cortisol levels between S1 and S3 showed a reduction of 14.4%, 22.2%, and 25.8% with slow, habitual, and fast chewing, respectively. A significant difference was observed between slow and fast chewing. This study showed that differences in chewing rate affected salivary cortisol levels as a stress indicator, and suggested that the effect on stress release with fast chewing is greater than that with slow chewing.
Reconstruction of fluorescence molecular tomography with a cosinoidal level set method.
Zhang, Xuanxuan; Cao, Xu; Zhu, Shouping
2017-06-27
Implicit shape-based reconstruction method in fluorescence molecular tomography (FMT) is capable of achieving higher image clarity than image-based reconstruction method. However, the implicit shape method suffers from a low convergence speed and performs unstably due to the utilization of gradient-based optimization methods. Moreover, the implicit shape method requires priori information about the number of targets. A shape-based reconstruction scheme of FMT with a cosinoidal level set method is proposed in this paper. The Heaviside function in the classical implicit shape method is replaced with a cosine function, and then the reconstruction can be accomplished with the Levenberg-Marquardt method rather than gradient-based methods. As a result, the priori information about the number of targets is not required anymore and the choice of step length is avoided. Numerical simulations and phantom experiments were carried out to validate the proposed method. Results of the proposed method show higher contrast to noise ratios and Pearson correlations than the implicit shape method and image-based reconstruction method. Moreover, the number of iterations required in the proposed method is much less than the implicit shape method. The proposed method performs more stably, provides a faster convergence speed than the implicit shape method, and achieves higher image clarity than the image-based reconstruction method.
Judd, Belinda Karyn; Alison, Jennifer Ailsey; Waters, Donna; Gordon, Christopher James
2016-08-01
Simulation-based clinical education often aims to replicate varying aspects of real clinical practice. It is unknown whether learners' stress levels in simulation are comparable with those in clinical practice. The current study compared acute stress markers during simulation-based clinical education with that experienced in situ in a hospital-based environment. Undergraduate physiotherapy students' (n = 33) acute stress responses [visual analog scales of stress and anxiety, continuous heart rate (HR), and saliva cortisol] were assessed during matched patient encounters in simulation-based laboratories using standardized patients and during hospital clinical placements with real patients. Group differences in stress variables were compared using repeated measures analysis of variance for 3 time points (before, during the patient encounter, and after) at 2 settings (simulation and hospital). Visual analog scale stress and anxiety as well as HR increased significantly from baseline levels before the encounter in both settings (all P < 0.05). Stress and anxiety were significantly higher in simulation [mean (SD), 45 (22) and 44 (25) mm; P = 0.003] compared with hospital [mean (SD), 31 (21) and 26 (20) mm; P = 0.002]. The mean (SD) HR during the simulation patient encounter was 90 (16) beats per minute and was not different compared with hospital [mean (SD), 87 (15) beats per minute; P = 0.89]. Changes in salivary cortisol before and after patient encounters were not statistically different between settings [mean (SD) simulation, 1.5 (2.4) nmol/L; hospital, 2.5 (2.9) nmol/L; P = 0.70]. Participants' experienced stress on clinical placements, irrespective of the clinical education setting (simulation vs. hospital). This study revealed that psychological stress and anxiety were greater during simulation compared with hospital settings; however, physiological stress responses (HR and cortisol) were comparable. These results indicate that psychological stress may be heightened in simulation, and health professional educators need to consider the impact of this on learners in simulation-based clinical education. New learners in their clinical education program may benefit from a less stressful simulation environment, before a gradual increase in stress demands as they approach clinical practice.
Fuligni, Allison Sidle; Howes, Carollee; Lara-Cinisomo, Sandraluz; Karoly, Lynn
2009-01-01
This paper presents a naturalistic investigation of the patterns of formal education, early childhood education training, and mentoring of a diverse group of urban early childhood educators participating in the Los Angeles: Exploring Children's Early Learning Settings (LA ExCELS) study. A total of 103 preschool teachers and family child care providers serving primarily low-income 3- and 4-year-old children in Los Angeles County provided data on their education, training, and beliefs about teaching. This sample worked in public center based preschool programs including Head Start classrooms and State preschool classrooms (N=42), private non-profit preschools including community based organizations and faith-based preschools (N=42), and licensed family child care homes (N=19). This study uses a person-centered approach to explore patterns of teacher preparation, sources of support, supervision, and mentoring across these 3 types of education settings, and how these patterns are associated with early childhood educators' beliefs and practices. Findings suggest a set of linkages between type of early education setting, professional development, and supervision of teaching. Public preschools have the strongest mandates for formal professional development and typically less variation in levels of monitoring, whereas family child care providers on average have less formal education and more variability in their access to and use of other forms of training and mentorship. Four distinct patterns of formal education, child development training, and ongoing mentoring or support were identified among the educators in this study. Associations between professional development experiences and teachers' beliefs and practices suggested the importance of higher levels of formal training for enhancing the quality of teacher-child interactions. Implications of the findings for changing teacher behaviors are discussed with respect to considering the setting context. PMID:20072719
Priority setting and evidence based purchasing.
Frith, L
1999-01-01
The purpose of this paper is to consider the role that values play in priority setting through the use of EBP. It is important to be clear about the role of values at all levels of the decision making process. At one level, society as a whole has to make decisions about the kind of health provision that it wants. As is generally accepted, these priority setting questions cannot be answered by medical science alone but involve important judgements of value. However, as I hope to show values come into priority setting questions at another level, one not often explicitly recognised in much of the literature: that of the very definition of the effectiveness of treatments. This has important consequences for patient care. If we do not recognise that the effectiveness of a treatment involve subjective elements--a patient's own assessment of the value of the treatment--then this could lead to the belief that we can purchase one treatment that is the most effective for all patients. This might result in a detrimental reduction in the range of options that a patient is given with some patients not receiving the treatment that is most effective for them.
2011-01-01
Background Logistic random effects models are a popular tool to analyze multilevel also called hierarchical data with a binary or ordinal outcome. Here, we aim to compare different statistical software implementations of these models. Methods We used individual patient data from 8509 patients in 231 centers with moderate and severe Traumatic Brain Injury (TBI) enrolled in eight Randomized Controlled Trials (RCTs) and three observational studies. We fitted logistic random effects regression models with the 5-point Glasgow Outcome Scale (GOS) as outcome, both dichotomized as well as ordinal, with center and/or trial as random effects, and as covariates age, motor score, pupil reactivity or trial. We then compared the implementations of frequentist and Bayesian methods to estimate the fixed and random effects. Frequentist approaches included R (lme4), Stata (GLLAMM), SAS (GLIMMIX and NLMIXED), MLwiN ([R]IGLS) and MIXOR, Bayesian approaches included WinBUGS, MLwiN (MCMC), R package MCMCglmm and SAS experimental procedure MCMC. Three data sets (the full data set and two sub-datasets) were analysed using basically two logistic random effects models with either one random effect for the center or two random effects for center and trial. For the ordinal outcome in the full data set also a proportional odds model with a random center effect was fitted. Results The packages gave similar parameter estimates for both the fixed and random effects and for the binary (and ordinal) models for the main study and when based on a relatively large number of level-1 (patient level) data compared to the number of level-2 (hospital level) data. However, when based on relatively sparse data set, i.e. when the numbers of level-1 and level-2 data units were about the same, the frequentist and Bayesian approaches showed somewhat different results. The software implementations differ considerably in flexibility, computation time, and usability. There are also differences in the availability of additional tools for model evaluation, such as diagnostic plots. The experimental SAS (version 9.2) procedure MCMC appeared to be inefficient. Conclusions On relatively large data sets, the different software implementations of logistic random effects regression models produced similar results. Thus, for a large data set there seems to be no explicit preference (of course if there is no preference from a philosophical point of view) for either a frequentist or Bayesian approach (if based on vague priors). The choice for a particular implementation may largely depend on the desired flexibility, and the usability of the package. For small data sets the random effects variances are difficult to estimate. In the frequentist approaches the MLE of this variance was often estimated zero with a standard error that is either zero or could not be determined, while for Bayesian methods the estimates could depend on the chosen "non-informative" prior of the variance parameter. The starting value for the variance parameter may be also critical for the convergence of the Markov chain. PMID:21605357
Michael E. Goerndt; Vincente J. Monleon; Hailemariam. Temesgen
2010-01-01
Three sets of linear models were developed to predict several forest attributes, using stand-level and single-tree remote sensing (STRS) light detection and ranging (LiDAR) metrics as predictor variables. The first used only area-level metrics (ALM) associated with first-return height distribution, percentage of cover, and canopy transparency. The second alternative...
ERIC Educational Resources Information Center
Minh, Tran Kiem; Lagrange, Jean-Baptiste
2016-01-01
This paper aims at contributing to remedy the narrow treatment of functions at upper secondary level. Assuming that students make sense of functions by working on functional situations in distinctive settings, we propose to consider functional working spaces inspired by geometrical working spaces. We analyse a classroom situation based on a…
ERIC Educational Resources Information Center
Utah System of Higher Education, 2014
2014-01-01
Utah has set the goal of having 66% of its 25-35 age population with a post-secondary degree or certificate by 2020. To achieve this goal, Utah must increase the number of degrees and certificates awarded annually, to a level 25% above the 2010-11 base year level. This requires a continuing focused effort in creating a highly-educated workforce. A…
Limestone and Silica Powder Replacements for Cement: Early-Age Performance.
Bentz, Dale P; Ferraris, Chiara F; Jones, Scott Z; Lootens, Didier; Zunino, Franco
2017-04-01
Developing functional concrete mixtures with less ordinary portland cement (OPC) has been one of the key objectives of the 21 st century sustainability movement. While the supplies of many alternatives to OPC (such as fly ash or slag) may be limited, those of limestone and silica powders produced by crushing rocks seem virtually endless. The present study examines the chemical and physical influences of these powders on the rheology, hydration, and setting of cement-based materials via experiments and three-dimensional microstructural modeling. It is shown that both limestone and silica particle surfaces are active templates (sites) for the nucleation and growth of cement hydration products, while the limestone itself is also somewhat soluble, leading to the formation of carboaluminate hydration products. Because the filler particles are incorporated as active members of the percolated backbone that constitutes initial setting of a cement-based system, replacements of up to 50 % of the OPC by either of these powders on a volumetric basis have minimal impact on the initial setting time, and even a paste with only 5 % OPC and 95 % limestone powder by volume achieves initial set within 24 h. While their influence on setting is similar, the limestone and silica powders produce pastes with quite different rheological properties, when substituted at the same volume level. When proceeding from setting to later age strength development, one must also consider the dilution of the system due to cement removal, along with the solubility/reactivity of the filler. However, for applications where controlled (prompt) setting is more critical than developing high strengths, such as mortar tile adhesives, grouts, and renderings, significant levels of these powder replacements for cement can serve as sustainable, functional alternatives to the oft-employed 100 % OPC products.
Latendresse, Shawn J.; Rose, Richard J.; Viken, Richard J.; Pulkkinen, Lea; Kaprio, Jaakko; Dick, Danielle M.
2013-01-01
Among adolescents, many parenting practices have been associated with the initiation and development of drinking behaviors. However, recent studies suggest discrepancies in parents’ and adolescents’ perceptions of parenting and their links with adolescent use. In this study, we derive two independent sets of underlying parenting profiles (based on parent and adolescent reported behaviors at age 11–12 years), which were then examined in relation to adolescents’ drinking behaviors at ages 14 and 17½. Results indicated that the two sets of profiles accounted for little shared variance, with those based on adolescents’ reports being stronger predictors of adolescent drinking. Moreover, comparisons of drinking levels across profiles pointed to multiple parenting strategies that may effectively reduce adolescent alcohol experimentation, including simply sustaining a moderate level of awareness of adolescents’ whereabouts and activities, and avoiding excessive conflict and strictness. PMID:19283601
Bergeest, Jan-Philip; Rohr, Karl
2012-10-01
In high-throughput applications, accurate and efficient segmentation of cells in fluorescence microscopy images is of central importance for the quantification of protein expression and the understanding of cell function. We propose an approach for segmenting cell nuclei which is based on active contours using level sets and convex energy functionals. Compared to previous work, our approach determines the global solution. Thus, the approach does not suffer from local minima and the segmentation result does not depend on the initialization. We consider three different well-known energy functionals for active contour-based segmentation and introduce convex formulations of these functionals. We also suggest a numeric approach for efficiently computing the solution. The performance of our approach has been evaluated using fluorescence microscopy images from different experiments comprising different cell types. We have also performed a quantitative comparison with previous segmentation approaches. Copyright © 2012 Elsevier B.V. All rights reserved.
Ensemble-based evaluation of extreme water levels for the eastern Baltic Sea
NASA Astrophysics Data System (ADS)
Eelsalu, Maris; Soomere, Tarmo
2016-04-01
The risks and damages associated with coastal flooding that are naturally associated with an increase in the magnitude of extreme storm surges are one of the largest concerns of countries with extensive low-lying nearshore areas. The relevant risks are even more contrast for semi-enclosed water bodies such as the Baltic Sea where subtidal (weekly-scale) variations in the water volume of the sea substantially contribute to the water level and lead to large spreading of projections of future extreme water levels. We explore the options for using large ensembles of projections to more reliably evaluate return periods of extreme water levels. Single projections of the ensemble are constructed by means of fitting several sets of block maxima with various extreme value distributions. The ensemble is based on two simulated data sets produced in the Swedish Meteorological and Hydrological Institute. A hindcast by the Rossby Centre Ocean model is sampled with a resolution of 6 h and a similar hindcast by the circulation model NEMO with a resolution of 1 h. As the annual maxima of water levels in the Baltic Sea are not always uncorrelated, we employ maxima for calendar years and for stormy seasons. As the shape parameter of the Generalised Extreme Value distribution changes its sign and substantially varies in magnitude along the eastern coast of the Baltic Sea, the use of a single distribution for the entire coast is inappropriate. The ensemble involves projections based on the Generalised Extreme Value, Gumbel and Weibull distributions. The parameters of these distributions are evaluated using three different ways: maximum likelihood method and method of moments based on both biased and unbiased estimates. The total number of projections in the ensemble is 40. As some of the resulting estimates contain limited additional information, the members of pairs of projections that are highly correlated are assigned weights 0.6. A comparison of the ensemble-based projection of extreme water levels and their return periods with similar estimates derived from local observations reveals an interesting pattern of match and mismatch. The match is almost perfect in measurement sites where local effects (e.g., wave-induced set-up or local surge in very shallow areas that are not resolved by circulation models) do not contribute to the observed values of water level. There is, however, substantial mismatch between projected and observed extreme values for most of the Estonian coast. The mismatch is largest for sections that are open to high waves and for several bays that are deeply cut into mainland but open for predominant strong wind directions. Detailed quantification of this mismatch eventually makes it possible to develop substantially improved estimates of extreme water levels in sections where local effects considerably contribute into the total water level.
Lai, Yinglei; Zhang, Fanni; Nayak, Tapan K; Modarres, Reza; Lee, Norman H; McCaffrey, Timothy A
2014-01-01
Gene set enrichment analysis (GSEA) is an important approach to the analysis of coordinate expression changes at a pathway level. Although many statistical and computational methods have been proposed for GSEA, the issue of a concordant integrative GSEA of multiple expression data sets has not been well addressed. Among different related data sets collected for the same or similar study purposes, it is important to identify pathways or gene sets with concordant enrichment. We categorize the underlying true states of differential expression into three representative categories: no change, positive change and negative change. Due to data noise, what we observe from experiments may not indicate the underlying truth. Although these categories are not observed in practice, they can be considered in a mixture model framework. Then, we define the mathematical concept of concordant gene set enrichment and calculate its related probability based on a three-component multivariate normal mixture model. The related false discovery rate can be calculated and used to rank different gene sets. We used three published lung cancer microarray gene expression data sets to illustrate our proposed method. One analysis based on the first two data sets was conducted to compare our result with a previous published result based on a GSEA conducted separately for each individual data set. This comparison illustrates the advantage of our proposed concordant integrative gene set enrichment analysis. Then, with a relatively new and larger pathway collection, we used our method to conduct an integrative analysis of the first two data sets and also all three data sets. Both results showed that many gene sets could be identified with low false discovery rates. A consistency between both results was also observed. A further exploration based on the KEGG cancer pathway collection showed that a majority of these pathways could be identified by our proposed method. This study illustrates that we can improve detection power and discovery consistency through a concordant integrative analysis of multiple large-scale two-sample gene expression data sets.
Maintenance = reuse-oriented software development
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1989-01-01
Maintenance is viewed as a reuse process. In this context, a set of models that can be used to support the maintenance process is discussed. A high level reuse framework is presented that characterizes the object of reuse, the process for adapting that object for its target application, and the reused object within its target application. Based upon this framework, a qualitative comparison is offered of the three maintenance process models with regard to their strengths and weaknesses and the circumstances in which they are appropriate. To provide a more systematic, quantitative approach for evaluating the appropriateness of the particular maintenance model, a measurement scheme is provided, based upon the reuse framework, in the form of an organized set of questions that need to be answered. To support the reuse perspective, a set of reuse enablers are discussed.
Physico-chemical characteristics of microwave-dried wheat distillers grain with solubles.
Mosqueda, Maria Rosario P; Tabil, Lope G; Meda, Venkatesh
2013-01-01
Laboratory-prepared samples of wheat distillers grain with solubles with varying condensed distillers solubles (CDS) content were dried under varying microwave power, and microwave convection settings using a domestic microwave oven to examine their effect on the chemical, structural, color, flow, compression, thermal, and frictional properties of the product, which is dried distillers grain with solubles (DDGS). As CDS level increased, protein and ash content increased, while fat and fiber content decreased in wheat-based DDGS. Fat content was also markedly effected by the microwave oven drying conditions. While CDS level, microwave power or microwave convection setting, and/or their interactions significantly effected a number of physical properties; results indicated that CDS level had a stronger influence compared to the other factors. DDGS samples with high CDS levels were significantly denser, finer but more differentiated in size, less flowable, and less dispersible. These also produced denser and stronger pellets.
Lukasczyk, Jonas; Weber, Gunther; Maciejewski, Ross; ...
2017-06-01
Tracking graphs are a well established tool in topological analysis to visualize the evolution of components and their properties over time, i.e., when components appear, disappear, merge, and split. However, tracking graphs are limited to a single level threshold and the graphs may vary substantially even under small changes to the threshold. To examine the evolution of features for varying levels, users have to compare multiple tracking graphs without a direct visual link between them. We propose a novel, interactive, nested graph visualization based on the fact that the tracked superlevel set components for different levels are related to eachmore » other through their nesting hierarchy. This approach allows us to set multiple tracking graphs in context to each other and enables users to effectively follow the evolution of components for different levels simultaneously. We show the effectiveness of our approach on datasets from finite pointset methods, computational fluid dynamics, and cosmology simulations.« less
Mougiakakou, Stavroula G; Valavanis, Ioannis K; Nikita, Alexandra; Nikita, Konstantina S
2007-09-01
The aim of the present study is to define an optimally performing computer-aided diagnosis (CAD) architecture for the classification of liver tissue from non-enhanced computed tomography (CT) images into normal liver (C1), hepatic cyst (C2), hemangioma (C3), and hepatocellular carcinoma (C4). To this end, various CAD architectures, based on texture features and ensembles of classifiers (ECs), are comparatively assessed. Number of regions of interests (ROIs) corresponding to C1-C4 have been defined by experienced radiologists in non-enhanced liver CT images. For each ROI, five distinct sets of texture features were extracted using first order statistics, spatial gray level dependence matrix, gray level difference method, Laws' texture energy measures, and fractal dimension measurements. Two different ECs were constructed and compared. The first one consists of five multilayer perceptron neural networks (NNs), each using as input one of the computed texture feature sets or its reduced version after genetic algorithm-based feature selection. The second EC comprised five different primary classifiers, namely one multilayer perceptron NN, one probabilistic NN, and three k-nearest neighbor classifiers, each fed with the combination of the five texture feature sets or their reduced versions. The final decision of each EC was extracted by using appropriate voting schemes, while bootstrap re-sampling was utilized in order to estimate the generalization ability of the CAD architectures based on the available relatively small-sized data set. The best mean classification accuracy (84.96%) is achieved by the second EC using a fused feature set, and the weighted voting scheme. The fused feature set was obtained after appropriate feature selection applied to specific subsets of the original feature set. The comparative assessment of the various CAD architectures shows that combining three types of classifiers with a voting scheme, fed with identical feature sets obtained after appropriate feature selection and fusion, may result in an accurate system able to assist differential diagnosis of focal liver lesions from non-enhanced CT images.
Briët, Olivier Jt; Penny, Melissa A
2013-11-07
Stagnating funds for malaria control have spurred interest in the question of how to sustain the gains of recent successes with long-lasting insecticidal nets (LLINs) and improved case management (CM). This simulation study examined the malaria transmission and disease dynamics in scenarios with sustained LLINs and CM interventions and tried to determine optimal LLIN distribution rates. The effects of abruptly halting LLIN distribution were also examined. Dynamic simulations of malaria in humans and mosquitoes were run on the OpenMalaria platform, using stochastic individual-based simulation models. LLINs were distributed in a range of transmission settings, with varying CM coverage levels. In the short-term, LLINs were beneficial over the entire transmission spectrum, reducing both transmission and disease burden. In the long-term, repeated distributions sustainably reduced transmission in all settings. However, because of the resulting reduction in acquired immunity in the population, the malaria disease burden, after initially being reduced, gradually increased and eventually stabilized at a new level. This new level was higher than the pre-intervention level in previously high transmission settings, if there is a maximum disease burden in the relationship between transmission and disease burden at intermediate transmission levels. This result could lead one to conclude that sustained LLIN distribution might not be cost-effective in high transmission settings in the long term. However, improved CM rendered LLINs more cost-effective in higher transmission settings than in those without improved CM and the majority of the African population lives in areas where CM and LLINs are sustainably combined. The effects of changes in LLIN distribution rate on cost-effectiveness were relatively small compared to the effects of changes in transmission setting and CM. Abruptly halting LLIN distribution led to temporary morbidity peaks, which were particularly large in low to intermediate transmission settings. This study reaffirms the importance of context specific intervention planning. Intervention planning must include combinations of malaria vector control and CM, and must consider both the pre-intervention transmission level and the intervention history to account for the loss of immunity and the potential for rebounds in disease burden.
The Performance Effects of an Ability-Based Approach to Goal Assignment
ERIC Educational Resources Information Center
Jeffrey, Scott A.; Schulz, Axel; Webb, Alan
2012-01-01
Some organizations have begun to target their goal-setting method more closely to the ability levels of their employees. In this article, we report the results of a laboratory study of 138 undergraduate students, which shows that these "ability-based" goals are more effective at improving performance than a "one goal for all"…
Bureaucratic Activism and Radical School Change in Tamil Nadu, India
ERIC Educational Resources Information Center
Niesz, Tricia; Krishnamurthy, Ramchandar
2013-01-01
In 2007, Activity Based Learning (ABL), a child-centered, activity-based method of pedagogical practice, transformed classrooms in all of the over 37,000 primary-level government schools in Tamil Nadu, India. The large scale, rapid pace, and radical nature of educational change sets the ABL initiative apart from most school reform efforts.…
A Storytime Year: A Month-to-Month Kit for Preschool Programming.
ERIC Educational Resources Information Center
Dailey, Susan M.
Noting the need for children's librarians, preschool teachers, and storytellers to keep their programs interesting and fresh, this guide is comprised of 48 theme-based units for preschool-level programs in libraries, early childhood settings, or at home. The guide is presented in two parts. Part 1 contains tips for theme-based program planning and…
Making Inclusion Work for Students with Autism Spectrum Disorders: An Evidence-Based Guide
ERIC Educational Resources Information Center
Smith, Tristram
2011-01-01
An indispensable resource for K-12 educators and autism specialists, this highly practical book shows how to include students with autism spectrum disorders (ASD) in general education settings. Tristram Smith and his associates present a research-based, step-by-step process for assessing students at a range of skill levels, planning and…
Designed Curriculum and Local Culture: Acknowledging the Primacy of Classroom Culture.
ERIC Educational Resources Information Center
Squire, Kurt D.; MaKinster, James G.; Barnett, Michael; Luehmann, April Lynn; Barab, Sasha L.
2003-01-01
Examines four teachers implementing a project-based curriculum (Air Quality module) on a web-based platform (ActiveInk Network) in four very different settings. Discusses each case across two themes by examining how the project-level question was contextualized to meet local needs and the cultural context that surrounded the implementation of the…
Active Learning in a Math for Liberal Arts Classroom
ERIC Educational Resources Information Center
Lenz, Laurie
2015-01-01
Inquiry-based learning is a topic of growing interest in the mathematical community. Much of the focus has been on using these methods in calculus and higher-level classes. This article describes the design and implementation of a set of inquiry-based learning activities in a Math for Liberal Arts course at a small, private, Catholic college.…
Curriculum-Based Measurement of Oral Reading: Quality of Progress Monitoring Outcomes
ERIC Educational Resources Information Center
Christ, Theodore J.; Zopluoglu, Cengiz; Long, Jeffery D.; Monaghen, Barbara D.
2012-01-01
Curriculum-based measurement of oral reading (CBM-R) is frequently used to set student goals and monitor student progress. This study examined the quality of growth estimates derived from CBM-R progress monitoring data. The authors used a linear mixed effects regression (LMER) model to simulate progress monitoring data for multiple levels of…
Project Link-Four: Pre-Vocational Education for Adults through Community Linkages.
ERIC Educational Resources Information Center
Stedman, Deborah S.
The Texas adult performance level (APL) project LINK-FOUR implemented a curriculum based on functional competencies at four sites (Austin, Texarkana, Texas City, and Abilene) and formed linkages with local organizations involved in adult vocational education. The concept on which the project was based was that a set of prevocational skills, plus a…
Comparing student performance on paper- and computer-based math curriculum-based measures.
Hensley, Kiersten; Rankin, Angelica; Hosp, John
2017-01-01
As the number of computerized curriculum-based measurement (CBM) tools increases, it is necessary to examine whether or not student performance can generalize across a variety of test administration modes (i.e., paper or computer). The purpose of this study is to compare math fact fluency on paper versus computer for 197 upper elementary students. Students completed identical sets of probes on paper and on the computer, which were then scored for digits correct, problems correct, and accuracy. Results showed a significant difference in performance between the two sets of probes, with higher fluency rates on the paper probes. Because decisions about levels of student support and interventions often rely on measures such as these, more research in this area is needed to examine the potential differences in student performance between paper-based and computer-based CBMs.
Seasonality and phenology alter functional leaf traits.
McKown, Athena D; Guy, Robert D; Azam, M Shofiul; Drewes, Eric C; Quamme, Linda K
2013-07-01
In plant ecophysiology, functional leaf traits are generally not assessed in relation to phenological phase of the canopy. Leaf traits measured in deciduous perennial species are known to vary between spring and summer seasons, but there is a knowledge gap relating to the late-summer phase marked by growth cessation and bud set occurring well before fall leaf senescence. The effects of phenology on canopy physiology were tested using a common garden of over 2,000 black cottonwood (Populus trichocarpa) individuals originating from a wide geographical range (44-60ºN). Annual phenological events and 12 leaf-based functional trait measurements were collected spanning the entire summer season prior to, and following, bud set. Patterns of seasonal trait change emerged by synchronizing trees using their date of bud set. In particular, photosynthetic, mass, and N-based traits increased substantially following bud set. Most traits were significantly different between pre-bud set and post-bud set phase trees, with many traits showing at least 25% alteration in mean value. Post-bud set, both the significance and direction of trait-trait relationships could be modified, with many relating directly to changes in leaf mass. In Populus, these dynamics in leaf traits throughout the summer season reflected a shift in whole plant physiology, but occurred long before the onset of leaf senescence. The marked shifts in measured trait values following bud set underscores the necessity to include phenology in trait-based ecological studies or large-scale phenotyping efforts, both at the local level and larger geographical scale.
Stewart, Jennifer M
2014-01-01
To assess the barriers and facilitators to using African American churches as sites for implementation of evidence-based HIV interventions among young African American women. Mixed methods cross-sectional design. African American churches in Philadelphia, PA. 142 African American pastors, church leaders, and young adult women ages 18 to 25. Mixed methods convergent parallel design. The majority of young adult women reported engaging in high-risk HIV-related behaviors. Although church leaders reported willingness to implement HIV risk-reduction interventions, they were unsure of how to initiate this process. Key facilitators to the implementation of evidence-based interventions included the perception of the leadership and church members that HIV interventions were needed and that the church was a promising venue for them. A primary barrier to implementation in this setting is the perception that discussions of sexuality should be private. Implementation of evidence-based HIV interventions for young adult African American women in church settings is feasible and needed. Building a level of comfort in discussing matters of sexuality and adapting existing evidence-based interventions to meet the needs of young women in church settings is a viable approach for successful implementation. © 2014 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses.
NASA Astrophysics Data System (ADS)
Dente, Elad; Lensky, Nadav G.; Morin, Efrat; Grodek, Tamir; Sheffer, Nathan A.; Enzel, Yehouda
2017-12-01
The geomorphic response of channels to base-level fall is an important factor in landscape evolution. To better understand the complex interactions between the factors controlling channel evolution in an emerging continental shelf setting, we use an extensive data set (high-resolution digital elevation models, aerial photographs, and Landsat imagery) of a newly incising, perennial segment of Nahal (Wadi) HaArava, Israel. This channel responds to the rapid and progressive lowering of its base-level, the Dead Sea (>30 m in 35 years; 0.5-1.3 m yr-1). Progressively evolving longitudinal profiles, channel width, sinuosity, and knickpoint retreat during the last few decades were documented or reconstructed. The results indicate that even under fast base-level fall, rapid delta progradation on top of the shelf and shelf edge can moderate channel mouth slopes and, therefore, largely inhibit channel incision and knickpoint propagation. This channel elongation stage ends when the delta reaches an extended accommodation within the receiving basin and fails to keep the channel mouth slopes as low as the channel bed slopes. Then, processes of incision, narrowing, and meandering begin to shape the channel and expand upstream. When the down-cutting channel encounters a more resistant stratum within the channel substrate, these processes are restricted to a downstream reach by formation of a retreating vertical knickpoint. When the knickpoint and the channel incise to a level below this stratum, a spatially continuous, diffusion-like evolution characterizes the channel's response and source-to-sink transport can be implemented. These results emphasize the mouth slope and channel substrate resistance as the governing factors over long-term channel evolution, whereas flash floods have only local and short-lived impacts in a confined, continuously incising channel. The documented channel response applies to eustatic base-level fall under steepening basin bathymetry, rapid delta progradation, and lithologic variations in the channel substrate.
Interaction potentials and transport properties of Ba, Ba+, and Ba2+ in rare gases from He to Xe
NASA Astrophysics Data System (ADS)
Buchachenko, Alexei A.; Viehland, Larry A.
2018-04-01
A highly accurate, consistent set of ab initio interaction potentials is obtained for the title systems at the coupled cluster with singles, doubles, and non-iterative triples level of theory with extrapolation to the complete basis set limit. These potentials are shown to be more reliable than the previous potentials based on their long-range behavior, equilibrium properties, collision cross sections, and transport properties.
Newsome, Seth D.; Yeakel, Justin D.; Wheatley, Patrick V.; Tinker, M. Tim
2012-01-01
Ecologists are increasingly using stable isotope analysis to inform questions about variation in resource and habitat use from the individual to community level. In this study we investigate data sets from 2 California sea otter (Enhydra lutris nereis) populations to illustrate the advantages and potential pitfalls of applying various statistical and quantitative approaches to isotopic data. We have subdivided these tools, or metrics, into 3 categories: IsoSpace metrics, stable isotope mixing models, and DietSpace metrics. IsoSpace metrics are used to quantify the spatial attributes of isotopic data that are typically presented in bivariate (e.g., δ13C versus δ15N) 2-dimensional space. We review IsoSpace metrics currently in use and present a technique by which uncertainty can be included to calculate the convex hull area of consumers or prey, or both. We then apply a Bayesian-based mixing model to quantify the proportion of potential dietary sources to the diet of each sea otter population and compare this to observational foraging data. Finally, we assess individual dietary specialization by comparing a previously published technique, variance components analysis, to 2 novel DietSpace metrics that are based on mixing model output. As the use of stable isotope analysis in ecology continues to grow, the field will need a set of quantitative tools for assessing isotopic variance at the individual to community level. Along with recent advances in Bayesian-based mixing models, we hope that the IsoSpace and DietSpace metrics described here will provide another set of interpretive tools for ecologists.
NASA Astrophysics Data System (ADS)
Stewart, Iris T.; Loague, Keith
2003-12-01
Groundwater vulnerability assessments of nonpoint source agrochemical contamination at regional scales are either qualitative in nature or require prohibitively costly computational efforts. By contrast, the type transfer function (TTF) modeling approach for vadose zone pesticide leaching presented here estimates solute concentrations at a depth of interest, only uses available soil survey, climatic, and irrigation information, and requires minimal computational cost for application. TTFs are soil texture based travel time probability density functions that describe a characteristic leaching behavior for soil profiles with similar soil hydraulic properties. Seven sets of TTFs, representing different levels of upscaling, were developed for six loam soil textural classes with the aid of simulated breakthrough curves from synthetic data sets. For each TTF set, TTFs were determined from a group or subgroup of breakthrough curves for each soil texture by identifying the effective parameters of the function that described the average leaching behavior of the group. The grouping of the breakthrough curves was based on the TTF index, a measure of the magnitude of the peak concentration, the peak arrival time, and the concentration spread. Comparison to process-based simulations show that the TTFs perform well with respect to mass balance, concentration magnitude, and the timing of concentration peaks. Sets of TTFs based on individual soil textures perform better for all the evaluation criteria than sets that span all textures. As prediction accuracy and computational cost increase with the number of TTFs in a set, the selection of a TTF set is determined by a given application.
Wallace, Lauren; Kapirir, Lydia
2017-01-01
Background: To date, research on priority-setting for new vaccines has not adequately explored the influence of the global, national and sub-national levels of decision-making or contextual issues such as political pressure and stakeholder influence and power. Using Kapiriri and Martin’s conceptual framework, this paper evaluates priority setting for new vaccines in Uganda at national and sub-national levels, and considers how global priorities can influence country priorities. This study focuses on 2 specific vaccines, the human papilloma virus (HPV) vaccine and the pneumococcal conjugate vaccine (PCV). Methods: This was a qualitative study that involved reviewing relevant Ugandan policy documents and media reports, as well as 54 key informant interviews at the global level and national and sub-national levels in Uganda. Kapiriri and Martin’s conceptual framework was used to evaluate the prioritization process. Results: Priority setting for PCV and HPV was conducted by the Ministry of Health (MoH), which is considered to be a legitimate institution. While respondents described the priority setting process for PCV process as transparent, participatory, and guided by explicit relevant criteria and evidence, the prioritization of HPV was thought to have been less transparent and less participatory. Respondents reported that neither process was based on an explicit priority setting framework nor did it involve adequate representation from the districts (program implementers) or publicity. The priority setting process for both PCV and HPV was negatively affected by the larger political and economic context, which contributed to weak institutional capacity as well as power imbalances between development assistance partners and the MoH. Conclusion: Priority setting in Uganda would be improved by strengthening institutional capacity and leadership and ensuring a transparent and participatory processes in which key stakeholders such as program implementers (the districts) and beneficiaries (the public) are involved. Kapiriri and Martin’s framework has the potential to guide priority setting evaluation efforts, however, evaluation should be built into the priority setting process a priori such that information on priority setting is gathered throughout the implementation cycle. PMID:29172378
75 FR 2938 - National Ambient Air Quality Standards for Ozone
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-19
...Based on its reconsideration of the primary and secondary national ambient air quality standards (NAAQS) for ozone (O3) set in March 2008, EPA proposes to set different primary and secondary standards than those set in 2008 to provide requisite protection of public health and welfare, respectively. With regard to the primary standard for O3, EPA proposes that the level of the 8-hour primary standard, which was set at 0.075 ppm in the 2008 final rule, should instead be set at a lower level within the range of 0.060 to 0.070 parts per million (ppm), to provide increased protection for children and other ``at risk'' populations against an array of O3-related adverse health effects that range from decreased lung function and increased respiratory symptoms to serious indicators of respiratory morbidity including emergency department visits and hospital admissions for respiratory causes, and possibly cardiovascular-related morbidity as well as total non-accidental and cardiopulmonary mortality. With regard to the secondary standard for O3, EPA proposes that the secondary O3 standard, which was set identical to the revised primary standard in the 2008 final rule, should instead be a new cumulative, seasonal standard expressed as an annual index of the sum of weighted hourly concentrations, cumulated over 12 hours per day (8 am to 8 pm) during the consecutive 3-month period within the O3 season with the maximum index value, set at a level within the range of 7 to 15 ppm- hours, to provide increased protection against O3-related adverse impacts on vegetation and forested ecosystems.
Collaborative knowledge acquisition for the design of context-aware alert systems
Joffe, Erel; Havakuk, Ofer; Herskovic, Jorge R; Patel, Vimla L
2012-01-01
Objective To present a framework for combining implicit knowledge acquisition from multiple experts with machine learning and to evaluate this framework in the context of anemia alerts. Materials and Methods Five internal medicine residents reviewed 18 anemia alerts, while ‘talking aloud’. They identified features that were reviewed by two or more physicians to determine appropriate alert level, etiology and treatment recommendation. Based on these features, data were extracted from 100 randomly-selected anemia cases for a training set and an additional 82 cases for a test set. Two staff internists assigned an alert level, etiology and treatment recommendation before and after reviewing the entire electronic medical record. The training set of 118 cases (100 plus 18) and the test set of 82 cases were explored using RIDOR and JRip algorithms. Results The feature set was sufficient to assess 93% of anemia cases (intraclass correlation for alert level before and after review of the records by internists 1 and 2 were 0.92 and 0.95, respectively). High-precision classifiers were constructed to identify low-level alerts (precision p=0.87, recall R=0.4), iron deficiency (p=1.0, R=0.73), and anemia associated with kidney disease (p=0.87, R=0.77). Discussion It was possible to identify low-level alerts and several conditions commonly associated with chronic anemia. This approach may reduce the number of clinically unimportant alerts. The study was limited to anemia alerts. Furthermore, clinicians were aware of the study hypotheses potentially biasing their evaluation. Conclusion Implicit knowledge acquisition, collaborative filtering and machine learning were combined automatically to induce clinically meaningful and precise decision rules. PMID:22744961
Level-set techniques for facies identification in reservoir modeling
NASA Astrophysics Data System (ADS)
Iglesias, Marco A.; McLaughlin, Dennis
2011-03-01
In this paper we investigate the application of level-set techniques for facies identification in reservoir models. The identification of facies is a geometrical inverse ill-posed problem that we formulate in terms of shape optimization. The goal is to find a region (a geologic facies) that minimizes the misfit between predicted and measured data from an oil-water reservoir. In order to address the shape optimization problem, we present a novel application of the level-set iterative framework developed by Burger in (2002 Interfaces Free Bound. 5 301-29 2004 Inverse Problems 20 259-82) for inverse obstacle problems. The optimization is constrained by (the reservoir model) a nonlinear large-scale system of PDEs that describes the reservoir dynamics. We reformulate this reservoir model in a weak (integral) form whose shape derivative can be formally computed from standard results of shape calculus. At each iteration of the scheme, the current estimate of the shape derivative is utilized to define a velocity in the level-set equation. The proper selection of this velocity ensures that the new shape decreases the cost functional. We present results of facies identification where the velocity is computed with the gradient-based (GB) approach of Burger (2002) and the Levenberg-Marquardt (LM) technique of Burger (2004). While an adjoint formulation allows the straightforward application of the GB approach, the LM technique requires the computation of the large-scale Karush-Kuhn-Tucker system that arises at each iteration of the scheme. We efficiently solve this system by means of the representer method. We present some synthetic experiments to show and compare the capabilities and limitations of the proposed implementations of level-set techniques for the identification of geologic facies.
Jacinto, Rogério Castilho; Linhares-Farina, Giane; Sposito, Otávio da Silva; Zanchi, César Henrique; Cenci, Maximiliano Sérgio
2015-01-01
The addition of chlorhexidine (CHX) to a resinous experimental Mineral Trioxide Aggregate (E-MTA) based root-end filling material is an alternative to boost its antimicrobial activity. However, the influence of chlorhexidine on the properties of this material is unclear. The aim of this study was to evaluate the influence of 2% chlorhexidine on the pH, calcium ion release and setting time of a Bisphenol A Ethoxylate Dimethacrylate/Mineral Trioxide Aggregate (Bis-EMA/MTA) based dual-cure experimental root-end filling material (E-MTA), in comparison with E-MTA without the addition of CHX and with conventional white MTA (W-MTA). The materials were placed in polyethylene tubes, and immersed in deionized water to determine pH (digital pH meter) and calcium ion release (atomic absorption spectrometry technique). The setting time of each material was analyzed using Gilmore needles. The data were statistically analyzed at a significance level of 5%. E-MTA + CHX showed an alkaline pH in the 3 h period of evaluation, the alkalinity of which decreased but remained as such for 15 days. The pH of E-MTA + CHX was higher than the other two materials after 7 days, and lower after 30 days (p < 0.05). All of the materials were found to release calcium ions throughout the 30 days of the study. The addition of CHX increased the calcium ion release of E-MTA to levels statistically similar to W-MTA. E-MTA showed shorter initial and final setting time, compared with W-MTA (p < 0.05). The addition of 2% CHX to MTA prevented setting of the material. The addition of CHX to E-MTA increased its pH and calcium ion release. However, it also prevented setting of the material.
NASA Astrophysics Data System (ADS)
Lerot, C.; Danckaert, T.; van Gent, J.; Coldewey-Egbers, M.; Loyola, D. G.; Errera, Q.; Spurr, R. J. D.; Garane, K.; Koukouli, M.; Balis, D.; Verhoelst, T.; Granville, J.; Lambert, J. C.; Van Roozendael, M.
2017-12-01
Total ozone is one of the Essential Climate Variables (ECV) operationally produced within the European Copernicus Climate Change Service (C3S), which aims at providing the geophysical information needed to monitor and study our climate system. The C3S total ozone processing chain relies on algorithmic developments realized for the last six years as part of the ESA's Ozone Climate Change Initiative (Ozone_cci) project. The C3S Climate Data Store currently contains a total ozone record based on observations from the nadir UV-Vis hyperspectral spectrometers GOME/ERS-2, SCIAMACHY/Envisat, GOME-2/Metop-A, GOME-2/Metop-B and OMI/Aura, spanning more than 23 years.Individual level-2 datasets were generated with the retrieval algorithm GODFIT (GOME-type Direct FITting). The retrievals are based on a non-linear least squares adjustment of reflectances simulated with radiative transfer tools from the LIDORT suite, to the measured spectra in the Huggins bands (325-335 nm). The inter-sensor consistency and the time stability of those data sets is significantly enhanced with the application of a soft-calibration procedure to the level-1 reflectances, in which GOME and OMI are used together as a long-term reference. Level-2 data sets are then combined to produce the level-3 GOME-type Total Ozone (GTO-ECV) record consisting of homogenized 1°x1° monthly mean grids. The merging procedure corrects for subsisting inter-satellite biases and temporal drifts. Some developments for minimizing sampling errors have also been recently investigated and will be discussed. Total ozone level-2 and level-3 data sets are regularly verified and validated by independent measurements both from space (independent algorithms and/or instruments) and ground (Brewer/Dobson/SAOZ) and their excellent quality and stability, as well as their consistency with other long-term total ozone data sets will be illustrated here. In future, in addition to be continuously extended in time, the C3S total ozone record will also incorporate new sensors such as OMPS aboard Suomi NPP or TROPOMI/S5p.
NASA Astrophysics Data System (ADS)
Deng, Lujuan; Xie, Songhe; Cui, Jiantao; Liu, Tao
2006-11-01
It is the essential goal of intelligent greenhouse environment optimal control to enhance income of cropper and energy save. There were some characteristics such as uncertainty, imprecision, nonlinear, strong coupling, bigger inertia and different time scale in greenhouse environment control system. So greenhouse environment optimal control was not easy and especially model-based optimal control method was more difficult. So the optimal control problem of plant environment in intelligent greenhouse was researched. Hierarchical greenhouse environment control system was constructed. In the first level data measuring was carried out and executive machine was controlled. Optimal setting points of climate controlled variable in greenhouse was calculated and chosen in the second level. Market analysis and planning were completed in third level. The problem of the optimal setting point was discussed in this paper. Firstly the model of plant canopy photosynthesis responses and the model of greenhouse climate model were constructed. Afterwards according to experience of the planting expert, in daytime the optimal goals were decided according to the most maximal photosynthesis rate principle. In nighttime on plant better growth conditions the optimal goals were decided by energy saving principle. Whereafter environment optimal control setting points were computed by GA. Compared the optimal result and recording data in real system, the method is reasonable and can achieve energy saving and the maximal photosynthesis rate in intelligent greenhouse
NASA Astrophysics Data System (ADS)
Arestova, M. L.; Bykovskii, A. Yu
1995-10-01
An architecture is proposed for a specialised optoelectronic multivalued logic processor based on the Allen—Givone algebra. The processor is intended for multiparametric processing of data arriving from a large number of sensors or for tackling spectral analysis tasks. The processor architecture makes it possible to obtain an approximate general estimate of the state of an object being diagnosed on a p-level scale. Optoelectronic systems are proposed for MAXIMUM, MINIMUM, and LITERAL logic gates, based on optical-frequency encoding of logic levels. Corresponding logic gates form a complete set of logic functions in the Allen—Givone algebra.
Cerebrospinal fluid neopterin decay characteristics after initiation of antiretroviral therapy
2013-01-01
Background Neopterin, a biomarker of macrophage activation, is elevated in the cerebrospinal fluid (CSF) of most HIV-infected individuals and decreases after initiation of antiretroviral therapy (ART). We studied decay characteristics of neopterin in CSF and blood after commencement of ART in HIV-infected subjects and estimated the set-point levels of CSF neopterin after ART-mediated viral suppression. Methods CSF and blood neopterin were longitudinally measured in 102 neurologically asymptomatic HIV-infected subjects who were treatment-naïve or had been off ART for ≥ 6 months. We used a non-linear model to estimate neopterin decay in response to ART and a stable neopterin set-point attained after prolonged ART. Seven subjects with HIV-associated dementia (HAD) who initiated ART were studied for comparison. Results Non-HAD patients were followed for a median 84.7 months. Though CSF neopterin concentrations decreased rapidly after ART initiation, it was estimated that set-point levels would be below normal CSF neopterin levels (<5.8 nmol/L) in only 60/102 (59%) of these patients. Pre-ART CSF neopterin was the primary predictor of set-point (P <0.001). HAD subjects had higher baseline median CSF neopterin levels than non-HAD subjects (P <0.0001). Based on the non-HAD model, only 14% of HAD patients were predicted to reach normal levels. Conclusions After virologically suppressive ART, abnormal CSF neopterin levels persisted in 41% of non-HAD and the majority of HAD patients. ART is not fully effective in ameliorating macrophage activation in CNS as well as blood, especially in subjects with higher pre-ART levels of immune activation. PMID:23664008
Mobile Phone Assessment in Egocentric Networks: A Pilot Study on Gay Men and Their Peers
Comulada, W. Scott
2015-01-01
Mobile phone-based data collection encompasses the richness of social network research. Both individual-level and network-level measures can be recorded. For example, health-related behaviors can be reported via mobile assessment. Social interactions can be assessed by phone-log data. Yet the potential of mobile phone data collection has largely been untapped. This is especially true of egocentric studies in public health settings where mobile phones can enhance both data collection and intervention delivery, e.g. mobile users can video chat with counselors. This is due in part to privacy issues and other barriers that are more difficult to address outside of academic settings where most mobile research to date has taken place. In this article, we aim to inform a broader discussion on mobile research. In particular, benefits and challenges to mobile phone-based data collection are highlighted through our mobile phone-based pilot study that was conducted on egocentric networks of 12 gay men (n = 44 total participants). HIV-transmission and general health behaviors were reported through a mobile phone-based daily assessment that was administered through study participants’ own mobile phones. Phone log information was collected from gay men with Android phones. Benefits and challenges to mobile implementation are discussed, along with the application of multi-level models to the type of longitudinal egocentric data that we collected. PMID:25844003
Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain
NASA Technical Reports Server (NTRS)
Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem
2016-01-01
The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.
Mobile Phone Assessment in Egocentric Networks: A Pilot Study on Gay Men and Their Peers.
Comulada, W Scott
2014-12-01
Mobile phone-based data collection encompasses the richness of social network research. Both individual-level and network-level measures can be recorded. For example, health-related behaviors can be reported via mobile assessment. Social interactions can be assessed by phone-log data. Yet the potential of mobile phone data collection has largely been untapped. This is especially true of egocentric studies in public health settings where mobile phones can enhance both data collection and intervention delivery, e.g. mobile users can video chat with counselors. This is due in part to privacy issues and other barriers that are more difficult to address outside of academic settings where most mobile research to date has taken place. In this article, we aim to inform a broader discussion on mobile research. In particular, benefits and challenges to mobile phone-based data collection are highlighted through our mobile phone-based pilot study that was conducted on egocentric networks of 12 gay men (n = 44 total participants). HIV-transmission and general health behaviors were reported through a mobile phone-based daily assessment that was administered through study participants' own mobile phones. Phone log information was collected from gay men with Android phones. Benefits and challenges to mobile implementation are discussed, along with the application of multi-level models to the type of longitudinal egocentric data that we collected.
Kasaie, Parastu; Mathema, Barun; Kelton, W David; Azman, Andrew S; Pennington, Jeff; Dowdy, David W
2015-01-01
In any setting, a proportion of incident active tuberculosis (TB) reflects recent transmission ("recent transmission proportion"), whereas the remainder represents reactivation. Appropriately estimating the recent transmission proportion has important implications for local TB control, but existing approaches have known biases, especially where data are incomplete. We constructed a stochastic individual-based model of a TB epidemic and designed a set of simulations (derivation set) to develop two regression-based tools for estimating the recent transmission proportion from five inputs: underlying TB incidence, sampling coverage, study duration, clustered proportion of observed cases, and proportion of observed clusters in the sample. We tested these tools on a set of unrelated simulations (validation set), and compared their performance against that of the traditional 'n-1' approach. In the validation set, the regression tools reduced the absolute estimation bias (difference between estimated and true recent transmission proportion) in the 'n-1' technique by a median [interquartile range] of 60% [9%, 82%] and 69% [30%, 87%]. The bias in the 'n-1' model was highly sensitive to underlying levels of study coverage and duration, and substantially underestimated the recent transmission proportion in settings of incomplete data coverage. By contrast, the regression models' performance was more consistent across different epidemiological settings and study characteristics. We provide one of these regression models as a user-friendly, web-based tool. Novel tools can improve our ability to estimate the recent TB transmission proportion from data that are observable (or estimable) by public health practitioners with limited available molecular data.
Kasaie, Parastu; Mathema, Barun; Kelton, W. David; Azman, Andrew S.; Pennington, Jeff; Dowdy, David W.
2015-01-01
In any setting, a proportion of incident active tuberculosis (TB) reflects recent transmission (“recent transmission proportion”), whereas the remainder represents reactivation. Appropriately estimating the recent transmission proportion has important implications for local TB control, but existing approaches have known biases, especially where data are incomplete. We constructed a stochastic individual-based model of a TB epidemic and designed a set of simulations (derivation set) to develop two regression-based tools for estimating the recent transmission proportion from five inputs: underlying TB incidence, sampling coverage, study duration, clustered proportion of observed cases, and proportion of observed clusters in the sample. We tested these tools on a set of unrelated simulations (validation set), and compared their performance against that of the traditional ‘n-1’ approach. In the validation set, the regression tools reduced the absolute estimation bias (difference between estimated and true recent transmission proportion) in the ‘n-1’ technique by a median [interquartile range] of 60% [9%, 82%] and 69% [30%, 87%]. The bias in the ‘n-1’ model was highly sensitive to underlying levels of study coverage and duration, and substantially underestimated the recent transmission proportion in settings of incomplete data coverage. By contrast, the regression models’ performance was more consistent across different epidemiological settings and study characteristics. We provide one of these regression models as a user-friendly, web-based tool. Novel tools can improve our ability to estimate the recent TB transmission proportion from data that are observable (or estimable) by public health practitioners with limited available molecular data. PMID:26679499
NASA Astrophysics Data System (ADS)
Fukuda, Hiroki; Suwa, Hideaki; Nakano, Atsushi; Sakamoto, Mari; Imazu, Miki; Hasegawa, Takuya; Takahama, Hiroyuki; Amaki, Makoto; Kanzaki, Hideaki; Anzai, Toshihisa; Mochizuki, Naoki; Ishii, Akira; Asanuma, Hiroshi; Asakura, Masanori; Washio, Takashi; Kitakaze, Masafumi
2016-11-01
Brain natriuretic peptide (BNP) is the most effective predictor of outcomes in chronic heart failure (CHF). This study sought to determine the qualitative relationship between the BNP levels at discharge and on the day of cardiovascular events in CHF patients. We devised a mathematical probabilistic model between the BNP levels at discharge (y) and on the day (t) of cardiovascular events after discharge for 113 CHF patients (Protocol I). We then prospectively evaluated this model on another set of 60 CHF patients who were readmitted (Protocol II). P(t|y) was the probability of cardiovascular events occurring after >t, the probability on t was given as p(t|y) = -dP(t|y)/dt, and p(t|y) = pP(t|y) = αyβP(t|y), along with p = αyβ (α and β were constant); the solution was p(t|y) = αyβ exp(-αyβt). We fitted this equation to the data set of Protocol I using the maximum likelihood principle, and we obtained the model p(t|y) = 0.000485y0.24788 exp(-0.000485y0.24788t). The cardiovascular event-free rate was computed as P(t) = 1/60Σi=1,…,60 exp(-0.000485yi0.24788t), based on this model and the BNP levels yi in a data set of Protocol II. We confirmed no difference between this model-based result and the actual event-free rate. In conclusion, the BNP levels showed a non-linear relationship with the day of occurrence of cardiovascular events in CHF patients.
Computer-aided detection of bladder masses in CT urography (CTU)
NASA Astrophysics Data System (ADS)
Cha, Kenny H.; Hadjiiski, Lubomir M.; Chan, Heang-Ping; Caoili, Elaine M.; Cohan, Richard H.; Weizer, Alon; Samala, Ravi K.
2017-03-01
We are developing a computer-aided detection system for bladder cancer in CT urography (CTU). We have previously developed methods for detection of bladder masses within the contrast-enhanced and the non-contrastenhanced regions of the bladder individually. In this study, we investigated methods for detection of bladder masses within the entire bladder. The bladder was segmented using our method that combined deep-learning convolutional neural network with level sets. The non-contrast-enhanced region was separated from the contrast-enhanced region with a maximum-intensity-projection-based method. The non-contrast region was smoothed and gray level threshold was applied to the contrast and non-contrast regions separately to extract the bladder wall and potential masses. The bladder wall was transformed into a straightened thickness profile, which was analyzed to identify lesion candidates in a prescreening step. The candidates were mapped back to the 3D CT volume and segmented using our auto-initialized cascaded level set (AI-CALS) segmentation method. Twenty-seven morphological features were extracted for each candidate. A data set of 57 patients with 71 biopsy-proven bladder lesions was used, which was split into independent training and test sets: 42 training cases with 52 lesions, and 15 test cases with 19 lesions. Using the training set, feature selection was performed and a linear discriminant (LDA) classifier was designed to merge the selected features for classification of bladder lesions and false positives. The trained classifier was evaluated with the test set. FROC analysis showed that the system achieved a sensitivity of 86.5% at 3.3 FPs/case for the training set, and 84.2% at 3.7 FPs/case for the test set.
Claim More™: Empowering African American Women to Make Healthy Choices.
Tkatch, Rifky; Musich, Shirley; Draklellis, Jennifer; Hetzel, Marla; Banks, Jo; Dugan, Jessica; Thompson, Kaylene; Hawkins, Kevin
2018-03-01
Diabetes is a serious issue for African American women. The purpose of this project was to develop and test the feasibility of a culturally appropriate and faith-based healthy eating program for African American women at risk for developing diabetes. At total of 30 women from two churches completed a 12-week, faith-based program using a community-based approach with lay health educators in the church setting. Participants set healthy eating goals, attended weekly education classes, and received daily text messaging reminders related to their goals. Outcomes included high levels of social support, frequent engagement with the program, and improved healthy eating. This program demonstrated the ability to target African American women at risk for diabetes and engage them in a health-related program.
Approach to numerical safety guidelines based on a core melt criterion. [PWR; BWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azarm, M.A.; Hall, R.E.
1982-01-01
A plausible approach is proposed for translating a single level criterion to a set of numerical guidelines. The criterion for core melt probability is used to set numerical guidelines for various core melt sequences, systems and component unavailabilities. These guidelines can be used as a means for making decisions regarding the necessity for replacing a component or improving part of a safety system. This approach is applied to estimate a set of numerical guidelines for various sequences of core melts that are analyzed in Reactor Safety Study for the Peach Bottom Nuclear Power Plant.
The use of an integrated variable fuzzy sets in water resources management
NASA Astrophysics Data System (ADS)
Qiu, Qingtai; Liu, Jia; Li, Chuanzhe; Yu, Xinzhe; Wang, Yang
2018-06-01
Based on the evaluation of the present situation of water resources and the development of water conservancy projects and social economy, optimal allocation of regional water resources presents an increasing need in the water resources management. Meanwhile it is also the most effective way to promote the harmonic relationship between human and water. In view of the own limitations of the traditional evaluations of which always choose a single index model using in optimal allocation of regional water resources, on the basis of the theory of variable fuzzy sets (VFS) and system dynamics (SD), an integrated variable fuzzy sets model (IVFS) is proposed to address dynamically complex problems in regional water resources management in this paper. The model is applied to evaluate the level of the optimal allocation of regional water resources of Zoucheng in China. Results show that the level of allocation schemes of water resources ranging from 2.5 to 3.5, generally showing a trend of lower level. To achieve optimal regional management of water resources, this model conveys a certain degree of accessing water resources management, which prominently improve the authentic assessment of water resources management by using the eigenvector of level H.
NASA Astrophysics Data System (ADS)
Costanzi, Stefano; Tikhonova, Irina G.; Harden, T. Kendall; Jacobson, Kenneth A.
2009-11-01
Accurate in silico models for the quantitative prediction of the activity of G protein-coupled receptor (GPCR) ligands would greatly facilitate the process of drug discovery and development. Several methodologies have been developed based on the properties of the ligands, the direct study of the receptor-ligand interactions, or a combination of both approaches. Ligand-based three-dimensional quantitative structure-activity relationships (3D-QSAR) techniques, not requiring knowledge of the receptor structure, have been historically the first to be applied to the prediction of the activity of GPCR ligands. They are generally endowed with robustness and good ranking ability; however they are highly dependent on training sets. Structure-based techniques generally do not provide the level of accuracy necessary to yield meaningful rankings when applied to GPCR homology models. However, they are essentially independent from training sets and have a sufficient level of accuracy to allow an effective discrimination between binders and nonbinders, thus qualifying as viable lead discovery tools. The combination of ligand and structure-based methodologies in the form of receptor-based 3D-QSAR and ligand and structure-based consensus models results in robust and accurate quantitative predictions. The contribution of the structure-based component to these combined approaches is expected to become more substantial and effective in the future, as more sophisticated scoring functions are developed and more detailed structural information on GPCRs is gathered.
The Effect of Geocenter Motion on Jason-2 and Jason-1 Orbits and the Mean Sea Level
NASA Technical Reports Server (NTRS)
Melachroinos, Stavros A.; Beckley, Brian D.; Lemoine, Frank G.; Zelensky, Nikita P.; Rowlands, David D.; Luthcke, Scott B.
2012-01-01
We have investigated the impact of geocenter motion on Jason-2 orbits. This was accomplished by computing a series of Jason-1, Jason-2 GPS-based and SLR/DORIS-based orbits using ITRF2008 and the IGS repro1 framework based on the most recent GSFC standards. From these orbits, we extract the Jason-2 orbit frame translational parameters per cycle by the means of a Helmert transformation between a set of reference orbits and a set of test orbits. The fitted annual and seasonal terms of these time-series are compared to two different geocenter motion models. Subsequently, we included the geocenter motion corrections in the POD process as a degree-1 loading displacement correction to the tracking network. The analysis suggested that the GSFC's Jason-2 std0905 GPS-based orbits are closely tied to the center of mass (CM) of the Earth whereas the SLR/DORIS std0905 orbits are tied to the center of figure (CF) of the ITRF2005 (Melachroinos et al., 2012). In this study we extend the investigation to the centering of the GPS constellation and the way those are tied in the Jason-1 and Jason-2 POD process. With a new set of standards, we quantify the GPS and SLR/DORIS-based orbit centering during the Jason-1 and Jason-2 inter-calibration period and how this impacts the orbit radial error over the globe, which is assimilated into mean sea level (MSL) error, from the omission of the full term of the geocenter motion correction.
NASA Astrophysics Data System (ADS)
El Houda Thabet, Rihab; Combastel, Christophe; Raïssi, Tarek; Zolghadri, Ali
2015-09-01
The paper develops a set membership detection methodology which is applied to the detection of abnormal positions of aircraft control surfaces. Robust and early detection of such abnormal positions is an important issue for early system reconfiguration and overall optimisation of aircraft design. In order to improve fault sensitivity while ensuring a high level of robustness, the method combines a data-driven characterisation of noise and a model-driven approach based on interval prediction. The efficiency of the proposed methodology is illustrated through simulation results obtained based on data recorded in several flight scenarios of a highly representative aircraft benchmark.
ERIC Educational Resources Information Center
Anchondo, Jose Jorge; And Others
Public education consists of various levels of laws, policies, regulations, rules, guidelines, and practices based on the U.S. Constitution. At each level, there is a set of "do's and don'ts" guiding the actions of people involved in public education. This handbook, written to help people understand their rights relating to public…
ERIC Educational Resources Information Center
Saunders, William M.; Goldenberg, Claude N.; Gallimore, Ronald
2009-01-01
The authors conducted a quasi-experimental investigation of effects on achievement by grade-level teams focused on improving learning. For 2 years (Phase 1), principals-only training was provided. During the final 3 years (Phase 2), school-based training was provided for principals and teacher leaders on stabilizing team settings and using…
ERIC Educational Resources Information Center
Avellaneda, Claudia N.; Dávalos, Eleonora
2017-01-01
This study seeks to explain macrolevel drivers of adolescent fertility rate using a panel data set derived from 17 Latin American countries over a period of 16 years (1997-2012). While many studies of adolescent fertility have focused on individual-level explanations, this study explores whether adolescent fertility rate is correlated to…
Predicting Patterns of Grammatical Complexity across Language Exam Task Types and Proficiency Levels
ERIC Educational Resources Information Center
Biber, Douglas; Gray, Bethany; Staples, Shelley
2016-01-01
In the present article, we explore the extent to which previous research on register variation can be used to predict spoken/written task-type variation as well as differences across score levels in the context of a major standardized language exam (TOEFL iBT). Specifically, we carry out two sets of linguistic analyses based on a large corpus of…
ERIC Educational Resources Information Center
Pegrum, Mark; Oakley, Grace; Faulkner, Robert
2013-01-01
This paper reports on the adoption of mobile handheld technologies in ten Western Australian independent schools, based on interviews with staff conducted in 2011. iPads were the most popular device, followed by iPod Touches and iPhones. Class sets were common at lower levels, with 1:1 models becoming increasingly common at higher levels. Mobile…
ERIC Educational Resources Information Center
York, James; deHaan, Jonathan William
2018-01-01
This article provides information on an action research project in a low-level EFL setting in Japan. The project aims were to (1) foster spoken communication skills and (2) help students engage with their own learning. The project investigated the applicability of board games as a mediating tool for authentic communication as part of a wider TBLT…
Guo, Hao-Bo; Ma, Yue; Tuskan, Gerald A.; ...
2018-01-01
The existence of complete genome sequences makes it important to develop different approaches for classification of large-scale data sets and to make extraction of biological insights easier. Here, we propose an approach for classification of complete proteomes/protein sets based on protein distributions on some basic attributes. We demonstrate the usefulness of this approach by determining protein distributions in terms of two attributes: protein lengths and protein intrinsic disorder contents (ID). The protein distributions based on L and ID are surveyed for representative proteome organisms and protein sets from the three domains of life. The two-dimensional maps (designated as fingerprints here)more » from the protein distribution densities in the LD space defined by ln( L ) and ID are then constructed. The fingerprints for different organisms and protein sets are found to be distinct with each other, and they can therefore be used for comparative studies. As a test case, phylogenetic trees have been constructed based on the protein distribution densities in the fingerprints of proteomes of organisms without performing any protein sequence comparison and alignments. The phylogenetic trees generated are biologically meaningful, demonstrating that the protein distributions in the LD space may serve as unique phylogenetic signals of the organisms at the proteome level.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Hao-Bo; Ma, Yue; Tuskan, Gerald A.
The existence of complete genome sequences makes it important to develop different approaches for classification of large-scale data sets and to make extraction of biological insights easier. Here, we propose an approach for classification of complete proteomes/protein sets based on protein distributions on some basic attributes. We demonstrate the usefulness of this approach by determining protein distributions in terms of two attributes: protein lengths and protein intrinsic disorder contents (ID). The protein distributions based on L and ID are surveyed for representative proteome organisms and protein sets from the three domains of life. The two-dimensional maps (designated as fingerprints here)more » from the protein distribution densities in the LD space defined by ln( L ) and ID are then constructed. The fingerprints for different organisms and protein sets are found to be distinct with each other, and they can therefore be used for comparative studies. As a test case, phylogenetic trees have been constructed based on the protein distribution densities in the fingerprints of proteomes of organisms without performing any protein sequence comparison and alignments. The phylogenetic trees generated are biologically meaningful, demonstrating that the protein distributions in the LD space may serve as unique phylogenetic signals of the organisms at the proteome level.« less
Structural methodologies for auditing SNOMED.
Wang, Yue; Halper, Michael; Min, Hua; Perl, Yehoshua; Chen, Yan; Spackman, Kent A
2007-10-01
SNOMED is one of the leading health care terminologies being used worldwide. As such, quality assurance is an important part of its maintenance cycle. Methodologies for auditing SNOMED based on structural aspects of its organization are presented. In particular, automated techniques for partitioning SNOMED into smaller groups of concepts based primarily on relationships patterns are defined. Two abstraction networks, the area taxonomy and p-area taxonomy, are derived from the partitions. The high-level views afforded by these abstraction networks form the basis for systematic auditing. The networks tend to highlight errors that manifest themselves as irregularities at the abstract level. They also support group-based auditing, where sets of purportedly similar concepts are focused on for review. The auditing methodologies are demonstrated on one of SNOMED's top-level hierarchies. Errors discovered during the auditing process are reported.
Challenges in Identifying Refugees in National Health Data Sets.
Semere, Wagahta; Yun, Katherine; Ahalt, Cyrus; Williams, Brie; Wang, Emily A
2016-07-01
To evaluate publicly available data sets to determine their utility for studying refugee health. We searched for keywords describing refugees in data sets within the Society of General Internal Medicine Dataset Compendium and the Inter-University Consortium for Political and Social Research database. We included in our analysis US-based data sets with publicly available documentation and a self-defined, health-related focus that allowed for an examination of patient-level factors. Of the 68 data sets that met the study criteria, 37 (54%) registered keyword matches related to refugees, but only 2 uniquely identified refugees. Few health data sets identify refugee status among participants, presenting barriers to understanding refugees' health and health care needs. Information about refugee status in national health surveys should include expanded demographic questions and focus on mental health and chronic disease.
Biomedical image segmentation using geometric deformable models and metaheuristics.
Mesejo, Pablo; Valsecchi, Andrea; Marrakchi-Kacem, Linda; Cagnoni, Stefano; Damas, Sergio
2015-07-01
This paper describes a hybrid level set approach for medical image segmentation. This new geometric deformable model combines region- and edge-based information with the prior shape knowledge introduced using deformable registration. Our proposal consists of two phases: training and test. The former implies the learning of the level set parameters by means of a Genetic Algorithm, while the latter is the proper segmentation, where another metaheuristic, in this case Scatter Search, derives the shape prior. In an experimental comparison, this approach has shown a better performance than a number of state-of-the-art methods when segmenting anatomical structures from different biomedical image modalities. Copyright © 2013 Elsevier Ltd. All rights reserved.
Analyzing the Language of Therapist Empathy in Motivational Interview based Psychotherapy
Xiao, Bo; Can, Dogan; Georgiou, Panayiotis G.; Atkins, David; Narayanan, Shrikanth S.
2016-01-01
Empathy is an important aspect of social communication, especially in medical and psychotherapy applications. Measures of empathy can offer insights into the quality of therapy. We use an N-gram language model based maximum likelihood strategy to classify empathic versus non-empathic utterances and report the precision and recall of classification for various parameters. High recall is obtained with unigram while bigram features achieved the highest F1-score. Based on the utterance level models, a group of lexical features are extracted at the therapy session level. The effectiveness of these features in modeling session level annotator perceptions of empathy is evaluated through correlation with expert-coded session level empathy scores. Our combined feature set achieved a correlation of 0.558 between predicted and expert-coded empathy scores. Results also suggest that the longer term empathy perception process may be more related to isolated empathic salient events. PMID:27602411
Rotolo, Federico; Paoletti, Xavier; Burzykowski, Tomasz; Buyse, Marc; Michiels, Stefan
2017-01-01
Surrogate endpoints are often used in clinical trials instead of well-established hard endpoints for practical convenience. The meta-analytic approach relies on two measures of surrogacy: one at the individual level and one at the trial level. In the survival data setting, a two-step model based on copulas is commonly used. We present a new approach which employs a bivariate survival model with an individual random effect shared between the two endpoints and correlated treatment-by-trial interactions. We fit this model using auxiliary mixed Poisson models. We study via simulations the operating characteristics of this mixed Poisson approach as compared to the two-step copula approach. We illustrate the application of the methods on two individual patient data meta-analyses in gastric cancer, in the advanced setting (4069 patients from 20 randomized trials) and in the adjuvant setting (3288 patients from 14 randomized trials).
Assessing the Readability of Medical Documents: A Ranking Approach.
Zheng, Jiaping; Yu, Hong
2018-03-23
The use of electronic health record (EHR) systems with patient engagement capabilities, including viewing, downloading, and transmitting health information, has recently grown tremendously. However, using these resources to engage patients in managing their own health remains challenging due to the complex and technical nature of the EHR narratives. Our objective was to develop a machine learning-based system to assess readability levels of complex documents such as EHR notes. We collected difficulty ratings of EHR notes and Wikipedia articles using crowdsourcing from 90 readers. We built a supervised model to assess readability based on relative orders of text difficulty using both surface text features and word embeddings. We evaluated system performance using the Kendall coefficient of concordance against human ratings. Our system achieved significantly higher concordance (.734) with human annotators than did a baseline using the Flesch-Kincaid Grade Level, a widely adopted readability formula (.531). The improvement was also consistent across different disease topics. This method's concordance with an individual human user's ratings was also higher than the concordance between different human annotators (.658). We explored methods to automatically assess the readability levels of clinical narratives. Our ranking-based system using simple textual features and easy-to-learn word embeddings outperformed a widely used readability formula. Our ranking-based method can predict relative difficulties of medical documents. It is not constrained to a predefined set of readability levels, a common design in many machine learning-based systems. Furthermore, the feature set does not rely on complex processing of the documents. One potential application of our readability ranking is personalization, allowing patients to better accommodate their own background knowledge. ©Jiaping Zheng, Hong Yu. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 23.03.2018.
Using community-level metrics to monitor the effects of marine protected areas on biodiversity.
Soykan, Candan U; Lewison, Rebecca L
2015-06-01
Marine protected areas (MPAs) are used to protect species, communities, and their associated habitats, among other goals. Measuring MPA efficacy can be challenging, however, particularly when considering responses at the community level. We gathered 36 abundance and 14 biomass data sets on fish assemblages and used meta-analysis to evaluate the ability of 22 distinct community diversity metrics to detect differences in community structure between MPAs and nearby control sites. We also considered the effects of 6 covariates-MPA size and age, MPA size and age interaction, latitude, total species richness, and level of protection-on each metric. Some common metrics, such as species richness and Shannon diversity, did not differ consistently between MPA and control sites, whereas other metrics, such as total abundance and biomass, were consistently different across studies. Metric responses derived from the biomass data sets were more consistent than those based on the abundance data sets, suggesting that community-level biomass differs more predictably than abundance between MPA and control sites. Covariate analyses indicated that level of protection, latitude, MPA size, and the interaction between MPA size and age affect metric performance. These results highlight a handful of metrics, several of which are little known, that could be used to meet the increasing demand for community-level indicators of MPA effectiveness. © 2015 Society for Conservation Biology.
Linguistic Knowledge and Reasoning for Error Diagnosis and Feedback Generation.
ERIC Educational Resources Information Center
Delmonte, Rodolfo
2003-01-01
Presents four sets of natural language processing-based exercises for which error correction and feedback are produced by means of a rich database in which linguistic information is encoded either at the lexical or the grammatical level. (Author/VWL)
Optimizing Artillery Fires at the Brigade Level
2017-06-09
order to determine changes needed to meet the demands placed upon units based on their mission set. The structured progression of increased readiness in...Brigade Combat Team (BCT). Through the construct of Doctrine, Organizational , Training, Material, Leadership and Education, Personnel, and Facilities...
ERIC Educational Resources Information Center
Ang, Lynn; Brooker, Elizabeth; Stephen, Christine
2017-01-01
This paper offers a discussion of the literature of an under-developed area of early years research--the exploration of childminding or home-based childcare and the contribution which this form of provision makes for children and families. Despite growing interest in childminding at the policy level and some international research on understanding…
ERIC Educational Resources Information Center
Hirumi, Atsusi; Appelman, Bob; Rieber, Lloyd; Van Eck, Richard
2010-01-01
In this three part series, four professors who teach graduate level courses on the design of instructional video games discuss their perspectives on preparing instructional designers to optimize game-based learning. Part I set the context for the series and one of four panelists discussed what he believes instructional designers should know about…
Fostering Self-Efficacy through Time Management in an Online Learning Environment
ERIC Educational Resources Information Center
Terry, Krista P.; Doolittle, Peter E.
2008-01-01
In this study, we investigated the use of a web-based tool designed to influence levels of student self-efficacy by engaging participants in a time management strategy. On a daily basis for 16 days, a total of 64 undergraduate and graduate students engaged in the web-based time management tool in which students set goals regarding how they planned…
An In-Law Comes To Stay: Examination of Interdisciplinary Conflict in a School-Based Health Center.
ERIC Educational Resources Information Center
Fast, Jonathan D.
2003-01-01
Social workers often work in settings where other professions exert a higher level of control. The organizational literature on causes of conflict and conflict resolution is briefly reviewed. A case study of a newly opened school-based health center provides an opportunity to analyze conflicts between the school and health center personnel and…
Public Speaking Anxiety: Comparing Face-to-Face and Web-Based Speeches
ERIC Educational Resources Information Center
Campbell, Scott; Larson, James
2013-01-01
This study is to determine whether or not students have a different level of anxiety between giving a speech to a group of people in a traditional face-to-face classroom setting to a speech given to an audience (visible on a projected screen) into a camera using distance or web-based technology. The study included approximately 70 students.…
Phonological and Executive Working Memory in L2 Task-Based Speech Planning and Performance
ERIC Educational Resources Information Center
Wen, Zhisheng
2016-01-01
The present study sets out to explore the distinctive roles played by two working memory (WM) components in various aspects of L2 task-based speech planning and performance. A group of 40 post-intermediate proficiency level Chinese EFL learners took part in the empirical study. Following the tenets and basic principles of the…
Need-Based Aid and College Persistence: The Effects of the Ohio College Opportunity Grant
ERIC Educational Resources Information Center
Bettinger, Eric
2015-01-01
This article exploits a natural experiment to estimate the effects of need-based aid policies on first-year college persistence rates. In fall 2006, Ohio abruptly adopted a new state financial aid policy that was significantly more generous than the previous plan. Using student-level data and very narrowly defined sets of students, I estimate a…
ERIC Educational Resources Information Center
Zimbardi, Kirsten; Bugarcic, Andrea; Colthorpe, Kay; Good, Jonathan P.; Lluka, Lesley J.
2013-01-01
Science graduates require critical thinking skills to deal with the complex problems they will face in their 21st century workplaces. Inquiry-based curricula can provide students with the opportunities to develop such critical thinking skills; however, evidence suggests that an inappropriate level of autonomy provided to under prepared students…
NASA Astrophysics Data System (ADS)
Frantzeskou, Georgia; Stamatatos, Efstathios; Gritzalis, Stefanos
Source code authorship analysis is the particular field that attempts to identify the author of a computer program by treating each program as a linguistically analyzable entity. This is usually based on other undisputed program samples from the same author. There are several cases where the application of such a method could be of a major benefit, such as tracing the source of code left in the system after a cyber attack, authorship disputes, proof of authorship in court, etc. In this paper, we present our approach which is based on byte-level n-gram profiles and is an extension of a method that has been successfully applied to natural language text authorship attribution. We propose a simplified profile and a new similarity measure which is less complicated than the algorithm followed in text authorship attribution and it seems more suitable for source code identification since is better able to deal with very small training sets. Experiments were performed on two different data sets, one with programs written in C++ and the second with programs written in Java. Unlike the traditional language-dependent metrics used by previous studies, our approach can be applied to any programming language with no additional cost. The presented accuracy rates are much better than the best reported results for the same data sets.
Douma, Johanna G; Volkers, Karin M; Engels, Gwenda; Sonneveld, Marieke H; Goossens, Richard H M; Scherder, Erik J A
2017-04-28
Despite the detrimental effects of physical inactivity for older adults, especially aged residents of residential care settings may spend much time in inactive behavior. This may be partly due to their poorer physical condition; however, there may also be other, setting-related factors that influence the amount of inactivity. The aim of this review was to review setting-related factors (including the social and physical environment) that may contribute to the amount of older adults' physical inactivity in a wide range of residential care settings (e.g., nursing homes, assisted care facilities). Five databases were systematically searched for eligible studies, using the key words 'inactivity', 'care facilities', and 'older adults', including their synonyms and MeSH terms. Additional studies were selected from references used in articles included from the search. Based on specific eligibility criteria, a total of 12 studies were included. Quality of the included studies was assessed using the Mixed Methods Appraisal Tool (MMAT). Based on studies using different methodologies (e.g., interviews and observations), and of different quality (assessed quality range: 25-100%), we report several aspects related to the physical environment and caregivers. Factors of the physical environment that may be related to physical inactivity included, among others, the environment's compatibility with the abilities of a resident, the presence of equipment, the accessibility, security, comfort, and aesthetics of the environment/corridors, and possibly the presence of some specific areas. Caregiver-related factors included staffing levels, the available time, and the amount and type of care being provided. Inactivity levels in residential care settings may be reduced by improving several features of the physical environment and with the help of caregivers. Intervention studies could be performed in order to gain more insight into causal effects of improving setting-related factors on physical inactivity of aged residents.
NASA Astrophysics Data System (ADS)
Amanda, A. R.; Widita, R.
2016-03-01
The aim of this research is to compare some image segmentation methods for lungs based on performance evaluation parameter (Mean Square Error (MSE) and Peak Signal Noise to Ratio (PSNR)). In this study, the methods compared were connected threshold, neighborhood connected, and the threshold level set segmentation on the image of the lungs. These three methods require one important parameter, i.e the threshold. The threshold interval was obtained from the histogram of the original image. The software used to segment the image here was InsightToolkit-4.7.0 (ITK). This research used 5 lung images to be analyzed. Then, the results were compared using the performance evaluation parameter determined by using MATLAB. The segmentation method is said to have a good quality if it has the smallest MSE value and the highest PSNR. The results show that four sample images match the criteria of connected threshold, while one sample refers to the threshold level set segmentation. Therefore, it can be concluded that connected threshold method is better than the other two methods for these cases.
Segmentation of mouse dynamic PET images using a multiphase level set method
NASA Astrophysics Data System (ADS)
Cheng-Liao, Jinxiu; Qi, Jinyi
2010-11-01
Image segmentation plays an important role in medical diagnosis. Here we propose an image segmentation method for four-dimensional mouse dynamic PET images. We consider that voxels inside each organ have similar time activity curves. The use of tracer dynamic information allows us to separate regions that have similar integrated activities in a static image but with different temporal responses. We develop a multiphase level set method that utilizes both the spatial and temporal information in a dynamic PET data set. Different weighting factors are assigned to each image frame based on the noise level and activity difference among organs of interest. We used a weighted absolute difference function in the data matching term to increase the robustness of the estimate and to avoid over-partition of regions with high contrast. We validated the proposed method using computer simulated dynamic PET data, as well as real mouse data from a microPET scanner, and compared the results with those of a dynamic clustering method. The results show that the proposed method results in smoother segments with the less number of misclassified voxels.
An improved level set method for brain MR images segmentation and bias correction.
Chen, Yunjie; Zhang, Jianwei; Macione, Jim
2009-10-01
Intensity inhomogeneities cause considerable difficulty in the quantitative analysis of magnetic resonance (MR) images. Thus, bias field estimation is a necessary step before quantitative analysis of MR data can be undertaken. This paper presents a variational level set approach to bias correction and segmentation for images with intensity inhomogeneities. Our method is based on an observation that intensities in a relatively small local region are separable, despite of the inseparability of the intensities in the whole image caused by the overall intensity inhomogeneity. We first define a localized K-means-type clustering objective function for image intensities in a neighborhood around each point. The cluster centers in this objective function have a multiplicative factor that estimates the bias within the neighborhood. The objective function is then integrated over the entire domain to define the data term into the level set framework. Our method is able to capture bias of quite general profiles. Moreover, it is robust to initialization, and thereby allows fully automated applications. The proposed method has been used for images of various modalities with promising results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, L.H., E-mail: Luhui.Han@tum.de; Hu, X.Y., E-mail: Xiangyu.Hu@tum.de; Adams, N.A., E-mail: Nikolaus.Adams@tum.de
In this paper we present a scale separation approach for multi-scale modeling of free-surface and two-phase flows with complex interface evolution. By performing a stimulus-response operation on the level-set function representing the interface, separation of resolvable and non-resolvable interface scales is achieved efficiently. Uniform positive and negative shifts of the level-set function are used to determine non-resolvable interface structures. Non-resolved interface structures are separated from the resolved ones and can be treated by a mixing model or a Lagrangian-particle model in order to preserve mass. Resolved interface structures are treated by the conservative sharp-interface model. Since the proposed scale separationmore » approach does not rely on topological information, unlike in previous work, it can be implemented in a straightforward fashion into a given level set based interface model. A number of two- and three-dimensional numerical tests demonstrate that the proposed method is able to cope with complex interface variations accurately and significantly increases robustness against underresolved interface structures.« less
Demoralization in Patients With Substance Use and Co-Occurring Psychiatric Disorders.
De Weert, Gerdien H; Markus, Wiebren; Kissane, David W; De Jong, Cornelis A J
2017-01-01
In recent years, treatment of substance use disorder has rekindled emphasis on recovery which, being a gradual process, starts with remoralization. In this study, we examine the level of demoralization throughout the treatment process for patients with comorbid substance dependence and psychiatric disorders. 217 patients with co-occurring disorders and 179 community-based individuals participated in this study. Demoralization was measured twice over one month as inpatient treatment happened. In contrast with the community sample, we found high levels of demoralization in the clinical cohort, with 86% of patients having demoralization scores above threshold. During the first month there was a statistically significant reduction in demoralization scores. However, clinically relevant change appeared limited, with only 3% of patients moving from dysfunctional to functional status in this naturalistic setting without targeted intervention aimed at remoralization. Although the level of demoralization is significantly improved during the first month of treatment, patients still remain strongly demoralized. Clinically relevant improvement is limited. It could be worthwhile to set up targeted interventions aimed at remoralization. Furthermore, we advocate for the assessment of demoralization in the clinical setting to monitor patients' treatment outcomes.
Curriculum-Based Handwriting Programs: A Systematic Review With Effect Sizes
Engel, Courtney; Lillie, Kristin; Zurawski, Sarah; Travers, Brittany G.
2018-01-01
Challenges with handwriting can have a negative impact on academic performance, and these challenges are commonly addressed by occupational therapy practitioners in school settings. This systematic review examined the efficacy of curriculum-based interventions to address children’s handwriting difficulties in the classroom (preschool to second grade). We reviewed and computed effect sizes for 13 studies (11 Level II, 2 Level III) identified through a comprehensive database search. The evidence shows that curriculum-based handwriting interventions resulted in small- to medium-sized improvements in legibility, a commonly reported challenge in this age group. The evidence for whether these interventions improved speed is mixed, and the evidence for whether they improved fluency is insufficient. No clear support was found for one handwriting program over another. These results suggest that curriculum-based interventions can lead to improvements in handwriting legibility, but Level I research is needed to validate the efficacy of these curricula. PMID:29689170
Dishman, Rod K; Vandenberg, Robert J; Motl, Robert W; Wilson, Mark G; DeJoy, David M
2010-08-01
The effectiveness of an intervention depends on its dose and on moderators of dose, which usually are not studied. The purpose of the study is to determine whether goal setting and theory-based moderators of goal setting had dose relations with increases in goal-related physical activity during a successful workplace intervention. A group-randomized 12-week intervention that included personal goal setting was implemented in fall 2005, with a multiracial/ethnic sample of employees at 16 geographically diverse worksites. Here, we examined dose-related variables in the cohort of participants (N = 664) from the 8 worksites randomized to the intervention. Participants in the intervention exceeded 9000 daily pedometer steps and 300 weekly minutes of moderate-to-vigorous physical activity (MVPA) during the last 6 weeks of the study, which approximated or exceeded current public health guidelines. Linear growth modeling indicated that participants who set higher goals and sustained higher levels of self-efficacy, commitment and intention about attaining their goals had greater increases in pedometer steps and MVPA. The relation between change in participants' satisfaction with current physical activity and increases in physical activity was mediated by increases in self-set goals. The results show a dose relation of increased physical activity with changes in goal setting, satisfaction, self-efficacy, commitment and intention, consistent with goal-setting theory.
Peacock, Stuart J; Mitton, Craig; Ruta, Danny; Donaldson, Cam; Bate, Angela; Hedden, Lindsay
2010-10-01
Economists' approaches to priority setting focus on the principles of opportunity cost, marginal analysis and choice under scarcity. These approaches are based on the premise that it is possible to design a rational priority setting system that will produce legitimate changes in resource allocation. However, beyond issuing guidance at the national level, economic approaches to priority setting have had only a moderate impact in practice. In particular, local health service organizations - such as health authorities, health maintenance organizations, hospitals and healthcare trusts - have had difficulty implementing evidence from economic appraisals. Yet, in the context of making decisions between competing claims on scarce health service resources, economic tools and thinking have much to offer. The purpose of this article is to describe and discuss ten evidence-based guidelines for the successful design and implementation of a program budgeting and marginal analysis (PBMA) priority setting exercise. PBMA is a framework that explicitly recognizes the need to balance pragmatic and ethical considerations with economic rationality when making resource allocation decisions. While the ten guidelines are drawn from the PBMA framework, they may be generalized across a range of economic approaches to priority setting.
Wang, W; Huang, S; Hou, W; Liu, Y; Fan, Q; He, A; Wen, Y; Hao, J; Guo, X; Zhang, F
2017-10-01
Several genome-wide association studies (GWAS) of bone mineral density (BMD) have successfully identified multiple susceptibility genes, yet isolated susceptibility genes are often difficult to interpret biologically. The aim of this study was to unravel the genetic background of BMD at pathway level, by integrating BMD GWAS data with genome-wide expression quantitative trait loci (eQTLs) and methylation quantitative trait loci (meQTLs) data METHOD: We employed the GWAS datasets of BMD from the Genetic Factors for Osteoporosis Consortium (GEFOS), analysing patients' BMD. The areas studied included 32 735 femoral necks, 28 498 lumbar spines, and 8143 forearms. Genome-wide eQTLs (containing 923 021 eQTLs) and meQTLs (containing 683 152 unique methylation sites with local meQTLs) data sets were collected from recently published studies. Gene scores were first calculated by summary data-based Mendelian randomisation (SMR) software and meQTL-aligned GWAS results. Gene set enrichment analysis (GSEA) was then applied to identify BMD-associated gene sets with a predefined significance level of 0.05. We identified multiple gene sets associated with BMD in one or more regions, including relevant known biological gene sets such as the Reactome Circadian Clock (GSEA p-value = 1.0 × 10 -4 for LS and 2.7 × 10 -2 for femoral necks BMD in eQTLs-based GSEA) and insulin-like growth factor receptor binding (GSEA p-value = 5.0 × 10 -4 for femoral necks and 2.6 × 10 -2 for lumbar spines BMD in meQTLs-based GSEA). Our results provided novel clues for subsequent functional analysis of bone metabolism, and illustrated the benefit of integrating eQTLs and meQTLs data into pathway association analysis for genetic studies of complex human diseases. Cite this article : W. Wang, S. Huang, W. Hou, Y. Liu, Q. Fan, A. He, Y. Wen, J. Hao, X. Guo, F. Zhang. Integrative analysis of GWAS, eQTLs and meQTLs data suggests that multiple gene sets are associated with bone mineral density. Bone Joint Res 2017;6:572-576. © 2017 Wang et al.
Aggarwal, Rohit; Rider, Lisa G; Ruperto, Nicolino; Bayat, Nastaran; Erman, Brian; Feldman, Brian M; Oddis, Chester V; Amato, Anthony A; Chinoy, Hector; Cooper, Robert G; Dastmalchi, Maryam; Fiorentino, David; Isenberg, David; Katz, James D; Mammen, Andrew; de Visser, Marianne; Ytterberg, Steven R; Lundberg, Ingrid E; Chung, Lorinda; Danko, Katalin; García-De la Torre, Ignacio; Song, Yeong Wook; Villa, Luca; Rinaldi, Mariangela; Rockette, Howard; Lachenbruch, Peter A; Miller, Frederick W; Vencovsky, Jiri
2017-05-01
To develop response criteria for adult dermatomyositis (DM) and polymyositis (PM). Expert surveys, logistic regression, and conjoint analysis were used to develop 287 definitions using core set measures. Myositis experts rated greater improvement among multiple pairwise scenarios in conjoint analysis surveys, where different levels of improvement in 2 core set measures were presented. The PAPRIKA (Potentially All Pairwise Rankings of All Possible Alternatives) method determined the relative weights of core set measures and conjoint analysis definitions. The performance characteristics of the definitions were evaluated on patient profiles using expert consensus (gold standard) and were validated using data from a clinical trial. The nominal group technique was used to reach consensus. Consensus was reached for a conjoint analysis-based continuous model using absolute per cent change in core set measures (physician, patient, and extramuscular global activity, muscle strength, Health Assessment Questionnaire, and muscle enzyme levels). A total improvement score (range 0-100), determined by summing scores for each core set measure, was based on improvement in and relative weight of each core set measure. Thresholds for minimal, moderate, and major improvement were ≥20, ≥40, and ≥60 points in the total improvement score. The same criteria were chosen for juvenile DM, with different improvement thresholds. Sensitivity and specificity in DM/PM patient cohorts were 85% and 92%, 90% and 96%, and 92% and 98% for minimal, moderate, and major improvement, respectively. Definitions were validated in the clinical trial analysis for differentiating the physician rating of improvement (p<0.001). The response criteria for adult DM/PM consisted of the conjoint analysis model based on absolute per cent change in 6 core set measures, with thresholds for minimal, moderate, and major improvement. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Optimal inverse functions created via population-based optimization.
Jennings, Alan L; Ordóñez, Raúl
2014-06-01
Finding optimal inputs for a multiple-input, single-output system is taxing for a system operator. Population-based optimization is used to create sets of functions that produce a locally optimal input based on a desired output. An operator or higher level planner could use one of the functions in real time. For the optimization, each agent in the population uses the cost and output gradients to take steps lowering the cost while maintaining their current output. When an agent reaches an optimal input for its current output, additional agents are generated in the output gradient directions. The new agents then settle to the local optima for the new output values. The set of associated optimal points forms an inverse function, via spline interpolation, from a desired output to an optimal input. In this manner, multiple locally optimal functions can be created. These functions are naturally clustered in input and output spaces allowing for a continuous inverse function. The operator selects the best cluster over the anticipated range of desired outputs and adjusts the set point (desired output) while maintaining optimality. This reduces the demand from controlling multiple inputs, to controlling a single set point with no loss in performance. Results are demonstrated on a sample set of functions and on a robot control problem.
Beyond the individual victim: multilevel consequences of abusive supervision in teams.
Farh, Crystal I C; Chen, Zhijun
2014-11-01
We conceptualize a multilevel framework that examines the manifestation of abusive supervision in team settings and its implications for the team and individual members. Drawing on Hackman's (1992) typology of ambient and discretionary team stimuli, our model features team-level abusive supervision (the average level of abuse reported by team members) and individual-level abusive supervision as simultaneous and interacting forces. We further draw on team-relevant theories of social influence to delineate two proximal outcomes of abuse-members' organization-based self-esteem (OBSE) at the individual level and relationship conflict at the team level-that channel the independent and interactive effects of individual- and team-level abuse onto team members' voice, team-role performance, and turnover intentions. Results from a field study and a scenario study provided support for these multilevel pathways. We conclude that abusive supervision in team settings holds toxic consequences for the team and individual, and offer practical implications as well as suggestions for future research on abusive supervision as a multilevel phenomenon. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Ruddell, B. L.; Merwade, V.
2010-12-01
Hydrology and geoscience education at the undergraduate and graduate levels may benefit greatly from a structured approach to pedagogy that utilizes modeling, authentic data, and simulation exercises to engage students in practice-like activities. Extensive evidence in the educational literature suggests that students retain more of their instruction, and attain higher levels of mastery over content, when interactive and practice-like activities are used to contextualize traditional lecture-based and theory-based instruction. However, it is also important that these activities carefully link the use of data and modeling to abstract theory, to promote transfer of knowledge to other contexts. While this type of data-based activity has been practiced in the hydrology classroom for decades, the hydrology community still lacks a set of standards and a mechanism for community-based development, publication, and review of this type of curriculum material. A community-based initiative is underway to develop a set curriculum materials to teach hydrology in the engineering and geoscience university classroom using outcomes-based, pedagogically rigorous modules that use authentic data and modeling experiences to complement traditional lecture-based instruction. A preliminary design for a community cyberinfrastructure for shared module development and publication, and for module topics and outcomes and ametadata and module interoperability standards, will be presented, along with the results of a series of community surveys and workshops informing this design.
Continued Evaluation of Gear Condition Indicator Performance on Rotorcraft Fleet
NASA Technical Reports Server (NTRS)
Delgado, Irebert R.; Dempsey, Paula J.; Antolick, Lance J.; Wade, Daniel R.
2013-01-01
This paper details analyses of condition indicator performance for the helicopter nose gearbox within the U.S. Army's Condition-Based Maintenance Program. Ten nose gearbox data sets underwent two specific analyses. A mean condition indicator level analysis was performed where condition indicator performance was based on a 'batting average' measured before and after part replacement. Two specific condition indicators, Diagnostic Algorithm 1 and Sideband Index, were found to perform well for the data sets studied. A condition indicator versus gear wear analysis was also performed, where gear wear photographs and descriptions from Army tear-down analyses were categorized based on ANSI/AGMA 1010-E95 standards. Seven nose gearbox data sets were analyzed and correlated with condition indicators Diagnostic Algorithm 1 and Sideband Index. Both were found to be most responsive to gear wear cases of micropitting and spalling. Input pinion nose gear box condition indicators were found to be more responsive to part replacement during overhaul than their corresponding output gear nose gear box condition indicators.
LAMMR world data base documentation support and demonstrations
NASA Technical Reports Server (NTRS)
Chin, R.; Beaudet, P.
1980-01-01
The primary purpose of the World Surface Map is to provide the LAMMR subsystem with world surface type classifications that are used to set up LAMMR LEVEL II process control. This data base will be accessed solely by the LAMMR subsystem. The SCATT and ALT subsystems will access the data base indirectly through the T sub b (Brightness Temperature) Data Bank, where the surface types were updated from a priori to current classification, and where the surface types were organized on an orbital subtrack basis. The single most important factor in the design of the World Surface Maps is the ease of access to the information while the complexity of generating these maps is of lesser importance because their generation is a one-time, off-line process. The World Surface Map provides storage of information with a resolution of 7 km necessary to set flags concerning the earth's features with a different set of maps for each month of the year.
Predictors of gender achievement in physical science at the secondary level
NASA Astrophysics Data System (ADS)
Kozlenko, Brittany Hunter
This study used the 2009 National Assessment of Educational Progress (NAEP) science restricted data-set for twelfth graders. The NAEP data used in this research study is derived from a sample group of 11,100 twelfth grade students that represented a national population of over 3,000,000 twelfth grade students enrolled in science in the United States in 2009. The researcher chose the NAEP data set because it provided a national sample using uniform questions. This study investigated how the factors of socioeconomic status (SES), parental education level, mode of instruction, and affective disposition affect twelfth grade students' physical science achievement levels in school for the sample population and subgroups for gender. The factors mode of instruction and affective disposition were built through factor analysis based on available questions from the student surveys. All four factors were found to be significant predictors of physical science achievement for the sample population. NAEP exams are administered to a national sample that represents the population of American students enrolled in public and private schools. This was a non-experimental study that adds to the literature on factors that impact physical science for both genders. A gender gap is essentially nonexistent at the fourth grade level but appears at the eighth grade level in science based on information from NAEP (NCES, 1997). The results of the study can be used to make recommendation for policy change to diminish this gender gap in the future. Educators need to be using research to make instructional decisions; research-based instruction helps all students.
Item Difficulty in the Evaluation of Computer-Based Instruction: An Example from Neuroanatomy
Chariker, Julia H.; Naaz, Farah; Pani, John R.
2012-01-01
This article reports large item effects in a study of computer-based learning of neuroanatomy. Outcome measures of the efficiency of learning, transfer of learning, and generalization of knowledge diverged by a wide margin across test items, with certain sets of items emerging as particularly difficult to master. In addition, the outcomes of comparisons between instructional methods changed with the difficulty of the items to be learned. More challenging items better differentiated between instructional methods. This set of results is important for two reasons. First, it suggests that instruction may be more efficient if sets of consistently difficult items are the targets of instructional methods particularly suited to them. Second, there is wide variation in the published literature regarding the outcomes of empirical evaluations of computer-based instruction. As a consequence, many questions arise as to the factors that may affect such evaluations. The present paper demonstrates that the level of challenge in the material that is presented to learners is an important factor to consider in the evaluation of a computer-based instructional system. PMID:22231801
Item difficulty in the evaluation of computer-based instruction: an example from neuroanatomy.
Chariker, Julia H; Naaz, Farah; Pani, John R
2012-01-01
This article reports large item effects in a study of computer-based learning of neuroanatomy. Outcome measures of the efficiency of learning, transfer of learning, and generalization of knowledge diverged by a wide margin across test items, with certain sets of items emerging as particularly difficult to master. In addition, the outcomes of comparisons between instructional methods changed with the difficulty of the items to be learned. More challenging items better differentiated between instructional methods. This set of results is important for two reasons. First, it suggests that instruction may be more efficient if sets of consistently difficult items are the targets of instructional methods particularly suited to them. Second, there is wide variation in the published literature regarding the outcomes of empirical evaluations of computer-based instruction. As a consequence, many questions arise as to the factors that may affect such evaluations. The present article demonstrates that the level of challenge in the material that is presented to learners is an important factor to consider in the evaluation of a computer-based instructional system. Copyright © 2011 American Association of Anatomists.
Robitaille, Yvonne; Laforest, Sophie; Fournier, Michel; Gauvin, Lise; Parisien, Manon; Corriveau, Hélène; Trickey, Francine; Damestoy, Nicole
2005-01-01
Objectives. We investigated the effectiveness of a group-based exercise intervention to improve balancing ability among older adults delivered in natural settings by staff in local community organizations. Methods. The main component of the intervention consisted of biweekly group-based exercise sessions conducted over 12 weeks by a professional, coupled with home-based exercises. In a quasiexperimental design, 10 community organizations working with older adults offered the intervention to groups of 5 to 15 persons concerned about falls, while 7 organizations recruited similar groups to participate in the control arm of the study. Participants (98 experimental and 102 control) underwent balance assessments by a physiotherapist at registration and 3 months later. Results. Eighty-nine percent of participants attended the 3-month measurement session (n=177). A linear regression analysis showed that after adjusting for baseline levels of balance and demographic and health characteristics, the intervention significantly improved static balance and mobility. Conclusion. Structured, group-based exercise programs offered by community organizations in natural settings can successfully increase balancing ability among community-dwelling older adults concerned about falls. PMID:16195514
Two Micron Laser Technology Advancements at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Singh, Upendra N.
2010-01-01
An Independent Laser Review Panel set up to examine NASA s space-based lidar missions and the technology readiness of lasers appropriate for space-based lidars indicated a critical need for an integrated research and development strategy to move laser transmitter technology from low technical readiness levels to the higher levels required for space missions. Based on the review, a multiyear Laser Risk Reduction Program (LRRP) was initiated by NASA in 2002 to develop technologies that ensure the successful development of the broad range of lidar missions envisioned by NASA. This presentation will provide an overview of the development of pulsed 2-micron solid-state laser technologies at NASA Langley Research Center for enabling space-based measurement of wind and carbon dioxide.
Updating the immunology curriculum in clinical laboratory science.
Stevens, C D
2000-01-01
To determine essential content areas of immunology/serology courses at the clinical laboratory technician (CLT) and clinical laboratory scientist (CLS) levels. A questionnaire was designed which listed all major topics in immunology and serology. Participants were asked to place a check beside each topic covered. For an additional list of serological and immunological laboratory testing, participants were asked to indicate if each test was performed in either the didactic or clinical setting, or not performed at all. A national survey of 593 NAACLS approved CLT and CLS programs was conducted by mail under the auspices of ASCLS. Responses were obtained from 158 programs. Respondents from all across the United States included 60 CLT programs, 48 hospital-based CLS programs, 45 university-based CLS programs, and 5 university-based combined CLT and CLS programs. The survey was designed to enumerate major topics included in immunology and serology courses by a majority of participants at two distinct educational levels, CLT and CLS. Laboratory testing routinely performed in student laboratories as well as in the clinical setting was also determined for these two levels of practitioners. Certain key topics were common to most immunology and serology courses. There were some notable differences in the depth of courses at the CLT and CLS levels. Laboratory testing associated with these courses also differed at the two levels. Testing requiring more detailed interpretation, such as antinuclear antibody patterns (ANAs), was mainly performed by CLS students only. There are certain key topics as well as specific laboratory tests that should be included in immunology/serology courses at each of the two different educational levels to best prepare students for the workplace. Educators can use this information as a guide to plan a curriculum for such courses.
HIV drug resistance surveillance for prioritizing treatment in resource-limited settings
Walensky, Rochelle P.; Weinstein, Milton C.; Yazdanpanah, Yazdan; Losina, Elena; Mercincavage, Lauren M.; Touré, Siaka; Divi, Nomita; Anglaret, Xavier; Goldie, Sue J.; Freedberg, Kenneth A.
2008-01-01
Background Sentinel testing programs for HIV drug resistance in resource-limited settings can inform policy on antiretroviral therapy (ART) and drug sequencing. Objective To examine the value of resistance surveillance in influencing recommendations toward effective and cost-effective sequencing of ART regimens. Methods A state-transition model of HIV infection was adapted to simulate clinical care in Côte d’Ivoire and evaluate the incremental cost-effectiveness of (1) no ART; (2) ART beginning with a non-nucleoside reverse transcriptase inhibitor (NNRTI)-based regimen followed by a boosted protease inhibitor (PI)-based regimen; and (3) ART beginning with a boosted PI-based regimen followed by an NNRTI-based regimen. Results At a 5% prevalence of NNRTI resistance, a strategy that started with a PI-based regimen had a smaller health benefit and higher cost-effectiveness ratio than a strategy that started with an NNRTI-based regimen (cost-effectiveness ratio $910/year of life saved). Results consistently favored initiation with an NNRTI-based regimen, regardless of the population prevalence of NNRTI resistance (up to 76%) and the efficacy of an NNRTI-based regimen in the setting of resistance. The most influential parameters on the cost-effectiveness of sequencing strategies were boosted PI-based regimen costs and the efficacy of this regimen when used as second-line therapy. Conclusions Drug costs and treatment efficacies, but not NNRTI resistance levels, were most influential in determining optimal HIV drug sequencing in Côte d’Ivoire. Results of surveillance for NNRTI resistance should not be used as a major guide to treatment policy in resource-limited settings. PMID:17457091
Effect of Liquid Penetrant Sensitivity on Probability of Detection
NASA Technical Reports Server (NTRS)
Parker, Bradford H.
2011-01-01
The objective of the task is to investigate the effect of liquid penetrant sensitivity level on probability of detection (POD) of cracks in various metals. NASA-STD-5009 currently requires the use of only sensitivity level 4 liquid penetrants for NASA Standard Level inspections. This requirement is based on the fact that the data used to establish the reliably detectable flaw sizes penetrant inspection was from studies performed in the 1970s using penetrant deemed to be equivalent only to modern day sensitivity level 4 penetrants. However, many NDE contractors supporting NASA Centers routinely use sensitivity level 3 penetrants. Because of the new NASA-STD-5009 requirement, these contractors will have to either shift to sensitivity level 4 penetrants or perform formal POD demonstration tests to qualify their existing process. We propose a study to compare the POD generated for two penetrant manufactures, Sherwin and Magnaflux, and for the two most common penetrant inspection methods, water washable and post emulsifiable, hydrophilic. NDE vendors local to GSFC will be employed. A total of six inspectors will inspect a set of crack panels with a broad range of fatigue crack sizes. Each inspector will perform eight inspections of the panel set using the combination of methods and sensitivity levels described above. At least one inspector will also perform multiple inspections using a fixed technique to investigate repeatability. The hit/miss data sets will be evaluated using both the NASA generated DOEPOD software and the MIL-STD-1823 software.
Zhang, Fan; Zhang, Xinhong
2011-01-01
Most of classification, quality evaluation or grading of the flue-cured tobacco leaves are manually operated, which relies on the judgmental experience of experts, and inevitably limited by personal, physical and environmental factors. The classification and the quality evaluation are therefore subjective and experientially based. In this paper, an automatic classification method of tobacco leaves based on the digital image processing and the fuzzy sets theory is presented. A grading system based on image processing techniques was developed for automatically inspecting and grading flue-cured tobacco leaves. This system uses machine vision for the extraction and analysis of color, size, shape and surface texture. Fuzzy comprehensive evaluation provides a high level of confidence in decision making based on the fuzzy logic. The neural network is used to estimate and forecast the membership function of the features of tobacco leaves in the fuzzy sets. The experimental results of the two-level fuzzy comprehensive evaluation (FCE) show that the accuracy rate of classification is about 94% for the trained tobacco leaves, and the accuracy rate of the non-trained tobacco leaves is about 72%. We believe that the fuzzy comprehensive evaluation is a viable way for the automatic classification and quality evaluation of the tobacco leaves. PMID:22163744
Data preprocessing method for liquid chromatography-mass spectrometry based metabolomics.
Wei, Xiaoli; Shi, Xue; Kim, Seongho; Zhang, Li; Patrick, Jeffrey S; Binkley, Joe; McClain, Craig; Zhang, Xiang
2012-09-18
A set of data preprocessing algorithms for peak detection and peak list alignment are reported for analysis of liquid chromatography-mass spectrometry (LC-MS)-based metabolomics data. For spectrum deconvolution, peak picking is achieved at the selected ion chromatogram (XIC) level. To estimate and remove the noise in XICs, each XIC is first segmented into several peak groups based on the continuity of scan number, and the noise level is estimated by all the XIC signals, except the regions potentially with presence of metabolite ion peaks. After removing noise, the peaks of molecular ions are detected using both the first and the second derivatives, followed by an efficient exponentially modified Gaussian-based peak deconvolution method for peak fitting. A two-stage alignment algorithm is also developed, where the retention times of all peaks are first transferred into the z-score domain and the peaks are aligned based on the measure of their mixture scores after retention time correction using a partial linear regression. Analysis of a set of spike-in LC-MS data from three groups of samples containing 16 metabolite standards mixed with metabolite extract from mouse livers demonstrates that the developed data preprocessing method performs better than two of the existing popular data analysis packages, MZmine2.6 and XCMS(2), for peak picking, peak list alignment, and quantification.
A Data Pre-processing Method for Liquid Chromatography Mass Spectrometry-based Metabolomics
Wei, Xiaoli; Shi, Xue; Kim, Seongho; Zhang, Li; Patrick, Jeffrey S.; Binkley, Joe; McClain, Craig; Zhang, Xiang
2012-01-01
A set of data pre-processing algorithms for peak detection and peak list alignment are reported for analysis of LC-MS based metabolomics data. For spectrum deconvolution, peak picking is achieved at selected ion chromatogram (XIC) level. To estimate and remove the noise in XICs, each XIC is first segmented into several peak groups based on the continuity of scan number, and the noise level is estimated by all the XIC signals, except the regions potentially with presence of metabolite ion peaks. After removing noise, the peaks of molecular ions are detected using both the first and the second derivatives, followed by an efficient exponentially modified Gaussian-based peak deconvolution method for peak fitting. A two-stage alignment algorithm is also developed, where the retention times of all peaks are first transferred into z-score domain and the peaks are aligned based on the measure of their mixture scores after retention time correction using a partial linear regression. Analysis of a set of spike-in LC-MS data from three groups of samples containing 16 metabolite standards mixed with metabolite extract from mouse livers, demonstrates that the developed data pre-processing methods performs better than two of the existing popular data analysis packages, MZmine2.6 and XCMS2, for peak picking, peak list alignment and quantification. PMID:22931487
SU-E-J-168: Automated Pancreas Segmentation Based On Dynamic MRI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gou, S; Rapacchi, S; Hu, P
2014-06-01
Purpose: MRI guided radiotherapy is particularly attractive for abdominal targets with low CT contrast. To fully utilize this modality for pancreas tracking, automated segmentation tools are needed. A hybrid gradient, region growth and shape constraint (hGReS) method to segment 2D upper abdominal dynamic MRI is developed for this purpose. Methods: 2D coronal dynamic MR images of 2 healthy volunteers were acquired with a frame rate of 5 f/second. The regions of interest (ROIs) included the liver, pancreas and stomach. The first frame was used as the source where the centers of the ROIs were annotated. These center locations were propagatedmore » to the next dynamic MRI frame. 4-neighborhood region transfer growth was performed from these initial seeds for rough segmentation. To improve the results, gradient, edge and shape constraints were applied to the ROIs before final refinement using morphological operations. Results from hGReS and 3 other automated segmentation methods using edge detection, region growth and level set were compared to manual contouring. Results: For the first patient, hGReS resulted in the organ segmentation accuracy as measure by the Dices index (0.77) for the pancreas. The accuracy was slightly superior to the level set method (0.72), and both are significantly more accurate than the edge detection (0.53) and region growth methods (0.42). For the second healthy volunteer, hGReS reliably segmented the pancreatic region, achieving a Dices index of 0.82, 0.92 and 0.93 for the pancreas, stomach and liver, respectively, comparing to manual segmentation. Motion trajectories derived from the hGReS, level set and manual segmentation methods showed high correlation to respiratory motion calculated using a lung blood vessel as the reference while the other two methods showed substantial motion tracking errors. hGReS was 10 times faster than level set. Conclusion: We have shown the feasibility of automated segmentation of the pancreas anatomy based on dynamic MRI.« less
A cluster analysis on road traffic accidents using genetic algorithms
NASA Astrophysics Data System (ADS)
Saharan, Sabariah; Baragona, Roberto
2017-04-01
The analysis of traffic road accidents is increasingly important because of the accidents cost and public road safety. The availability or large data sets makes the study of factors that affect the frequency and severity accidents are viable. However, the data are often highly unbalanced and overlapped. We deal with the data set of the road traffic accidents recorded in Christchurch, New Zealand, from 2000-2009 with a total of 26440 accidents. The data is in a binary set and there are 50 factors road traffic accidents with four level of severity. We used genetic algorithm for the analysis because we are in the presence of a large unbalanced data set and standard clustering like k-means algorithm may not be suitable for the task. The genetic algorithm based on clustering for unknown K, (GCUK) has been used to identify the factors associated with accidents of different levels of severity. The results provided us with an interesting insight into the relationship between factors and accidents severity level and suggest that the two main factors that contributes to fatal accidents are "Speed greater than 60 km h" and "Did not see other people until it was too late". A comparison with the k-means algorithm and the independent component analysis is performed to validate the results.
Davis, P R; Rickards, A C; Ollerton, J E
2007-12-01
To determine the optimal composition o f the pre-hospital medical response team (MERT) and the value of pre-hospital critical care interventions in a military setting, and specifically to determine both the benefit of including a doctor in the pre-hospital response team and the relevance of the time and distance to definitive care. A comprehensive review of the literature incorporating a range of electronic search engines and hand searches of key journals. There was no level 1 evidence on which to base conclusions. The 15 most relevant articles were analysed in detail. There was one randomized controlled trial (level 2 evidence) that supports the inclusion of a doctor on MERT. Several cohort studies were identified that analysed the benefits of specific critical care interventions in the pre-hospital setting. A doctor with critical care skills deployed on the MERT is associated with improved survival in victims of major trauma. Specific critical care interventions including emergency endotracheal intubation and ventilation, and intercostal drainage are associated with improved survival and functional recovery in certain patients. These benefits appear to be more easily demonstrated for the rural and remote setting than for the urban setting.
From big data to rich data: The key features of athlete wheelchair mobility performance.
van der Slikke, R M A; Berger, M A M; Bregman, D J J; Veeger, H E J
2016-10-03
Quantitative assessment of an athlete׳s individual wheelchair mobility performance is one prerequisite needed to evaluate game performance, improve wheelchair settings and optimize training routines. Inertial Measurement Unit (IMU) based methods can be used to perform such quantitative assessment, providing a large number of kinematic data. The goal of this research was to reduce that large amount of data to a set of key features best describing wheelchair mobility performance in match play and present them in meaningful way for both scientists and athletes. To test the discriminative power, wheelchair mobility characteristics of athletes with different performance levels were compared. The wheelchair kinematics of 29 (inter-)national level athletes were measured during a match using three inertial sensors mounted on the wheelchair. Principal component analysis was used to reduce 22 kinematic outcomes to a set of six outcomes regarding linear and rotational movement; speed and acceleration; average and best performance. In addition, it was explored whether groups of athletes with known performance differences based on their impairment classification also differed with respect to these key outcomes using univariate general linear models. For all six key outcomes classification showed to be a significant factor (p<0.05). We composed a set of six key kinematic outcomes that accurately describe wheelchair mobility performance in match play. The key kinematic outcomes were displayed in an easy to interpret way, usable for athletes, coaches and scientists. This standardized representation enables comparison of different wheelchair sports regarding wheelchair mobility, but also evaluation at the level of an individual athlete. By this means, the tool could enhance further development of wheelchair sports in general. Copyright © 2016 Elsevier Ltd. All rights reserved.
Clevenger, Carolyn K; Chu, Thasha A; Yang, Zhou; Hepburn, Kenneth W
2012-09-01
The segment of older adults who present to the emergency department (ED) with cognitive impairment ranges from 21% to 40%. Difficulties inherent in the chaotic ED setting combined with dementia may result in a number of unwanted clinical outcomes, but strategies to minimize these outcomes are lacking. A review of the literature was conducted to examine the practices undertaken in the care of persons with dementia (PWD) specific to the ED setting. PubMed and Cumulative Index to Nursing and Allied Health Literature were searched for published articles specific to the care of PWD provided in the ED. All English-language articles were reviewed; editorials and reflective journals were excluded. Seven articles ultimately met inclusion criteria; all provided Level 7 evidence: narrative review or opinions from authorities. The articles recommended clinical practices that can be categorized into five themes: assessment of cognitive impairment, dementia communication strategies, avoidance of adverse events, alterations to the physical environment, and education of ED staff. Many recommendations are extrapolated from residential care settings. Review results indicate that there is minimal guidance for the care of PWD specific to the ED setting. There are no empirical studies of the care (assessment, interventions) of PWD in the ED. The existing (Level 7) recommendations lack a research base to support their effectiveness or adoption as evidence-based practice. There is a significant opportunity for research to identify and test ways to meet the needs of PWD in the ED to ensure a safe visit, accurate diagnosis, and prudent transfer to the most appropriate level of care. © 2012, Copyright the Authors Journal compilation © 2012, The American Geriatrics Society.
Round table discussion " Development of qualification framework in meteorology (TEMPUS QUALIMET)"
NASA Astrophysics Data System (ADS)
Bashmakova, I.; Belotserkovsky, A.; Karlin, L.; Petrosyan, A.; Serditova, N.; Zilitinkevich, S.
2010-09-01
The international consortium has started implementing a project aimed at the development of unified framework of qualifications in meteorology (QualiMet), setting a system of recognition and award of qualifications up to Doctoral level based on standards of knowledge, skill and competence acquired by learners is underway. The QualiMet has the following specific objectives: 1. To develop standards of knowledge, skills and competence for all qualifications up to Doctoral level needed in all possible occupations meteorology learner can undertake, by July 2011 2. To develop reciprocally recognized rubrics, criteria, methods and tools for assessing the compliance with the developed standards (quality assurance), by July 2012 3. To set the network of Centers of Excellence as the primary designer of sample education programs and learning experiences, both in brick-and-mortar and distant setting of delivery, leading to achievement of the developed standards, by December 2012 4. To set a system of mutual international recognition and award of qualifications in meteorology based on the developed procedures and establishment of self-regulatory public organization, by December 2012 The main beneficiaries of the project are: 1. Meteorology learners from the consortium countries. They will be able to make informed decisions about available qualification choices and progression options and provided an opportunity for students and graduates to participate in the system of international continuous education. 2. Meteorology employers from the consortium countries, They will be able to specify the level of knowledge, skill and competence required for occupational roles, evaluate qualifications presented, connect training and development with business needs. 3. Students and academic staff of all the consortium members, who will gain the increased mobility and exchange the fluxes of culturally and institutionally diversified lecturers and qualified specialists
Kozhina, T N; Evstiukhina, T A; Peshekhonov, V T; Chernenkov, A Yu; Korolev, V G
2016-03-01
In the Saccharomyces cerevisiae yeasts, the DOT1 gene product provides methylation of lysine 79 (K79) of hi- stone H3 and the SET2 gene product provides the methylation of lysine 36 (K36) of the same histone. We determined that the dot1 and set2 mutants suppress the UV-induced mutagenesis to an equally high degree. The dot1 mutation demonstrated statistically higher sensitivity to the low doses of MMC than the wild type strain. The analysis of the interaction between the dot1 and rad52 mutations revealed a considerable level of spontaneous cell death in the double dot1 rad52 mutant. We observed strong suppression of the gamma-in- duced mutagenesis in the set2 mutant. We determined that the dot1 and set2 mutations decrease the sponta- neous mutagenesis rate in both single and d ouble mutants. The epistatic interaction between the dot1 and set2 mutations and almost similar sensitivity of the corresponding mutants to the different types of DNA damage allow one to conclude that both genes are involved in the control of the same DNA repair pathways, the ho- mologous-recombination-based and the postreplicative DNA repair.
ECOREGION: ECOREGIONS OF CONTERMINOUS UNITED STATES
The Ecoregion data set covers aquatic ecoregions of the conterminous U.S. It is provided by the USGS and is intended for national-level studies of water resources. Aquatic ecoregions are based on perceived patterns of a combination of causal and integrative factors including lan...
Locally Based Kernel PLS Regression De-noising with Application to Event-Related Potentials
NASA Technical Reports Server (NTRS)
Rosipal, Roman; Trejo, Leonard J.; Wheeler, Kevin; Tino, Peter
2002-01-01
The close relation of signal de-noising and regression problems dealing with the estimation of functions reflecting dependency between a set of inputs and dependent outputs corrupted with some level of noise have been employed in our approach.
ERIC Educational Resources Information Center
Jennings, Gretchen; Craig, Michelle L.
1997-01-01
Describes an exhibition-based activity set that teaches important psychological processes such as attention (Interference), communication (Pattern Talk), and cooperation versus competition (Do Nice Guys Finish Last?). Activities follow the scientific method, and teachers can observe varying levels of skill and cognitive development in students of…
A Risk Stratification Model for Lung Cancer Based on Gene Coexpression Network and Deep Learning
2018-01-01
Risk stratification model for lung cancer with gene expression profile is of great interest. Instead of previous models based on individual prognostic genes, we aimed to develop a novel system-level risk stratification model for lung adenocarcinoma based on gene coexpression network. Using multiple microarray, gene coexpression network analysis was performed to identify survival-related networks. A deep learning based risk stratification model was constructed with representative genes of these networks. The model was validated in two test sets. Survival analysis was performed using the output of the model to evaluate whether it could predict patients' survival independent of clinicopathological variables. Five networks were significantly associated with patients' survival. Considering prognostic significance and representativeness, genes of the two survival-related networks were selected for input of the model. The output of the model was significantly associated with patients' survival in two test sets and training set (p < 0.00001, p < 0.0001 and p = 0.02 for training and test sets 1 and 2, resp.). In multivariate analyses, the model was associated with patients' prognosis independent of other clinicopathological features. Our study presents a new perspective on incorporating gene coexpression networks into the gene expression signature and clinical application of deep learning in genomic data science for prognosis prediction. PMID:29581968
NASA Astrophysics Data System (ADS)
Zhang, C.; Pan, X.; Zhang, S. Q.; Li, H. P.; Atkinson, P. M.
2017-09-01
Recent advances in remote sensing have witnessed a great amount of very high resolution (VHR) images acquired at sub-metre spatial resolution. These VHR remotely sensed data has post enormous challenges in processing, analysing and classifying them effectively due to the high spatial complexity and heterogeneity. Although many computer-aid classification methods that based on machine learning approaches have been developed over the past decades, most of them are developed toward pixel level spectral differentiation, e.g. Multi-Layer Perceptron (MLP), which are unable to exploit abundant spatial details within VHR images. This paper introduced a rough set model as a general framework to objectively characterize the uncertainty in CNN classification results, and further partition them into correctness and incorrectness on the map. The correct classification regions of CNN were trusted and maintained, whereas the misclassification areas were reclassified using a decision tree with both CNN and MLP. The effectiveness of the proposed rough set decision tree based MLP-CNN was tested using an urban area at Bournemouth, United Kingdom. The MLP-CNN, well capturing the complementarity between CNN and MLP through the rough set based decision tree, achieved the best classification performance both visually and numerically. Therefore, this research paves the way to achieve fully automatic and effective VHR image classification.
A Feature-based Approach to Big Data Analysis of Medical Images
Toews, Matthew; Wachinger, Christian; Estepar, Raul San Jose; Wells, William M.
2015-01-01
This paper proposes an inference method well-suited to large sets of medical images. The method is based upon a framework where distinctive 3D scale-invariant features are indexed efficiently to identify approximate nearest-neighbor (NN) feature matches in O(log N) computational complexity in the number of images N. It thus scales well to large data sets, in contrast to methods based on pair-wise image registration or feature matching requiring O(N) complexity. Our theoretical contribution is a density estimator based on a generative model that generalizes kernel density estimation and K-nearest neighbor (KNN) methods. The estimator can be used for on-the-fly queries, without requiring explicit parametric models or an off-line training phase. The method is validated on a large multi-site data set of 95,000,000 features extracted from 19,000 lung CT scans. Subject-level classification identifies all images of the same subjects across the entire data set despite deformation due to breathing state, including unintentional duplicate scans. State-of-the-art performance is achieved in predicting chronic pulmonary obstructive disorder (COPD) severity across the 5-category GOLD clinical rating, with an accuracy of 89% if both exact and one-off predictions are considered correct. PMID:26221685
A Feature-Based Approach to Big Data Analysis of Medical Images.
Toews, Matthew; Wachinger, Christian; Estepar, Raul San Jose; Wells, William M
2015-01-01
This paper proposes an inference method well-suited to large sets of medical images. The method is based upon a framework where distinctive 3D scale-invariant features are indexed efficiently to identify approximate nearest-neighbor (NN) feature matches-in O (log N) computational complexity in the number of images N. It thus scales well to large data sets, in contrast to methods based on pair-wise image registration or feature matching requiring O(N) complexity. Our theoretical contribution is a density estimator based on a generative model that generalizes kernel density estimation and K-nearest neighbor (KNN) methods.. The estimator can be used for on-the-fly queries, without requiring explicit parametric models or an off-line training phase. The method is validated on a large multi-site data set of 95,000,000 features extracted from 19,000 lung CT scans. Subject-level classification identifies all images of the same subjects across the entire data set despite deformation due to breathing state, including unintentional duplicate scans. State-of-the-art performance is achieved in predicting chronic pulmonary obstructive disorder (COPD) severity across the 5-category GOLD clinical rating, with an accuracy of 89% if both exact and one-off predictions are considered correct.
Scaling Up Graph-Based Semisupervised Learning via Prototype Vector Machines
Zhang, Kai; Lan, Liang; Kwok, James T.; Vucetic, Slobodan; Parvin, Bahram
2014-01-01
When the amount of labeled data are limited, semi-supervised learning can improve the learner's performance by also using the often easily available unlabeled data. In particular, a popular approach requires the learned function to be smooth on the underlying data manifold. By approximating this manifold as a weighted graph, such graph-based techniques can often achieve state-of-the-art performance. However, their high time and space complexities make them less attractive on large data sets. In this paper, we propose to scale up graph-based semisupervised learning using a set of sparse prototypes derived from the data. These prototypes serve as a small set of data representatives, which can be used to approximate the graph-based regularizer and to control model complexity. Consequently, both training and testing become much more efficient. Moreover, when the Gaussian kernel is used to define the graph affinity, a simple and principled method to select the prototypes can be obtained. Experiments on a number of real-world data sets demonstrate encouraging performance and scaling properties of the proposed approach. It also compares favorably with models learned via ℓ1-regularization at the same level of model sparsity. These results demonstrate the efficacy of the proposed approach in producing highly parsimonious and accurate models for semisupervised learning. PMID:25720002
Pari, Sangavi; Wang, Inger A; Liu, Haizhou; Wong, Bryan M
2017-03-22
Advanced oxidation processes that utilize highly oxidative radicals are widely used in water reuse treatment. In recent years, the application of sulfate radical (SO 4 ˙ - ) as a promising oxidant for water treatment has gained increasing attention. To understand the efficiency of SO 4 ˙ - in the degradation of organic contaminants in wastewater effluent, it is important to be able to predict the reaction kinetics of various SO 4 ˙ - -driven oxidation reactions. In this study, we utilize density functional theory (DFT) and high-level wavefunction-based methods (including computationally-intensive coupled cluster methods), to explore the activation energies of SO 4 ˙ - -driven oxidation reactions on a series of benzene-derived contaminants. These high-level calculations encompass a wide set of reactions including 110 forward/reverse reactions and 5 different computational methods in total. Based on the high-level coupled-cluster quantum calculations, we find that the popular M06-2X DFT functional is significantly more accurate for OH - additions than for SO 4 ˙ - reactions. Most importantly, we highlight some of the limitations and deficiencies of other computational methods, and we recommend the use of high-level quantum calculations to spot-check environmental chemistry reactions that may lie outside the training set of the M06-2X functional, particularly for water oxidation reactions that involve SO 4 ˙ - and other inorganic species.
Market-Based and System-Wide Fuel Cycle Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, Paul Philip Hood; Scopatz, Anthony; Gidden, Matthew
This work introduces automated optimization into fuel cycle simulations in the Cyclus platform. This includes system-level optimizations, seeking a deployment plan that optimizes the performance over the entire transition, and market-level optimization, seeking an optimal set of material trades at each time step. These concepts were introduced in a way that preserves the flexibility of the Cyclus fuel cycle framework, one of its most important design principles.