Sample records for penalized objective functions

  1. Penalized likelihood and multi-objective spatial scans for the detection and inference of irregular clusters

    PubMed Central

    2010-01-01

    Background Irregularly shaped spatial clusters are difficult to delineate. A cluster found by an algorithm often spreads through large portions of the map, impacting its geographical meaning. Penalized likelihood methods for Kulldorff's spatial scan statistics have been used to control the excessive freedom of the shape of clusters. Penalty functions based on cluster geometry and non-connectivity have been proposed recently. Another approach involves the use of a multi-objective algorithm to maximize two objectives: the spatial scan statistics and the geometric penalty function. Results & Discussion We present a novel scan statistic algorithm employing a function based on the graph topology to penalize the presence of under-populated disconnection nodes in candidate clusters, the disconnection nodes cohesion function. A disconnection node is defined as a region within a cluster, such that its removal disconnects the cluster. By applying this function, the most geographically meaningful clusters are sifted through the immense set of possible irregularly shaped candidate cluster solutions. To evaluate the statistical significance of solutions for multi-objective scans, a statistical approach based on the concept of attainment function is used. In this paper we compared different penalized likelihoods employing the geometric and non-connectivity regularity functions and the novel disconnection nodes cohesion function. We also build multi-objective scans using those three functions and compare them with the previous penalized likelihood scans. An application is presented using comprehensive state-wide data for Chagas' disease in puerperal women in Minas Gerais state, Brazil. Conclusions We show that, compared to the other single-objective algorithms, multi-objective scans present better performance, regarding power, sensitivity and positive predicted value. The multi-objective non-connectivity scan is faster and better suited for the detection of moderately irregularly shaped clusters. The multi-objective cohesion scan is most effective for the detection of highly irregularly shaped clusters. PMID:21034451

  2. This is SPIRAL-TAP: Sparse Poisson Intensity Reconstruction ALgorithms--theory and practice.

    PubMed

    Harmany, Zachary T; Marcia, Roummel F; Willett, Rebecca M

    2012-03-01

    Observations in many applications consist of counts of discrete events, such as photons hitting a detector, which cannot be effectively modeled using an additive bounded or Gaussian noise model, and instead require a Poisson noise model. As a result, accurate reconstruction of a spatially or temporally distributed phenomenon (f*) from Poisson data (y) cannot be effectively accomplished by minimizing a conventional penalized least-squares objective function. The problem addressed in this paper is the estimation of f* from y in an inverse problem setting, where the number of unknowns may potentially be larger than the number of observations and f* admits sparse approximation. The optimization formulation considered in this paper uses a penalized negative Poisson log-likelihood objective function with nonnegativity constraints (since Poisson intensities are naturally nonnegative). In particular, the proposed approach incorporates key ideas of using separable quadratic approximations to the objective function at each iteration and penalization terms related to l1 norms of coefficient vectors, total variation seminorms, and partition-based multiscale estimation methods.

  3. 40 CFR 33.410 - Can a recipient be penalized for failing to meet its fair share objectives?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 1 2014-07-01 2014-07-01 false Can a recipient be penalized for failing to meet its fair share objectives? 33.410 Section 33.410 Protection of Environment ENVIRONMENTAL... penalized for failing to meet its fair share objectives? A recipient cannot be penalized, or treated by EPA...

  4. 40 CFR 33.410 - Can a recipient be penalized for failing to meet its fair share objectives?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 1 2013-07-01 2013-07-01 false Can a recipient be penalized for failing to meet its fair share objectives? 33.410 Section 33.410 Protection of Environment ENVIRONMENTAL... penalized for failing to meet its fair share objectives? A recipient cannot be penalized, or treated by EPA...

  5. 40 CFR 33.410 - Can a recipient be penalized for failing to meet its fair share objectives?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Can a recipient be penalized for failing to meet its fair share objectives? 33.410 Section 33.410 Protection of Environment ENVIRONMENTAL... penalized for failing to meet its fair share objectives? A recipient cannot be penalized, or treated by EPA...

  6. 40 CFR 33.410 - Can a recipient be penalized for failing to meet its fair share objectives?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 1 2011-07-01 2011-07-01 false Can a recipient be penalized for failing to meet its fair share objectives? 33.410 Section 33.410 Protection of Environment ENVIRONMENTAL... penalized for failing to meet its fair share objectives? A recipient cannot be penalized, or treated by EPA...

  7. 40 CFR 33.410 - Can a recipient be penalized for failing to meet its fair share objectives?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 1 2012-07-01 2012-07-01 false Can a recipient be penalized for failing to meet its fair share objectives? 33.410 Section 33.410 Protection of Environment ENVIRONMENTAL... penalized for failing to meet its fair share objectives? A recipient cannot be penalized, or treated by EPA...

  8. Penalized Nonlinear Least Squares Estimation of Time-Varying Parameters in Ordinary Differential Equations

    PubMed Central

    Cao, Jiguo; Huang, Jianhua Z.; Wu, Hulin

    2012-01-01

    Ordinary differential equations (ODEs) are widely used in biomedical research and other scientific areas to model complex dynamic systems. It is an important statistical problem to estimate parameters in ODEs from noisy observations. In this article we propose a method for estimating the time-varying coefficients in an ODE. Our method is a variation of the nonlinear least squares where penalized splines are used to model the functional parameters and the ODE solutions are approximated also using splines. We resort to the implicit function theorem to deal with the nonlinear least squares objective function that is only defined implicitly. The proposed penalized nonlinear least squares method is applied to estimate a HIV dynamic model from a real dataset. Monte Carlo simulations show that the new method can provide much more accurate estimates of functional parameters than the existing two-step local polynomial method which relies on estimation of the derivatives of the state function. Supplemental materials for the article are available online. PMID:23155351

  9. Efficient Compressed Sensing Based MRI Reconstruction using Nonconvex Total Variation Penalties

    NASA Astrophysics Data System (ADS)

    Lazzaro, D.; Loli Piccolomini, E.; Zama, F.

    2016-10-01

    This work addresses the problem of Magnetic Resonance Image Reconstruction from highly sub-sampled measurements in the Fourier domain. It is modeled as a constrained minimization problem, where the objective function is a non-convex function of the gradient of the unknown image and the constraints are given by the data fidelity term. We propose an algorithm, Fast Non Convex Reweighted (FNCR), where the constrained problem is solved by a reweighting scheme, as a strategy to overcome the non-convexity of the objective function, with an adaptive adjustment of the penalization parameter. We propose a fast iterative algorithm and we can prove that it converges to a local minimum because the constrained problem satisfies the Kurdyka-Lojasiewicz property. Moreover the adaptation of non convex l0 approximation and penalization parameters, by means of a continuation technique, allows us to obtain good quality solutions, avoiding to get stuck in unwanted local minima. Some numerical experiments performed on MRI sub-sampled data show the efficiency of the algorithm and the accuracy of the solution.

  10. Penalized weighted least-squares approach for low-dose x-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Li, Tianfang; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    The noise of low-dose computed tomography (CT) sinogram follows approximately a Gaussian distribution with nonlinear dependence between the sample mean and variance. The noise is statistically uncorrelated among detector bins at any view angle. However the correlation coefficient matrix of data signal indicates a strong signal correlation among neighboring views. Based on above observations, Karhunen-Loeve (KL) transform can be used to de-correlate the signal among the neighboring views. In each KL component, a penalized weighted least-squares (PWLS) objective function can be constructed and optimal sinogram can be estimated by minimizing the objective function, followed by filtered backprojection (FBP) for CT image reconstruction. In this work, we compared the KL-PWLS method with an iterative image reconstruction algorithm, which uses the Gauss-Seidel iterative calculation to minimize the PWLS objective function in image domain. We also compared the KL-PWLS with an iterative sinogram smoothing algorithm, which uses the iterated conditional mode calculation to minimize the PWLS objective function in sinogram space, followed by FBP for image reconstruction. Phantom experiments show a comparable performance of these three PWLS methods in suppressing the noise-induced artifacts and preserving resolution in reconstructed images. Computer simulation concurs with the phantom experiments in terms of noise-resolution tradeoff and detectability in low contrast environment. The KL-PWLS noise reduction may have the advantage in computation for low-dose CT imaging, especially for dynamic high-resolution studies.

  11. Comparing implementations of penalized weighted least-squares sinogram restoration.

    PubMed

    Forthmann, Peter; Koehler, Thomas; Defrise, Michel; La Riviere, Patrick

    2010-11-01

    A CT scanner measures the energy that is deposited in each channel of a detector array by x rays that have been partially absorbed on their way through the object. The measurement process is complex and quantitative measurements are always and inevitably associated with errors, so CT data must be preprocessed prior to reconstruction. In recent years, the authors have formulated CT sinogram preprocessing as a statistical restoration problem in which the goal is to obtain the best estimate of the line integrals needed for reconstruction from the set of noisy, degraded measurements. The authors have explored both penalized Poisson likelihood (PL) and penalized weighted least-squares (PWLS) objective functions. At low doses, the authors found that the PL approach outperforms PWLS in terms of resolution-noise tradeoffs, but at standard doses they perform similarly. The PWLS objective function, being quadratic, is more amenable to computational acceleration than the PL objective. In this work, the authors develop and compare two different methods for implementing PWLS sinogram restoration with the hope of improving computational performance relative to PL in the standard-dose regime. Sinogram restoration is still significant in the standard-dose regime since it can still outperform standard approaches and it allows for correction of effects that are not usually modeled in standard CT preprocessing. The authors have explored and compared two implementation strategies for PWLS sinogram restoration: (1) A direct matrix-inversion strategy based on the closed-form solution to the PWLS optimization problem and (2) an iterative approach based on the conjugate-gradient algorithm. Obtaining optimal performance from each strategy required modifying the naive off-the-shelf implementations of the algorithms to exploit the particular symmetry and sparseness of the sinogram-restoration problem. For the closed-form approach, the authors subdivided the large matrix inversion into smaller coupled problems and exploited sparseness to minimize matrix operations. For the conjugate-gradient approach, the authors exploited sparseness and preconditioned the problem to speed up convergence. All methods produced qualitatively and quantitatively similar images as measured by resolution-variance tradeoffs and difference images. Despite the acceleration strategies, the direct matrix-inversion approach was found to be uncompetitive with iterative approaches, with a computational burden higher by an order of magnitude or more. The iterative conjugate-gradient approach, however, does appear promising, with computation times half that of the authors' previous penalized-likelihood implementation. Iterative conjugate-gradient based PWLS sinogram restoration with careful matrix optimizations has computational advantages over direct matrix PWLS inversion and over penalized-likelihood sinogram restoration and can be considered a good alternative in standard-dose regimes.

  12. Noise reduction for low-dose helical CT by 3D penalized weighted least-squares sinogram smoothing

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Li, Tianfang; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Helical computed tomography (HCT) has several advantages over conventional step-and-shoot CT for imaging a relatively large object, especially for dynamic studies. However, HCT may increase X-ray exposure significantly to the patient. This work aims to reduce the radiation by lowering the X-ray tube current (mA) and filtering the low-mA (or dose) sinogram noise. Based on the noise properties of HCT sinogram, a three-dimensional (3D) penalized weighted least-squares (PWLS) objective function was constructed and an optimal sinogram was estimated by minimizing the objective function. To consider the difference of signal correlation among different direction of the HCT sinogram, an anisotropic Markov random filed (MRF) Gibbs function was designed as the penalty. The minimization of the objection function was performed by iterative Gauss-Seidel updating strategy. The effectiveness of the 3D-PWLS sinogram smoothing for low-dose HCT was demonstrated by a 3D Shepp-Logan head phantom study. Comparison studies with our previously developed KL domain PWLS sinogram smoothing algorithm indicate that the KL+2D-PWLS algorithm shows better performance on in-plane noise-resolution trade-off while the 3D-PLWS shows better performance on z-axis noise-resolution trade-off. Receiver operating characteristic (ROC) studies by using channelized Hotelling observer (CHO) shows that 3D-PWLS and KL+2DPWLS algorithms have similar performance on detectability in low-contrast environment.

  13. $L^1$ penalization of volumetric dose objectives in optimal control of PDEs

    DOE PAGES

    Barnard, Richard C.; Clason, Christian

    2017-02-11

    This work is concerned with a class of PDE-constrained optimization problems that are motivated by an application in radiotherapy treatment planning. Here the primary design objective is to minimize the volume where a functional of the state violates a prescribed level, but prescribing these levels in the form of pointwise state constraints leads to infeasible problems. We therefore propose an alternative approach based on L 1 penalization of the violation that is also applicable when state constraints are infeasible. We establish well-posedness of the corresponding optimal control problem, derive first-order optimality conditions, discuss convergence of minimizers as the penalty parametermore » tends to infinity, and present a semismooth Newton method for their efficient numerical solution. Finally, the performance of this method for a model problem is illustrated and contrasted with an alternative approach based on (regularized) state constraints.« less

  14. Comparing implementations of penalized weighted least-squares sinogram restoration

    PubMed Central

    Forthmann, Peter; Koehler, Thomas; Defrise, Michel; La Riviere, Patrick

    2010-01-01

    Purpose: A CT scanner measures the energy that is deposited in each channel of a detector array by x rays that have been partially absorbed on their way through the object. The measurement process is complex and quantitative measurements are always and inevitably associated with errors, so CT data must be preprocessed prior to reconstruction. In recent years, the authors have formulated CT sinogram preprocessing as a statistical restoration problem in which the goal is to obtain the best estimate of the line integrals needed for reconstruction from the set of noisy, degraded measurements. The authors have explored both penalized Poisson likelihood (PL) and penalized weighted least-squares (PWLS) objective functions. At low doses, the authors found that the PL approach outperforms PWLS in terms of resolution-noise tradeoffs, but at standard doses they perform similarly. The PWLS objective function, being quadratic, is more amenable to computational acceleration than the PL objective. In this work, the authors develop and compare two different methods for implementing PWLS sinogram restoration with the hope of improving computational performance relative to PL in the standard-dose regime. Sinogram restoration is still significant in the standard-dose regime since it can still outperform standard approaches and it allows for correction of effects that are not usually modeled in standard CT preprocessing. Methods: The authors have explored and compared two implementation strategies for PWLS sinogram restoration: (1) A direct matrix-inversion strategy based on the closed-form solution to the PWLS optimization problem and (2) an iterative approach based on the conjugate-gradient algorithm. Obtaining optimal performance from each strategy required modifying the naive off-the-shelf implementations of the algorithms to exploit the particular symmetry and sparseness of the sinogram-restoration problem. For the closed-form approach, the authors subdivided the large matrix inversion into smaller coupled problems and exploited sparseness to minimize matrix operations. For the conjugate-gradient approach, the authors exploited sparseness and preconditioned the problem to speed up convergence. Results: All methods produced qualitatively and quantitatively similar images as measured by resolution-variance tradeoffs and difference images. Despite the acceleration strategies, the direct matrix-inversion approach was found to be uncompetitive with iterative approaches, with a computational burden higher by an order of magnitude or more. The iterative conjugate-gradient approach, however, does appear promising, with computation times half that of the authors’ previous penalized-likelihood implementation. Conclusions: Iterative conjugate-gradient based PWLS sinogram restoration with careful matrix optimizations has computational advantages over direct matrix PWLS inversion and over penalized-likelihood sinogram restoration and can be considered a good alternative in standard-dose regimes. PMID:21158306

  15. Computer method for identification of boiler transfer functions

    NASA Technical Reports Server (NTRS)

    Miles, J. H.

    1971-01-01

    An iterative computer method is described for identifying boiler transfer functions using frequency response data. An objective penalized performance measure and a nonlinear minimization technique are used to cause the locus of points generated by a transfer function to resemble the locus of points obtained from frequency response measurements. Different transfer functions can be tried until a satisfactory empirical transfer function to the system is found. To illustrate the method, some examples and some results from a study of a set of data consisting of measurements of the inlet impedance of a single tube forced flow boiler with inserts are given.

  16. AucPR: an AUC-based approach using penalized regression for disease prediction with high-dimensional omics data.

    PubMed

    Yu, Wenbao; Park, Taesung

    2014-01-01

    It is common to get an optimal combination of markers for disease classification and prediction when multiple markers are available. Many approaches based on the area under the receiver operating characteristic curve (AUC) have been proposed. Existing works based on AUC in a high-dimensional context depend mainly on a non-parametric, smooth approximation of AUC, with no work using a parametric AUC-based approach, for high-dimensional data. We propose an AUC-based approach using penalized regression (AucPR), which is a parametric method used for obtaining a linear combination for maximizing the AUC. To obtain the AUC maximizer in a high-dimensional context, we transform a classical parametric AUC maximizer, which is used in a low-dimensional context, into a regression framework and thus, apply the penalization regression approach directly. Two kinds of penalization, lasso and elastic net, are considered. The parametric approach can avoid some of the difficulties of a conventional non-parametric AUC-based approach, such as the lack of an appropriate concave objective function and a prudent choice of the smoothing parameter. We apply the proposed AucPR for gene selection and classification using four real microarray and synthetic data. Through numerical studies, AucPR is shown to perform better than the penalized logistic regression and the nonparametric AUC-based method, in the sense of AUC and sensitivity for a given specificity, particularly when there are many correlated genes. We propose a powerful parametric and easily-implementable linear classifier AucPR, for gene selection and disease prediction for high-dimensional data. AucPR is recommended for its good prediction performance. Beside gene expression microarray data, AucPR can be applied to other types of high-dimensional omics data, such as miRNA and protein data.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burke, J.V.

    The published work on exact penalization is indeed vast. Recently this work has indicated an intimate relationship between exact penalization, Lagrange multipliers, and problem stability or calmness. In the present work we chronicle this development within a simple idealized problem framework, wherein we unify, extend, and refine much of the known theory. In particular, most of the foundations for constrained optimization are developed with the aid of exact penalization techniques. Our approach is highly geometric and is based upon the elementary subdifferential theory for distance functions. It is assumed that the reader is familiar with the theory of convex setsmore » and functions. 54 refs.« less

  18. Wavefield reconstruction inversion with a multiplicative cost function

    NASA Astrophysics Data System (ADS)

    da Silva, Nuno V.; Yao, Gang

    2018-01-01

    We present a method for the automatic estimation of the trade-off parameter in the context of wavefield reconstruction inversion (WRI). WRI formulates the inverse problem as an optimisation problem, minimising the data misfit while penalising with a wave equation constraining term. The trade-off between the two terms is balanced by a scaling factor that balances the contributions of the data-misfit term and the constraining term to the value of the objective function. If this parameter is too large then it implies penalizing for the wave equation imposing a hard constraint in the inversion. If it is too small, then this leads to a poorly constrained solution as it is essentially penalizing for the data misfit and not taking into account the physics that explains the data. This paper introduces a new approach for the formulation of WRI recasting its formulation into a multiplicative cost function. We demonstrate that the proposed method outperforms the additive cost function when the trade-off parameter is appropriately scaled in the latter, when adapting it throughout the iterations, and when the data is contaminated with Gaussian random noise. Thus this work contributes with a framework for a more automated application of WRI.

  19. Identification of boiler inlet transfer functions and estimation of system parameters

    NASA Technical Reports Server (NTRS)

    Miles, J. H.

    1972-01-01

    An iterative computer method is described for identifying boiler transfer functions using frequency response data. An objective penalized performance measure and a nonlinear minimization technique are used to cause the locus of points generated by a transfer function to resemble the locus of points obtained from frequency response measurements. Different transfer functions can be tried until a satisfactory empirical transfer function of the system is found. To illustrate the method, some examples and some results from a study of a set of data consisting of measurements of the inlet impedance of a single tube forced flow boiler with inserts are given.

  20. Fast function-on-scalar regression with penalized basis expansions.

    PubMed

    Reiss, Philip T; Huang, Lei; Mennes, Maarten

    2010-01-01

    Regression models for functional responses and scalar predictors are often fitted by means of basis functions, with quadratic roughness penalties applied to avoid overfitting. The fitting approach described by Ramsay and Silverman in the 1990 s amounts to a penalized ordinary least squares (P-OLS) estimator of the coefficient functions. We recast this estimator as a generalized ridge regression estimator, and present a penalized generalized least squares (P-GLS) alternative. We describe algorithms by which both estimators can be implemented, with automatic selection of optimal smoothing parameters, in a more computationally efficient manner than has heretofore been available. We discuss pointwise confidence intervals for the coefficient functions, simultaneous inference by permutation tests, and model selection, including a novel notion of pointwise model selection. P-OLS and P-GLS are compared in a simulation study. Our methods are illustrated with an analysis of age effects in a functional magnetic resonance imaging data set, as well as a reanalysis of a now-classic Canadian weather data set. An R package implementing the methods is publicly available.

  1. Neutron Tomography of a Fuel Cell: Statistical Learning Implementation of a Penalized Likelihood Method

    NASA Astrophysics Data System (ADS)

    Coakley, Kevin J.; Vecchia, Dominic F.; Hussey, Daniel S.; Jacobson, David L.

    2013-10-01

    At the NIST Neutron Imaging Facility, we collect neutron projection data for both the dry and wet states of a Proton-Exchange-Membrane (PEM) fuel cell. Transmitted thermal neutrons captured in a scintillator doped with lithium-6 produce scintillation light that is detected by an amorphous silicon detector. Based on joint analysis of the dry and wet state projection data, we reconstruct a residual neutron attenuation image with a Penalized Likelihood method with an edge-preserving Huber penalty function that has two parameters that control how well jumps in the reconstruction are preserved and how well noisy fluctuations are smoothed out. The choice of these parameters greatly influences the resulting reconstruction. We present a data-driven method that objectively selects these parameters, and study its performance for both simulated and experimental data. Before reconstruction, we transform the projection data so that the variance-to-mean ratio is approximately one. For both simulated and measured projection data, the Penalized Likelihood method reconstruction is visually sharper than a reconstruction yielded by a standard Filtered Back Projection method. In an idealized simulation experiment, we demonstrate that the cross validation procedure selects regularization parameters that yield a reconstruction that is nearly optimal according to a root-mean-square prediction error criterion.

  2. Penalized weighted least-squares approach for multienergy computed tomography image reconstruction via structure tensor total variation regularization.

    PubMed

    Zeng, Dong; Gao, Yuanyuan; Huang, Jing; Bian, Zhaoying; Zhang, Hua; Lu, Lijun; Ma, Jianhua

    2016-10-01

    Multienergy computed tomography (MECT) allows identifying and differentiating different materials through simultaneous capture of multiple sets of energy-selective data belonging to specific energy windows. However, because sufficient photon counts are not available in each energy window compared with that in the whole energy window, the MECT images reconstructed by the analytical approach often suffer from poor signal-to-noise and strong streak artifacts. To address the particular challenge, this work presents a penalized weighted least-squares (PWLS) scheme by incorporating the new concept of structure tensor total variation (STV) regularization, which is henceforth referred to as 'PWLS-STV' for simplicity. Specifically, the STV regularization is derived by penalizing higher-order derivatives of the desired MECT images. Thus it could provide more robust measures of image variation, which can eliminate the patchy artifacts often observed in total variation (TV) regularization. Subsequently, an alternating optimization algorithm was adopted to minimize the objective function. Extensive experiments with a digital XCAT phantom and meat specimen clearly demonstrate that the present PWLS-STV algorithm can achieve more gains than the existing TV-based algorithms and the conventional filtered backpeojection (FBP) algorithm in terms of both quantitative and visual quality evaluations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. dPIRPLE: a joint estimation framework for deformable registration and penalized-likelihood CT image reconstruction using prior images

    NASA Astrophysics Data System (ADS)

    Dang, H.; Wang, A. S.; Sussman, Marc S.; Siewerdsen, J. H.; Stayman, J. W.

    2014-09-01

    Sequential imaging studies are conducted in many clinical scenarios. Prior images from previous studies contain a great deal of patient-specific anatomical information and can be used in conjunction with subsequent imaging acquisitions to maintain image quality while enabling radiation dose reduction (e.g., through sparse angular sampling, reduction in fluence, etc). However, patient motion between images in such sequences results in misregistration between the prior image and current anatomy. Existing prior-image-based approaches often include only a simple rigid registration step that can be insufficient for capturing complex anatomical motion, introducing detrimental effects in subsequent image reconstruction. In this work, we propose a joint framework that estimates the 3D deformation between an unregistered prior image and the current anatomy (based on a subsequent data acquisition) and reconstructs the current anatomical image using a model-based reconstruction approach that includes regularization based on the deformed prior image. This framework is referred to as deformable prior image registration, penalized-likelihood estimation (dPIRPLE). Central to this framework is the inclusion of a 3D B-spline-based free-form-deformation model into the joint registration-reconstruction objective function. The proposed framework is solved using a maximization strategy whereby alternating updates to the registration parameters and image estimates are applied allowing for improvements in both the registration and reconstruction throughout the optimization process. Cadaver experiments were conducted on a cone-beam CT testbench emulating a lung nodule surveillance scenario. Superior reconstruction accuracy and image quality were demonstrated using the dPIRPLE algorithm as compared to more traditional reconstruction methods including filtered backprojection, penalized-likelihood estimation (PLE), prior image penalized-likelihood estimation (PIPLE) without registration, and prior image penalized-likelihood estimation with rigid registration of a prior image (PIRPLE) over a wide range of sampling sparsity and exposure levels.

  4. Integrative Analysis of “-Omics” Data Using Penalty Functions

    PubMed Central

    Zhao, Qing; Shi, Xingjie; Huang, Jian; Liu, Jin; Li, Yang; Ma, Shuangge

    2014-01-01

    In the analysis of omics data, integrative analysis provides an effective way of pooling information across multiple datasets or multiple correlated responses, and can be more effective than single-dataset (response) analysis. Multiple families of integrative analysis methods have been proposed in the literature. The current review focuses on the penalization methods. Special attention is paid to sparse meta-analysis methods that pool summary statistics across datasets, and integrative analysis methods that pool raw data across datasets. We discuss their formulation and rationale. Beyond “standard” penalized selection, we also review contrasted penalization and Laplacian penalization which accommodate finer data structures. The computational aspects, including computational algorithms and tuning parameter selection, are examined. This review concludes with possible limitations and extensions. PMID:25691921

  5. A TVSCAD approach for image deblurring with impulsive noise

    NASA Astrophysics Data System (ADS)

    Gu, Guoyong; Jiang, Suhong; Yang, Junfeng

    2017-12-01

    We consider image deblurring problem in the presence of impulsive noise. It is known that total variation (TV) regularization with L1-norm penalized data fitting (TVL1 for short) works reasonably well only when the level of impulsive noise is relatively low. For high level impulsive noise, TVL1 works poorly. The reason is that all data, both corrupted and noise free, are equally penalized in data fitting, leading to insurmountable difficulty in balancing regularization and data fitting. In this paper, we propose to combine TV regularization with nonconvex smoothly clipped absolute deviation (SCAD) penalty for data fitting (TVSCAD for short). Our motivation is simply that data fitting should be enforced only when an observed data is not severely corrupted, while for those data more likely to be severely corrupted, less or even null penalization should be enforced. A difference of convex functions algorithm is adopted to solve the nonconvex TVSCAD model, resulting in solving a sequence of TVL1-equivalent problems, each of which can then be solved efficiently by the alternating direction method of multipliers. Theoretically, we establish global convergence to a critical point of the nonconvex objective function. The R-linear and at-least-sublinear convergence rate results are derived for the cases of anisotropic and isotropic TV, respectively. Numerically, experimental results are given to show that the TVSCAD approach improves those of the TVL1 significantly, especially for cases with high level impulsive noise, and is comparable with the recently proposed iteratively corrected TVL1 method (Bai et al 2016 Inverse Problems 32 085004).

  6. Low-mAs X-ray CT image reconstruction by adaptive-weighted TV-constrained penalized re-weighted least-squares

    PubMed Central

    Liu, Yan; Ma, Jianhua; Zhang, Hao; Wang, Jing; Liang, Zhengrong

    2014-01-01

    Background The negative effects of X-ray exposure, such as inducing genetic and cancerous diseases, has arisen more attentions. Objective This paper aims to investigate a penalized re-weighted least-square (PRWLS) strategy for low-mAs X-ray computed tomography image reconstruction by incorporating an adaptive weighted total variation (AwTV) penalty term and a noise variance model of projection data. Methods An AwTV penalty is introduced in the objective function by considering both piecewise constant property and local nearby intensity similarity of the desired image. Furthermore, the weight of data fidelity term in the objective function is determined by our recent study on modeling variance estimation of projection data in the presence of electronic background noise. Results The presented AwTV-PRWLS algorithm can achieve the highest full-width-at-half-maximum (FWHM) measurement, for data conditions of (1) full-view 10mA acquisition and (2) sparse-view 80mA acquisition. In comparison between the AwTV/TV-PRWLS strategies and the previous reported AwTV/TV-projection onto convex sets (AwTV/TV-POCS) approaches, the former can gain in terms of FWHM for data condition (1), but cannot gain for the data condition (2). Conclusions In the case of full-view 10mA projection data, the presented AwTV-PRWLS shows potential improvement. However, in the case of sparse-view 80mA projection data, the AwTV/TV-POCS shows advantage over the PRWLS strategies. PMID:25080113

  7. Drag and drop simulation: from pictures to full three-dimensional simulations

    NASA Astrophysics Data System (ADS)

    Bergmann, Michel; Iollo, Angelo

    2014-11-01

    We present a suite of methods to achieve ``drag and drop'' simulation, i.e., to fully automatize the process to perform thee-dimensional flow simulations around a bodies defined by actual images of moving objects. The overall approach requires a skeleton graph generation to get level set function from pictures, optimal transportation to get body velocity on the surface and then flow simulation thanks to a cartesian method based on penalization. We illustrate this paradigm simulating the swimming of a mackerel fish.

  8. Variable Selection for Support Vector Machines in Moderately High Dimensions

    PubMed Central

    Zhang, Xiang; Wu, Yichao; Wang, Lan; Li, Runze

    2015-01-01

    Summary The support vector machine (SVM) is a powerful binary classification tool with high accuracy and great flexibility. It has achieved great success, but its performance can be seriously impaired if many redundant covariates are included. Some efforts have been devoted to studying variable selection for SVMs, but asymptotic properties, such as variable selection consistency, are largely unknown when the number of predictors diverges to infinity. In this work, we establish a unified theory for a general class of nonconvex penalized SVMs. We first prove that in ultra-high dimensions, there exists one local minimizer to the objective function of nonconvex penalized SVMs possessing the desired oracle property. We further address the problem of nonunique local minimizers by showing that the local linear approximation algorithm is guaranteed to converge to the oracle estimator even in the ultra-high dimensional setting if an appropriate initial estimator is available. This condition on initial estimator is verified to be automatically valid as long as the dimensions are moderately high. Numerical examples provide supportive evidence. PMID:26778916

  9. Functional Generalized Structured Component Analysis.

    PubMed

    Suk, Hye Won; Hwang, Heungsun

    2016-12-01

    An extension of Generalized Structured Component Analysis (GSCA), called Functional GSCA, is proposed to analyze functional data that are considered to arise from an underlying smooth curve varying over time or other continua. GSCA has been geared for the analysis of multivariate data. Accordingly, it cannot deal with functional data that often involve different measurement occasions across participants and a large number of measurement occasions that exceed the number of participants. Functional GSCA addresses these issues by integrating GSCA with spline basis function expansions that represent infinite-dimensional curves onto a finite-dimensional space. For parameter estimation, functional GSCA minimizes a penalized least squares criterion by using an alternating penalized least squares estimation algorithm. The usefulness of functional GSCA is illustrated with gait data.

  10. Metabolic flux estimation using particle swarm optimization with penalty function.

    PubMed

    Long, Hai-Xia; Xu, Wen-Bo; Sun, Jun

    2009-01-01

    Metabolic flux estimation through 13C trace experiment is crucial for quantifying the intracellular metabolic fluxes. In fact, it corresponds to a constrained optimization problem that minimizes a weighted distance between measured and simulated results. In this paper, we propose particle swarm optimization (PSO) with penalty function to solve 13C-based metabolic flux estimation problem. The stoichiometric constraints are transformed to an unconstrained one, by penalizing the constraints and building a single objective function, which in turn is minimized using PSO algorithm for flux quantification. The proposed algorithm is applied to estimate the central metabolic fluxes of Corynebacterium glutamicum. From simulation results, it is shown that the proposed algorithm has superior performance and fast convergence ability when compared to other existing algorithms.

  11. Historical HIV incidence modelling in regional subgroups: use of flexible discrete models with penalized splines based on prior curves.

    PubMed

    Greenland, S

    1996-03-15

    This paper presents an approach to back-projection (back-calculation) of human immunodeficiency virus (HIV) person-year infection rates in regional subgroups based on combining a log-linear model for subgroup differences with a penalized spline model for trends. The penalized spline approach allows flexible trend estimation but requires far fewer parameters than fully non-parametric smoothers, thus saving parameters that can be used in estimating subgroup effects. Use of reasonable prior curve to construct the penalty function minimizes the degree of smoothing needed beyond model specification. The approach is illustrated in application to acquired immunodeficiency syndrome (AIDS) surveillance data from Los Angeles County.

  12. Majorization Minimization by Coordinate Descent for Concave Penalized Generalized Linear Models

    PubMed Central

    Jiang, Dingfeng; Huang, Jian

    2013-01-01

    Recent studies have demonstrated theoretical attractiveness of a class of concave penalties in variable selection, including the smoothly clipped absolute deviation and minimax concave penalties. The computation of the concave penalized solutions in high-dimensional models, however, is a difficult task. We propose a majorization minimization by coordinate descent (MMCD) algorithm for computing the concave penalized solutions in generalized linear models. In contrast to the existing algorithms that use local quadratic or local linear approximation to the penalty function, the MMCD seeks to majorize the negative log-likelihood by a quadratic loss, but does not use any approximation to the penalty. This strategy makes it possible to avoid the computation of a scaling factor in each update of the solutions, which improves the efficiency of coordinate descent. Under certain regularity conditions, we establish theoretical convergence property of the MMCD. We implement this algorithm for a penalized logistic regression model using the SCAD and MCP penalties. Simulation studies and a data example demonstrate that the MMCD works sufficiently fast for the penalized logistic regression in high-dimensional settings where the number of covariates is much larger than the sample size. PMID:25309048

  13. Sparse Logistic Regression for Diagnosis of Liver Fibrosis in Rat by Using SCAD-Penalized Likelihood

    PubMed Central

    Yan, Fang-Rong; Lin, Jin-Guan; Liu, Yu

    2011-01-01

    The objective of the present study is to find out the quantitative relationship between progression of liver fibrosis and the levels of certain serum markers using mathematic model. We provide the sparse logistic regression by using smoothly clipped absolute deviation (SCAD) penalized function to diagnose the liver fibrosis in rats. Not only does it give a sparse solution with high accuracy, it also provides the users with the precise probabilities of classification with the class information. In the simulative case and the experiment case, the proposed method is comparable to the stepwise linear discriminant analysis (SLDA) and the sparse logistic regression with least absolute shrinkage and selection operator (LASSO) penalty, by using receiver operating characteristic (ROC) with bayesian bootstrap estimating area under the curve (AUC) diagnostic sensitivity for selected variable. Results show that the new approach provides a good correlation between the serum marker levels and the liver fibrosis induced by thioacetamide (TAA) in rats. Meanwhile, this approach might also be used in predicting the development of liver cirrhosis. PMID:21716672

  14. Variable selection in semiparametric cure models based on penalized likelihood, with application to breast cancer clinical trials.

    PubMed

    Liu, Xiang; Peng, Yingwei; Tu, Dongsheng; Liang, Hua

    2012-10-30

    Survival data with a sizable cure fraction are commonly encountered in cancer research. The semiparametric proportional hazards cure model has been recently used to analyze such data. As seen in the analysis of data from a breast cancer study, a variable selection approach is needed to identify important factors in predicting the cure status and risk of breast cancer recurrence. However, no specific variable selection method for the cure model is available. In this paper, we present a variable selection approach with penalized likelihood for the cure model. The estimation can be implemented easily by combining the computational methods for penalized logistic regression and the penalized Cox proportional hazards models with the expectation-maximization algorithm. We illustrate the proposed approach on data from a breast cancer study. We conducted Monte Carlo simulations to evaluate the performance of the proposed method. We used and compared different penalty functions in the simulation studies. Copyright © 2012 John Wiley & Sons, Ltd.

  15. Joint penalized-likelihood reconstruction of time-activity curves and regions-of-interest from projection data in brain PET

    NASA Astrophysics Data System (ADS)

    Krestyannikov, E.; Tohka, J.; Ruotsalainen, U.

    2008-06-01

    This paper presents a novel statistical approach for joint estimation of regions-of-interest (ROIs) and the corresponding time-activity curves (TACs) from dynamic positron emission tomography (PET) brain projection data. It is based on optimizing the joint objective function that consists of a data log-likelihood term and two penalty terms reflecting the available a priori information about the human brain anatomy. The developed local optimization strategy iteratively updates both the ROI and TAC parameters and is guaranteed to monotonically increase the objective function. The quantitative evaluation of the algorithm is performed with numerically and Monte Carlo-simulated dynamic PET brain data of the 11C-Raclopride and 18F-FDG tracers. The results demonstrate that the method outperforms the existing sequential ROI quantification approaches in terms of accuracy, and can noticeably reduce the errors in TACs arising due to the finite spatial resolution and ROI delineation.

  16. Across-Platform Imputation of DNA Methylation Levels Incorporating Nonlocal Information Using Penalized Functional Regression.

    PubMed

    Zhang, Guosheng; Huang, Kuan-Chieh; Xu, Zheng; Tzeng, Jung-Ying; Conneely, Karen N; Guan, Weihua; Kang, Jian; Li, Yun

    2016-05-01

    DNA methylation is a key epigenetic mark involved in both normal development and disease progression. Recent advances in high-throughput technologies have enabled genome-wide profiling of DNA methylation. However, DNA methylation profiling often employs different designs and platforms with varying resolution, which hinders joint analysis of methylation data from multiple platforms. In this study, we propose a penalized functional regression model to impute missing methylation data. By incorporating functional predictors, our model utilizes information from nonlocal probes to improve imputation quality. Here, we compared the performance of our functional model to linear regression and the best single probe surrogate in real data and via simulations. Specifically, we applied different imputation approaches to an acute myeloid leukemia dataset consisting of 194 samples and our method showed higher imputation accuracy, manifested, for example, by a 94% relative increase in information content and up to 86% more CpG sites passing post-imputation filtering. Our simulated association study further demonstrated that our method substantially improves the statistical power to identify trait-associated methylation loci. These findings indicate that the penalized functional regression model is a convenient and valuable imputation tool for methylation data, and it can boost statistical power in downstream epigenome-wide association study (EWAS). © 2016 WILEY PERIODICALS, INC.

  17. Penalized spline estimation for functional coefficient regression models.

    PubMed

    Cao, Yanrong; Lin, Haiqun; Wu, Tracy Z; Yu, Yan

    2010-04-01

    The functional coefficient regression models assume that the regression coefficients vary with some "threshold" variable, providing appreciable flexibility in capturing the underlying dynamics in data and avoiding the so-called "curse of dimensionality" in multivariate nonparametric estimation. We first investigate the estimation, inference, and forecasting for the functional coefficient regression models with dependent observations via penalized splines. The P-spline approach, as a direct ridge regression shrinkage type global smoothing method, is computationally efficient and stable. With established fixed-knot asymptotics, inference is readily available. Exact inference can be obtained for fixed smoothing parameter λ, which is most appealing for finite samples. Our penalized spline approach gives an explicit model expression, which also enables multi-step-ahead forecasting via simulations. Furthermore, we examine different methods of choosing the important smoothing parameter λ: modified multi-fold cross-validation (MCV), generalized cross-validation (GCV), and an extension of empirical bias bandwidth selection (EBBS) to P-splines. In addition, we implement smoothing parameter selection using mixed model framework through restricted maximum likelihood (REML) for P-spline functional coefficient regression models with independent observations. The P-spline approach also easily allows different smoothness for different functional coefficients, which is enabled by assigning different penalty λ accordingly. We demonstrate the proposed approach by both simulation examples and a real data application.

  18. Penalized nonparametric scalar-on-function regression via principal coordinates

    PubMed Central

    Reiss, Philip T.; Miller, David L.; Wu, Pei-Shien; Hua, Wen-Yu

    2016-01-01

    A number of classical approaches to nonparametric regression have recently been extended to the case of functional predictors. This paper introduces a new method of this type, which extends intermediate-rank penalized smoothing to scalar-on-function regression. In the proposed method, which we call principal coordinate ridge regression, one regresses the response on leading principal coordinates defined by a relevant distance among the functional predictors, while applying a ridge penalty. Our publicly available implementation, based on generalized additive modeling software, allows for fast optimal tuning parameter selection and for extensions to multiple functional predictors, exponential family-valued responses, and mixed-effects models. In an application to signature verification data, principal coordinate ridge regression, with dynamic time warping distance used to define the principal coordinates, is shown to outperform a functional generalized linear model. PMID:29217963

  19. Cox Regression Models with Functional Covariates for Survival Data.

    PubMed

    Gellar, Jonathan E; Colantuoni, Elizabeth; Needham, Dale M; Crainiceanu, Ciprian M

    2015-06-01

    We extend the Cox proportional hazards model to cases when the exposure is a densely sampled functional process, measured at baseline. The fundamental idea is to combine penalized signal regression with methods developed for mixed effects proportional hazards models. The model is fit by maximizing the penalized partial likelihood, with smoothing parameters estimated by a likelihood-based criterion such as AIC or EPIC. The model may be extended to allow for multiple functional predictors, time varying coefficients, and missing or unequally-spaced data. Methods were inspired by and applied to a study of the association between time to death after hospital discharge and daily measures of disease severity collected in the intensive care unit, among survivors of acute respiratory distress syndrome.

  20. Inverse determination of the penalty parameter in penalized weighted least-squares algorithm for noise reduction of low-dose CBCT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jing; Guan, Huaiqun; Solberg, Timothy

    2011-07-15

    Purpose: A statistical projection restoration algorithm based on the penalized weighted least-squares (PWLS) criterion can substantially improve the image quality of low-dose CBCT images. The performance of PWLS is largely dependent on the choice of the penalty parameter. Previously, the penalty parameter was chosen empirically by trial and error. In this work, the authors developed an inverse technique to calculate the penalty parameter in PWLS for noise suppression of low-dose CBCT in image guided radiotherapy (IGRT). Methods: In IGRT, a daily CBCT is acquired for the same patient during a treatment course. In this work, the authors acquired the CBCTmore » with a high-mAs protocol for the first session and then a lower mAs protocol for the subsequent sessions. The high-mAs projections served as the goal (ideal) toward, which the low-mAs projections were to be smoothed by minimizing the PWLS objective function. The penalty parameter was determined through an inverse calculation of the derivative of the objective function incorporating both the high and low-mAs projections. Then the parameter obtained can be used for PWLS to smooth the noise in low-dose projections. CBCT projections for a CatPhan 600 and an anthropomorphic head phantom, as well as for a brain patient, were used to evaluate the performance of the proposed technique. Results: The penalty parameter in PWLS was obtained for each CBCT projection using the proposed strategy. The noise in the low-dose CBCT images reconstructed from the smoothed projections was greatly suppressed. Image quality in PWLS-processed low-dose CBCT was comparable to its corresponding high-dose CBCT. Conclusions: A technique was proposed to estimate the penalty parameter for PWLS algorithm. It provides an objective and efficient way to obtain the penalty parameter for image restoration algorithms that require predefined smoothing parameters.« less

  1. Estimation and Selection via Absolute Penalized Convex Minimization And Its Multistage Adaptive Applications

    PubMed Central

    Huang, Jian; Zhang, Cun-Hui

    2013-01-01

    The ℓ1-penalized method, or the Lasso, has emerged as an important tool for the analysis of large data sets. Many important results have been obtained for the Lasso in linear regression which have led to a deeper understanding of high-dimensional statistical problems. In this article, we consider a class of weighted ℓ1-penalized estimators for convex loss functions of a general form, including the generalized linear models. We study the estimation, prediction, selection and sparsity properties of the weighted ℓ1-penalized estimator in sparse, high-dimensional settings where the number of predictors p can be much larger than the sample size n. Adaptive Lasso is considered as a special case. A multistage method is developed to approximate concave regularized estimation by applying an adaptive Lasso recursively. We provide prediction and estimation oracle inequalities for single- and multi-stage estimators, a general selection consistency theorem, and an upper bound for the dimension of the Lasso estimator. Important models including the linear regression, logistic regression and log-linear models are used throughout to illustrate the applications of the general results. PMID:24348100

  2. [Sponsoring of medical conferences, workshops and symposia by pharmaceutical companies. Physicians must be wary of this!].

    PubMed

    Warntjen, M

    2009-12-01

    The longstanding conventional forms of cooperation between medical organizations and physicians on the one hand and the pharmaceutical industry and manufacturers of medical products on the other hand nowadays hold the risk of coming into conflict with the public prosecutor. Typical circumstances which are taken up by the investigating authorities are financial supports of medical conferences, workshops and symposia. To understand the problem under criminal law it is important to become acquainted with the protective aim of the statutory offences of the acceptance of benefits according to section sign 331 of the Penal Code (Strafgesetzbuch, StGB) and of corruption according to section sign 332 of the Penal Code. The "trust of the general public in the objectivity of governmental decisions" must be protected and the "evil appearance of the corruptibility of official acts" must be counteracted. A basic differentiation is made between physicians with and without office-bearing functions. By paying attention to the recommendations and basic principles of cooperation between the medical profession and the healthcare industry presented in this article (transparency principle, equivalence principle, documentation principle and separation principle) the emergence of any suspicious factors can be effectively avoided.

  3. Sparse High Dimensional Models in Economics

    PubMed Central

    Fan, Jianqing; Lv, Jinchi; Qi, Lei

    2010-01-01

    This paper reviews the literature on sparse high dimensional models and discusses some applications in economics and finance. Recent developments of theory, methods, and implementations in penalized least squares and penalized likelihood methods are highlighted. These variable selection methods are proved to be effective in high dimensional sparse modeling. The limits of dimensionality that regularization methods can handle, the role of penalty functions, and their statistical properties are detailed. Some recent advances in ultra-high dimensional sparse modeling are also briefly discussed. PMID:22022635

  4. Motion visualization and estimation for flapping wing systems

    NASA Astrophysics Data System (ADS)

    Hsu, Tzu-Sheng Shane; Fitzgerald, Timothy; Nguyen, Vincent Phuc; Patel, Trisha; Balachandran, Balakumar

    2017-04-01

    Studies of fluid-structure interactions associated with flexible structures such as flapping wings require the capture and quantification of large motions of bodies that may be opaque. As a case study, motion capture of a free flying Manduca sexta, also known as hawkmoth, is considered by using three synchronized high-speed cameras. A solid finite element (FE) representation is used as a reference body and successive snapshots in time of the displacement fields are reconstructed via an optimization procedure. One of the original aspects of this work is the formulation of an objective function and the use of shadow matching and strain-energy regularization. With this objective function, the authors penalize the projection differences between silhouettes of the captured images and the FE representation of the deformed body. The process and procedures undertaken to go from high-speed videography to motion estimation are discussed, and snapshots of representative results are presented. Finally, the captured free-flight motion is also characterized and quantified.

  5. Subjective and Objective Evaluation of Pitch Extractors for LPC and Harmonic Deviations Vocoders.

    DTIC Science & Technology

    1984-07-01

    Neman Inc. significance of the voicing errors at unvoiced-voiced transitions (see Section 7.2), is to penalize the early and the late start of...8217- Report No. 5726 Bolt Beranek and Neman Inc. Objective PAL Measure LPC/Cler =/Clear LPC/Noise HDY Noise S- Bes Measures: S EDFL-C-EF -0.990 -0.991 -0.995

  6. Polarimetric image reconstruction algorithms

    NASA Astrophysics Data System (ADS)

    Valenzuela, John R.

    In the field of imaging polarimetry Stokes parameters are sought and must be inferred from noisy and blurred intensity measurements. Using a penalized-likelihood estimation framework we investigate reconstruction quality when estimating intensity images and then transforming to Stokes parameters (traditional estimator), and when estimating Stokes parameters directly (Stokes estimator). We define our cost function for reconstruction by a weighted least squares data fit term and a regularization penalty. It is shown that under quadratic regularization, the traditional and Stokes estimators can be made equal by appropriate choice of regularization parameters. It is empirically shown that, when using edge preserving regularization, estimating the Stokes parameters directly leads to lower RMS error in reconstruction. Also, the addition of a cross channel regularization term further lowers the RMS error for both methods especially in the case of low SNR. The technique of phase diversity has been used in traditional incoherent imaging systems to jointly estimate an object and optical system aberrations. We extend the technique of phase diversity to polarimetric imaging systems. Specifically, we describe penalized-likelihood methods for jointly estimating Stokes images and optical system aberrations from measurements that contain phase diversity. Jointly estimating Stokes images and optical system aberrations involves a large parameter space. A closed-form expression for the estimate of the Stokes images in terms of the aberration parameters is derived and used in a formulation that reduces the dimensionality of the search space to the number of aberration parameters only. We compare the performance of the joint estimator under both quadratic and edge-preserving regularization. The joint estimator with edge-preserving regularization yields higher fidelity polarization estimates than with quadratic regularization. Under quadratic regularization, using the reduced-parameter search strategy, accurate aberration estimates can be obtained without recourse to regularization "tuning". Phase-diverse wavefront sensing is emerging as a viable candidate wavefront sensor for adaptive-optics systems. In a quadratically penalized weighted least squares estimation framework a closed form expression for the object being imaged in terms of the aberrations in the system is available. This expression offers a dramatic reduction of the dimensionality of the estimation problem and thus is of great interest for practical applications. We have derived an expression for an approximate joint covariance matrix for object and aberrations in the phase diversity context. Our expression for the approximate joint covariance is compared with the "known-object" Cramer-Rao lower bound that is typically used for system parameter optimization. Estimates of the optimal amount of defocus in a phase-diverse wavefront sensor derived from the joint-covariance matrix, the known-object Cramer-Rao bound, and Monte Carlo simulations are compared for an extended scene and a point object. It is found that our variance approximation, that incorporates the uncertainty of the object, leads to an improvement in predicting the optimal amount of defocus to use in a phase-diverse wavefront sensor.

  7. Improving Cluster Analysis with Automatic Variable Selection Based on Trees

    DTIC Science & Technology

    2014-12-01

    regression trees Daisy DISsimilAritY PAM partitioning around medoids PMA penalized multivariate analysis SPC sparse principal components UPGMA unweighted...unweighted pair-group average method ( UPGMA ). This method measures dissimilarities between all objects in two clusters and takes the average value

  8. Iterative image reconstruction for multienergy computed tomography via structure tensor total variation regularization

    NASA Astrophysics Data System (ADS)

    Zeng, Dong; Bian, Zhaoying; Gong, Changfei; Huang, Jing; He, Ji; Zhang, Hua; Lu, Lijun; Feng, Qianjin; Liang, Zhengrong; Ma, Jianhua

    2016-03-01

    Multienergy computed tomography (MECT) has the potential to simultaneously offer multiple sets of energy- selective data belonging to specific energy windows. However, because sufficient photon counts are not available in the specific energy windows compared with that in the whole energy window, the MECT images reconstructed by the analytical approach often suffer from poor signal-to-noise (SNR) and strong streak artifacts. To eliminate this drawback, in this work we present a penalized weighted least-squares (PWLS) scheme by incorporating the new concept of structure tensor total variation (STV) regularization to improve the MECT images quality from low-milliampere-seconds (low-mAs) data acquisitions. Henceforth the present scheme is referred to as `PWLS- STV' for simplicity. Specifically, the STV regularization is derived by penalizing the eigenvalues of the structure tensor of every point in the MECT images. Thus it can provide more robust measures of image variation, which can eliminate the patchy artifacts often observed in total variation regularization. Subsequently, an alternating optimization algorithm was adopted to minimize the objective function. Experiments with a digital XCAT phantom clearly demonstrate that the present PWLS-STV algorithm can achieve more gains than the existing TV-based algorithms and the conventional filtered backpeojection (FBP) algorithm in terms of noise-induced artifacts suppression, resolution preservation, and material decomposition assessment.

  9. Civil commitment and the criminal insanity plea in Israeli law.

    PubMed

    Toib, Josef A

    2008-01-01

    In Israeli jurisprudence there is a substantial difference towards mentally ill patients between the civil and penal law systems that goes well beyond differences required by their separate objectives. Mentally ill people dangerous to others due to their illness belong in the hospital, not in the community or in jail. The data gathered especially for this paper make it hard to escape the conclusion that contemporary practice in Israel does not accord with this objective. On the civil front, inaccuracy in predicting who is dangerous may lead to involuntary commitment of people who are not dangerous. On the criminal side, too few people are sent to the hospital in Israel and correspondingly too many to jail. Comparison with US data and practice shows that on the civil side prediction has been improved by using actuarial methods, while on the penal side more up to date definitions of mental illness have been adopted. Whatever the appropriate solution for Israel, surely the first requirement is recognition of the problem.

  10. Development and application of a volume penalization immersed boundary method for the computation of blood flow and shear stresses in cerebral vessels and aneurysms.

    PubMed

    Mikhal, Julia; Geurts, Bernard J

    2013-12-01

    A volume-penalizing immersed boundary method is presented for the simulation of laminar incompressible flow inside geometrically complex blood vessels in the human brain. We concentrate on cerebral aneurysms and compute flow in curved brain vessels with and without spherical aneurysm cavities attached. We approximate blood as an incompressible Newtonian fluid and simulate the flow with the use of a skew-symmetric finite-volume discretization and explicit time-stepping. A key element of the immersed boundary method is the so-called masking function. This is a binary function with which we identify at any location in the domain whether it is 'solid' or 'fluid', allowing to represent objects immersed in a Cartesian grid. We compare three definitions of the masking function for geometries that are non-aligned with the grid. In each case a 'staircase' representation is used in which a grid cell is either 'solid' or 'fluid'. Reliable findings are obtained with our immersed boundary method, even at fairly coarse meshes with about 16 grid cells across a velocity profile. The validation of the immersed boundary method is provided on the basis of classical Poiseuille flow in a cylindrical pipe. We obtain first order convergence for the velocity and the shear stress, reflecting the fact that in our approach the solid-fluid interface is localized with an accuracy on the order of a grid cell. Simulations for curved vessels and aneurysms are done for different flow regimes, characterized by different values of the Reynolds number (Re). The validation is performed for laminar flow at Re = 250, while the flow in more complex geometries is studied at Re = 100 and Re = 250, as suggested by physiological conditions pertaining to flow of blood in the circle of Willis.

  11. 28 CFR 0.95 - General functions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Judicial Administration DEPARTMENT OF JUSTICE ORGANIZATION OF THE DEPARTMENT OF JUSTICE Bureau of Prisons § 0.95 General functions. The Director of the Bureau of Prisons shall direct all activities of the Bureau of Prisons including: (a) Management and regulation of all Federal penal and correctional...

  12. 28 CFR 0.95 - General functions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Judicial Administration DEPARTMENT OF JUSTICE ORGANIZATION OF THE DEPARTMENT OF JUSTICE Bureau of Prisons § 0.95 General functions. The Director of the Bureau of Prisons shall direct all activities of the Bureau of Prisons including: (a) Management and regulation of all Federal penal and correctional...

  13. 28 CFR 0.95 - General functions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Judicial Administration DEPARTMENT OF JUSTICE ORGANIZATION OF THE DEPARTMENT OF JUSTICE Bureau of Prisons § 0.95 General functions. The Director of the Bureau of Prisons shall direct all activities of the Bureau of Prisons including: (a) Management and regulation of all Federal penal and correctional...

  14. Elastic SCAD as a novel penalization method for SVM classification tasks in high-dimensional data.

    PubMed

    Becker, Natalia; Toedt, Grischa; Lichter, Peter; Benner, Axel

    2011-05-09

    Classification and variable selection play an important role in knowledge discovery in high-dimensional data. Although Support Vector Machine (SVM) algorithms are among the most powerful classification and prediction methods with a wide range of scientific applications, the SVM does not include automatic feature selection and therefore a number of feature selection procedures have been developed. Regularisation approaches extend SVM to a feature selection method in a flexible way using penalty functions like LASSO, SCAD and Elastic Net.We propose a novel penalty function for SVM classification tasks, Elastic SCAD, a combination of SCAD and ridge penalties which overcomes the limitations of each penalty alone.Since SVM models are extremely sensitive to the choice of tuning parameters, we adopted an interval search algorithm, which in comparison to a fixed grid search finds rapidly and more precisely a global optimal solution. Feature selection methods with combined penalties (Elastic Net and Elastic SCAD SVMs) are more robust to a change of the model complexity than methods using single penalties. Our simulation study showed that Elastic SCAD SVM outperformed LASSO (L1) and SCAD SVMs. Moreover, Elastic SCAD SVM provided sparser classifiers in terms of median number of features selected than Elastic Net SVM and often better predicted than Elastic Net in terms of misclassification error.Finally, we applied the penalization methods described above on four publicly available breast cancer data sets. Elastic SCAD SVM was the only method providing robust classifiers in sparse and non-sparse situations. The proposed Elastic SCAD SVM algorithm provides the advantages of the SCAD penalty and at the same time avoids sparsity limitations for non-sparse data. We were first to demonstrate that the integration of the interval search algorithm and penalized SVM classification techniques provides fast solutions on the optimization of tuning parameters.The penalized SVM classification algorithms as well as fixed grid and interval search for finding appropriate tuning parameters were implemented in our freely available R package 'penalizedSVM'.We conclude that the Elastic SCAD SVM is a flexible and robust tool for classification and feature selection tasks for high-dimensional data such as microarray data sets.

  15. Elastic SCAD as a novel penalization method for SVM classification tasks in high-dimensional data

    PubMed Central

    2011-01-01

    Background Classification and variable selection play an important role in knowledge discovery in high-dimensional data. Although Support Vector Machine (SVM) algorithms are among the most powerful classification and prediction methods with a wide range of scientific applications, the SVM does not include automatic feature selection and therefore a number of feature selection procedures have been developed. Regularisation approaches extend SVM to a feature selection method in a flexible way using penalty functions like LASSO, SCAD and Elastic Net. We propose a novel penalty function for SVM classification tasks, Elastic SCAD, a combination of SCAD and ridge penalties which overcomes the limitations of each penalty alone. Since SVM models are extremely sensitive to the choice of tuning parameters, we adopted an interval search algorithm, which in comparison to a fixed grid search finds rapidly and more precisely a global optimal solution. Results Feature selection methods with combined penalties (Elastic Net and Elastic SCAD SVMs) are more robust to a change of the model complexity than methods using single penalties. Our simulation study showed that Elastic SCAD SVM outperformed LASSO (L1) and SCAD SVMs. Moreover, Elastic SCAD SVM provided sparser classifiers in terms of median number of features selected than Elastic Net SVM and often better predicted than Elastic Net in terms of misclassification error. Finally, we applied the penalization methods described above on four publicly available breast cancer data sets. Elastic SCAD SVM was the only method providing robust classifiers in sparse and non-sparse situations. Conclusions The proposed Elastic SCAD SVM algorithm provides the advantages of the SCAD penalty and at the same time avoids sparsity limitations for non-sparse data. We were first to demonstrate that the integration of the interval search algorithm and penalized SVM classification techniques provides fast solutions on the optimization of tuning parameters. The penalized SVM classification algorithms as well as fixed grid and interval search for finding appropriate tuning parameters were implemented in our freely available R package 'penalizedSVM'. We conclude that the Elastic SCAD SVM is a flexible and robust tool for classification and feature selection tasks for high-dimensional data such as microarray data sets. PMID:21554689

  16. Estimating the variance for heterogeneity in arm-based network meta-analysis.

    PubMed

    Piepho, Hans-Peter; Madden, Laurence V; Roger, James; Payne, Roger; Williams, Emlyn R

    2018-04-19

    Network meta-analysis can be implemented by using arm-based or contrast-based models. Here we focus on arm-based models and fit them using generalized linear mixed model procedures. Full maximum likelihood (ML) estimation leads to biased trial-by-treatment interaction variance estimates for heterogeneity. Thus, our objective is to investigate alternative approaches to variance estimation that reduce bias compared with full ML. Specifically, we use penalized quasi-likelihood/pseudo-likelihood and hierarchical (h) likelihood approaches. In addition, we consider a novel model modification that yields estimators akin to the residual maximum likelihood estimator for linear mixed models. The proposed methods are compared by simulation, and 2 real datasets are used for illustration. Simulations show that penalized quasi-likelihood/pseudo-likelihood and h-likelihood reduce bias and yield satisfactory coverage rates. Sum-to-zero restriction and baseline contrasts for random trial-by-treatment interaction effects, as well as a residual ML-like adjustment, also reduce bias compared with an unconstrained model when ML is used, but coverage rates are not quite as good. Penalized quasi-likelihood/pseudo-likelihood and h-likelihood are therefore recommended. Copyright © 2018 John Wiley & Sons, Ltd.

  17. Penalization of aperture complexity in inversely planned volumetric modulated arc therapy

    PubMed Central

    Younge, Kelly C.; Matuszak, Martha M.; Moran, Jean M.; McShan, Daniel L.; Fraass, Benedick A.; Roberts, Donald A.

    2012-01-01

    Purpose: Apertures obtained during volumetric modulated arc therapy (VMAT) planning can be small and irregular, resulting in dosimetric inaccuracies during delivery. Our purpose is to develop and integrate an aperture-regularization objective function into the optimization process for VMAT, and to quantify the impact of using this objective function on dose delivery accuracy and optimized dose distributions. Methods: An aperture-based metric (“edge penalty”) was developed that penalizes complex aperture shapes based on the ratio of MLC side edge length and aperture area. To assess the utility of the metric, VMAT plans were created for example paraspinal, brain, and liver SBRT cases with and without incorporating the edge penalty in the cost function. To investigate the dose calculation accuracy, Gafchromic EBT2 film was used to measure the 15 highest weighted apertures individually and as a composite from each of two paraspinal plans: one with and one without the edge penalty applied. Films were analyzed using a triple-channel nonuniformity correction and measurements were compared directly to calculations. Results: Apertures generated with the edge penalty were larger, more regularly shaped and required up to 30% fewer monitor units than those created without the edge penalty. Dose volume histogram analysis showed that the changes in doses to targets, organs at risk, and normal tissues were negligible. Edge penalty apertures that were measured with film for the paraspinal plan showed a notable decrease in the number of pixels disagreeing with calculation by more than 10%. For a 5% dose passing criterion, the number of pixels passing in the composite dose distributions for the non-edge penalty and edge penalty plans were 52% and 96%, respectively. Employing gamma with 3% dose/1 mm distance criteria resulted in a 79.5% (without penalty)/95.4% (with penalty) pass rate for the two plans. Gradient compensation of 3%/1 mm resulted in 83.3%/96.2% pass rates. Conclusions: The use of the edge penalty during optimization has the potential to markedly improve dose delivery accuracy for VMAT plans while still maintaining high quality optimized dose distributions. The penalty regularizes aperture shape and improves delivery efficiency. PMID:23127107

  18. Robust Variable Selection with Exponential Squared Loss.

    PubMed

    Wang, Xueqin; Jiang, Yunlu; Huang, Mian; Zhang, Heping

    2013-04-01

    Robust variable selection procedures through penalized regression have been gaining increased attention in the literature. They can be used to perform variable selection and are expected to yield robust estimates. However, to the best of our knowledge, the robustness of those penalized regression procedures has not been well characterized. In this paper, we propose a class of penalized robust regression estimators based on exponential squared loss. The motivation for this new procedure is that it enables us to characterize its robustness that has not been done for the existing procedures, while its performance is near optimal and superior to some recently developed methods. Specifically, under defined regularity conditions, our estimators are [Formula: see text] and possess the oracle property. Importantly, we show that our estimators can achieve the highest asymptotic breakdown point of 1/2 and that their influence functions are bounded with respect to the outliers in either the response or the covariate domain. We performed simulation studies to compare our proposed method with some recent methods, using the oracle method as the benchmark. We consider common sources of influential points. Our simulation studies reveal that our proposed method performs similarly to the oracle method in terms of the model error and the positive selection rate even in the presence of influential points. In contrast, other existing procedures have a much lower non-causal selection rate. Furthermore, we re-analyze the Boston Housing Price Dataset and the Plasma Beta-Carotene Level Dataset that are commonly used examples for regression diagnostics of influential points. Our analysis unravels the discrepancies of using our robust method versus the other penalized regression method, underscoring the importance of developing and applying robust penalized regression methods.

  19. Robust Variable Selection with Exponential Squared Loss

    PubMed Central

    Wang, Xueqin; Jiang, Yunlu; Huang, Mian; Zhang, Heping

    2013-01-01

    Robust variable selection procedures through penalized regression have been gaining increased attention in the literature. They can be used to perform variable selection and are expected to yield robust estimates. However, to the best of our knowledge, the robustness of those penalized regression procedures has not been well characterized. In this paper, we propose a class of penalized robust regression estimators based on exponential squared loss. The motivation for this new procedure is that it enables us to characterize its robustness that has not been done for the existing procedures, while its performance is near optimal and superior to some recently developed methods. Specifically, under defined regularity conditions, our estimators are n-consistent and possess the oracle property. Importantly, we show that our estimators can achieve the highest asymptotic breakdown point of 1/2 and that their influence functions are bounded with respect to the outliers in either the response or the covariate domain. We performed simulation studies to compare our proposed method with some recent methods, using the oracle method as the benchmark. We consider common sources of influential points. Our simulation studies reveal that our proposed method performs similarly to the oracle method in terms of the model error and the positive selection rate even in the presence of influential points. In contrast, other existing procedures have a much lower non-causal selection rate. Furthermore, we re-analyze the Boston Housing Price Dataset and the Plasma Beta-Carotene Level Dataset that are commonly used examples for regression diagnostics of influential points. Our analysis unravels the discrepancies of using our robust method versus the other penalized regression method, underscoring the importance of developing and applying robust penalized regression methods. PMID:23913996

  20. B-spline parameterization of the dielectric function and information criteria: the craft of non-overfitting

    NASA Astrophysics Data System (ADS)

    Likhachev, Dmitriy V.

    2017-06-01

    Johs and Hale developed the Kramers-Kronig consistent B-spline formulation for the dielectric function modeling in spectroscopic ellipsometry data analysis. In this article we use popular Akaike, corrected Akaike and Bayesian Information Criteria (AIC, AICc and BIC, respectively) to determine an optimal number of knots for B-spline model. These criteria allow finding a compromise between under- and overfitting of experimental data since they penalize for increasing number of knots and select representation which achieves the best fit with minimal number of knots. Proposed approach provides objective and practical guidance, as opposite to empirically driven or "gut feeling" decisions, for selecting the right number of knots for B-spline models in spectroscopic ellipsometry. AIC, AICc and BIC selection criteria work remarkably well as we demonstrated in several real-data applications. This approach formalizes selection of the optimal knot number and may be useful in practical perspective of spectroscopic ellipsometry data analysis.

  1. Bootstrap Enhanced Penalized Regression for Variable Selection with Neuroimaging Data.

    PubMed

    Abram, Samantha V; Helwig, Nathaniel E; Moodie, Craig A; DeYoung, Colin G; MacDonald, Angus W; Waller, Niels G

    2016-01-01

    Recent advances in fMRI research highlight the use of multivariate methods for examining whole-brain connectivity. Complementary data-driven methods are needed for determining the subset of predictors related to individual differences. Although commonly used for this purpose, ordinary least squares (OLS) regression may not be ideal due to multi-collinearity and over-fitting issues. Penalized regression is a promising and underutilized alternative to OLS regression. In this paper, we propose a nonparametric bootstrap quantile (QNT) approach for variable selection with neuroimaging data. We use real and simulated data, as well as annotated R code, to demonstrate the benefits of our proposed method. Our results illustrate the practical potential of our proposed bootstrap QNT approach. Our real data example demonstrates how our method can be used to relate individual differences in neural network connectivity with an externalizing personality measure. Also, our simulation results reveal that the QNT method is effective under a variety of data conditions. Penalized regression yields more stable estimates and sparser models than OLS regression in situations with large numbers of highly correlated neural predictors. Our results demonstrate that penalized regression is a promising method for examining associations between neural predictors and clinically relevant traits or behaviors. These findings have important implications for the growing field of functional connectivity research, where multivariate methods produce numerous, highly correlated brain networks.

  2. Bootstrap Enhanced Penalized Regression for Variable Selection with Neuroimaging Data

    PubMed Central

    Abram, Samantha V.; Helwig, Nathaniel E.; Moodie, Craig A.; DeYoung, Colin G.; MacDonald, Angus W.; Waller, Niels G.

    2016-01-01

    Recent advances in fMRI research highlight the use of multivariate methods for examining whole-brain connectivity. Complementary data-driven methods are needed for determining the subset of predictors related to individual differences. Although commonly used for this purpose, ordinary least squares (OLS) regression may not be ideal due to multi-collinearity and over-fitting issues. Penalized regression is a promising and underutilized alternative to OLS regression. In this paper, we propose a nonparametric bootstrap quantile (QNT) approach for variable selection with neuroimaging data. We use real and simulated data, as well as annotated R code, to demonstrate the benefits of our proposed method. Our results illustrate the practical potential of our proposed bootstrap QNT approach. Our real data example demonstrates how our method can be used to relate individual differences in neural network connectivity with an externalizing personality measure. Also, our simulation results reveal that the QNT method is effective under a variety of data conditions. Penalized regression yields more stable estimates and sparser models than OLS regression in situations with large numbers of highly correlated neural predictors. Our results demonstrate that penalized regression is a promising method for examining associations between neural predictors and clinically relevant traits or behaviors. These findings have important implications for the growing field of functional connectivity research, where multivariate methods produce numerous, highly correlated brain networks. PMID:27516732

  3. A characteristic based volume penalization method for general evolution problems applied to compressible viscous flows

    NASA Astrophysics Data System (ADS)

    Brown-Dymkoski, Eric; Kasimov, Nurlybek; Vasilyev, Oleg V.

    2014-04-01

    In order to introduce solid obstacles into flows, several different methods are used, including volume penalization methods which prescribe appropriate boundary conditions by applying local forcing to the constitutive equations. One well known method is Brinkman penalization, which models solid obstacles as porous media. While it has been adapted for compressible, incompressible, viscous and inviscid flows, it is limited in the types of boundary conditions that it imposes, as are most volume penalization methods. Typically, approaches are limited to Dirichlet boundary conditions. In this paper, Brinkman penalization is extended for generalized Neumann and Robin boundary conditions by introducing hyperbolic penalization terms with characteristics pointing inward on solid obstacles. This Characteristic-Based Volume Penalization (CBVP) method is a comprehensive approach to conditions on immersed boundaries, providing for homogeneous and inhomogeneous Dirichlet, Neumann, and Robin boundary conditions on hyperbolic and parabolic equations. This CBVP method can be used to impose boundary conditions for both integrated and non-integrated variables in a systematic manner that parallels the prescription of exact boundary conditions. Furthermore, the method does not depend upon a physical model, as with porous media approach for Brinkman penalization, and is therefore flexible for various physical regimes and general evolutionary equations. Here, the method is applied to scalar diffusion and to direct numerical simulation of compressible, viscous flows. With the Navier-Stokes equations, both homogeneous and inhomogeneous Neumann boundary conditions are demonstrated through external flow around an adiabatic and heated cylinder. Theoretical and numerical examination shows that the error from penalized Neumann and Robin boundary conditions can be rigorously controlled through an a priori penalization parameter η. The error on a transient boundary is found to converge as O(η), which is more favorable than the error convergence of the already established Dirichlet boundary condition.

  4. Machine Learning EEG to Predict Cognitive Functioning and Processing Speed Over a 2-Year Period in Multiple Sclerosis Patients and Controls.

    PubMed

    Kiiski, Hanni; Jollans, Lee; Donnchadha, Seán Ó; Nolan, Hugh; Lonergan, Róisín; Kelly, Siobhán; O'Brien, Marie Claire; Kinsella, Katie; Bramham, Jessica; Burke, Teresa; Hutchinson, Michael; Tubridy, Niall; Reilly, Richard B; Whelan, Robert

    2018-05-01

    Event-related potentials (ERPs) show promise to be objective indicators of cognitive functioning. The aim of the study was to examine if ERPs recorded during an oddball task would predict cognitive functioning and information processing speed in Multiple Sclerosis (MS) patients and controls at the individual level. Seventy-eight participants (35 MS patients, 43 healthy age-matched controls) completed visual and auditory 2- and 3-stimulus oddball tasks with 128-channel EEG, and a neuropsychological battery, at baseline (month 0) and at Months 13 and 26. ERPs from 0 to 700 ms and across the whole scalp were transformed into 1728 individual spatio-temporal datapoints per participant. A machine learning method that included penalized linear regression used the entire spatio-temporal ERP to predict composite scores of both cognitive functioning and processing speed at baseline (month 0), and months 13 and 26. The results showed ERPs during the visual oddball tasks could predict cognitive functioning and information processing speed at baseline and a year later in a sample of MS patients and healthy controls. In contrast, ERPs during auditory tasks were not predictive of cognitive performance. These objective neurophysiological indicators of cognitive functioning and processing speed, and machine learning methods that can interrogate high-dimensional data, show promise in outcome prediction.

  5. Error Covariance Penalized Regression: A novel multivariate model combining penalized regression with multivariate error structure.

    PubMed

    Allegrini, Franco; Braga, Jez W B; Moreira, Alessandro C O; Olivieri, Alejandro C

    2018-06-29

    A new multivariate regression model, named Error Covariance Penalized Regression (ECPR) is presented. Following a penalized regression strategy, the proposed model incorporates information about the measurement error structure of the system, using the error covariance matrix (ECM) as a penalization term. Results are reported from both simulations and experimental data based on replicate mid and near infrared (MIR and NIR) spectral measurements. The results for ECPR are better under non-iid conditions when compared with traditional first-order multivariate methods such as ridge regression (RR), principal component regression (PCR) and partial least-squares regression (PLS). Copyright © 2018 Elsevier B.V. All rights reserved.

  6. MO-DE-207A-10: One-Step CT Reconstruction for Metal Artifact Reduction by a Modification of Penalized Weighted Least-Squares (PWLS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, H; Chen, J

    Purpose: Metal objects create severe artifacts in kilo-voltage (kV) CT image reconstructions due to the high attenuation coefficients of high atomic number objects. Most of the techniques devised to reduce this artifact utilize a two-step approach, which do not reliably yield the qualified reconstructed images. Thus, for accuracy and simplicity, this work presents a one-step reconstruction method based on a modified penalized weighted least-squares (PWLS) technique. Methods: Existing techniques for metal artifact reduction mostly adopt a two-step approach, which conduct additional reconstruction with the modified projection data from the initial reconstruction. This procedure does not consistently perform well due tomore » the uncertainties in manipulating the metal-contaminated projection data by thresholding and linear interpolation. This study proposes a one-step reconstruction process using a new PWLS operation with total-variation (TV) minimization, while not manipulating the projection. The PWLS for CT reconstruction has been investigated using a pre-defined weight, based on the variance of the projection datum at each detector bin. It works well when reconstructing CT images from metal-free projection data, which does not appropriately penalize metal-contaminated projection data. The proposed work defines the weight at each projection element under the assumption of a Poisson random variable. This small modification using element-wise penalization has a large impact in reducing metal artifacts. For evaluation, the proposed technique was assessed with two noisy, metal-contaminated digital phantoms, against the existing PWLS with TV minimization and the two-step approach. Result: The proposed PWLS with TV minimization greatly improved the metal artifact reduction, relative to the other techniques, by watching the results. Numerically, the new approach lowered the normalized root-mean-square error about 30 and 60% for the two cases, respectively, compared to the two-step method. Conclusion: A new PWLS operation shows promise for improving metal artifact reduction in CT imaging, as well as simplifying the reconstructing procedure.« less

  7. Polychromatic sparse image reconstruction and mass attenuation spectrum estimation via B-spline basis function expansion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gu, Renliang, E-mail: Venliang@iastate.edu, E-mail: ald@iastate.edu; Dogandžić, Aleksandar, E-mail: Venliang@iastate.edu, E-mail: ald@iastate.edu

    2015-03-31

    We develop a sparse image reconstruction method for polychromatic computed tomography (CT) measurements under the blind scenario where the material of the inspected object and the incident energy spectrum are unknown. To obtain a parsimonious measurement model parameterization, we first rewrite the measurement equation using our mass-attenuation parameterization, which has the Laplace integral form. The unknown mass-attenuation spectrum is expanded into basis functions using a B-spline basis of order one. We develop a block coordinate-descent algorithm for constrained minimization of a penalized negative log-likelihood function, where constraints and penalty terms ensure nonnegativity of the spline coefficients and sparsity of themore » density map image in the wavelet domain. This algorithm alternates between a Nesterov’s proximal-gradient step for estimating the density map image and an active-set step for estimating the incident spectrum parameters. Numerical simulations demonstrate the performance of the proposed scheme.« less

  8. Consumer Education. An Introductory Unit for Inmates in Penal Institutions.

    ERIC Educational Resources Information Center

    Schmoele, Henry H.; And Others

    This introductory consumer education curriculum outline contains materials designed to help soon-to-be-released prisoners to develop an awareness of consumer concerns and to better manage their family lives. Each of the four units provided includes lesson objectives, suggested contents, suggested teaching methods, handouts, and tests. The unit on…

  9. Race Making in a Penal Institution.

    PubMed

    Walker, Michael L

    2016-01-01

    This article provides a ground-level investigation into the lives of penal inmates, linking the literature on race making and penal management to provide an understanding of racial formation processes in a modern penal institution. Drawing on 135 days of ethnographic data collected as an inmate in a Southern California county jail system, the author argues that inmates are subjected to two mutually constitutive racial projects--one institutional and the other microinteractional. Operating in symbiosis within a narrative of risk management, these racial projects increase (rather than decrease) incidents of intraracial violence and the potential for interracial violence. These findings have implications for understanding the process of racialization and evaluating the effectiveness of penal management strategies.

  10. Assessment of Blood Glucose Control in the Pediatric Intensive Care Unit: Extension of the Glycemic Penalty Index toward Children and Infants

    PubMed Central

    Van Herpe, Tom; Gielen, Marijke; Vanhonsebrouck, Koen; Wouters, Pieter J; Van den Berghe, Greet; De Moor, Bart; Mesotten, Dieter

    2011-01-01

    Background: The glycemic penalty index (GPI) is a measure to assess blood glucose (BG) control in critically ill adult patients but needs to be adapted for children and infants. Method: The squared differences between a clinical expertise penalty function and the corresponding polynomial function are minimized for optimization purposes. The average of all penalties (individually assigned to all BG readings) represents the patient-specific GPI. Results: Penalization in the hypoglycemic range is more severe than in the hyperglycemic range as the developing brains of infants and children may be more vulnerable to hypoglycemia. Similarly, hypoglycemia is also more heavily penalized in infants than in children. Conclusions: Extending the adult GPI toward the age-specific GPI is an important methodological step. Long-term clinical studies are needed to determine the clinically acceptable GPI cut-off level. PMID:21527105

  11. Use of atropine penalization to treat amblyopia in UK orthoptic practice.

    PubMed

    Piano, Marianne; O'Connor, Anna R; Newsham, David

    2014-01-01

    To compare clinical practice patterns regarding atropine penalization use by UK orthoptists to the current evidence base and identify any existing barriers against use of AP as first-line treatment. An online survey was designed to assess current practice patterns of UK orthoptists using atropine penalization. They were asked to identify issues limiting their use of atropine penalization and give opinions on its effectiveness compared to occlusion. Descriptive statistics and content analysis were applied to the results. Responses were obtained from 151 orthoptists throughout the United Kingdom. The main perceived barriers to use of atropine penalization were inability to prescribe atropine and supply difficulties. However, respondents also did not consider atropine penalization as effective as occlusion in treating amblyopia, contrary to recent research findings. Patient selection criteria and treatment administration largely follow current evidence. More orthoptists use atropine penalization as first-line treatment than previously reported. Practitioners tend to closely follow the current evidence base when using atropine penalization, but reluctance in offering it as first-line treatment or providing a choice for parents between occlusion and atropine still remains. This may result from concerns regarding atropine's general efficacy, side effects, and risk of reverse amblyopia. Alternatively, as demonstrated in other areas of medicine, it may reflect the inherent delay of research findings translating to clinical practice changes. Copyright 2014, SLACK Incorporated.

  12. Estimation of Noise Properties for TV-regularized Image Reconstruction in Computed Tomography

    PubMed Central

    Sánchez, Adrian A.

    2016-01-01

    A method for predicting the image covariance resulting from total-variation-penalized iterative image reconstruction (TV-penalized IIR) is presented and demonstrated in a variety of contexts. The method is validated against the sample covariance from statistical noise realizations for a small image using a variety of comparison metrics. Potential applications for the covariance approximation include investigation of image properties such as object- and signal-dependence of noise, and noise stationarity. These applications are demonstrated, along with the construction of image pixel variance maps for two-dimensional 128 × 128 pixel images. Methods for extending the proposed covariance approximation to larger images and improving computational efficiency are discussed. Future work will apply the developed methodology to the construction of task-based image quality metrics such as the Hotelling observer detectability for TV-based IIR. PMID:26308968

  13. Estimation of noise properties for TV-regularized image reconstruction in computed tomography.

    PubMed

    Sánchez, Adrian A

    2015-09-21

    A method for predicting the image covariance resulting from total-variation-penalized iterative image reconstruction (TV-penalized IIR) is presented and demonstrated in a variety of contexts. The method is validated against the sample covariance from statistical noise realizations for a small image using a variety of comparison metrics. Potential applications for the covariance approximation include investigation of image properties such as object- and signal-dependence of noise, and noise stationarity. These applications are demonstrated, along with the construction of image pixel variance maps for two-dimensional 128 × 128 pixel images. Methods for extending the proposed covariance approximation to larger images and improving computational efficiency are discussed. Future work will apply the developed methodology to the construction of task-based image quality metrics such as the Hotelling observer detectability for TV-based IIR.

  14. Estimation of noise properties for TV-regularized image reconstruction in computed tomography

    NASA Astrophysics Data System (ADS)

    Sánchez, Adrian A.

    2015-09-01

    A method for predicting the image covariance resulting from total-variation-penalized iterative image reconstruction (TV-penalized IIR) is presented and demonstrated in a variety of contexts. The method is validated against the sample covariance from statistical noise realizations for a small image using a variety of comparison metrics. Potential applications for the covariance approximation include investigation of image properties such as object- and signal-dependence of noise, and noise stationarity. These applications are demonstrated, along with the construction of image pixel variance maps for two-dimensional 128× 128 pixel images. Methods for extending the proposed covariance approximation to larger images and improving computational efficiency are discussed. Future work will apply the developed methodology to the construction of task-based image quality metrics such as the Hotelling observer detectability for TV-based IIR.

  15. A Selective Overview of Variable Selection in High Dimensional Feature Space

    PubMed Central

    Fan, Jianqing

    2010-01-01

    High dimensional statistical problems arise from diverse fields of scientific research and technological development. Variable selection plays a pivotal role in contemporary statistical learning and scientific discoveries. The traditional idea of best subset selection methods, which can be regarded as a specific form of penalized likelihood, is computationally too expensive for many modern statistical applications. Other forms of penalized likelihood methods have been successfully developed over the last decade to cope with high dimensionality. They have been widely applied for simultaneously selecting important variables and estimating their effects in high dimensional statistical inference. In this article, we present a brief account of the recent developments of theory, methods, and implementations for high dimensional variable selection. What limits of the dimensionality such methods can handle, what the role of penalty functions is, and what the statistical properties are rapidly drive the advances of the field. The properties of non-concave penalized likelihood and its roles in high dimensional statistical modeling are emphasized. We also review some recent advances in ultra-high dimensional variable selection, with emphasis on independence screening and two-scale methods. PMID:21572976

  16. Optimal Multiple Surface Segmentation With Shape and Context Priors

    PubMed Central

    Bai, Junjie; Garvin, Mona K.; Sonka, Milan; Buatti, John M.; Wu, Xiaodong

    2014-01-01

    Segmentation of multiple surfaces in medical images is a challenging problem, further complicated by the frequent presence of weak boundary evidence, large object deformations, and mutual influence between adjacent objects. This paper reports a novel approach to multi-object segmentation that incorporates both shape and context prior knowledge in a 3-D graph-theoretic framework to help overcome the stated challenges. We employ an arc-based graph representation to incorporate a wide spectrum of prior information through pair-wise energy terms. In particular, a shape-prior term is used to penalize local shape changes and a context-prior term is used to penalize local surface-distance changes from a model of the expected shape and surface distances, respectively. The globally optimal solution for multiple surfaces is obtained by computing a maximum flow in a low-order polynomial time. The proposed method was validated on intraretinal layer segmentation of optical coherence tomography images and demonstrated statistically significant improvement of segmentation accuracy compared to our earlier graph-search method that was not utilizing shape and context priors. The mean unsigned surface positioning errors obtained by the conventional graph-search approach (6.30 ± 1.58 μm) was improved to 5.14 ± 0.99 μm when employing our new method with shape and context priors. PMID:23193309

  17. Terrestrial cross-calibrated assimilation of various datasources

    NASA Astrophysics Data System (ADS)

    Groß, André; Müller, Richard; Schömer, Elmar; Trentmann, Jörg

    2014-05-01

    We introduce a novel software tool, ANACLIM, for the efficient assimilation of multiple two-dimensional data sets using a variational approach. We consider a single objective function in two spatial coordinates with higher derivatives. This function measures the deviation of the input data from the target data set. By using the Euler-Lagrange formalism the minimization of this objective function can be transformed into a sparse system of linear equations, which can be efficiently solved by a conjugate gradient solver on a desktop workstation. The objective function allows for a series of physically-motivated constraints. The user can control the relative global weights, as well as the individual weight of each constraint on a per-grid-point level. The different constraints are realized as separate terms of the objective function: One similarity term for each input data set and two additional smoothness terms, penalizing high gradient and curvature values. ANACLIM is designed to combine similarity and smoothness operators easily and to choose different solvers. We performed a series of benchmarks to calibrate and verify our solution. We use, for example, terrestrial stations of BSRN and GEBA for the solar incoming flux and AERONET stations for aerosol optical depth. First results show that the combination of these data sources gain a significant benefit against the input datasets with our approach. ANACLIM also includes a region growing algorithm for the assimilation of ground based data. The region growing algorithm computes the maximum area around a station that represents the station data. The regions are grown under several constraints like the homogeneity of the area. The resulting dataset is then used within the assimilation process. Verification is performed by cross-validation. The method and validation results will be presented and discussed.

  18. Object matching using a locally affine invariant and linear programming techniques.

    PubMed

    Li, Hongsheng; Huang, Xiaolei; He, Lei

    2013-02-01

    In this paper, we introduce a new matching method based on a novel locally affine-invariant geometric constraint and linear programming techniques. To model and solve the matching problem in a linear programming formulation, all geometric constraints should be able to be exactly or approximately reformulated into a linear form. This is a major difficulty for this kind of matching algorithm. We propose a novel locally affine-invariant constraint which can be exactly linearized and requires a lot fewer auxiliary variables than other linear programming-based methods do. The key idea behind it is that each point in the template point set can be exactly represented by an affine combination of its neighboring points, whose weights can be solved easily by least squares. Errors of reconstructing each matched point using such weights are used to penalize the disagreement of geometric relationships between the template points and the matched points. The resulting overall objective function can be solved efficiently by linear programming techniques. Our experimental results on both rigid and nonrigid object matching show the effectiveness of the proposed algorithm.

  19. Do Standardized Tests Penalize Deep-Thinking, Creative, or Conscientious Students?: Some Personality Correlates of Graduate Record Examinations Test Scores

    ERIC Educational Resources Information Center

    Powers, Donald E.; Kaufman, James C.

    2004-01-01

    The objective of the study reported here was to explore the relationship of Graduate Record Examinations (GRE) General Test scores to selected personality traits--conscientiousness, rationality, ingenuity, quickness, creativity, and depth. A sample of 342 GRE test takers completed short personality inventory scales for each trait. Analyses…

  20. Decree amending and adding various provisions to the Penal Code for the Federal District with respect to local jurisdiction and to the Penal Code for the whole Republic with respect to federal jurisdiction, 29 December 1988. [Selected provisions].

    PubMed

    1989-01-01

    Mexico's decree amending and adding various provisions to the penal code for the federal district, with respect to local jurisdiction, and to the penal code for the whole republic, with respect to federal jurisdiction, December 29, 1988, among other things, amends the penal code to strengthen provisions relating to sex crimes. Among the provisions are the following: anyone procuring or facilitating the corruption of a minor (under 18) or a person lacking capacity, by means of sexual acts, or who induces him to engage in begging, drunkenness, drug addiction, or some other vice; to form part of an unlawful association; or to commit whatever crimes will be imprisoned for 3-8 years and subjected to a fine. The sentence shall be enhanced if the minor or incapacitated person forms a habit due to repeated acts of corruption. Whoever encourages or facilitates a person's engaging in prostitution or obtains or delivers a person for the purpose of prostitution will be imprisoned for 2-9 years and fined. Pandering will be punished with imprisonment for 2-9 years and a fine. Whoever, without consent and without the purpose of achieving intercourse, performs on her a sexual act with lascivious intent, or forces her to perform it, will be sentenced to "15 days to 1 year's or to 10-40 days' community service work." If physical or moral violence is used, the penalty will be 2-7 years imprisonment. Performing the above act on a person under age 12 or on someone unable to resist increases the punishment. Whoever uses physical or moral violence to have intercourse with a person of whatever sex shall be imprisoned 8-14 years; using an object other than a penis either vaginally or anally on either a male or a female will result in imprisonment of 1-5 years. If the victim is under age 12, unable to resist, or if 2 or more persons commit the crime, an enhanced punishment will result.

  1. 7 CFR 1484.73 - Are Cooperators penalized for failing to make required contributions?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 10 2013-01-01 2013-01-01 false Are Cooperators penalized for failing to make... Cooperators penalized for failing to make required contributions? A Cooperator's contribution requirement is specified in the Cooperator program allocation letter. If a Cooperator fails to contribute the amount...

  2. 7 CFR 1484.73 - Are Cooperators penalized for failing to make required contributions?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 10 2014-01-01 2014-01-01 false Are Cooperators penalized for failing to make... Cooperators penalized for failing to make required contributions? A Cooperator's contribution requirement is specified in the Cooperator program allocation letter. If a Cooperator fails to contribute the amount...

  3. 7 CFR 1484.73 - Are Cooperators penalized for failing to make required contributions?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 10 2012-01-01 2012-01-01 false Are Cooperators penalized for failing to make... Cooperators penalized for failing to make required contributions? A Cooperator's contribution requirement is specified in the Cooperator program allocation letter. If a Cooperator fails to contribute the amount...

  4. 7 CFR 1484.73 - Are Cooperators penalized for failing to make required contributions?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 10 2011-01-01 2011-01-01 false Are Cooperators penalized for failing to make... § 1484.73 Are Cooperators penalized for failing to make required contributions? A Cooperator's contribution requirement is specified in the Cooperator program allocation letter. If a Cooperator fails to...

  5. 7 CFR 1484.73 - Are Cooperators penalized for failing to make required contributions?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Are Cooperators penalized for failing to make... § 1484.73 Are Cooperators penalized for failing to make required contributions? A Cooperator's contribution requirement is specified in the Cooperator program allocation letter. If a Cooperator fails to...

  6. [Direct genetic manipulation and criminal code in Venezuela: absolute criminal law void?].

    PubMed

    Cermeño Zambrano, Fernando G De J

    2002-01-01

    The judicial regulation of genetic biotechnology applied to the human genome is of big relevance currently in Venezuela due to the drafting of an innovative bioethical law in the country's parliament. This article will highlight the constitutional normative of Venezuela's 1999 Constitution regarding this subject, as it establishes the framework from which this matter will be legally regulated. The approach this article makes towards the genetic biotechnology applied to the human genome is made taking into account the Venezuelan penal law and by highlighting the violent genetic manipulations that have criminal relevance. The genetic biotechnology applied to the human genome has another important relevance as a consequence of the reformulation of the Venezuelan Penal Code discussed by the country's National Assembly. Therefore, a concise study of the country's penal code will be made in this article to better understand what judicial-penal properties have been protected by the Venezuelan penal legislation. This last step will enable us to identify the penal tools Venezuela counts on to face direct genetic manipulations. We will equally indicate the existing punitive loophole and that should be covered by the penal legislator. In conclusion, this essay concerns criminal policy, referred to the direct genetic manipulations on the human genome that haven't been typified in Venezuelan law, thus discovering a genetic biotechnology paradise.

  7. Low-dose dynamic myocardial perfusion CT image reconstruction using pre-contrast normal-dose CT scan induced structure tensor total variation regularization

    NASA Astrophysics Data System (ADS)

    Gong, Changfei; Han, Ce; Gan, Guanghui; Deng, Zhenxiang; Zhou, Yongqiang; Yi, Jinling; Zheng, Xiaomin; Xie, Congying; Jin, Xiance

    2017-04-01

    Dynamic myocardial perfusion CT (DMP-CT) imaging provides quantitative functional information for diagnosis and risk stratification of coronary artery disease by calculating myocardial perfusion hemodynamic parameter (MPHP) maps. However, the level of radiation delivered by dynamic sequential scan protocol can be potentially high. The purpose of this work is to develop a pre-contrast normal-dose scan induced structure tensor total variation regularization based on the penalized weighted least-squares (PWLS) criteria to improve the image quality of DMP-CT with a low-mAs CT acquisition. For simplicity, the present approach was termed as ‘PWLS-ndiSTV’. Specifically, the ndiSTV regularization takes into account the spatial-temporal structure information of DMP-CT data and further exploits the higher order derivatives of the objective images to enhance denoising performance. Subsequently, an effective optimization algorithm based on the split-Bregman approach was adopted to minimize the associative objective function. Evaluations with modified dynamic XCAT phantom and preclinical porcine datasets have demonstrated that the proposed PWLS-ndiSTV approach can achieve promising gains over other existing approaches in terms of noise-induced artifacts mitigation, edge details preservation, and accurate MPHP maps calculation.

  8. Functional linear models for zero-inflated count data with application to modeling hospitalizations in patients on dialysis.

    PubMed

    Sentürk, Damla; Dalrymple, Lorien S; Nguyen, Danh V

    2014-11-30

    We propose functional linear models for zero-inflated count data with a focus on the functional hurdle and functional zero-inflated Poisson (ZIP) models. Although the hurdle model assumes the counts come from a mixture of a degenerate distribution at zero and a zero-truncated Poisson distribution, the ZIP model considers a mixture of a degenerate distribution at zero and a standard Poisson distribution. We extend the generalized functional linear model framework with a functional predictor and multiple cross-sectional predictors to model counts generated by a mixture distribution. We propose an estimation procedure for functional hurdle and ZIP models, called penalized reconstruction, geared towards error-prone and sparsely observed longitudinal functional predictors. The approach relies on dimension reduction and pooling of information across subjects involving basis expansions and penalized maximum likelihood techniques. The developed functional hurdle model is applied to modeling hospitalizations within the first 2 years from initiation of dialysis, with a high percentage of zeros, in the Comprehensive Dialysis Study participants. Hospitalization counts are modeled as a function of sparse longitudinal measurements of serum albumin concentrations, patient demographics, and comorbidities. Simulation studies are used to study finite sample properties of the proposed method and include comparisons with an adaptation of standard principal components regression. Copyright © 2014 John Wiley & Sons, Ltd.

  9. 27 CFR 19.957 - Instructions to compute bond penal sum.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Instructions to compute bond penal sum. 19.957 Section 19.957 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX... Fuel Use Bonds § 19.957 Instructions to compute bond penal sum. (a) Medium plants. To find the required...

  10. SPECT reconstruction using DCT-induced tight framelet regularization

    NASA Astrophysics Data System (ADS)

    Zhang, Jiahan; Li, Si; Xu, Yuesheng; Schmidtlein, C. R.; Lipson, Edward D.; Feiglin, David H.; Krol, Andrzej

    2015-03-01

    Wavelet transforms have been successfully applied in many fields of image processing. Yet, to our knowledge, they have never been directly incorporated to the objective function in Emission Computed Tomography (ECT) image reconstruction. Our aim has been to investigate if the ℓ1-norm of non-decimated discrete cosine transform (DCT) coefficients of the estimated radiotracer distribution could be effectively used as the regularization term for the penalized-likelihood (PL) reconstruction, where a regularizer is used to enforce the image smoothness in the reconstruction. In this study, the ℓ1-norm of 2D DCT wavelet decomposition was used as a regularization term. The Preconditioned Alternating Projection Algorithm (PAPA), which we proposed in earlier work to solve penalized likelihood (PL) reconstruction with non-differentiable regularizers, was used to solve this optimization problem. The DCT wavelet decompositions were performed on the transaxial reconstructed images. We reconstructed Monte Carlo simulated SPECT data obtained for a numerical phantom with Gaussian blobs as hot lesions and with a warm random lumpy background. Reconstructed images using the proposed method exhibited better noise suppression and improved lesion conspicuity, compared with images reconstructed using expectation maximization (EM) algorithm with Gaussian post filter (GPF). Also, the mean square error (MSE) was smaller, compared with EM-GPF. A critical and challenging aspect of this method was selection of optimal parameters. In summary, our numerical experiments demonstrated that the ℓ1-norm of discrete cosine transform (DCT) wavelet frame transform DCT regularizer shows promise for SPECT image reconstruction using PAPA method.

  11. [Treatment of amblyopia].

    PubMed

    von Noorden, G K

    1990-01-01

    Animal experiments have explored the structural and functional alterations of the afferent visual pathways in amblyopia and have emphasized the extraordinary sensitivity of the immature visual system to abnormal visual stimulation. The practical consequences of these experiments are obvious: early diagnosis of amblyopia and energetic occlusion therapy as early in life as possible. At the same time, measures must be taken to prevent visual deprivation amblyopia in the occluded eye. After successful treatment, alternating penalization with two pairs of spectacles is recommended. Pleoptics involves an enormous commitment in terms of time, personnel and costs. In view of the fact that the superiority of this treatment over occlusion therapy has yet to be proven, the current value of pleoptics appears dubious. Moreover, overtreated patients may end up with intractable diplopia. Diverging opinions exist with regard to the use of penalization as a primary treatment of amblyopia. We employ it only in special cases as an alternative to occlusion therapy. Visual deprivation in infancy caused by opacities of the ocular media, especially when they occur unilaterally, must be eliminated, and deprivation amblyopia must be treated without delay to regain useful vision. Brief periods of bilateral occlusion are recommended to avoid the highly amblyopiogenic imbalance between binocular afferent visual input. Future developments will hopefully include new objective methods to diagnose amblyopia in preverbal children and infants. The application of positron emission tomography is perhaps the first step in the direction of searching for new approaches to this problem.(ABSTRACT TRUNCATED AT 250 WORDS)

  12. A cost-function approach to rival penalized competitive learning (RPCL).

    PubMed

    Ma, Jinwen; Wang, Taijun

    2006-08-01

    Rival penalized competitive learning (RPCL) has been shown to be a useful tool for clustering on a set of sample data in which the number of clusters is unknown. However, the RPCL algorithm was proposed heuristically and is still in lack of a mathematical theory to describe its convergence behavior. In order to solve the convergence problem, we investigate it via a cost-function approach. By theoretical analysis, we prove that a general form of RPCL, called distance-sensitive RPCL (DSRPCL), is associated with the minimization of a cost function on the weight vectors of a competitive learning network. As a DSRPCL process decreases the cost to a local minimum, a number of weight vectors eventually fall into a hypersphere surrounding the sample data, while the other weight vectors diverge to infinity. Moreover, it is shown by the theoretical analysis and simulation experiments that if the cost reduces into the global minimum, a correct number of weight vectors is automatically selected and located around the centers of the actual clusters, respectively. Finally, we apply the DSRPCL algorithms to unsupervised color image segmentation and classification of the wine data.

  13. MPC Design for Rapid Pump-Attenuation and Expedited Hyperglycemia Response to Treat T1DM with an Artificial Pancreas

    PubMed Central

    Gondhalekar, Ravi; Dassau, Eyal; Doyle, Francis J.

    2016-01-01

    The design of a Model Predictive Control (MPC) strategy for the closed-loop operation of an Artificial Pancreas (AP) for treating Type 1 Diabetes Mellitus (T1DM) is considered in this paper. The contribution of this paper is to propose two changes to the usual structure of the MPC problems typically considered for control of an AP. The first proposed change is to replace the symmetric, quadratic input cost function with an asymmetric, quadratic function, allowing negative control inputs to be penalized less than positive ones. This facilitates rapid pump-suspensions in response to predicted hypoglycemia, while simultaneously permitting the design of a conservative response to hyperglycemia. The second proposed change is to penalize the velocity of the predicted glucose level, where this velocity penalty is based on a cost function that is again asymmetric, but additionally state-dependent. This facilitates the accelerated response to acute, persistent hyperglycemic events, e.g., as induced by unannounced meals. The novel functionality is demonstrated by numerical examples, and the efficacy of the proposed MPC strategy verified using the University of Padova/Virginia metabolic simulator. PMID:28479660

  14. 49 CFR 26.47 - Can recipients be penalized for failing to meet overall goals?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false Can recipients be penalized for failing to meet... Goals, Good Faith Efforts, and Counting § 26.47 Can recipients be penalized for failing to meet overall... rule, because your DBE participation falls short of your overall goal, unless you have failed to...

  15. 43 CFR 4170.2-1 - Penal provisions under the Taylor Grazing Act.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Penal provisions under the Taylor Grazing Act. 4170.2-1 Section 4170.2-1 Public Lands: Interior Regulations Relating to Public Lands (Continued...-EXCLUSIVE OF ALASKA Penalties § 4170.2-1 Penal provisions under the Taylor Grazing Act. Under section 2 of...

  16. 38 CFR 14.560 - Procedure where violation of penal statutes is involved including those offenses coming within...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Procedure where violation of penal statutes is involved including those offenses coming within the purview of the Assimilative... where violation of penal statutes is involved including those offenses coming within the purview of the...

  17. 38 CFR 14.560 - Procedure where violation of penal statutes is involved including those offenses coming within...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2011-07-01 2011-07-01 false Procedure where violation of penal statutes is involved including those offenses coming within the purview of the Assimilative... where violation of penal statutes is involved including those offenses coming within the purview of the...

  18. 38 CFR 14.560 - Procedure where violation of penal statutes is involved including those offenses coming within...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2013-07-01 2013-07-01 false Procedure where violation of penal statutes is involved including those offenses coming within the purview of the Assimilative... where violation of penal statutes is involved including those offenses coming within the purview of the...

  19. 38 CFR 14.560 - Procedure where violation of penal statutes is involved including those offenses coming within...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2014-07-01 2014-07-01 false Procedure where violation of penal statutes is involved including those offenses coming within the purview of the Assimilative... where violation of penal statutes is involved including those offenses coming within the purview of the...

  20. 38 CFR 14.560 - Procedure where violation of penal statutes is involved including those offenses coming within...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2012-07-01 2012-07-01 false Procedure where violation of penal statutes is involved including those offenses coming within the purview of the Assimilative... where violation of penal statutes is involved including those offenses coming within the purview of the...

  1. Conventional occlusion versus pharmacologic penalization for amblyopia.

    PubMed

    Li, Tianjing; Shotton, Kate

    2009-10-07

    Amblyopia is defined as defective visual acuity in one or both eyes without demonstrable abnormality of the visual pathway, and is not immediately resolved by wearing glasses. To assess the effectiveness and safety of conventional occlusion versus atropine penalization for amblyopia. We searched CENTRAL, MEDLINE, EMBASE, LILACS, the WHO International Clinical Trials Registry Platform, preference lists, science citation index and ongoing trials up to June 2009. We included randomized/quasi-randomized controlled trials comparing conventional occlusion to atropine penalization for amblyopia. Two authors independently screened abstracts and full text articles, abstracted data, and assessed the risk of bias. Three trials with a total of 525 amblyopic eyes were included. One trial was assessed as having a low risk of bias among these three trials, and one was assessed as having a high risk of bias.Evidence from three trials suggests atropine penalization is as effective as conventional occlusion. One trial found similar improvement in vision at six and 24 months. At six months, visual acuity in the amblyopic eye improved from baseline 3.16 lines in the occlusion and 2.84 lines in the atropine group (mean difference 0.034 logMAR; 95% confidence interval (CI) 0.005 to 0.064 logMAR). At 24 months, additional improvement was seen in both groups; but there continued to be no meaningful difference (mean difference 0.01 logMAR; 95% CI -0.02 to 0.04 logMAR). The second trial reported atropine to be more effective than occlusion. At six months, visual acuity improved 1.8 lines in the patching group and 3.4 lines in the atropine penalization group, and was in favor of atropine (mean difference -0.16 logMAR; 95% CI -0.23 to -0.09 logMAR). Different occlusion modalities were used in these two trials. The third trial had inherent methodological flaws and limited inference could be drawn.No difference in ocular alignment, stereo acuity and sound eye visual acuity between occlusion and atropine penalization was found. Although both treatments were well tolerated, compliance was better in atropine. Atropine penalization costs less than conventional occlusion. The results indicate that atropine penalization is as effective as conventional occlusion. Both conventional occlusion and atropine penalization produce visual acuity improvement in the amblyopic eye. Atropine penalization appears to be as effective as conventional occlusion, although the magnitude of improvement differed among the three trials. Atropine penalization can be used as first line treatment for amblyopia.

  2. On the advancement of therapeutic penality: therapeutic authority, personality science and the therapeutic community.

    PubMed

    McBride, Ruari-Santiago

    2017-09-01

    In this article I examine the advancement of therapeutic penality in the UK, a penal philosophy that reimagines prison policy, practices and environments utilising psychological knowledge. Adopting a historical approach, I show how modern therapeutic penality is linked to the emergence of personality science in the nineteenth century and the development of the democratic therapeutic community (DTC) model in the twentieth century. I outline how at the turn of the twenty-first century a catalytic event generated a moral panic that led the British government to mobilise psychological knowledge and technologies in an attempt to manage dangerous people with severe personality disorder. Tracing subsequent developments, I argue psychological ways of talking, thinking and acting have obtained unparalleled salience in domains of penality and, in turn, radically transformed the conditions of imprisonment. © 2017 Foundation for the Sociology of Health & Illness.

  3. Non-convex Statistical Optimization for Sparse Tensor Graphical Model

    PubMed Central

    Sun, Wei; Wang, Zhaoran; Liu, Han; Cheng, Guang

    2016-01-01

    We consider the estimation of sparse graphical models that characterize the dependency structure of high-dimensional tensor-valued data. To facilitate the estimation of the precision matrix corresponding to each way of the tensor, we assume the data follow a tensor normal distribution whose covariance has a Kronecker product structure. The penalized maximum likelihood estimation of this model involves minimizing a non-convex objective function. In spite of the non-convexity of this estimation problem, we prove that an alternating minimization algorithm, which iteratively estimates each sparse precision matrix while fixing the others, attains an estimator with the optimal statistical rate of convergence as well as consistent graph recovery. Notably, such an estimator achieves estimation consistency with only one tensor sample, which is unobserved in previous work. Our theoretical results are backed by thorough numerical studies. PMID:28316459

  4. Estimation of Covariance Matrix on Bi-Response Longitudinal Data Analysis with Penalized Spline Regression

    NASA Astrophysics Data System (ADS)

    Islamiyati, A.; Fatmawati; Chamidah, N.

    2018-03-01

    The correlation assumption of the longitudinal data with bi-response occurs on the measurement between the subjects of observation and the response. It causes the auto-correlation of error, and this can be overcome by using a covariance matrix. In this article, we estimate the covariance matrix based on the penalized spline regression model. Penalized spline involves knot points and smoothing parameters simultaneously in controlling the smoothness of the curve. Based on our simulation study, the estimated regression model of the weighted penalized spline with covariance matrix gives a smaller error value compared to the error of the model without covariance matrix.

  5. Differentiating among penal states.

    PubMed

    Lacey, Nicola

    2010-12-01

    This review article assesses Loïc Wacquant's contribution to debates on penality, focusing on his most recent book, Punishing the Poor: The Neoliberal Government of Social Insecurity (Wacquant 2009), while setting its argument in the context of his earlier Prisons of Poverty (1999). In particular, it draws on both historical and comparative methods to question whether Wacquant's conception of 'the penal state' is adequately differentiated for the purposes of building the explanatory account he proposes; about whether 'neo-liberalism' has, materially, the global influence which he ascribes to it; and about whether, therefore, the process of penal Americanization which he asserts in his recent writings is credible.

  6. A Study of Penalty Function Methods for Constraint Handling with Genetic Algorithm

    NASA Technical Reports Server (NTRS)

    Ortiz, Francisco

    2004-01-01

    COMETBOARDS (Comparative Evaluation Testbed of Optimization and Analysis Routines for Design of Structures) is a design optimization test bed that can evaluate the performance of several different optimization algorithms. A few of these optimization algorithms are the sequence of unconstrained minimization techniques (SUMT), sequential linear programming (SLP) and the sequential quadratic programming techniques (SQP). A genetic algorithm (GA) is a search technique that is based on the principles of natural selection or "survival of the fittest". Instead of using gradient information, the GA uses the objective function directly in the search. The GA searches the solution space by maintaining a population of potential solutions. Then, using evolving operations such as recombination, mutation and selection, the GA creates successive generations of solutions that will evolve and take on the positive characteristics of their parents and thus gradually approach optimal or near-optimal solutions. By using the objective function directly in the search, genetic algorithms can be effectively applied in non-convex, highly nonlinear, complex problems. The genetic algorithm is not guaranteed to find the global optimum, but it is less likely to get trapped at a local optimum than traditional gradient-based search methods when the objective function is not smooth and generally well behaved. The purpose of this research is to assist in the integration of genetic algorithm (GA) into COMETBOARDS. COMETBOARDS cast the design of structures as a constrained nonlinear optimization problem. One method used to solve constrained optimization problem with a GA to convert the constrained optimization problem into an unconstrained optimization problem by developing a penalty function that penalizes infeasible solutions. There have been several suggested penalty function in the literature each with there own strengths and weaknesses. A statistical analysis of some suggested penalty functions is performed in this study. Also, a response surface approach to robust design is used to develop a new penalty function approach. This new penalty function approach is then compared with the other existing penalty functions.

  7. Analyzing Test-Taking Behavior: Decision Theory Meets Psychometric Theory.

    PubMed

    Budescu, David V; Bo, Yuanchao

    2015-12-01

    We investigate the implications of penalizing incorrect answers to multiple-choice tests, from the perspective of both test-takers and test-makers. To do so, we use a model that combines a well-known item response theory model with prospect theory (Kahneman and Tversky, Prospect theory: An analysis of decision under risk, Econometrica 47:263-91, 1979). Our results reveal that when test-takers are fully informed of the scoring rule, the use of any penalty has detrimental effects for both test-takers (they are always penalized in excess, particularly those who are risk averse and loss averse) and test-makers (the bias of the estimated scores, as well as the variance and skewness of their distribution, increase as a function of the severity of the penalty).

  8. Differential gene expression detection and sample classification using penalized linear regression models.

    PubMed

    Wu, Baolin

    2006-02-15

    Differential gene expression detection and sample classification using microarray data have received much research interest recently. Owing to the large number of genes p and small number of samples n (p > n), microarray data analysis poses big challenges for statistical analysis. An obvious problem owing to the 'large p small n' is over-fitting. Just by chance, we are likely to find some non-differentially expressed genes that can classify the samples very well. The idea of shrinkage is to regularize the model parameters to reduce the effects of noise and produce reliable inferences. Shrinkage has been successfully applied in the microarray data analysis. The SAM statistics proposed by Tusher et al. and the 'nearest shrunken centroid' proposed by Tibshirani et al. are ad hoc shrinkage methods. Both methods are simple, intuitive and prove to be useful in empirical studies. Recently Wu proposed the penalized t/F-statistics with shrinkage by formally using the (1) penalized linear regression models for two-class microarray data, showing good performance. In this paper we systematically discussed the use of penalized regression models for analyzing microarray data. We generalize the two-class penalized t/F-statistics proposed by Wu to multi-class microarray data. We formally derive the ad hoc shrunken centroid used by Tibshirani et al. using the (1) penalized regression models. And we show that the penalized linear regression models provide a rigorous and unified statistical framework for sample classification and differential gene expression detection.

  9. School Crime Handbook. Summary of California Penal and Civil Laws Pertaining to Crimes Committed against Persons or Property on School Grounds.

    ERIC Educational Resources Information Center

    California State Office of the Attorney General, Sacramento.

    This handbook was prepared to ensure that, as required by section 626.1 of the California Penal Code in 1984, "students, parents, and all school officials and employees have access to a concise, easily understandable summary of California penal and civil law pertaining to crimes committed against persons or property on school grounds."…

  10. Association between Stereotactic Radiotherapy and Death from Brain Metastases of Epithelial Ovarian Cancer: a Gliwice Data Re-Analysis with Penalization

    PubMed

    Tukiendorf, Andrzej; Mansournia, Mohammad Ali; Wydmański, Jerzy; Wolny-Rokicka, Edyta

    2017-04-01

    Background: Clinical datasets for epithelial ovarian cancer brain metastatic patients are usually small in size. When adequate case numbers are lacking, resulting estimates of regression coefficients may demonstrate bias. One of the direct approaches to reduce such sparse-data bias is based on penalized estimation. Methods: A re- analysis of formerly reported hazard ratios in diagnosed patients was performed using penalized Cox regression with a popular SAS package providing additional software codes for a statistical computational procedure. Results: It was found that the penalized approach can readily diminish sparse data artefacts and radically reduce the magnitude of estimated regression coefficients. Conclusions: It was confirmed that classical statistical approaches may exaggerate regression estimates or distort study interpretations and conclusions. The results support the thesis that penalization via weak informative priors and data augmentation are the safest approaches to shrink sparse data artefacts frequently occurring in epidemiological research. Creative Commons Attribution License

  11. Relationship between Training Programs being Offered in State and Federal Penal Institutions and the Unfilled Job Openings in the Major Occupations in the United States.

    ERIC Educational Resources Information Center

    Torrence, John Thomas

    Excluding military installations, training programs in state and federal penal institutions were surveyed, through a mailed checklist, to test the hypotheses that (1) training programs in penal institutions were not related to the unfilled job openings by major occupations in the United States, and (2) that training programs reported would have a…

  12. [Penal treatment and rehabilitation of the convict in the new Penal Code of San Marino. Juridical and criminological aspects].

    PubMed

    Sclafani, F; Starace, A

    1978-01-01

    The Republic of San Marino adopted a new Penal Code which came into force on Ist January 1975; it replaced the former one of 15th Sept. 1865. After having stated the typical aspects of the Penal Procedure System therein enforceable, the Authors examine the rules concerning criminal responsibility and the danger of committing new crimes. They point out and criticize the relevant contradictions. In explaining the measures regarding punishment and educational rehabilitation provided for by the San Marino's legal system, the Authors later consider them from a juridical and criminological viewpoint. If some reforms must be approved (for example: biopsychical inquiry on the charged person, probation, week-end imprisonments, fines according to the incomes of the condemned, etc.). the Authors stress that some legal provisions may appear useless and unrealistic when one considers the environmental conditions of the little Republic. The Authors conclude that Penal Procedure Law is not in accordance with Penal Law and, consequently, they hope that a new reform will be grounded on the needs arising from the crimes perpetrated in loco. It shall be, however, necessary to plan a co-ordination among the two Codes within a framework of de-criminalization of many acts which are now punishable as crime.

  13. A powerful and flexible approach to the analysis of RNA sequence count data.

    PubMed

    Zhou, Yi-Hui; Xia, Kai; Wright, Fred A

    2011-10-01

    A number of penalization and shrinkage approaches have been proposed for the analysis of microarray gene expression data. Similar techniques are now routinely applied to RNA sequence transcriptional count data, although the value of such shrinkage has not been conclusively established. If penalization is desired, the explicit modeling of mean-variance relationships provides a flexible testing regimen that 'borrows' information across genes, while easily incorporating design effects and additional covariates. We describe BBSeq, which incorporates two approaches: (i) a simple beta-binomial generalized linear model, which has not been extensively tested for RNA-Seq data and (ii) an extension of an expression mean-variance modeling approach to RNA-Seq data, involving modeling of the overdispersion as a function of the mean. Our approaches are flexible, allowing for general handling of discrete experimental factors and continuous covariates. We report comparisons with other alternate methods to handle RNA-Seq data. Although penalized methods have advantages for very small sample sizes, the beta-binomial generalized linear model, combined with simple outlier detection and testing approaches, appears to have favorable characteristics in power and flexibility. An R package containing examples and sample datasets is available at http://www.bios.unc.edu/research/genomic_software/BBSeq yzhou@bios.unc.edu; fwright@bios.unc.edu Supplementary data are available at Bioinformatics online.

  14. Patch-based image reconstruction for PET using prior-image derived dictionaries

    NASA Astrophysics Data System (ADS)

    Tahaei, Marzieh S.; Reader, Andrew J.

    2016-09-01

    In PET image reconstruction, regularization is often needed to reduce the noise in the resulting images. Patch-based image processing techniques have recently been successfully used for regularization in medical image reconstruction through a penalized likelihood framework. Re-parameterization within reconstruction is another powerful regularization technique in which the object in the scanner is re-parameterized using coefficients for spatially-extensive basis vectors. In this work, a method for extracting patch-based basis vectors from the subject’s MR image is proposed. The coefficients for these basis vectors are then estimated using the conventional MLEM algorithm. Furthermore, using the alternating direction method of multipliers, an algorithm for optimizing the Poisson log-likelihood while imposing sparsity on the parameters is also proposed. This novel method is then utilized to find sparse coefficients for the patch-based basis vectors extracted from the MR image. The results indicate the superiority of the proposed methods to patch-based regularization using the penalized likelihood framework.

  15. 45 CFR 261.15 - Can a family be penalized if a parent refuses to work because he or she cannot find child care?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Provisions Addressing Individual Responsibility? § 261.15 Can a family be penalized if a parent refuses to... parent caring for a child under age six who has a demonstrated inability to obtain needed child care, as... 45 Public Welfare 2 2011-10-01 2011-10-01 false Can a family be penalized if a parent refuses to...

  16. 45 CFR 261.15 - Can a family be penalized if a parent refuses to work because he or she cannot find child care?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Provisions Addressing Individual Responsibility? § 261.15 Can a family be penalized if a parent refuses to... parent caring for a child under age six who has a demonstrated inability to obtain needed child care, as... 45 Public Welfare 2 2013-10-01 2012-10-01 true Can a family be penalized if a parent refuses to...

  17. 45 CFR 261.15 - Can a family be penalized if a parent refuses to work because he or she cannot find child care?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Provisions Addressing Individual Responsibility? § 261.15 Can a family be penalized if a parent refuses to... parent caring for a child under age six who has a demonstrated inability to obtain needed child care, as... 45 Public Welfare 2 2012-10-01 2012-10-01 false Can a family be penalized if a parent refuses to...

  18. 45 CFR 261.15 - Can a family be penalized if a parent refuses to work because he or she cannot find child care?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Provisions Addressing Individual Responsibility? § 261.15 Can a family be penalized if a parent refuses to... parent caring for a child under age six who has a demonstrated inability to obtain needed child care, as... 45 Public Welfare 2 2010-10-01 2010-10-01 false Can a family be penalized if a parent refuses to...

  19. 45 CFR 261.15 - Can a family be penalized if a parent refuses to work because he or she cannot find child care?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Provisions Addressing Individual Responsibility? § 261.15 Can a family be penalized if a parent refuses to... parent caring for a child under age six who has a demonstrated inability to obtain needed child care, as... 45 Public Welfare 2 2014-10-01 2012-10-01 true Can a family be penalized if a parent refuses to...

  20. Hospital Characteristics Associated With Penalties in the Centers for Medicare & Medicaid Services Hospital-Acquired Condition Reduction Program.

    PubMed

    Rajaram, Ravi; Chung, Jeanette W; Kinnier, Christine V; Barnard, Cynthia; Mohanty, Sanjay; Pavey, Emily S; McHugh, Megan C; Bilimoria, Karl Y

    2015-07-28

    In fiscal year (FY) 2015, the Centers for Medicare & Medicaid Services (CMS) instituted the Hospital-Acquired Condition (HAC) Reduction Program, which reduces payments to the lowest-performing hospitals. However, it is uncertain whether this program accurately measures quality and fairly penalizes hospitals. To examine the characteristics of hospitals penalized by the HAC Reduction Program and to evaluate the association of a summary score of hospital characteristics related to quality with penalization in the HAC program. Data for hospitals participating in the FY2015 HAC Reduction Program were obtained from CMS' Hospital Compare and merged with the 2014 American Hospital Association Annual Survey and FY2015 Medicare Impact File. Logistic regression models were developed to examine the association between hospital characteristics and HAC program penalization. An 8-point hospital quality summary score was created using hospital characteristics related to volume, accreditations, and offering of advanced care services. The relationship between the hospital quality summary score and HAC program penalization was examined. Publicly reported process-of-care and outcome measures were examined from 4 clinical areas (surgery, acute myocardial infarction, heart failure, pneumonia), and their association with the hospital quality summary score was evaluated. Penalization in the HAC Reduction Program. Hospital characteristics associated with penalization. Of the 3284 hospitals participating in the HAC program, 721 (22.0%) were penalized. Hospitals were more likely to be penalized if they were accredited by the Joint Commission (24.0% accredited, 14.4% not accredited; odds ratio [OR], 1.33; 95% CI, 1.04-1.70); they were major teaching hospitals (42.3%; OR, 1.58; 95% CI, 1.09-2.29) or very major teaching hospitals (62.2%; OR, 2.61; 95% CI, 1.55-4.39; vs nonteaching hospitals, 17.0%); they cared for more complex patient populations based on case mix index (quartile 4 vs quartile 1: 32.8% vs 12.1%; OR, 1.98; 95% CI, 1.44-2.71); or they were safety-net hospitals vs non-safety-net hospitals (28.3% vs 19.9%; OR, 1.36; 95% CI, 1.11-1.68). Hospitals with higher hospital quality summary scores had significantly better performance on 9 of 10 publicly reported process and outcomes measures compared with hospitals that had lower quality scores (all P ≤ .01 for trend). However, hospitals with the highest quality score of 8 were penalized significantly more frequently than hospitals with the lowest quality score of 0 (67.3% [37/55] vs 12.6% [53/422]; P < .001 for trend). Among hospitals participating in the HAC Reduction Program, hospitals that were penalized more frequently had more quality accreditations, offered advanced services, were major teaching institutions, and had better performance on other process and outcome measures. These paradoxical findings suggest that the approach for assessing hospital penalties in the HAC Reduction Program merits reconsideration to ensure it is achieving the intended goals.

  1. Performance and robustness of penalized and unpenalized methods for genetic prediction of complex human disease.

    PubMed

    Abraham, Gad; Kowalczyk, Adam; Zobel, Justin; Inouye, Michael

    2013-02-01

    A central goal of medical genetics is to accurately predict complex disease from genotypes. Here, we present a comprehensive analysis of simulated and real data using lasso and elastic-net penalized support-vector machine models, a mixed-effects linear model, a polygenic score, and unpenalized logistic regression. In simulation, the sparse penalized models achieved lower false-positive rates and higher precision than the other methods for detecting causal SNPs. The common practice of prefiltering SNP lists for subsequent penalized modeling was examined and shown to substantially reduce the ability to recover the causal SNPs. Using genome-wide SNP profiles across eight complex diseases within cross-validation, lasso and elastic-net models achieved substantially better predictive ability in celiac disease, type 1 diabetes, and Crohn's disease, and had equivalent predictive ability in the rest, with the results in celiac disease strongly replicating between independent datasets. We investigated the effect of linkage disequilibrium on the predictive models, showing that the penalized methods leverage this information to their advantage, compared with methods that assume SNP independence. Our findings show that sparse penalized approaches are robust across different disease architectures, producing as good as or better phenotype predictions and variance explained. This has fundamental ramifications for the selection and future development of methods to genetically predict human disease. © 2012 WILEY PERIODICALS, INC.

  2. Joint Optimization of Fluence Field Modulation and Regularization in Task-Driven Computed Tomography.

    PubMed

    Gang, G J; Siewerdsen, J H; Stayman, J W

    2017-02-11

    This work presents a task-driven joint optimization of fluence field modulation (FFM) and regularization in quadratic penalized-likelihood (PL) reconstruction. Conventional FFM strategies proposed for filtered-backprojection (FBP) are evaluated in the context of PL reconstruction for comparison. We present a task-driven framework that leverages prior knowledge of the patient anatomy and imaging task to identify FFM and regularization. We adopted a maxi-min objective that ensures a minimum level of detectability index ( d' ) across sample locations in the image volume. The FFM designs were parameterized by 2D Gaussian basis functions to reduce dimensionality of the optimization and basis function coefficients were estimated using the covariance matrix adaptation evolutionary strategy (CMA-ES) algorithm. The FFM was jointly optimized with both space-invariant and spatially-varying regularization strength ( β ) - the former via an exhaustive search through discrete values and the latter using an alternating optimization where β was exhaustively optimized locally and interpolated to form a spatially-varying map. The optimal FFM inverts as β increases, demonstrating the importance of a joint optimization. For the task and object investigated, the optimal FFM assigns more fluence through less attenuating views, counter to conventional FFM schemes proposed for FBP. The maxi-min objective homogenizes detectability throughout the image and achieves a higher minimum detectability than conventional FFM strategies. The task-driven FFM designs found in this work are counter to conventional patterns for FBP and yield better performance in terms of the maxi-min objective, suggesting opportunities for improved image quality and/or dose reduction when model-based reconstructions are applied in conjunction with FFM.

  3. A unified framework for penalized statistical muon tomography reconstruction with edge preservation priors of lp norm type

    NASA Astrophysics Data System (ADS)

    Yu, Baihui; Zhao, Ziran; Wang, Xuewu; Wu, Dufan; Zeng, Zhi; Zeng, Ming; Wang, Yi; Cheng, Jianping

    2016-01-01

    The Tsinghua University MUon Tomography facilitY (TUMUTY) has been built up and it is utilized to reconstruct the special objects with complex structure. Since fine image is required, the conventional Maximum likelihood Scattering and Displacement (MLSD) algorithm is employed. However, due to the statistical characteristics of muon tomography and the data incompleteness, the reconstruction is always instable and accompanied with severe noise. In this paper, we proposed a Maximum a Posterior (MAP) algorithm for muon tomography regularization, where an edge-preserving prior on the scattering density image is introduced to the object function. The prior takes the lp norm (p>0) of the image gradient magnitude, where p=1 and p=2 are the well-known total-variation (TV) and Gaussian prior respectively. The optimization transfer principle is utilized to minimize the object function in a unified framework. At each iteration the problem is transferred to solving a cubic equation through paraboloidal surrogating. To validate the method, the French Test Object (FTO) is imaged by both numerical simulation and TUMUTY. The proposed algorithm is used for the reconstruction where different norms are detailedly studied, including l2, l1, l0.5, and an l2-0.5 mixture norm. Compared with MLSD method, MAP achieves better image quality in both structure preservation and noise reduction. Furthermore, compared with the previous work where one dimensional image was acquired, we achieve the relatively clear three dimensional images of FTO, where the inner air hole and the tungsten shell is visible.

  4. Analyzing Association Mapping in Pedigree-Based GWAS Using a Penalized Multitrait Mixed Model

    PubMed Central

    Liu, Jin; Yang, Can; Shi, Xingjie; Li, Cong; Huang, Jian; Zhao, Hongyu; Ma, Shuangge

    2017-01-01

    Genome-wide association studies (GWAS) have led to the identification of many genetic variants associated with complex diseases in the past 10 years. Penalization methods, with significant numerical and statistical advantages, have been extensively adopted in analyzing GWAS. This study has been partly motivated by the analysis of Genetic Analysis Workshop (GAW) 18 data, which have two notable characteristics. First, the subjects are from a small number of pedigrees and hence related. Second, for each subject, multiple correlated traits have been measured. Most of the existing penalization methods assume independence between subjects and traits and can be suboptimal. There are a few methods in the literature based on mixed modeling that can accommodate correlations. However, they cannot fully accommodate the two types of correlations while conducting effective marker selection. In this study, we develop a penalized multitrait mixed modeling approach. It accommodates the two different types of correlations and includes several existing methods as special cases. Effective penalization is adopted for marker selection. Simulation demonstrates its satisfactory performance. The GAW 18 data are analyzed using the proposed method. PMID:27247027

  5. Dose-shaping using targeted sparse optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sayre, George A.; Ruan, Dan

    2013-07-15

    Purpose: Dose volume histograms (DVHs) are common tools in radiation therapy treatment planning to characterize plan quality. As statistical metrics, DVHs provide a compact summary of the underlying plan at the cost of losing spatial information: the same or similar dose-volume histograms can arise from substantially different spatial dose maps. This is exactly the reason why physicians and physicists scrutinize dose maps even after they satisfy all DVH endpoints numerically. However, up to this point, little has been done to control spatial phenomena, such as the spatial distribution of hot spots, which has significant clinical implications. To this end, themore » authors propose a novel objective function that enables a more direct tradeoff between target coverage, organ-sparing, and planning target volume (PTV) homogeneity, and presents our findings from four prostate cases, a pancreas case, and a head-and-neck case to illustrate the advantages and general applicability of our method.Methods: In designing the energy minimization objective (E{sub tot}{sup sparse}), the authors utilized the following robust cost functions: (1) an asymmetric linear well function to allow differential penalties for underdose, relaxation of prescription dose, and overdose in the PTV; (2) a two-piece linear function to heavily penalize high dose and mildly penalize low and intermediate dose in organs-at risk (OARs); and (3) a total variation energy, i.e., the L{sub 1} norm applied to the first-order approximation of the dose gradient in the PTV. By minimizing a weighted sum of these robust costs, general conformity to dose prescription and dose-gradient prescription is achieved while encouraging prescription violations to follow a Laplace distribution. In contrast, conventional quadratic objectives are associated with a Gaussian distribution of violations, which is less forgiving to large violations of prescription than the Laplace distribution. As a result, the proposed objective E{sub tot}{sup sparse} improves tradeoff between planning goals by 'sacrificing' voxels that have already been violated to improve PTV coverage, PTV homogeneity, and/or OAR-sparing. In doing so, overall plan quality is increased since these large violations only arise if a net reduction in E{sub tot}{sup sparse} occurs as a result. For example, large violations to dose prescription in the PTV in E{sub tot}{sup sparse}-optimized plans will naturally localize to voxels in and around PTV-OAR overlaps where OAR-sparing may be increased without compromising target coverage. The authors compared the results of our method and the corresponding clinical plans using analyses of DVH plots, dose maps, and two quantitative metrics that quantify PTV homogeneity and overdose. These metrics do not penalize underdose since E{sub tot}{sup sparse}-optimized plans were planned such that their target coverage was similar or better than that of the clinical plans. Finally, plan deliverability was assessed with the 2D modulation index.Results: The proposed method was implemented using IBM's CPLEX optimization package (ILOG CPLEX, Sunnyvale, CA) and required 1-4 min to solve with a 12-core Intel i7 processor. In the testing procedure, the authors optimized for several points on the Pareto surface of four 7-field 6MV prostate cases that were optimized for different levels of PTV homogeneity and OAR-sparing. The generated results were compared against each other and the clinical plan by analyzing their DVH plots and dose maps. After developing intuition by planning the four prostate cases, which had relatively few tradeoffs, the authors applied our method to a 7-field 6 MV pancreas case and a 9-field 6MV head-and-neck case to test the potential impact of our method on more challenging cases. The authors found that our formulation: (1) provided excellent flexibility for balancing OAR-sparing with PTV homogeneity; and (2) permitted the dose planner more control over the evolution of the PTV's spatial dose distribution than conventional objective functions. In particular, E{sub tot}{sup sparse}-optimized plans for the pancreas case and head-and-neck case exhibited substantially improved sparing of the spinal cord and parotid glands, respectively, while maintaining or improving sparing for other OARs and markedly improving PTV homogeneity. Plan deliverability for E{sub tot}{sup sparse}-optimized plans was shown to be better than their associated clinical plans, according to the two-dimensional modulation index.Conclusions: These results suggest that our formulation may be used to improve dose-shaping and OAR-sparing for complicated disease sites, such as the pancreas or head and neck. Furthermore, our objective function and constraints are linear and constitute a linear program, which converges to the global minimum quickly, and can be easily implemented in treatment planning software. Thus, the authors expect fast translation of our method to the clinic where it may have a positive impact on plan quality for challenging disease sites.« less

  6. Blind beam-hardening correction from Poisson measurements

    NASA Astrophysics Data System (ADS)

    Gu, Renliang; Dogandžić, Aleksandar

    2016-02-01

    We develop a sparse image reconstruction method for Poisson-distributed polychromatic X-ray computed tomography (CT) measurements under the blind scenario where the material of the inspected object and the incident energy spectrum are unknown. We employ our mass-attenuation spectrum parameterization of the noiseless measurements and express the mass- attenuation spectrum as a linear combination of B-spline basis functions of order one. A block coordinate-descent algorithm is developed for constrained minimization of a penalized Poisson negative log-likelihood (NLL) cost function, where constraints and penalty terms ensure nonnegativity of the spline coefficients and nonnegativity and sparsity of the density map image; the image sparsity is imposed using a convex total-variation (TV) norm penalty term. This algorithm alternates between a Nesterov's proximal-gradient (NPG) step for estimating the density map image and a limited-memory Broyden-Fletcher-Goldfarb-Shanno with box constraints (L-BFGS-B) step for estimating the incident-spectrum parameters. To accelerate convergence of the density- map NPG steps, we apply function restart and a step-size selection scheme that accounts for varying local Lipschitz constants of the Poisson NLL. Real X-ray CT reconstruction examples demonstrate the performance of the proposed scheme.

  7. A model for nocturnal frost formation on a wing section: Aircraft takeoff performance penalties

    NASA Technical Reports Server (NTRS)

    Dietenberger, M. A.

    1983-01-01

    The nocturnal frost formation on a wing section, to explain the hazard associated with frost during takeoff was investigated. A model of nocturnal frost formation on a wing section which predicts when the nocturnal frost will form and also its thickness and density as a function of time was developed. The aerodynamic penalities as related to the nocturnal frost formation properties were analyzed to determine how much the takeoff performance would be degraded by a specific frost layer. With an aircraft takeoff assuming equations representing a steady climbing flight, it is determined that a reduction in the maximum gross weight or a partial frost clearance and a reduction in the takeoff angle of attack is needed to neutralize drag and life penalities which are due to frost. Atmospheric conditions which produce the most hazardous frost buildup are determined.

  8. Aggressor-Victim Dissent in Perceived Legitimacy of Aggression in Soccer: The Moderating Role of Situational Background

    ERIC Educational Resources Information Center

    Rascle, Olivier; Traclet, Alan; Souchon, Nicolas; Coulomb-Cabagno, Genevieve; Petrucci, Carrie

    2010-01-01

    The purpose of this study was to investigate the aggressor-victim difference in perceived legitimacy of aggression in soccer as a function of score information (tied, favorable, unfavorable), sporting penalization (no risk, yellow card, red card), and type of aggression (instrumental, hostile). French male soccer players (N = 133) read written…

  9. A powerful and flexible approach to the analysis of RNA sequence count data

    PubMed Central

    Zhou, Yi-Hui; Xia, Kai; Wright, Fred A.

    2011-01-01

    Motivation: A number of penalization and shrinkage approaches have been proposed for the analysis of microarray gene expression data. Similar techniques are now routinely applied to RNA sequence transcriptional count data, although the value of such shrinkage has not been conclusively established. If penalization is desired, the explicit modeling of mean–variance relationships provides a flexible testing regimen that ‘borrows’ information across genes, while easily incorporating design effects and additional covariates. Results: We describe BBSeq, which incorporates two approaches: (i) a simple beta-binomial generalized linear model, which has not been extensively tested for RNA-Seq data and (ii) an extension of an expression mean–variance modeling approach to RNA-Seq data, involving modeling of the overdispersion as a function of the mean. Our approaches are flexible, allowing for general handling of discrete experimental factors and continuous covariates. We report comparisons with other alternate methods to handle RNA-Seq data. Although penalized methods have advantages for very small sample sizes, the beta-binomial generalized linear model, combined with simple outlier detection and testing approaches, appears to have favorable characteristics in power and flexibility. Availability: An R package containing examples and sample datasets is available at http://www.bios.unc.edu/research/genomic_software/BBSeq Contact: yzhou@bios.unc.edu; fwright@bios.unc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21810900

  10. Robust Gaussian Graphical Modeling via l1 Penalization

    PubMed Central

    Sun, Hokeun; Li, Hongzhe

    2012-01-01

    Summary Gaussian graphical models have been widely used as an effective method for studying the conditional independency structure among genes and for constructing genetic networks. However, gene expression data typically have heavier tails or more outlying observations than the standard Gaussian distribution. Such outliers in gene expression data can lead to wrong inference on the dependency structure among the genes. We propose a l1 penalized estimation procedure for the sparse Gaussian graphical models that is robustified against possible outliers. The likelihood function is weighted according to how the observation is deviated, where the deviation of the observation is measured based on its own likelihood. An efficient computational algorithm based on the coordinate gradient descent method is developed to obtain the minimizer of the negative penalized robustified-likelihood, where nonzero elements of the concentration matrix represents the graphical links among the genes. After the graphical structure is obtained, we re-estimate the positive definite concentration matrix using an iterative proportional fitting algorithm. Through simulations, we demonstrate that the proposed robust method performs much better than the graphical Lasso for the Gaussian graphical models in terms of both graph structure selection and estimation when outliers are present. We apply the robust estimation procedure to an analysis of yeast gene expression data and show that the resulting graph has better biological interpretation than that obtained from the graphical Lasso. PMID:23020775

  11. Comparison of the efficacies of patching and penalization therapies for the treatment of amblyopia patients

    PubMed Central

    Cabi, Cemalettin; Sayman Muslubas, Isil Bahar; Aydin Oral, Ayse Yesim; Dastan, Metin

    2014-01-01

    AIM To compare the efficacies of patching and penalization therapies for the treatment of amblyopia patients. METHODS The records of 64 eyes of 50 patients 7 to 16y of age who had presented to our clinics with a diagnosis of amblyopia, were evaluated retrospectively. Forty eyes of 26 patients who had received patching therapy and 24 eyes of 24 patients who had received penalization therapy included in this study. The latencies and amplitudes of visual evoked potential (VEP) records and best corrected visual acuities (BCVA) of these two groups were compared before and six months after the treatment. RESULTS In both patching and the penalization groups, the visual acuities increased significantly following the treatments (P<0.05). The latency measurements of the P100 wave obtained at 1.0°, 15 arc min. Patterns of both groups significantly decreased following the 6-months-treatment. However, the amplitude measurements increased (P<0.05). CONCLUSION The patching and the penalization methods, which are the main methods used in the treatment of amblyopia, were also effective over the age of 7y, which has been accepted as the critical age for the treatment of amblyopia. PMID:24967195

  12. Linear models to perform treaty verification tasks for enhanced information security

    DOE PAGES

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; ...

    2016-11-12

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensionalmore » vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.« less

  13. Linear models to perform treaty verification tasks for enhanced information security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensionalmore » vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.« less

  14. Linear models to perform treaty verification tasks for enhanced information security

    NASA Astrophysics Data System (ADS)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-02-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  15. Towards the stabilization of the low density elements in topology optimization with large deformation

    NASA Astrophysics Data System (ADS)

    Lahuerta, Ricardo Doll; Simões, Eduardo T.; Campello, Eduardo M. B.; Pimenta, Paulo M.; Silva, Emilio C. N.

    2013-10-01

    This work addresses the treatment of lower density regions of structures undergoing large deformations during the design process by the topology optimization method (TOM) based on the finite element method. During the design process the nonlinear elastic behavior of the structure is based on exact kinematics. The material model applied in the TOM is based on the solid isotropic microstructure with penalization approach. No void elements are deleted and all internal forces of the nodes surrounding the void elements are considered during the nonlinear equilibrium solution. The distribution of design variables is solved through the method of moving asymptotes, in which the sensitivity of the objective function is obtained directly. In addition, a continuation function and a nonlinear projection function are invoked to obtain a checkerboard free and mesh independent design. 2D examples with both plane strain and plane stress conditions hypothesis are presented and compared. The problem of instability is overcome by adopting a polyconvex constitutive model in conjunction with a suggested relaxation function to stabilize the excessive distorted elements. The exact tangent stiffness matrix is used. The optimal topology results are compared to the results obtained by using the classical Saint Venant-Kirchhoff constitutive law, and strong differences are found.

  16. Analysis of Genome-Wide Association Studies with Multiple Outcomes Using Penalization

    PubMed Central

    Liu, Jin; Huang, Jian; Ma, Shuangge

    2012-01-01

    Genome-wide association studies have been extensively conducted, searching for markers for biologically meaningful outcomes and phenotypes. Penalization methods have been adopted in the analysis of the joint effects of a large number of SNPs (single nucleotide polymorphisms) and marker identification. This study is partly motivated by the analysis of heterogeneous stock mice dataset, in which multiple correlated phenotypes and a large number of SNPs are available. Existing penalization methods designed to analyze a single response variable cannot accommodate the correlation among multiple response variables. With multiple response variables sharing the same set of markers, joint modeling is first employed to accommodate the correlation. The group Lasso approach is adopted to select markers associated with all the outcome variables. An efficient computational algorithm is developed. Simulation study and analysis of the heterogeneous stock mice dataset show that the proposed method can outperform existing penalization methods. PMID:23272092

  17. Penalized Ordinal Regression Methods for Predicting Stage of Cancer in High-Dimensional Covariate Spaces.

    PubMed

    Gentry, Amanda Elswick; Jackson-Cook, Colleen K; Lyon, Debra E; Archer, Kellie J

    2015-01-01

    The pathological description of the stage of a tumor is an important clinical designation and is considered, like many other forms of biomedical data, an ordinal outcome. Currently, statistical methods for predicting an ordinal outcome using clinical, demographic, and high-dimensional correlated features are lacking. In this paper, we propose a method that fits an ordinal response model to predict an ordinal outcome for high-dimensional covariate spaces. Our method penalizes some covariates (high-throughput genomic features) without penalizing others (such as demographic and/or clinical covariates). We demonstrate the application of our method to predict the stage of breast cancer. In our model, breast cancer subtype is a nonpenalized predictor, and CpG site methylation values from the Illumina Human Methylation 450K assay are penalized predictors. The method has been made available in the ordinalgmifs package in the R programming environment.

  18. Penalized Weighted Least-Squares Approach to Sinogram Noise Reduction and Image Reconstruction for Low-Dose X-Ray Computed Tomography

    PubMed Central

    Wang, Jing; Li, Tianfang; Lu, Hongbing; Liang, Zhengrong

    2006-01-01

    Reconstructing low-dose X-ray CT (computed tomography) images is a noise problem. This work investigated a penalized weighted least-squares (PWLS) approach to address this problem in two dimensions, where the WLS considers first- and second-order noise moments and the penalty models signal spatial correlations. Three different implementations were studied for the PWLS minimization. One utilizes a MRF (Markov random field) Gibbs functional to consider spatial correlations among nearby detector bins and projection views in sinogram space and minimizes the PWLS cost function by iterative Gauss-Seidel algorithm. Another employs Karhunen-Loève (KL) transform to de-correlate data signals among nearby views and minimizes the PWLS adaptively to each KL component by analytical calculation, where the spatial correlation among nearby bins is modeled by the same Gibbs functional. The third one models the spatial correlations among image pixels in image domain also by a MRF Gibbs functional and minimizes the PWLS by iterative successive over-relaxation algorithm. In these three implementations, a quadratic functional regularization was chosen for the MRF model. Phantom experiments showed a comparable performance of these three PWLS-based methods in terms of suppressing noise-induced streak artifacts and preserving resolution in the reconstructed images. Computer simulations concurred with the phantom experiments in terms of noise-resolution tradeoff and detectability in low contrast environment. The KL-PWLS implementation may have the advantage in terms of computation for high-resolution dynamic low-dose CT imaging. PMID:17024831

  19. A Penalized Robust Method for Identifying Gene-Environment Interactions

    PubMed Central

    Shi, Xingjie; Liu, Jin; Huang, Jian; Zhou, Yong; Xie, Yang; Ma, Shuangge

    2015-01-01

    In high-throughput studies, an important objective is to identify gene-environment interactions associated with disease outcomes and phenotypes. Many commonly adopted methods assume specific parametric or semiparametric models, which may be subject to model mis-specification. In addition, they usually use significance level as the criterion for selecting important interactions. In this study, we adopt the rank-based estimation, which is much less sensitive to model specification than some of the existing methods and includes several commonly encountered data and models as special cases. Penalization is adopted for the identification of gene-environment interactions. It achieves simultaneous estimation and identification and does not rely on significance level. For computation feasibility, a smoothed rank estimation is further proposed. Simulation shows that under certain scenarios, for example with contaminated or heavy-tailed data, the proposed method can significantly outperform the existing alternatives with more accurate identification. We analyze a lung cancer prognosis study with gene expression measurements under the AFT (accelerated failure time) model. The proposed method identifies interactions different from those using the alternatives. Some of the identified genes have important implications. PMID:24616063

  20. Advanced colorectal neoplasia risk stratification by penalized logistic regression.

    PubMed

    Lin, Yunzhi; Yu, Menggang; Wang, Sijian; Chappell, Richard; Imperiale, Thomas F

    2016-08-01

    Colorectal cancer is the second leading cause of death from cancer in the United States. To facilitate the efficiency of colorectal cancer screening, there is a need to stratify risk for colorectal cancer among the 90% of US residents who are considered "average risk." In this article, we investigate such risk stratification rules for advanced colorectal neoplasia (colorectal cancer and advanced, precancerous polyps). We use a recently completed large cohort study of subjects who underwent a first screening colonoscopy. Logistic regression models have been used in the literature to estimate the risk of advanced colorectal neoplasia based on quantifiable risk factors. However, logistic regression may be prone to overfitting and instability in variable selection. Since most of the risk factors in our study have several categories, it was tempting to collapse these categories into fewer risk groups. We propose a penalized logistic regression method that automatically and simultaneously selects variables, groups categories, and estimates their coefficients by penalizing the [Formula: see text]-norm of both the coefficients and their differences. Hence, it encourages sparsity in the categories, i.e. grouping of the categories, and sparsity in the variables, i.e. variable selection. We apply the penalized logistic regression method to our data. The important variables are selected, with close categories simultaneously grouped, by penalized regression models with and without the interactions terms. The models are validated with 10-fold cross-validation. The receiver operating characteristic curves of the penalized regression models dominate the receiver operating characteristic curve of naive logistic regressions, indicating a superior discriminative performance. © The Author(s) 2013.

  1. Segmentation-free statistical image reconstruction for polyenergetic x-ray computed tomography with experimental validation.

    PubMed

    Idris A, Elbakri; Fessler, Jeffrey A

    2003-08-07

    This paper describes a statistical image reconstruction method for x-ray CT that is based on a physical model that accounts for the polyenergetic x-ray source spectrum and the measurement nonlinearities caused by energy-dependent attenuation. Unlike our earlier work, the proposed algorithm does not require pre-segmentation of the object into the various tissue classes (e.g., bone and soft tissue) and allows mixed pixels. The attenuation coefficient of each voxel is modelled as the product of its unknown density and a weighted sum of energy-dependent mass attenuation coefficients. We formulate a penalized-likelihood function for this polyenergetic model and develop an iterative algorithm for estimating the unknown density of each voxel. Applying this method to simulated x-ray CT measurements of objects containing both bone and soft tissue yields images with significantly reduced beam hardening artefacts relative to conventional beam hardening correction methods. We also apply the method to real data acquired from a phantom containing various concentrations of potassium phosphate solution. The algorithm reconstructs an image with accurate density values for the different concentrations, demonstrating its potential for quantitative CT applications.

  2. Iterative reconstruction for x-ray computed tomography using prior-image induced nonlocal regularization.

    PubMed

    Zhang, Hua; Huang, Jing; Ma, Jianhua; Bian, Zhaoying; Feng, Qianjin; Lu, Hongbing; Liang, Zhengrong; Chen, Wufan

    2014-09-01

    Repeated X-ray computed tomography (CT) scans are often required in several specific applications such as perfusion imaging, image-guided biopsy needle, image-guided intervention, and radiotherapy with noticeable benefits. However, the associated cumulative radiation dose significantly increases as comparison with that used in the conventional CT scan, which has raised major concerns in patients. In this study, to realize radiation dose reduction by reducing the X-ray tube current and exposure time (mAs) in repeated CT scans, we propose a prior-image induced nonlocal (PINL) regularization for statistical iterative reconstruction via the penalized weighted least-squares (PWLS) criteria, which we refer to as "PWLS-PINL". Specifically, the PINL regularization utilizes the redundant information in the prior image and the weighted least-squares term considers a data-dependent variance estimation, aiming to improve current low-dose image quality. Subsequently, a modified iterative successive overrelaxation algorithm is adopted to optimize the associative objective function. Experimental results on both phantom and patient data show that the present PWLS-PINL method can achieve promising gains over the other existing methods in terms of the noise reduction, low-contrast object detection, and edge detail preservation.

  3. Iterative Reconstruction for X-Ray Computed Tomography using Prior-Image Induced Nonlocal Regularization

    PubMed Central

    Ma, Jianhua; Bian, Zhaoying; Feng, Qianjin; Lu, Hongbing; Liang, Zhengrong; Chen, Wufan

    2014-01-01

    Repeated x-ray computed tomography (CT) scans are often required in several specific applications such as perfusion imaging, image-guided biopsy needle, image-guided intervention, and radiotherapy with noticeable benefits. However, the associated cumulative radiation dose significantly increases as comparison with that used in the conventional CT scan, which has raised major concerns in patients. In this study, to realize radiation dose reduction by reducing the x-ray tube current and exposure time (mAs) in repeated CT scans, we propose a prior-image induced nonlocal (PINL) regularization for statistical iterative reconstruction via the penalized weighted least-squares (PWLS) criteria, which we refer to as “PWLS-PINL”. Specifically, the PINL regularization utilizes the redundant information in the prior image and the weighted least-squares term considers a data-dependent variance estimation, aiming to improve current low-dose image quality. Subsequently, a modified iterative successive over-relaxation algorithm is adopted to optimize the associative objective function. Experimental results on both phantom and patient data show that the present PWLS-PINL method can achieve promising gains over the other existing methods in terms of the noise reduction, low-contrast object detection and edge detail preservation. PMID:24235272

  4. Joint Optimization of Fluence Field Modulation and Regularization in Task-Driven Computed Tomography

    PubMed Central

    Gang, G. J.; Siewerdsen, J. H.; Stayman, J. W.

    2017-01-01

    Purpose This work presents a task-driven joint optimization of fluence field modulation (FFM) and regularization in quadratic penalized-likelihood (PL) reconstruction. Conventional FFM strategies proposed for filtered-backprojection (FBP) are evaluated in the context of PL reconstruction for comparison. Methods We present a task-driven framework that leverages prior knowledge of the patient anatomy and imaging task to identify FFM and regularization. We adopted a maxi-min objective that ensures a minimum level of detectability index (d′) across sample locations in the image volume. The FFM designs were parameterized by 2D Gaussian basis functions to reduce dimensionality of the optimization and basis function coefficients were estimated using the covariance matrix adaptation evolutionary strategy (CMA-ES) algorithm. The FFM was jointly optimized with both space-invariant and spatially-varying regularization strength (β) - the former via an exhaustive search through discrete values and the latter using an alternating optimization where β was exhaustively optimized locally and interpolated to form a spatially-varying map. Results The optimal FFM inverts as β increases, demonstrating the importance of a joint optimization. For the task and object investigated, the optimal FFM assigns more fluence through less attenuating views, counter to conventional FFM schemes proposed for FBP. The maxi-min objective homogenizes detectability throughout the image and achieves a higher minimum detectability than conventional FFM strategies. Conclusions The task-driven FFM designs found in this work are counter to conventional patterns for FBP and yield better performance in terms of the maxi-min objective, suggesting opportunities for improved image quality and/or dose reduction when model-based reconstructions are applied in conjunction with FFM. PMID:28626290

  5. Joint optimization of fluence field modulation and regularization in task-driven computed tomography

    NASA Astrophysics Data System (ADS)

    Gang, G. J.; Siewerdsen, J. H.; Stayman, J. W.

    2017-03-01

    Purpose: This work presents a task-driven joint optimization of fluence field modulation (FFM) and regularization in quadratic penalized-likelihood (PL) reconstruction. Conventional FFM strategies proposed for filtered-backprojection (FBP) are evaluated in the context of PL reconstruction for comparison. Methods: We present a task-driven framework that leverages prior knowledge of the patient anatomy and imaging task to identify FFM and regularization. We adopted a maxi-min objective that ensures a minimum level of detectability index (d') across sample locations in the image volume. The FFM designs were parameterized by 2D Gaussian basis functions to reduce dimensionality of the optimization and basis function coefficients were estimated using the covariance matrix adaptation evolutionary strategy (CMA-ES) algorithm. The FFM was jointly optimized with both space-invariant and spatially-varying regularization strength (β) - the former via an exhaustive search through discrete values and the latter using an alternating optimization where β was exhaustively optimized locally and interpolated to form a spatially-varying map. Results: The optimal FFM inverts as β increases, demonstrating the importance of a joint optimization. For the task and object investigated, the optimal FFM assigns more fluence through less attenuating views, counter to conventional FFM schemes proposed for FBP. The maxi-min objective homogenizes detectability throughout the image and achieves a higher minimum detectability than conventional FFM strategies. Conclusions: The task-driven FFM designs found in this work are counter to conventional patterns for FBP and yield better performance in terms of the maxi-min objective, suggesting opportunities for improved image quality and/or dose reduction when model-based reconstructions are applied in conjunction with FFM.

  6. Simple Penalties on Maximum-Likelihood Estimates of Genetic Parameters to Reduce Sampling Variation

    PubMed Central

    Meyer, Karin

    2016-01-01

    Multivariate estimates of genetic parameters are subject to substantial sampling variation, especially for smaller data sets and more than a few traits. A simple modification of standard, maximum-likelihood procedures for multivariate analyses to estimate genetic covariances is described, which can improve estimates by substantially reducing their sampling variances. This is achieved by maximizing the likelihood subject to a penalty. Borrowing from Bayesian principles, we propose a mild, default penalty—derived assuming a Beta distribution of scale-free functions of the covariance components to be estimated—rather than laboriously attempting to determine the stringency of penalization from the data. An extensive simulation study is presented, demonstrating that such penalties can yield very worthwhile reductions in loss, i.e., the difference from population values, for a wide range of scenarios and without distorting estimates of phenotypic covariances. Moreover, mild default penalties tend not to increase loss in difficult cases and, on average, achieve reductions in loss of similar magnitude to computationally demanding schemes to optimize the degree of penalization. Pertinent details required for the adaptation of standard algorithms to locate the maximum of the likelihood function are outlined. PMID:27317681

  7. Will hypertension performance measures used for pay-for-performance programs penalize those who care for medically complex patients?

    PubMed

    Petersen, Laura A; Woodard, Lechauncy D; Henderson, Louise M; Urech, Tracy H; Pietz, Kenneth

    2009-06-16

    There is concern that performance measures, patient ratings of their care, and pay-for-performance programs may penalize healthcare providers of patients with multiple chronic coexisting conditions. We examined the impact of coexisting conditions on the quality of care for hypertension and patient perception of overall quality of their health care. We classified 141 609 veterans with hypertension into 4 condition groups: those with hypertension-concordant (diabetes mellitus, ischemic heart disease, dyslipidemia) and/or -discordant (arthritis, depression, chronic obstructive pulmonary disease) conditions or neither. We measured blood pressure control at the index visit, overall good quality of care for hypertension, including a follow-up interval, and patient ratings of satisfaction with their care. Associations between condition type and number of coexisting conditions on receipt of overall good quality of care were assessed with logistic regression. The relationship between patient assessment and objective measures of quality was assessed. Of the cohort, 49.5% had concordant-only comorbidities, 8.7% had discordant-only comorbidities, 25.9% had both, and 16.0% had none. Odds of receiving overall good quality after adjustment for age were higher for those with concordant comorbidities (odds ratio, 1.78; 95% confidence interval, 1.70 to 1.87), discordant comorbidities (odds ratio, 1.32; 95% confidence interval, 1.23 to 1.41), or both (odds ratio, 2.25; 95% confidence interval, 2.13 to 2.38) compared with neither. Findings did not change after adjustment for illness severity and/or number of primary care and specialty care visits. Patient assessment of quality did not vary by the presence of coexisting conditions and was not related to objective ratings of quality of care. Contrary to expectations, patients with greater complexity had higher odds of receiving high-quality care for hypertension. Subjective ratings of care did not vary with the presence or absence of comorbid conditions. Our findings should be reassuring to those who care for the most medically complex patients and are concerned that they will be penalized by performance measures or patient ratings of their care.

  8. On testing an unspecified function through a linear mixed effects model with multiple variance components

    PubMed Central

    Wang, Yuanjia; Chen, Huaihou

    2012-01-01

    Summary We examine a generalized F-test of a nonparametric function through penalized splines and a linear mixed effects model representation. With a mixed effects model representation of penalized splines, we imbed the test of an unspecified function into a test of some fixed effects and a variance component in a linear mixed effects model with nuisance variance components under the null. The procedure can be used to test a nonparametric function or varying-coefficient with clustered data, compare two spline functions, test the significance of an unspecified function in an additive model with multiple components, and test a row or a column effect in a two-way analysis of variance model. Through a spectral decomposition of the residual sum of squares, we provide a fast algorithm for computing the null distribution of the test, which significantly improves the computational efficiency over bootstrap. The spectral representation reveals a connection between the likelihood ratio test (LRT) in a multiple variance components model and a single component model. We examine our methods through simulations, where we show that the power of the generalized F-test may be higher than the LRT, depending on the hypothesis of interest and the true model under the alternative. We apply these methods to compute the genome-wide critical value and p-value of a genetic association test in a genome-wide association study (GWAS), where the usual bootstrap is computationally intensive (up to 108 simulations) and asymptotic approximation may be unreliable and conservative. PMID:23020801

  9. On testing an unspecified function through a linear mixed effects model with multiple variance components.

    PubMed

    Wang, Yuanjia; Chen, Huaihou

    2012-12-01

    We examine a generalized F-test of a nonparametric function through penalized splines and a linear mixed effects model representation. With a mixed effects model representation of penalized splines, we imbed the test of an unspecified function into a test of some fixed effects and a variance component in a linear mixed effects model with nuisance variance components under the null. The procedure can be used to test a nonparametric function or varying-coefficient with clustered data, compare two spline functions, test the significance of an unspecified function in an additive model with multiple components, and test a row or a column effect in a two-way analysis of variance model. Through a spectral decomposition of the residual sum of squares, we provide a fast algorithm for computing the null distribution of the test, which significantly improves the computational efficiency over bootstrap. The spectral representation reveals a connection between the likelihood ratio test (LRT) in a multiple variance components model and a single component model. We examine our methods through simulations, where we show that the power of the generalized F-test may be higher than the LRT, depending on the hypothesis of interest and the true model under the alternative. We apply these methods to compute the genome-wide critical value and p-value of a genetic association test in a genome-wide association study (GWAS), where the usual bootstrap is computationally intensive (up to 10(8) simulations) and asymptotic approximation may be unreliable and conservative. © 2012, The International Biometric Society.

  10. Trends in stratospheric ozone profiles using functional mixed models

    NASA Astrophysics Data System (ADS)

    Park, A.; Guillas, S.; Petropavlovskikh, I.

    2013-11-01

    This paper is devoted to the modeling of altitude-dependent patterns of ozone variations over time. Umkehr ozone profiles (quarter of Umkehr layer) from 1978 to 2011 are investigated at two locations: Boulder (USA) and Arosa (Switzerland). The study consists of two statistical stages. First we approximate ozone profiles employing an appropriate basis. To capture primary modes of ozone variations without losing essential information, a functional principal component analysis is performed. It penalizes roughness of the function and smooths excessive variations in the shape of the ozone profiles. As a result, data-driven basis functions (empirical basis functions) are obtained. The coefficients (principal component scores) corresponding to the empirical basis functions represent dominant temporal evolution in the shape of ozone profiles. We use those time series coefficients in the second statistical step to reveal the important sources of the patterns and variations in the profiles. We estimate the effects of covariates - month, year (trend), quasi-biennial oscillation, the solar cycle, the Arctic oscillation, the El Niño/Southern Oscillation cycle and the Eliassen-Palm flux - on the principal component scores of ozone profiles using additive mixed effects models. The effects are represented as smooth functions and the smooth functions are estimated by penalized regression splines. We also impose a heteroscedastic error structure that reflects the observed seasonality in the errors. The more complex error structure enables us to provide more accurate estimates of influences and trends, together with enhanced uncertainty quantification. Also, we are able to capture fine variations in the time evolution of the profiles, such as the semi-annual oscillation. We conclude by showing the trends by altitude over Boulder and Arosa, as well as for total column ozone. There are great variations in the trends across altitudes, which highlights the benefits of modeling ozone profiles.

  11. Task-Driven Optimization of Fluence Field and Regularization for Model-Based Iterative Reconstruction in Computed Tomography.

    PubMed

    Gang, Grace J; Siewerdsen, Jeffrey H; Stayman, J Webster

    2017-12-01

    This paper presents a joint optimization of dynamic fluence field modulation (FFM) and regularization in quadratic penalized-likelihood reconstruction that maximizes a task-based imaging performance metric. We adopted a task-driven imaging framework for prospective designs of the imaging parameters. A maxi-min objective function was adopted to maximize the minimum detectability index ( ) throughout the image. The optimization algorithm alternates between FFM (represented by low-dimensional basis functions) and local regularization (including the regularization strength and directional penalty weights). The task-driven approach was compared with three FFM strategies commonly proposed for FBP reconstruction (as well as a task-driven TCM strategy) for a discrimination task in an abdomen phantom. The task-driven FFM assigned more fluence to less attenuating anteroposterior views and yielded approximately constant fluence behind the object. The optimal regularization was almost uniform throughout image. Furthermore, the task-driven FFM strategy redistribute fluence across detector elements in order to prescribe more fluence to the more attenuating central region of the phantom. Compared with all strategies, the task-driven FFM strategy not only improved minimum by at least 17.8%, but yielded higher over a large area inside the object. The optimal FFM was highly dependent on the amount of regularization, indicating the importance of a joint optimization. Sample reconstructions of simulated data generally support the performance estimates based on computed . The improvements in detectability show the potential of the task-driven imaging framework to improve imaging performance at a fixed dose, or, equivalently, to provide a similar level of performance at reduced dose.

  12. A Penalized Likelihood Framework For High-Dimensional Phylogenetic Comparative Methods And An Application To New-World Monkeys Brain Evolution.

    PubMed

    Julien, Clavel; Leandro, Aristide; Hélène, Morlon

    2018-06-19

    Working with high-dimensional phylogenetic comparative datasets is challenging because likelihood-based multivariate methods suffer from low statistical performances as the number of traits p approaches the number of species n and because some computational complications occur when p exceeds n. Alternative phylogenetic comparative methods have recently been proposed to deal with the large p small n scenario but their use and performances are limited. Here we develop a penalized likelihood framework to deal with high-dimensional comparative datasets. We propose various penalizations and methods for selecting the intensity of the penalties. We apply this general framework to the estimation of parameters (the evolutionary trait covariance matrix and parameters of the evolutionary model) and model comparison for the high-dimensional multivariate Brownian (BM), Early-burst (EB), Ornstein-Uhlenbeck (OU) and Pagel's lambda models. We show using simulations that our penalized likelihood approach dramatically improves the estimation of evolutionary trait covariance matrices and model parameters when p approaches n, and allows for their accurate estimation when p equals or exceeds n. In addition, we show that penalized likelihood models can be efficiently compared using Generalized Information Criterion (GIC). We implement these methods, as well as the related estimation of ancestral states and the computation of phylogenetic PCA in the R package RPANDA and mvMORPH. Finally, we illustrate the utility of the new proposed framework by evaluating evolutionary models fit, analyzing integration patterns, and reconstructing evolutionary trajectories for a high-dimensional 3-D dataset of brain shape in the New World monkeys. We find a clear support for an Early-burst model suggesting an early diversification of brain morphology during the ecological radiation of the clade. Penalized likelihood offers an efficient way to deal with high-dimensional multivariate comparative data.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gang, G; Siewerdsen, J; Stayman, J

    Purpose: There has been increasing interest in integrating fluence field modulation (FFM) devices with diagnostic CT scanners for dose reduction purposes. Conventional FFM strategies, however, are often either based on heuristics or the analysis of filtered-backprojection (FBP) performance. This work investigates a prospective task-driven optimization of FFM for model-based iterative reconstruction (MBIR) in order to improve imaging performance at the same total dose as conventional strategies. Methods: The task-driven optimization framework utilizes an ultra-low dose 3D scout as a patient-specific anatomical model and a mathematical formation of the imaging task. The MBIR method investigated is quadratically penalized-likelihood reconstruction. The FFMmore » objective function uses detectability index, d’, computed as a function of the predicted spatial resolution and noise in the image. To optimize performance throughout the object, a maxi-min objective was adopted where the minimum d’ over multiple locations is maximized. To reduce the dimensionality of the problem, FFM is parameterized as a linear combination of 2D Gaussian basis functions over horizontal detector pixels and projection angles. The coefficients of these bases are found using the covariance matrix adaptation evolution strategy (CMA-ES) algorithm. The task-driven design was compared with three other strategies proposed for FBP reconstruction for a calcification cluster discrimination task in an abdomen phantom. Results: The task-driven optimization yielded FFM that was significantly different from those designed for FBP. Comparing all four strategies, the task-based design achieved the highest minimum d’ with an 8–48% improvement, consistent with the maxi-min objective. In addition, d’ was improved to a greater extent over a larger area within the entire phantom. Conclusion: Results from this investigation suggests the need to re-evaluate conventional FFM strategies for MBIR. The task-based optimization framework provides a promising approach that maximizes imaging performance under the same total dose constraint.« less

  14. mPLR-Loc: an adaptive decision multi-label classifier based on penalized logistic regression for protein subcellular localization prediction.

    PubMed

    Wan, Shibiao; Mak, Man-Wai; Kung, Sun-Yuan

    2015-03-15

    Proteins located in appropriate cellular compartments are of paramount importance to exert their biological functions. Prediction of protein subcellular localization by computational methods is required in the post-genomic era. Recent studies have been focusing on predicting not only single-location proteins but also multi-location proteins. However, most of the existing predictors are far from effective for tackling the challenges of multi-label proteins. This article proposes an efficient multi-label predictor, namely mPLR-Loc, based on penalized logistic regression and adaptive decisions for predicting both single- and multi-location proteins. Specifically, for each query protein, mPLR-Loc exploits the information from the Gene Ontology (GO) database by using its accession number (AC) or the ACs of its homologs obtained via BLAST. The frequencies of GO occurrences are used to construct feature vectors, which are then classified by an adaptive decision-based multi-label penalized logistic regression classifier. Experimental results based on two recent stringent benchmark datasets (virus and plant) show that mPLR-Loc remarkably outperforms existing state-of-the-art multi-label predictors. In addition to being able to rapidly and accurately predict subcellular localization of single- and multi-label proteins, mPLR-Loc can also provide probabilistic confidence scores for the prediction decisions. For readers' convenience, the mPLR-Loc server is available online (http://bioinfo.eie.polyu.edu.hk/mPLRLocServer). Copyright © 2014 Elsevier Inc. All rights reserved.

  15. A Solution to Separation and Multicollinearity in Multiple Logistic Regression

    PubMed Central

    Shen, Jianzhao; Gao, Sujuan

    2010-01-01

    In dementia screening tests, item selection for shortening an existing screening test can be achieved using multiple logistic regression. However, maximum likelihood estimates for such logistic regression models often experience serious bias or even non-existence because of separation and multicollinearity problems resulting from a large number of highly correlated items. Firth (1993, Biometrika, 80(1), 27–38) proposed a penalized likelihood estimator for generalized linear models and it was shown to reduce bias and the non-existence problems. The ridge regression has been used in logistic regression to stabilize the estimates in cases of multicollinearity. However, neither solves the problems for each other. In this paper, we propose a double penalized maximum likelihood estimator combining Firth’s penalized likelihood equation with a ridge parameter. We present a simulation study evaluating the empirical performance of the double penalized likelihood estimator in small to moderate sample sizes. We demonstrate the proposed approach using a current screening data from a community-based dementia study. PMID:20376286

  16. A Solution to Separation and Multicollinearity in Multiple Logistic Regression.

    PubMed

    Shen, Jianzhao; Gao, Sujuan

    2008-10-01

    In dementia screening tests, item selection for shortening an existing screening test can be achieved using multiple logistic regression. However, maximum likelihood estimates for such logistic regression models often experience serious bias or even non-existence because of separation and multicollinearity problems resulting from a large number of highly correlated items. Firth (1993, Biometrika, 80(1), 27-38) proposed a penalized likelihood estimator for generalized linear models and it was shown to reduce bias and the non-existence problems. The ridge regression has been used in logistic regression to stabilize the estimates in cases of multicollinearity. However, neither solves the problems for each other. In this paper, we propose a double penalized maximum likelihood estimator combining Firth's penalized likelihood equation with a ridge parameter. We present a simulation study evaluating the empirical performance of the double penalized likelihood estimator in small to moderate sample sizes. We demonstrate the proposed approach using a current screening data from a community-based dementia study.

  17. Evaluation of Penalized and Nonpenalized Methods for Disease Prediction with Large-Scale Genetic Data.

    PubMed

    Won, Sungho; Choi, Hosik; Park, Suyeon; Lee, Juyoung; Park, Changyi; Kwon, Sunghoon

    2015-01-01

    Owing to recent improvement of genotyping technology, large-scale genetic data can be utilized to identify disease susceptibility loci and this successful finding has substantially improved our understanding of complex diseases. However, in spite of these successes, most of the genetic effects for many complex diseases were found to be very small, which have been a big hurdle to build disease prediction model. Recently, many statistical methods based on penalized regressions have been proposed to tackle the so-called "large P and small N" problem. Penalized regressions including least absolute selection and shrinkage operator (LASSO) and ridge regression limit the space of parameters, and this constraint enables the estimation of effects for very large number of SNPs. Various extensions have been suggested, and, in this report, we compare their accuracy by applying them to several complex diseases. Our results show that penalized regressions are usually robust and provide better accuracy than the existing methods for at least diseases under consideration.

  18. A penalized framework for distributed lag non-linear models.

    PubMed

    Gasparrini, Antonio; Scheipl, Fabian; Armstrong, Ben; Kenward, Michael G

    2017-09-01

    Distributed lag non-linear models (DLNMs) are a modelling tool for describing potentially non-linear and delayed dependencies. Here, we illustrate an extension of the DLNM framework through the use of penalized splines within generalized additive models (GAM). This extension offers built-in model selection procedures and the possibility of accommodating assumptions on the shape of the lag structure through specific penalties. In addition, this framework includes, as special cases, simpler models previously proposed for linear relationships (DLMs). Alternative versions of penalized DLNMs are compared with each other and with the standard unpenalized version in a simulation study. Results show that this penalized extension to the DLNM class provides greater flexibility and improved inferential properties. The framework exploits recent theoretical developments of GAMs and is implemented using efficient routines within freely available software. Real-data applications are illustrated through two reproducible examples in time series and survival analysis. © 2017 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  19. Estimating and Identifying Unspecified Correlation Structure for Longitudinal Data

    PubMed Central

    Hu, Jianhua; Wang, Peng; Qu, Annie

    2014-01-01

    Identifying correlation structure is important to achieving estimation efficiency in analyzing longitudinal data, and is also crucial for drawing valid statistical inference for large size clustered data. In this paper, we propose a nonparametric method to estimate the correlation structure, which is applicable for discrete longitudinal data. We utilize eigenvector-based basis matrices to approximate the inverse of the empirical correlation matrix and determine the number of basis matrices via model selection. A penalized objective function based on the difference between the empirical and model approximation of the correlation matrices is adopted to select an informative structure for the correlation matrix. The eigenvector representation of the correlation estimation is capable of reducing the risk of model misspecification, and also provides useful information on the specific within-cluster correlation pattern of the data. We show that the proposed method possesses the oracle property and selects the true correlation structure consistently. The proposed method is illustrated through simulations and two data examples on air pollution and sonar signal studies. PMID:26361433

  20. Symmetrical and overloaded effect of diffusion in information filtering

    NASA Astrophysics Data System (ADS)

    Zhu, Xuzhen; Tian, Hui; Chen, Guilin; Cai, Shimin

    2017-10-01

    In physical dynamics, mass diffusion theory has been applied to design effective information filtering models on bipartite network. In previous works, researchers unilaterally believe objects' similarities are determined by single directional mass diffusion from the collected object to the uncollected, meanwhile, inadvertently ignore adverse influence of diffusion overload. It in some extent veils the essence of diffusion in physical dynamics and hurts the recommendation accuracy and diversity. After delicate investigation, we argue that symmetrical diffusion effectively discloses essence of mass diffusion, and high diffusion overload should be published. Accordingly, in this paper, we propose an symmetrical and overload penalized diffusion based model (SOPD), which shows excellent performances in extensive experiments on benchmark datasets Movielens and Netflix.

  1. Compound Identification Using Penalized Linear Regression on Metabolomics

    PubMed Central

    Liu, Ruiqi; Wu, Dongfeng; Zhang, Xiang; Kim, Seongho

    2014-01-01

    Compound identification is often achieved by matching the experimental mass spectra to the mass spectra stored in a reference library based on mass spectral similarity. Because the number of compounds in the reference library is much larger than the range of mass-to-charge ratio (m/z) values so that the data become high dimensional data suffering from singularity. For this reason, penalized linear regressions such as ridge regression and the lasso are used instead of the ordinary least squares regression. Furthermore, two-step approaches using the dot product and Pearson’s correlation along with the penalized linear regression are proposed in this study. PMID:27212894

  2. Dissecting gene-environment interactions: A penalized robust approach accounting for hierarchical structures.

    PubMed

    Wu, Cen; Jiang, Yu; Ren, Jie; Cui, Yuehua; Ma, Shuangge

    2018-02-10

    Identification of gene-environment (G × E) interactions associated with disease phenotypes has posed a great challenge in high-throughput cancer studies. The existing marginal identification methods have suffered from not being able to accommodate the joint effects of a large number of genetic variants, while some of the joint-effect methods have been limited by failing to respect the "main effects, interactions" hierarchy, by ignoring data contamination, and by using inefficient selection techniques under complex structural sparsity. In this article, we develop an effective penalization approach to identify important G × E interactions and main effects, which can account for the hierarchical structures of the 2 types of effects. Possible data contamination is accommodated by adopting the least absolute deviation loss function. The advantage of the proposed approach over the alternatives is convincingly demonstrated in both simulation and a case study on lung cancer prognosis with gene expression measurements and clinical covariates under the accelerated failure time model. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Dense image registration through MRFs and efficient linear programming.

    PubMed

    Glocker, Ben; Komodakis, Nikos; Tziritas, Georgios; Navab, Nassir; Paragios, Nikos

    2008-12-01

    In this paper, we introduce a novel and efficient approach to dense image registration, which does not require a derivative of the employed cost function. In such a context, the registration problem is formulated using a discrete Markov random field objective function. First, towards dimensionality reduction on the variables we assume that the dense deformation field can be expressed using a small number of control points (registration grid) and an interpolation strategy. Then, the registration cost is expressed using a discrete sum over image costs (using an arbitrary similarity measure) projected on the control points, and a smoothness term that penalizes local deviations on the deformation field according to a neighborhood system on the grid. Towards a discrete approach, the search space is quantized resulting in a fully discrete model. In order to account for large deformations and produce results on a high resolution level, a multi-scale incremental approach is considered where the optimal solution is iteratively updated. This is done through successive morphings of the source towards the target image. Efficient linear programming using the primal dual principles is considered to recover the lowest potential of the cost function. Very promising results using synthetic data with known deformations and real data demonstrate the potentials of our approach.

  4. Non-Asymptotic Oracle Inequalities for the High-Dimensional Cox Regression via Lasso.

    PubMed

    Kong, Shengchun; Nan, Bin

    2014-01-01

    We consider finite sample properties of the regularized high-dimensional Cox regression via lasso. Existing literature focuses on linear models or generalized linear models with Lipschitz loss functions, where the empirical risk functions are the summations of independent and identically distributed (iid) losses. The summands in the negative log partial likelihood function for censored survival data, however, are neither iid nor Lipschitz.We first approximate the negative log partial likelihood function by a sum of iid non-Lipschitz terms, then derive the non-asymptotic oracle inequalities for the lasso penalized Cox regression using pointwise arguments to tackle the difficulties caused by lacking iid Lipschitz losses.

  5. Non-Asymptotic Oracle Inequalities for the High-Dimensional Cox Regression via Lasso

    PubMed Central

    Kong, Shengchun; Nan, Bin

    2013-01-01

    We consider finite sample properties of the regularized high-dimensional Cox regression via lasso. Existing literature focuses on linear models or generalized linear models with Lipschitz loss functions, where the empirical risk functions are the summations of independent and identically distributed (iid) losses. The summands in the negative log partial likelihood function for censored survival data, however, are neither iid nor Lipschitz.We first approximate the negative log partial likelihood function by a sum of iid non-Lipschitz terms, then derive the non-asymptotic oracle inequalities for the lasso penalized Cox regression using pointwise arguments to tackle the difficulties caused by lacking iid Lipschitz losses. PMID:24516328

  6. Alignment of cryo-EM movies of individual particles by optimization of image translations.

    PubMed

    Rubinstein, John L; Brubaker, Marcus A

    2015-11-01

    Direct detector device (DDD) cameras have revolutionized single particle electron cryomicroscopy (cryo-EM). In addition to an improved camera detective quantum efficiency, acquisition of DDD movies allows for correction of movement of the specimen, due to both instabilities in the microscope specimen stage and electron beam-induced movement. Unlike specimen stage drift, beam-induced movement is not always homogeneous within an image. Local correlation in the trajectories of nearby particles suggests that beam-induced motion is due to deformation of the ice layer. Algorithms have already been described that can correct movement for large regions of frames and for >1 MDa protein particles. Another algorithm allows individual <1 MDa protein particle trajectories to be estimated, but requires rolling averages to be calculated from frames and fits linear trajectories for particles. Here we describe an algorithm that allows for individual <1 MDa particle images to be aligned without frame averaging or linear trajectories. The algorithm maximizes the overall correlation of the shifted frames with the sum of the shifted frames. The optimum in this single objective function is found efficiently by making use of analytically calculated derivatives of the function. To smooth estimates of particle trajectories, rapid changes in particle positions between frames are penalized in the objective function and weighted averaging of nearby trajectories ensures local correlation in trajectories. This individual particle motion correction, in combination with weighting of Fourier components to account for increasing radiation damage in later frames, can be used to improve 3-D maps from single particle cryo-EM. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Generating Impact Maps from Automatically Detected Bomb Craters in Aerial Wartime Images Using Marked Point Processes

    NASA Astrophysics Data System (ADS)

    Kruse, Christian; Rottensteiner, Franz; Hoberg, Thorsten; Ziems, Marcel; Rebke, Julia; Heipke, Christian

    2018-04-01

    The aftermath of wartime attacks is often felt long after the war ended, as numerous unexploded bombs may still exist in the ground. Typically, such areas are documented in so-called impact maps which are based on the detection of bomb craters. This paper proposes a method for the automatic detection of bomb craters in aerial wartime images that were taken during the Second World War. The object model for the bomb craters is represented by ellipses. A probabilistic approach based on marked point processes determines the most likely configuration of objects within the scene. Adding and removing new objects to and from the current configuration, respectively, changing their positions and modifying the ellipse parameters randomly creates new object configurations. Each configuration is evaluated using an energy function. High gradient magnitudes along the border of the ellipse are favored and overlapping ellipses are penalized. Reversible Jump Markov Chain Monte Carlo sampling in combination with simulated annealing provides the global energy optimum, which describes the conformance with a predefined model. For generating the impact map a probability map is defined which is created from the automatic detections via kernel density estimation. By setting a threshold, areas around the detections are classified as contaminated or uncontaminated sites, respectively. Our results show the general potential of the method for the automatic detection of bomb craters and its automated generation of an impact map in a heterogeneous image stock.

  8. Indirect Correspondence-Based Robust Extrinsic Calibration of LiDAR and Camera

    PubMed Central

    Sim, Sungdae; Sock, Juil; Kwak, Kiho

    2016-01-01

    LiDAR and cameras have been broadly utilized in computer vision and autonomous vehicle applications. However, in order to convert data between the local coordinate systems, we must estimate the rigid body transformation between the sensors. In this paper, we propose a robust extrinsic calibration algorithm that can be implemented easily and has small calibration error. The extrinsic calibration parameters are estimated by minimizing the distance between corresponding features projected onto the image plane. The features are edge and centerline features on a v-shaped calibration target. The proposed algorithm contributes two ways to improve the calibration accuracy. First, we use different weights to distance between a point and a line feature according to the correspondence accuracy of the features. Second, we apply a penalizing function to exclude the influence of outliers in the calibration datasets. Additionally, based on our robust calibration approach for a single LiDAR-camera pair, we introduce a joint calibration that estimates the extrinsic parameters of multiple sensors at once by minimizing one objective function with loop closing constraints. We conduct several experiments to evaluate the performance of our extrinsic calibration algorithm. The experimental results show that our calibration method has better performance than the other approaches. PMID:27338416

  9. An Efficient Augmented Lagrangian Method for Statistical X-Ray CT Image Reconstruction.

    PubMed

    Li, Jiaojiao; Niu, Shanzhou; Huang, Jing; Bian, Zhaoying; Feng, Qianjin; Yu, Gaohang; Liang, Zhengrong; Chen, Wufan; Ma, Jianhua

    2015-01-01

    Statistical iterative reconstruction (SIR) for X-ray computed tomography (CT) under the penalized weighted least-squares criteria can yield significant gains over conventional analytical reconstruction from the noisy measurement. However, due to the nonlinear expression of the objective function, most exiting algorithms related to the SIR unavoidably suffer from heavy computation load and slow convergence rate, especially when an edge-preserving or sparsity-based penalty or regularization is incorporated. In this work, to address abovementioned issues of the general algorithms related to the SIR, we propose an adaptive nonmonotone alternating direction algorithm in the framework of augmented Lagrangian multiplier method, which is termed as "ALM-ANAD". The algorithm effectively combines an alternating direction technique with an adaptive nonmonotone line search to minimize the augmented Lagrangian function at each iteration. To evaluate the present ALM-ANAD algorithm, both qualitative and quantitative studies were conducted by using digital and physical phantoms. Experimental results show that the present ALM-ANAD algorithm can achieve noticeable gains over the classical nonlinear conjugate gradient algorithm and state-of-the-art split Bregman algorithm in terms of noise reduction, contrast-to-noise ratio, convergence rate, and universal quality index metrics.

  10. Personalized recommendation based on preferential bidirectional mass diffusion

    NASA Astrophysics Data System (ADS)

    Chen, Guilin; Gao, Tianrun; Zhu, Xuzhen; Tian, Hui; Yang, Zhao

    2017-03-01

    Recommendation system provides a promising way to alleviate the dilemma of information overload. In physical dynamics, mass diffusion has been used to design effective recommendation algorithms on bipartite network. However, most of the previous studies focus overwhelmingly on unidirectional mass diffusion from collected objects to uncollected objects, while overlooking the opposite direction, leading to the risk of similarity estimation deviation and performance degradation. In addition, they are biased towards recommending popular objects which will not necessarily promote the accuracy but make the recommendation lack diversity and novelty that indeed contribute to the vitality of the system. To overcome the aforementioned disadvantages, we propose a preferential bidirectional mass diffusion (PBMD) algorithm by penalizing the weight of popular objects in bidirectional diffusion. Experiments are evaluated on three benchmark datasets (Movielens, Netflix and Amazon) by 10-fold cross validation, and results indicate that PBMD remarkably outperforms the mainstream methods in accuracy, diversity and novelty.

  11. NCAA Penalizes Fewer Teams than Expected

    ERIC Educational Resources Information Center

    Sander, Libby

    2008-01-01

    This article reports that the National Collegiate Athletic Association (NCAA) has penalized fewer teams than it expected this year over athletes' poor academic performance. For years, officials with the NCAA have predicted that strikingly high numbers of college sports teams could be at risk of losing scholarships this year because of their…

  12. The Role of the Environmental Health Specialist in the Penal and Correctional System

    ERIC Educational Resources Information Center

    Walker, Bailus, Jr.; Gordon, Theodore J.

    1976-01-01

    Implementing a health and hygiene program in penal systems necessitates coordinating the entire staff. Health specialists could participate in facility planning and management, policy formation, and evaluation of medical care, housekeeping, and food services. They could also serve as liaisons between correctional staff and governmental or…

  13. The Change Grid and the Active Client: Challenging the Assumptions of Change Agentry in the Penal Process.

    ERIC Educational Resources Information Center

    Klofas, John; Duffee, David E.

    1981-01-01

    Reexamines the assumptions of the change grid regarding the channeling of masses of clients into change strategies programs. Penal organizations specifically select and place clients so that programs remain stable, rather than sequence programs to meet the needs of clients. (Author)

  14. 27 CFR 19.246 - Strengthening bonds.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Strengthening bonds. 19... Strengthening bonds. In all cases when the penal sum of any bond becomes insufficient, the principal shall either give a strengthening bond with the same surety to attain a sufficient penal sum, or give a new...

  15. LINKING LUNG AIRWAY STRUCTURE TO PULMONARY FUNCTION VIA COMPOSITE BRIDGE REGRESSION

    PubMed Central

    Chen, Kun; Hoffman, Eric A.; Seetharaman, Indu; Jiao, Feiran; Lin, Ching-Long; Chan, Kung-Sik

    2017-01-01

    The human lung airway is a complex inverted tree-like structure. Detailed airway measurements can be extracted from MDCT-scanned lung images, such as segmental wall thickness, airway diameter, parent-child branch angles, etc. The wealth of lung airway data provides a unique opportunity for advancing our understanding of the fundamental structure-function relationships within the lung. An important problem is to construct and identify important lung airway features in normal subjects and connect these to standardized pulmonary function test results such as FEV1%. Among other things, the problem is complicated by the fact that a particular airway feature may be an important (relevant) predictor only when it pertains to segments of certain generations. Thus, the key is an efficient, consistent method for simultaneously conducting group selection (lung airway feature types) and within-group variable selection (airway generations), i.e., bi-level selection. Here we streamline a comprehensive procedure to process the lung airway data via imputation, normalization, transformation and groupwise principal component analysis, and then adopt a new composite penalized regression approach for conducting bi-level feature selection. As a prototype of composite penalization, the proposed composite bridge regression method is shown to admit an efficient algorithm, enjoy bi-level oracle properties, and outperform several existing methods. We analyze the MDCT lung image data from a cohort of 132 subjects with normal lung function. Our results show that, lung function in terms of FEV1% is promoted by having a less dense and more homogeneous lung comprising an airway whose segments enjoy more heterogeneity in wall thicknesses, larger mean diameters, lumen areas and branch angles. These data hold the potential of defining more accurately the “normal” subject population with borderline atypical lung functions that are clearly influenced by many genetic and environmental factors. PMID:28280520

  16. In comparative perspective: The effects of incarceration abroad on penal subjectivity among prisoners in Lithuania

    PubMed Central

    Slade, Gavin; Vaičiūnienė, Rūta

    2017-01-01

    This article looks at how global flows of people and policies affect penal subjectivity among prisoners in Lithuania. Those who had previously been incarcerated abroad perceive their punishment in Lithuania’s reforming penal system in comparative terms. We find that international prison experience may either diminish or increase the sense of the severity of the current punishment. Respondents often felt more comfortable in a familiar culture of punishment in Lithuania that emphasizes autonomy and communality. Moreover, internationalized prisoners perceive prison reform emulating West European models as a threat to this culture and are able to articulate comparative critiques of this reform and contest its effects. PMID:29568238

  17. SU-F-18C-14: Hessian-Based Norm Penalty for Weighted Least-Square CBCT Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, T; Sun, N; Tan, S

    Purpose: To develop a Hessian-based norm penalty for cone-beam CT (CBCT) reconstruction that has a similar ability in suppressing noise as the total variation (TV) penalty while avoiding the staircase effect and better preserving low-contrast objects. Methods: We extended the TV penalty to a Hessian-based norm penalty based on the Frobenius norm of the Hessian matrix of an image for CBCT reconstruction. The objective function was constructed using the penalized weighted least-square (PWLS) principle. An effective algorithm was developed to minimize the objective function using a majorization-minimization (MM) approach. We evaluated and compared the proposed penalty with the TV penaltymore » on a CatPhan 600 phantom and an anthropomorphic head phantom, each acquired at a low-dose protocol (10mA/10ms) and a high-dose protocol (80mA/12ms). For both penalties, contrast-to-noise (CNR) in four low-contrast regions-of-interest (ROIs) and the full-width-at-half-maximum (FWHM) of two point-like objects in constructed images were calculated and compared. Results: In the experiment of CatPhan 600 phantom, the Hessian-based norm penalty has slightly higher CNRs and approximately equivalent FWHM values compared with the TV penalty. In the experiment of the anthropomorphic head phantom at the low-dose protocol, the TV penalty result has several artificial piece-wise constant areas known as the staircase effect while in the Hessian-based norm penalty the image appears smoother and more similar to that of the FDK result using the high-dose protocol. Conclusion: The proposed Hessian-based norm penalty has a similar performance in suppressing noise to the TV penalty, but has a potential advantage in suppressing the staircase effect and preserving low-contrast objects. This work was supported in part by National Natural Science Foundation of China (NNSFC), under Grant Nos. 60971112 and 61375018, and Fundamental Research Funds for the Central Universities, under Grant No. 2012QN086.« less

  18. Robust learning for optimal treatment decision with NP-dimensionality

    PubMed Central

    Shi, Chengchun; Song, Rui; Lu, Wenbin

    2016-01-01

    In order to identify important variables that are involved in making optimal treatment decision, Lu, Zhang and Zeng (2013) proposed a penalized least squared regression framework for a fixed number of predictors, which is robust against the misspecification of the conditional mean model. Two problems arise: (i) in a world of explosively big data, effective methods are needed to handle ultra-high dimensional data set, for example, with the dimension of predictors is of the non-polynomial (NP) order of the sample size; (ii) both the propensity score and conditional mean models need to be estimated from data under NP dimensionality. In this paper, we propose a robust procedure for estimating the optimal treatment regime under NP dimensionality. In both steps, penalized regressions are employed with the non-concave penalty function, where the conditional mean model of the response given predictors may be misspecified. The asymptotic properties, such as weak oracle properties, selection consistency and oracle distributions, of the proposed estimators are investigated. In addition, we study the limiting distribution of the estimated value function for the obtained optimal treatment regime. The empirical performance of the proposed estimation method is evaluated by simulations and an application to a depression dataset from the STAR*D study. PMID:28781717

  19. Bias correction in the hierarchical likelihood approach to the analysis of multivariate survival data.

    PubMed

    Jeon, Jihyoun; Hsu, Li; Gorfine, Malka

    2012-07-01

    Frailty models are useful for measuring unobserved heterogeneity in risk of failures across clusters, providing cluster-specific risk prediction. In a frailty model, the latent frailties shared by members within a cluster are assumed to act multiplicatively on the hazard function. In order to obtain parameter and frailty variate estimates, we consider the hierarchical likelihood (H-likelihood) approach (Ha, Lee and Song, 2001. Hierarchical-likelihood approach for frailty models. Biometrika 88, 233-243) in which the latent frailties are treated as "parameters" and estimated jointly with other parameters of interest. We find that the H-likelihood estimators perform well when the censoring rate is low, however, they are substantially biased when the censoring rate is moderate to high. In this paper, we propose a simple and easy-to-implement bias correction method for the H-likelihood estimators under a shared frailty model. We also extend the method to a multivariate frailty model, which incorporates complex dependence structure within clusters. We conduct an extensive simulation study and show that the proposed approach performs very well for censoring rates as high as 80%. We also illustrate the method with a breast cancer data set. Since the H-likelihood is the same as the penalized likelihood function, the proposed bias correction method is also applicable to the penalized likelihood estimators.

  20. Unconditionally energy stable time stepping scheme for Cahn–Morral equation: Application to multi-component spinodal decomposition and optimal space tiling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tavakoli, Rouhollah, E-mail: rtavakoli@sharif.ir

    An unconditionally energy stable time stepping scheme is introduced to solve Cahn–Morral-like equations in the present study. It is constructed based on the combination of David Eyre's time stepping scheme and Schur complement approach. Although the presented method is general and independent of the choice of homogeneous free energy density function term, logarithmic and polynomial energy functions are specifically considered in this paper. The method is applied to study the spinodal decomposition in multi-component systems and optimal space tiling problems. A penalization strategy is developed, in the case of later problem, to avoid trivial solutions. Extensive numerical experiments demonstrate themore » success and performance of the presented method. According to the numerical results, the method is convergent and energy stable, independent of the choice of time stepsize. Its MATLAB implementation is included in the appendix for the numerical evaluation of algorithm and reproduction of the presented results. -- Highlights: •Extension of Eyre's convex–concave splitting scheme to multiphase systems. •Efficient solution of spinodal decomposition in multi-component systems. •Efficient solution of least perimeter periodic space partitioning problem. •Developing a penalization strategy to avoid trivial solutions. •Presentation of MATLAB implementation of the introduced algorithm.« less

  1. Setting the Standard. International Forum on Education in Penal Systems Conference Proceedings (Adelaide, Australia, April 5-7, 1998).

    ERIC Educational Resources Information Center

    Semmens, Bob, Ed.; Cook, Sandy, Ed.

    This document contains 19 papers presented at an international forum on education in penal systems. The following papers are included: "Burning" (Craig W.J. Minogue); "The Acquisition of Cognitive Skills as a Means of Recidivism Reduction: A Former Prisoner's Perspective" (Trevor Darryl Doherty); "CEA (Correctional…

  2. 27 CFR 24.153 - Strengthening bonds.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2012-04-01 2012-04-01 false Strengthening bonds. 24... Strengthening bonds. In any instance where the penal sum of the bond on file becomes insufficient, the principal shall either give a strengthening bond with the same surety to attain a sufficient penal sum or give a...

  3. 27 CFR 24.153 - Strengthening bonds.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2011-04-01 2011-04-01 false Strengthening bonds. 24... Strengthening bonds. In any instance where the penal sum of the bond on file becomes insufficient, the principal shall either give a strengthening bond with the same surety to attain a sufficient penal sum or give a...

  4. 27 CFR 24.153 - Strengthening bonds.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Strengthening bonds. 24... Strengthening bonds. In any instance where the penal sum of the bond on file becomes insufficient, the principal shall either give a strengthening bond with the same surety to attain a sufficient penal sum or give a...

  5. 27 CFR 24.153 - Strengthening bonds.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2013-04-01 2013-04-01 false Strengthening bonds. 24... Strengthening bonds. In any instance where the penal sum of the bond on file becomes insufficient, the principal shall either give a strengthening bond with the same surety to attain a sufficient penal sum or give a...

  6. 27 CFR 24.153 - Strengthening bonds.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2014-04-01 2014-04-01 false Strengthening bonds. 24... Strengthening bonds. In any instance where the penal sum of the bond on file becomes insufficient, the principal shall either give a strengthening bond with the same surety to attain a sufficient penal sum or give a...

  7. Crime and Punishment: Are Copyright Violators Ever Penalized?

    ERIC Educational Resources Information Center

    Russell, Carrie

    2004-01-01

    Is there a Web site that keeps track of copyright Infringers and fines? Some colleagues don't believe that copyright violators are ever penalized. This question was asked by a reader in a question and answer column of "School Library Journal". Carrie Russell is the American Library Association's copyright specialist, and she will answer selected…

  8. STRONG ORACLE OPTIMALITY OF FOLDED CONCAVE PENALIZED ESTIMATION.

    PubMed

    Fan, Jianqing; Xue, Lingzhou; Zou, Hui

    2014-06-01

    Folded concave penalization methods have been shown to enjoy the strong oracle property for high-dimensional sparse estimation. However, a folded concave penalization problem usually has multiple local solutions and the oracle property is established only for one of the unknown local solutions. A challenging fundamental issue still remains that it is not clear whether the local optimum computed by a given optimization algorithm possesses those nice theoretical properties. To close this important theoretical gap in over a decade, we provide a unified theory to show explicitly how to obtain the oracle solution via the local linear approximation algorithm. For a folded concave penalized estimation problem, we show that as long as the problem is localizable and the oracle estimator is well behaved, we can obtain the oracle estimator by using the one-step local linear approximation. In addition, once the oracle estimator is obtained, the local linear approximation algorithm converges, namely it produces the same estimator in the next iteration. The general theory is demonstrated by using four classical sparse estimation problems, i.e., sparse linear regression, sparse logistic regression, sparse precision matrix estimation and sparse quantile regression.

  9. STRONG ORACLE OPTIMALITY OF FOLDED CONCAVE PENALIZED ESTIMATION

    PubMed Central

    Fan, Jianqing; Xue, Lingzhou; Zou, Hui

    2014-01-01

    Folded concave penalization methods have been shown to enjoy the strong oracle property for high-dimensional sparse estimation. However, a folded concave penalization problem usually has multiple local solutions and the oracle property is established only for one of the unknown local solutions. A challenging fundamental issue still remains that it is not clear whether the local optimum computed by a given optimization algorithm possesses those nice theoretical properties. To close this important theoretical gap in over a decade, we provide a unified theory to show explicitly how to obtain the oracle solution via the local linear approximation algorithm. For a folded concave penalized estimation problem, we show that as long as the problem is localizable and the oracle estimator is well behaved, we can obtain the oracle estimator by using the one-step local linear approximation. In addition, once the oracle estimator is obtained, the local linear approximation algorithm converges, namely it produces the same estimator in the next iteration. The general theory is demonstrated by using four classical sparse estimation problems, i.e., sparse linear regression, sparse logistic regression, sparse precision matrix estimation and sparse quantile regression. PMID:25598560

  10. Functional linear models to test for differences in prairie wetland hydraulic gradients

    USGS Publications Warehouse

    Greenwood, Mark C.; Sojda, Richard S.; Preston, Todd M.; Swayne, David A.; Yang, Wanhong; Voinov, A.A.; Rizzoli, A.; Filatova, T.

    2010-01-01

    Functional data analysis provides a framework for analyzing multiple time series measured frequently in time, treating each series as a continuous function of time. Functional linear models are used to test for effects on hydraulic gradient functional responses collected from three types of land use in Northeastern Montana at fourteen locations. Penalized regression-splines are used to estimate the underlying continuous functions based on the discretely recorded (over time) gradient measurements. Permutation methods are used to assess the statistical significance of effects. A method for accommodating missing observations in each time series is described. Hydraulic gradients may be an initial and fundamental ecosystem process that responds to climate change. We suggest other potential uses of these methods for detecting evidence of climate change.

  11. LBP-based penalized weighted least-squares approach to low-dose cone-beam computed tomography reconstruction

    NASA Astrophysics Data System (ADS)

    Ma, Ming; Wang, Huafeng; Liu, Yan; Zhang, Hao; Gu, Xianfeng; Liang, Zhengrong

    2014-03-01

    Cone-beam computed tomography (CBCT) has attracted growing interest of researchers in image reconstruction. The mAs level of the X-ray tube current, in practical application of CBCT, is mitigated in order to reduce the CBCT dose. The lowering of the X-ray tube current, however, results in the degradation of image quality. Thus, low-dose CBCT image reconstruction is in effect a noise problem. To acquire clinically acceptable quality of image, and keep the X-ray tube current as low as achievable in the meanwhile, some penalized weighted least-squares (PWLS)-based image reconstruction algorithms have been developed. One representative strategy in previous work is to model the prior information for solution regularization using an anisotropic penalty term. To enhance the edge preserving and noise suppressing in a finer scale, a novel algorithm combining the local binary pattern (LBP) with penalized weighted leastsquares (PWLS), called LBP-PWLS-based image reconstruction algorithm, is proposed in this work. The proposed LBP-PWLS-based algorithm adaptively encourages strong diffusion on the local spot/flat region around a voxel and less diffusion on edge/corner ones by adjusting the penalty for cost function, after the LBP is utilized to detect the region around the voxel as spot, flat and edge ones. The LBP-PWLS-based reconstruction algorithm was evaluated using the sinogram data acquired by a clinical CT scanner from the CatPhan® 600 phantom. Experimental results on the noiseresolution tradeoff measurement and other quantitative measurements demonstrated its feasibility and effectiveness in edge preserving and noise suppressing in comparison with a previous PWLS reconstruction algorithm.

  12. Estimation and model selection of semiparametric multivariate survival functions under general censorship.

    PubMed

    Chen, Xiaohong; Fan, Yanqin; Pouzo, Demian; Ying, Zhiliang

    2010-07-01

    We study estimation and model selection of semiparametric models of multivariate survival functions for censored data, which are characterized by possibly misspecified parametric copulas and nonparametric marginal survivals. We obtain the consistency and root- n asymptotic normality of a two-step copula estimator to the pseudo-true copula parameter value according to KLIC, and provide a simple consistent estimator of its asymptotic variance, allowing for a first-step nonparametric estimation of the marginal survivals. We establish the asymptotic distribution of the penalized pseudo-likelihood ratio statistic for comparing multiple semiparametric multivariate survival functions subject to copula misspecification and general censorship. An empirical application is provided.

  13. Estimation and model selection of semiparametric multivariate survival functions under general censorship

    PubMed Central

    Chen, Xiaohong; Fan, Yanqin; Pouzo, Demian; Ying, Zhiliang

    2013-01-01

    We study estimation and model selection of semiparametric models of multivariate survival functions for censored data, which are characterized by possibly misspecified parametric copulas and nonparametric marginal survivals. We obtain the consistency and root-n asymptotic normality of a two-step copula estimator to the pseudo-true copula parameter value according to KLIC, and provide a simple consistent estimator of its asymptotic variance, allowing for a first-step nonparametric estimation of the marginal survivals. We establish the asymptotic distribution of the penalized pseudo-likelihood ratio statistic for comparing multiple semiparametric multivariate survival functions subject to copula misspecification and general censorship. An empirical application is provided. PMID:24790286

  14. 36 CFR 1200.16 - Will I be penalized for misusing the official seals and logos?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... misusing the official seals and logos? 1200.16 Section 1200.16 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION GENERAL RULES OFFICIAL SEALS Penalties for Misuse of NARA Seals and Logos § 1200.16 Will I be penalized for misusing the official seals and logos? (a) Seals. (1) If you...

  15. 36 CFR 1200.16 - Will I be penalized for misusing the official seals and logos?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... misusing the official seals and logos? 1200.16 Section 1200.16 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION GENERAL RULES OFFICIAL SEALS Penalties for Misuse of NARA Seals and Logos § 1200.16 Will I be penalized for misusing the official seals and logos? (a) Seals. (1) If you...

  16. 27 CFR 24.148 - Penal sums of bonds.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Penal sums of bonds. 24.148 Section 24.148 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE BUREAU... Vinegar Plant Bond, TTB F 5510.2 Not less than the tax on all wine on hand, in transit, or unaccounted for...

  17. [Assisted reproduction and artificial insemination and genetic manipulation in the Criminal Code of the Federal District, Mexico].

    PubMed

    Brena Sesma, Ingrid

    2004-01-01

    The article that one presents has for purpose outline and comment on the recent modifications to the Penal Code for the Federal District of México which establish, for the first time, crimes related to the artificial procreation and to the genetic manipulation. Also one refers to the interaction of the new legal texts with the sanitary legislation of the country. Since it will be stated in some cases they present confrontations between the penal and the sanitary reglamentation and some points related to the legality or unlawfulness of a conduct that stayed without the enough development. These lacks will complicate the application of the new rules of the Penal Code of the Federal District.

  18. Madness and crime: Zefinha, the longest confined woman in Brazil.

    PubMed

    Diniz, Debora; Brito, Luciana

    2016-01-01

    Living in a forensic hospital for the last 38 years, Josefa da Silva is the longest female inhabitant surviving the penal and psychiatric regime in Brazil. This paper analyses dossier, judicial proceedings, interviews and photographs about her. The psychiatric report is the key component of the medical and penal doubling of criminal insanity. Twelve psychiatric reports illustrate three time frames of the court files: abnormality, danger, and abandonment. The psychiatric authority over confinement has moved from discipline to security, and from disciplinary security to social assistance. In the arrangement between the penal and psychiatric powers, the judge recognizes the medical authority over the truth of insanity. It is the medicine of the reasons for Zefinha's internment that altered over the decades.

  19. Sinogram restoration in computed tomography with an edge-preserving penalty

    PubMed Central

    Little, Kevin J.; La Rivière, Patrick J.

    2015-01-01

    Purpose: With the goal of producing a less computationally intensive alternative to fully iterative penalized-likelihood image reconstruction, our group has explored the use of penalized-likelihood sinogram restoration for transmission tomography. Previously, we have exclusively used a quadratic penalty in our restoration objective function. However, a quadratic penalty does not excel at preserving edges while reducing noise. Here, we derive a restoration update equation for nonquadratic penalties. Additionally, we perform a feasibility study to extend our sinogram restoration method to a helical cone-beam geometry and clinical data. Methods: A restoration update equation for nonquadratic penalties is derived using separable parabolic surrogates (SPS). A method for calculating sinogram degradation coefficients for a helical cone-beam geometry is proposed. Using simulated data, sinogram restorations are performed using both a quadratic penalty and the edge-preserving Huber penalty. After sinogram restoration, Fourier-based analytical methods are used to obtain reconstructions, and resolution-noise trade-offs are investigated. For the fan-beam geometry, a comparison is made to image-domain SPS reconstruction using the Huber penalty. The effects of varying object size and contrast are also investigated. For the helical cone-beam geometry, we investigate the effect of helical pitch (axial movement/rotation). Huber-penalty sinogram restoration is performed on 3D clinical data, and the reconstructed images are compared to those generated with no restoration. Results: We find that by applying the edge-preserving Huber penalty to our sinogram restoration methods, the reconstructed image has a better resolution-noise relationship than an image produced using a quadratic penalty in the sinogram restoration. However, we find that this relatively straightforward approach to edge preservation in the sinogram domain is affected by the physical size of imaged objects in addition to the contrast across the edge. This presents some disadvantages of this method relative to image-domain edge-preserving methods, although the computational burden of the sinogram-domain approach is much lower. For a helical cone-beam geometry, we found applying sinogram restoration in 3D was reasonable and that pitch did not make a significant difference in the general effect of sinogram restoration. The application of Huber-penalty sinogram restoration to clinical data resulted in a reconstruction with less noise while retaining resolution. Conclusions: Sinogram restoration with the Huber penalty is able to provide better resolution-noise performance than restoration with a quadratic penalty. Additionally, sinogram restoration with the Huber penalty is feasible for helical cone-beam CT and can be applied to clinical data. PMID:25735286

  20. Sinogram restoration in computed tomography with an edge-preserving penalty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Little, Kevin J., E-mail: little@uchicago.edu; La Rivière, Patrick J.

    2015-03-15

    Purpose: With the goal of producing a less computationally intensive alternative to fully iterative penalized-likelihood image reconstruction, our group has explored the use of penalized-likelihood sinogram restoration for transmission tomography. Previously, we have exclusively used a quadratic penalty in our restoration objective function. However, a quadratic penalty does not excel at preserving edges while reducing noise. Here, we derive a restoration update equation for nonquadratic penalties. Additionally, we perform a feasibility study to extend our sinogram restoration method to a helical cone-beam geometry and clinical data. Methods: A restoration update equation for nonquadratic penalties is derived using separable parabolic surrogatesmore » (SPS). A method for calculating sinogram degradation coefficients for a helical cone-beam geometry is proposed. Using simulated data, sinogram restorations are performed using both a quadratic penalty and the edge-preserving Huber penalty. After sinogram restoration, Fourier-based analytical methods are used to obtain reconstructions, and resolution-noise trade-offs are investigated. For the fan-beam geometry, a comparison is made to image-domain SPS reconstruction using the Huber penalty. The effects of varying object size and contrast are also investigated. For the helical cone-beam geometry, we investigate the effect of helical pitch (axial movement/rotation). Huber-penalty sinogram restoration is performed on 3D clinical data, and the reconstructed images are compared to those generated with no restoration. Results: We find that by applying the edge-preserving Huber penalty to our sinogram restoration methods, the reconstructed image has a better resolution-noise relationship than an image produced using a quadratic penalty in the sinogram restoration. However, we find that this relatively straightforward approach to edge preservation in the sinogram domain is affected by the physical size of imaged objects in addition to the contrast across the edge. This presents some disadvantages of this method relative to image-domain edge-preserving methods, although the computational burden of the sinogram-domain approach is much lower. For a helical cone-beam geometry, we found applying sinogram restoration in 3D was reasonable and that pitch did not make a significant difference in the general effect of sinogram restoration. The application of Huber-penalty sinogram restoration to clinical data resulted in a reconstruction with less noise while retaining resolution. Conclusions: Sinogram restoration with the Huber penalty is able to provide better resolution-noise performance than restoration with a quadratic penalty. Additionally, sinogram restoration with the Huber penalty is feasible for helical cone-beam CT and can be applied to clinical data.« less

  1. Some Easy ’t’-Statistics.

    DTIC Science & Technology

    1981-11-01

    criterion if C(F ) n is nonconstant. The class of such functionals is a very borad one because dif- ferent investigative aims may require different... balancing act. We want highly skewed, 14& heavy-tailed confidence procedures to be noticed, yet, if this unde- sirable behavior occurs only quite...occasionally, we do not wish to penalize an otherwise sound confidence procedure. In light of the first part of the balance , we could be overly risky if we

  2. A Dataset and a Technique for Generalized Nuclear Segmentation for Computational Pathology.

    PubMed

    Kumar, Neeraj; Verma, Ruchika; Sharma, Sanuj; Bhargava, Surabhi; Vahadane, Abhishek; Sethi, Amit

    2017-07-01

    Nuclear segmentation in digital microscopic tissue images can enable extraction of high-quality features for nuclear morphometrics and other analysis in computational pathology. Conventional image processing techniques, such as Otsu thresholding and watershed segmentation, do not work effectively on challenging cases, such as chromatin-sparse and crowded nuclei. In contrast, machine learning-based segmentation can generalize across various nuclear appearances. However, training machine learning algorithms requires data sets of images, in which a vast number of nuclei have been annotated. Publicly accessible and annotated data sets, along with widely agreed upon metrics to compare techniques, have catalyzed tremendous innovation and progress on other image classification problems, particularly in object recognition. Inspired by their success, we introduce a large publicly accessible data set of hematoxylin and eosin (H&E)-stained tissue images with more than 21000 painstakingly annotated nuclear boundaries, whose quality was validated by a medical doctor. Because our data set is taken from multiple hospitals and includes a diversity of nuclear appearances from several patients, disease states, and organs, techniques trained on it are likely to generalize well and work right out-of-the-box on other H&E-stained images. We also propose a new metric to evaluate nuclear segmentation results that penalizes object- and pixel-level errors in a unified manner, unlike previous metrics that penalize only one type of error. We also propose a segmentation technique based on deep learning that lays a special emphasis on identifying the nuclear boundaries, including those between the touching or overlapping nuclei, and works well on a diverse set of test images.

  3. Variable selection for zero-inflated and overdispersed data with application to health care demand in Germany

    PubMed Central

    Wang, Zhu; Shuangge, Ma; Wang, Ching-Yun

    2017-01-01

    In health services and outcome research, count outcomes are frequently encountered and often have a large proportion of zeros. The zero-inflated negative binomial (ZINB) regression model has important applications for this type of data. With many possible candidate risk factors, this paper proposes new variable selection methods for the ZINB model. We consider maximum likelihood function plus a penalty including the least absolute shrinkage and selection operator (LASSO), smoothly clipped absolute deviation (SCAD) and minimax concave penalty (MCP). An EM (expectation-maximization) algorithm is proposed for estimating the model parameters and conducting variable selection simultaneously. This algorithm consists of estimating penalized weighted negative binomial models and penalized logistic models via the coordinated descent algorithm. Furthermore, statistical properties including the standard error formulae are provided. A simulation study shows that the new algorithm not only has more accurate or at least comparable estimation, also is more robust than the traditional stepwise variable selection. The proposed methods are applied to analyze the health care demand in Germany using an open-source R package mpath. PMID:26059498

  4. Reduction of Metal Artifact in Single Photon-Counting Computed Tomography by Spectral-Driven Iterative Reconstruction Technique

    PubMed Central

    Nasirudin, Radin A.; Mei, Kai; Panchev, Petar; Fehringer, Andreas; Pfeiffer, Franz; Rummeny, Ernst J.; Fiebich, Martin; Noël, Peter B.

    2015-01-01

    Purpose The exciting prospect of Spectral CT (SCT) using photon-counting detectors (PCD) will lead to new techniques in computed tomography (CT) that take advantage of the additional spectral information provided. We introduce a method to reduce metal artifact in X-ray tomography by incorporating knowledge obtained from SCT into a statistical iterative reconstruction scheme. We call our method Spectral-driven Iterative Reconstruction (SPIR). Method The proposed algorithm consists of two main components: material decomposition and penalized maximum likelihood iterative reconstruction. In this study, the spectral data acquisitions with an energy-resolving PCD were simulated using a Monte-Carlo simulator based on EGSnrc C++ class library. A jaw phantom with a dental implant made of gold was used as an object in this study. A total of three dental implant shapes were simulated separately to test the influence of prior knowledge on the overall performance of the algorithm. The generated projection data was first decomposed into three basis functions: photoelectric absorption, Compton scattering and attenuation of gold. A pseudo-monochromatic sinogram was calculated and used as input in the reconstruction, while the spatial information of the gold implant was used as a prior. The results from the algorithm were assessed and benchmarked with state-of-the-art reconstruction methods. Results Decomposition results illustrate that gold implant of any shape can be distinguished from other components of the phantom. Additionally, the result from the penalized maximum likelihood iterative reconstruction shows that artifacts are significantly reduced in SPIR reconstructed slices in comparison to other known techniques, while at the same time details around the implant are preserved. Quantitatively, the SPIR algorithm best reflects the true attenuation value in comparison to other algorithms. Conclusion It is demonstrated that the combination of the additional information from Spectral CT and statistical reconstruction can significantly improve image quality, especially streaking artifacts caused by the presence of materials with high atomic numbers. PMID:25955019

  5. Statistical CT noise reduction with multiscale decomposition and penalized weighted least squares in the projection domain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang Shaojie; Tang Xiangyang; School of Automation, Xi'an University of Posts and Telecommunications, Xi'an, Shaanxi 710121

    2012-09-15

    Purposes: The suppression of noise in x-ray computed tomography (CT) imaging is of clinical relevance for diagnostic image quality and the potential for radiation dose saving. Toward this purpose, statistical noise reduction methods in either the image or projection domain have been proposed, which employ a multiscale decomposition to enhance the performance of noise suppression while maintaining image sharpness. Recognizing the advantages of noise suppression in the projection domain, the authors propose a projection domain multiscale penalized weighted least squares (PWLS) method, in which the angular sampling rate is explicitly taken into consideration to account for the possible variation ofmore » interview sampling rate in advanced clinical or preclinical applications. Methods: The projection domain multiscale PWLS method is derived by converting an isotropic diffusion partial differential equation in the image domain into the projection domain, wherein a multiscale decomposition is carried out. With adoption of the Markov random field or soft thresholding objective function, the projection domain multiscale PWLS method deals with noise at each scale. To compensate for the degradation in image sharpness caused by the projection domain multiscale PWLS method, an edge enhancement is carried out following the noise reduction. The performance of the proposed method is experimentally evaluated and verified using the projection data simulated by computer and acquired by a CT scanner. Results: The preliminary results show that the proposed projection domain multiscale PWLS method outperforms the projection domain single-scale PWLS method and the image domain multiscale anisotropic diffusion method in noise reduction. In addition, the proposed method can preserve image sharpness very well while the occurrence of 'salt-and-pepper' noise and mosaic artifacts can be avoided. Conclusions: Since the interview sampling rate is taken into account in the projection domain multiscale decomposition, the proposed method is anticipated to be useful in advanced clinical and preclinical applications where the interview sampling rate varies.« less

  6. Reduced rank regression via adaptive nuclear norm penalization

    PubMed Central

    Chen, Kun; Dong, Hongbo; Chan, Kung-Sik

    2014-01-01

    Summary We propose an adaptive nuclear norm penalization approach for low-rank matrix approximation, and use it to develop a new reduced rank estimation method for high-dimensional multivariate regression. The adaptive nuclear norm is defined as the weighted sum of the singular values of the matrix, and it is generally non-convex under the natural restriction that the weight decreases with the singular value. However, we show that the proposed non-convex penalized regression method has a global optimal solution obtained from an adaptively soft-thresholded singular value decomposition. The method is computationally efficient, and the resulting solution path is continuous. The rank consistency of and prediction/estimation performance bounds for the estimator are established for a high-dimensional asymptotic regime. Simulation studies and an application in genetics demonstrate its efficacy. PMID:25045172

  7. Joint contact forces can be reduced by improving joint moment symmetry in below-knee amputee gait simulations.

    PubMed

    Koelewijn, Anne D; van den Bogert, Antonie J

    2016-09-01

    Despite having a fully functional knee and hip in both legs, asymmetries in joint moments of the knee and hip are often seen in gait of persons with a unilateral transtibial amputation (TTA), possibly resulting in excessive joint loading. We hypothesize that persons with a TTA can walk with more symmetric joint moments at the cost of increased effort or abnormal kinematics. The hypothesis was tested using predictive simulations of gait. Open loop controls of one gait cycle were found by solving an optimization problem that minimizes a combination of walking effort and tracking error in joint angles, ground reaction force and gait cycle duration. A second objective was added to penalize joint moment asymmetry, creating a multi-objective optimization problem. A Pareto front was constructed by changing the weights of the objectives and three solutions were analyzed to study the effect of increasing joint moment symmetry. When the optimization placed more weight on moment symmetry, walking effort increased and kinematics became less normal, confirming the hypothesis. TTA gait improved with a moderate increase in joint moment symmetry. At a small cost of effort and abnormal kinematics, the peak hip extension moment in the intact leg was decreased significantly, and so was the joint contact force in the knee and hip. Additional symmetry required a significant increase in walking effort and the joint contact forces in both hips became significantly higher than in able-bodied gait. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Instruction No. 3/1988, 1 June 1988.

    PubMed

    1988-01-01

    This Instruction contains provisions relating to the injury and maltreatment of children and women and the enforcement of support payments to women and children. With respect to the maltreatment of children, it provides that the Spanish Office of the Public Prosecutor shall a) curb child abuse; b) take decisive action to protect minors who have been abused or are helpless and to execute the new responsibilities imposed by Law 21/1987 of November 11, which modifies the Civil Code with respect to adoption; and c) collect the necessary data with regard to felonies or misdemeanors that involve child abuse so as to allow for the compilation of annual statistics which may identify the magnitude of this problem; With respect to the maltreatment of women, the prosecutor shall a) curb the abuse of women, investigating those cases where a lack of evidence exists because women are afraid; and b) collect the necessary data with regard to felonies or misdemeanors that involve the abuse of women so as to allow for the compilation of annual statistics which may identify the social reality of this problem. With respect to support payments, the Instruction urges all prosecutors to monitor the execution of support payments using all judicial means authorized, including penal actions when applicable, or enforcement mechanisms which have been mentioned in this memo, that is, conditioning visitation rights on timely payment of the support established for food and education of the children. In Circular No. 32 of 15 April 1988, the National Directorate of Police is called on to provide all necessary police assistance to women who have been the object of illegitimate acts of force so that they can report such acts and to inform women of laws designed to help them and the means of availing themselves of these laws. See Anuario de Derecho Penal y Ciencias Penales, Vol. 41, No. 3, September-December 1988, p. 978. full text

  9. A penal problem: the increasing incidence of implantation of penile foreign bodies.

    PubMed

    Flynn, Ryan M; Mostafa, Hesham I; Khan, Omar A; Haselhuhn, Gregory D; Jain, Samay

    2014-12-01

    Our objective is to describe a novel presentation of subcutaneous penile insertion of foreign bodies. This is a practice performed globally and mostly has been reported outside of the United States. We present three cases of incarcerated males that implanted sculpted dominos into the penile subcutaneous tissue. The patients presented with erosion of the foreign bodies through the skin without evidence of infection. We believe that insertion of foreign bodies into penile subcutaneous tissue by incarcerated American males for sexual enhancement is more widespread than previously reported. Erosion is a novel presentation.

  10. [Adapting the law to offer better protection to female victims of violence].

    PubMed

    Durand, Édouard

    2014-11-01

    As society has become more aware of the seriousness and the extent of domestic violence, the law has been adapted in order to offer female victims better protection. These legislative changes are recent and still meet with some resistance. The act of the 9th of July 2010 modified penal and civil laws to take better account of the specificities of the mechanisms of domestic abuse and create appropriate tools. The law about real equality between women and men, approved by the National Assembly on the 23rd of July 2014, is in line with this same objective.

  11. Fuel Efficient Strategies for Reducing Contrail Formations in United States Air Space

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Chen, Neil Y.; Ng, Hok K.

    2010-01-01

    This paper describes a class of strategies for reducing persistent contrail formation in the United States airspace. The primary objective is to minimize potential contrail formation regions by altering the aircraft's cruising altitude in a fuel-efficient way. The results show that the contrail formations can be reduced significantly without extra fuel consumption and without adversely affecting congestion in the airspace. The contrail formations can be further reduced by using extra fuel. For the day tested, the maximal reduction strategy has a 53% contrail reduction rate. The most fuel-efficient strategy has an 8% reduction rate with 2.86% less fuel-burnt compared to the maximal reduction strategy. Using a cost function which penalizes extra fuel consumed while maximizing the amount of contrail reduction provides a flexible way to trade off between contrail reduction and fuel consumption. It can achieve a 35% contrail reduction rate with only 0.23% extra fuel consumption. The proposed fuel-efficient contrail reduction strategy provides a solution to reduce aviation-induced environmental impact on a daily basis.

  12. Competitive repetition suppression (CoRe) clustering: a biologically inspired learning model with application to robust clustering.

    PubMed

    Bacciu, Davide; Starita, Antonina

    2008-11-01

    Determining a compact neural coding for a set of input stimuli is an issue that encompasses several biological memory mechanisms as well as various artificial neural network models. In particular, establishing the optimal network structure is still an open problem when dealing with unsupervised learning models. In this paper, we introduce a novel learning algorithm, named competitive repetition-suppression (CoRe) learning, inspired by a cortical memory mechanism called repetition suppression (RS). We show how such a mechanism is used, at various levels of the cerebral cortex, to generate compact neural representations of the visual stimuli. From the general CoRe learning model, we derive a clustering algorithm, named CoRe clustering, that can automatically estimate the unknown cluster number from the data without using a priori information concerning the input distribution. We illustrate how CoRe clustering, besides its biological plausibility, posses strong theoretical properties in terms of robustness to noise and outliers, and we provide an error function describing CoRe learning dynamics. Such a description is used to analyze CoRe relationships with the state-of-the art clustering models and to highlight CoRe similitude with rival penalized competitive learning (RPCL), showing how CoRe extends such a model by strengthening the rival penalization estimation by means of loss functions from robust statistics.

  13. Bayesian Recurrent Neural Network for Language Modeling.

    PubMed

    Chien, Jen-Tzung; Ku, Yuan-Chu

    2016-02-01

    A language model (LM) is calculated as the probability of a word sequence that provides the solution to word prediction for a variety of information systems. A recurrent neural network (RNN) is powerful to learn the large-span dynamics of a word sequence in the continuous space. However, the training of the RNN-LM is an ill-posed problem because of too many parameters from a large dictionary size and a high-dimensional hidden layer. This paper presents a Bayesian approach to regularize the RNN-LM and apply it for continuous speech recognition. We aim to penalize the too complicated RNN-LM by compensating for the uncertainty of the estimated model parameters, which is represented by a Gaussian prior. The objective function in a Bayesian classification network is formed as the regularized cross-entropy error function. The regularized model is constructed not only by calculating the regularized parameters according to the maximum a posteriori criterion but also by estimating the Gaussian hyperparameter by maximizing the marginal likelihood. A rapid approximation to a Hessian matrix is developed to implement the Bayesian RNN-LM (BRNN-LM) by selecting a small set of salient outer-products. The proposed BRNN-LM achieves a sparser model than the RNN-LM. Experiments on different corpora show the robustness of system performance by applying the rapid BRNN-LM under different conditions.

  14. Suicide-related crimes in contemporary European criminal laws.

    PubMed

    Mäkinen, I H

    1997-01-01

    This article describes suicide-related penal legislation in contemporary Europe, and analyzes and relates the results to cultural attitudes towards suicide and to national suicide rates. Data were obtained from 42 legal entities. Of these, 34 have penal regulations which--according to definition--chiefly and directly deal with suicide. There are three main types of act: aiding suicide, abetting suicide, and driving to suicide. The laws vary considerably with regard to which acts are sanctioned, how severely they are punished, and whether any special circumstances such as the motive, the result, or the object can make the crime more serious. Various ideologies have inspired legislation: religions, the euthanasia movement, and suicide prevention have all left their mark. There are some cases in which neighboring legal systems have clearly influenced laws on the topic. However, the process seems mostly to have been a national affair, resulting in surprisingly large discrepancies between European legal systems. The laws seem to reflect public opinions: countries which punish the crimes harder have significantly less permissive cultural attitudes towards suicide. Likewise, suicide rates were significantly higher in countries with a narrow scope of criminalization and milder punishments for suicide-related crimes. The cultural and normative elements of society are connected with its suicide mortality.

  15. Structured functional additive regression in reproducing kernel Hilbert spaces.

    PubMed

    Zhu, Hongxiao; Yao, Fang; Zhang, Hao Helen

    2014-06-01

    Functional additive models (FAMs) provide a flexible yet simple framework for regressions involving functional predictors. The utilization of data-driven basis in an additive rather than linear structure naturally extends the classical functional linear model. However, the critical issue of selecting nonlinear additive components has been less studied. In this work, we propose a new regularization framework for the structure estimation in the context of Reproducing Kernel Hilbert Spaces. The proposed approach takes advantage of the functional principal components which greatly facilitates the implementation and the theoretical analysis. The selection and estimation are achieved by penalized least squares using a penalty which encourages the sparse structure of the additive components. Theoretical properties such as the rate of convergence are investigated. The empirical performance is demonstrated through simulation studies and a real data application.

  16. Penalized regression procedures for variable selection in the potential outcomes framework

    PubMed Central

    Ghosh, Debashis; Zhu, Yeying; Coffman, Donna L.

    2015-01-01

    A recent topic of much interest in causal inference is model selection. In this article, we describe a framework in which to consider penalized regression approaches to variable selection for causal effects. The framework leads to a simple ‘impute, then select’ class of procedures that is agnostic to the type of imputation algorithm as well as penalized regression used. It also clarifies how model selection involves a multivariate regression model for causal inference problems, and that these methods can be applied for identifying subgroups in which treatment effects are homogeneous. Analogies and links with the literature on machine learning methods, missing data and imputation are drawn. A difference LASSO algorithm is defined, along with its multiple imputation analogues. The procedures are illustrated using a well-known right heart catheterization dataset. PMID:25628185

  17. Task-Driven Tube Current Modulation and Regularization Design in Computed Tomography with Penalized-Likelihood Reconstruction.

    PubMed

    Gang, G J; Siewerdsen, J H; Stayman, J W

    2016-02-01

    This work applies task-driven optimization to design CT tube current modulation and directional regularization in penalized-likelihood (PL) reconstruction. The relative performance of modulation schemes commonly adopted for filtered-backprojection (FBP) reconstruction were also evaluated for PL in comparison. We adopt a task-driven imaging framework that utilizes a patient-specific anatomical model and information of the imaging task to optimize imaging performance in terms of detectability index ( d' ). This framework leverages a theoretical model based on implicit function theorem and Fourier approximations to predict local spatial resolution and noise characteristics of PL reconstruction as a function of the imaging parameters to be optimized. Tube current modulation was parameterized as a linear combination of Gaussian basis functions, and regularization was based on the design of (directional) pairwise penalty weights for the 8 in-plane neighboring voxels. Detectability was optimized using a covariance matrix adaptation evolutionary strategy algorithm. Task-driven designs were compared to conventional tube current modulation strategies for a Gaussian detection task in an abdomen phantom. The task-driven design yielded the best performance, improving d' by ~20% over an unmodulated acquisition. Contrary to FBP, PL reconstruction using automatic exposure control and modulation based on minimum variance (in FBP) performed worse than the unmodulated case, decreasing d' by 16% and 9%, respectively. This work shows that conventional tube current modulation schemes suitable for FBP can be suboptimal for PL reconstruction. Thus, the proposed task-driven optimization provides additional opportunities for improved imaging performance and dose reduction beyond that achievable with conventional acquisition and reconstruction.

  18. Quantile Regression for Analyzing Heterogeneity in Ultra-high Dimension

    PubMed Central

    Wang, Lan; Wu, Yichao

    2012-01-01

    Ultra-high dimensional data often display heterogeneity due to either heteroscedastic variance or other forms of non-location-scale covariate effects. To accommodate heterogeneity, we advocate a more general interpretation of sparsity which assumes that only a small number of covariates influence the conditional distribution of the response variable given all candidate covariates; however, the sets of relevant covariates may differ when we consider different segments of the conditional distribution. In this framework, we investigate the methodology and theory of nonconvex penalized quantile regression in ultra-high dimension. The proposed approach has two distinctive features: (1) it enables us to explore the entire conditional distribution of the response variable given the ultra-high dimensional covariates and provides a more realistic picture of the sparsity pattern; (2) it requires substantially weaker conditions compared with alternative methods in the literature; thus, it greatly alleviates the difficulty of model checking in the ultra-high dimension. In theoretic development, it is challenging to deal with both the nonsmooth loss function and the nonconvex penalty function in ultra-high dimensional parameter space. We introduce a novel sufficient optimality condition which relies on a convex differencing representation of the penalized loss function and the subdifferential calculus. Exploring this optimality condition enables us to establish the oracle property for sparse quantile regression in the ultra-high dimension under relaxed conditions. The proposed method greatly enhances existing tools for ultra-high dimensional data analysis. Monte Carlo simulations demonstrate the usefulness of the proposed procedure. The real data example we analyzed demonstrates that the new approach reveals substantially more information compared with alternative methods. PMID:23082036

  19. A displacement-based finite element formulation for incompressible and nearly-incompressible cardiac mechanics

    PubMed Central

    Hadjicharalambous, Myrianthi; Lee, Jack; Smith, Nicolas P.; Nordsletten, David A.

    2014-01-01

    The Lagrange Multiplier (LM) and penalty methods are commonly used to enforce incompressibility and compressibility in models of cardiac mechanics. In this paper we show how both formulations may be equivalently thought of as a weakly penalized system derived from the statically condensed Perturbed Lagrangian formulation, which may be directly discretized maintaining the simplicity of penalty formulations with the convergence characteristics of LM techniques. A modified Shamanskii–Newton–Raphson scheme is introduced to enhance the nonlinear convergence of the weakly penalized system and, exploiting its equivalence, modifications are developed for the penalty form. Focusing on accuracy, we proceed to study the convergence behavior of these approaches using different interpolation schemes for both a simple test problem and more complex models of cardiac mechanics. Our results illustrate the well-known influence of locking phenomena on the penalty approach (particularly for lower order schemes) and its effect on accuracy for whole-cycle mechanics. Additionally, we verify that direct discretization of the weakly penalized form produces similar convergence behavior to mixed formulations while avoiding the use of an additional variable. Combining a simple structure which allows the solution of computationally challenging problems with good convergence characteristics, the weakly penalized form provides an accurate and efficient alternative to incompressibility and compressibility in cardiac mechanics. PMID:25187672

  20. A displacement-based finite element formulation for incompressible and nearly-incompressible cardiac mechanics.

    PubMed

    Hadjicharalambous, Myrianthi; Lee, Jack; Smith, Nicolas P; Nordsletten, David A

    2014-06-01

    The Lagrange Multiplier (LM) and penalty methods are commonly used to enforce incompressibility and compressibility in models of cardiac mechanics. In this paper we show how both formulations may be equivalently thought of as a weakly penalized system derived from the statically condensed Perturbed Lagrangian formulation, which may be directly discretized maintaining the simplicity of penalty formulations with the convergence characteristics of LM techniques. A modified Shamanskii-Newton-Raphson scheme is introduced to enhance the nonlinear convergence of the weakly penalized system and, exploiting its equivalence, modifications are developed for the penalty form. Focusing on accuracy, we proceed to study the convergence behavior of these approaches using different interpolation schemes for both a simple test problem and more complex models of cardiac mechanics. Our results illustrate the well-known influence of locking phenomena on the penalty approach (particularly for lower order schemes) and its effect on accuracy for whole-cycle mechanics. Additionally, we verify that direct discretization of the weakly penalized form produces similar convergence behavior to mixed formulations while avoiding the use of an additional variable. Combining a simple structure which allows the solution of computationally challenging problems with good convergence characteristics, the weakly penalized form provides an accurate and efficient alternative to incompressibility and compressibility in cardiac mechanics.

  1. Color TV: total variation methods for restoration of vector-valued images.

    PubMed

    Blomgren, P; Chan, T F

    1998-01-01

    We propose a new definition of the total variation (TV) norm for vector-valued functions that can be applied to restore color and other vector-valued images. The new TV norm has the desirable properties of 1) not penalizing discontinuities (edges) in the image, 2) being rotationally invariant in the image space, and 3) reducing to the usual TV norm in the scalar case. Some numerical experiments on denoising simple color images in red-green-blue (RGB) color space are presented.

  2. Determination of Trajectories for a Gliding Parachute System

    DTIC Science & Technology

    1975-04-01

    use of such items. Destroy this report when no longer needed. Do not return it to the originator. * jiT >T^>’TyT>’V’v,>T>’*.y^jrw\\gwJ^ "-’ v*»’v -*iv...The terminal error serves as a penalty function which penalizes undesirable terminal states by the extent to which they deviate from the desired...the open-loop control becomes, in effect , feedback control with the sequence of initial conditions serving as the current state. The major advantage

  3. Association Between Hospital Penalty Status Under the Hospital Readmission Reduction Program and Readmission Rates for Target and Non-Target Conditions

    PubMed Central

    Desai, Nihar R.; Ross, Joseph S.; Kwon, Ji Young; Herrin, Jeph; Dharmarajan, Kumar; Bernheim, Susannah M.; Krumholz, Harlan M.; Horwitz, Leora I.

    2017-01-01

    Importance Readmission rates declined after announcement of the Hospital Readmission Reduction Program (HRRP), which penalizes hospitals for excess readmissions for acute myocardial infarction (AMI), heart failure (HF), and pneumonia. Objective To compare trends in readmission rates for target and non-target conditions, stratified by hospital penalty status. Design, Setting, Participants Retrospective cohort study of 48,137,102 hospitalizations of 20,351,161 Medicare fee-for-service beneficiaries over 64 years discharged between January 1, 2008 and June 30, 2015 from 3,497 hospitals. Difference interrupted time series models were used to compare trends in readmission rates by condition and penalty status. Exposure Hospital penalty status or target condition under the HRRP. Outcome 30-day risk adjusted, all-cause unplanned readmission rates for target and non-target conditions. Results In January 2008, the mean readmission rates for AMI, HF, pneumonia and non-target conditions were 21.9%, 27.5%, 20.1%, and 18.4% respectively at hospitals later subject to financial penalties (n=2,189) and 18.7%, 24.2%, 17.4%, and 15.7% at hospitals not subject to penalties (n=1,283). Between January 2008 and March 2010, prior to HRRP announcement, readmission rates were stable across hospitals (except AMI at non-penalty hospitals). Following announcement of HRRP (March 2010), readmission rates for both target and non-target conditions declined significantly faster for patients at hospitals later subject to financial penalties compared with those at non-penalized hospitals (AMI, additional decrease of −1.24 (95% CI, −1.84, −0.65) percentage points per year relative to non-penalty discharges; HF, −1.25 (−1.64, −0.65); pneumonia, −1.37 (−0.95, −1.80); non-target, −0.27 (−0.38, −0.17); p<0.001 for all). For penalty hospitals, readmission rates for target conditions declined significantly faster compared with non-target conditions (AMI: additional decline of −0.49 (−0.81, −0.16) percentage points per year relative to non-target conditions, p=0.004; HF: −0.90 (−1.18, −0.62), p<0.001; pneumonia: −0.57 (−0.92,−0.23), p<0.001). By contrast, among non-penalty hospitals, readmissions for target conditions declined similarly or more slowly compared with non-target conditions (AMI: additional increase of 0.48 (0.01, 0.95) percentage points per year, p=0.05; HF: 0.08 (−0.30, 0.46), p=0.67; pneumonia: 0.53 (0.13, 0.93), p=0.01). After HRRP implementation in October 2012, the rate of change for readmission rates plateaued (p<0.05 for all except pneumonia at non-penalty hospitals) with the greatest relative change observed among hospitals subject to financial penalty. Conclusions Patients at hospitals subject to penalties had greater reductions in readmission rates compared with those at non-penalized hospitals. Changes were greater for target conditions at penalized hospitals, but not at non-penalized hospitals. PMID:28027367

  4. An image morphing technique based on optimal mass preserving mapping.

    PubMed

    Zhu, Lei; Yang, Yan; Haker, Steven; Tannenbaum, Allen

    2007-06-01

    Image morphing, or image interpolation in the time domain, deals with the metamorphosis of one image into another. In this paper, a new class of image morphing algorithms is proposed based on the theory of optimal mass transport. The L(2) mass moving energy functional is modified by adding an intensity penalizing term, in order to reduce the undesired double exposure effect. It is an intensity-based approach and, thus, is parameter free. The optimal warping function is computed using an iterative gradient descent approach. This proposed morphing method is also extended to doubly connected domains using a harmonic parameterization technique, along with finite-element methods.

  5. An Image Morphing Technique Based on Optimal Mass Preserving Mapping

    PubMed Central

    Zhu, Lei; Yang, Yan; Haker, Steven; Tannenbaum, Allen

    2013-01-01

    Image morphing, or image interpolation in the time domain, deals with the metamorphosis of one image into another. In this paper, a new class of image morphing algorithms is proposed based on the theory of optimal mass transport. The L2 mass moving energy functional is modified by adding an intensity penalizing term, in order to reduce the undesired double exposure effect. It is an intensity-based approach and, thus, is parameter free. The optimal warping function is computed using an iterative gradient descent approach. This proposed morphing method is also extended to doubly connected domains using a harmonic parameterization technique, along with finite-element methods. PMID:17547128

  6. A minimal cost function method for optimizing the age-Depth relation of deep-sea sediment cores

    NASA Astrophysics Data System (ADS)

    Brüggemann, Wolfgang

    1992-08-01

    The question of an optimal age-depth relation for deep-sea sediment cores has been raised frequently. The data from such cores (e.g., δ18O values) are used to test the astronomical theory of ice ages as established by Milankovitch in 1938. In this work, we use a minimal cost function approach to find simultaneously an optimal age-depth relation and a linear model that optimally links solar insolation or other model input with global ice volume. Thus a general tool for the calibration of deep-sea cores to arbitrary tuning targets is presented. In this inverse modeling type approach, an objective function is minimized that penalizes: (1) the deviation of the data from the theoretical linear model (whose transfer function can be computed analytically for a given age-depth relation) and (2) the violation of a set of plausible assumptions about the model, the data and the obtained correction of a first guess age-depth function. These assumptions have been suggested before but are now quantified and incorporated explicitly into the objective function as penalty terms. We formulate an optimization problem that is solved numerically by conjugate gradient type methods. Using this direct approach, we obtain high coherences in the Milankovitch frequency bands (over 90%). Not only the data time series but also the the derived correction to a first guess linear age-depth function (and therefore the sedimentation rate) itself contains significant energy in a broad frequency band around 100 kyr. The use of a sedimentation rate which varies continuously on ice age time scales results in a shift of energy from 100 kyr in the original data spectrum to 41, 23, and 19 kyr in the spectrum of the corrected data. However, a large proportion of the data variance remains unexplained, particularly in the 100 kyr frequency band, where there is no significant input by orbital forcing. The presented method is applied to a real sediment core and to the SPECMAP stack, and results are compared with those obtained in earlier investigations.

  7. 45 CFR 286.150 - Can a family, with a child under age 6, be penalized because a parent refuses to work because (s...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... penalized because a parent refuses to work because (s)he cannot find child care? 286.150 Section 286.150... a parent refuses to work because (s)he cannot find child care? (a) If the individual is a single custodial parent caring for a child under age six, the Tribe may not reduce or terminate assistance based on...

  8. 45 CFR 286.150 - Can a family, with a child under age 6, be penalized because a parent refuses to work because (s...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... penalized because a parent refuses to work because (s)he cannot find child care? 286.150 Section 286.150... a parent refuses to work because (s)he cannot find child care? (a) If the individual is a single custodial parent caring for a child under age six, the Tribe may not reduce or terminate assistance based on...

  9. 45 CFR 286.150 - Can a family, with a child under age 6, be penalized because a parent refuses to work because (s...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... penalized because a parent refuses to work because (s)he cannot find child care? 286.150 Section 286.150... a parent refuses to work because (s)he cannot find child care? (a) If the individual is a single custodial parent caring for a child under age six, the Tribe may not reduce or terminate assistance based on...

  10. 45 CFR 286.150 - Can a family, with a child under age 6, be penalized because a parent refuses to work because (s...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... penalized because a parent refuses to work because (s)he cannot find child care? 286.150 Section 286.150... a parent refuses to work because (s)he cannot find child care? (a) If the individual is a single custodial parent caring for a child under age six, the Tribe may not reduce or terminate assistance based on...

  11. 45 CFR 286.150 - Can a family, with a child under age 6, be penalized because a parent refuses to work because (s...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... penalized because a parent refuses to work because (s)he cannot find child care? 286.150 Section 286.150... a parent refuses to work because (s)he cannot find child care? (a) If the individual is a single custodial parent caring for a child under age six, the Tribe may not reduce or terminate assistance based on...

  12. Law No. 91, Amendment to the Penal Code, 5 September 1987.

    PubMed

    1989-01-01

    This Law replaces Article 398 of the Iraq Penal Code with the following language: "If a sound contract of marriage has been made between a perpetrator of one of the crimes mentioned in this chapter and the victim, it shall be a legal extenuating excuse for the purpose of implementing the provisions of Articles (130 and 131) of the Penal Code. If the marriage contract has been terminated by a divorce issued by the husband without a legitimate reason, or by a divorce passed by the court for such reasons related [to] a mistake or a misconduct of the husband, three years before the expiry of the sentence of the action, then, the punishment shall be reconsidered with a view to intensifying it due to a request from the public prosecution, the victim herself, or any interested person." Among the crimes mentioned in the chapter referred to in Article 398 is rape.

  13. Postcolonial penality: Liberty and repression in the shadow of independence, India c. 1947

    PubMed Central

    Brown, Mark

    2016-01-01

    This article reports primary archival data on the colonial penal history of British India and its reconfiguration into the postcolonial Indian state. It introduces criminologists to frameworks through which postcolonial scholars have sought to make sense of the continuities and discontinuities of rule across the colonial/postcolonial divide. The article examines the postcolonial life of one example of colonial penal power, known as the criminal tribes policy, under which more than three million Indian subjects of British rule were restricted in their movements, subject to a host of administrative rules and sometimes severe punishments, sequestered in settlements and limited in access to legal redress. It illustrates how at the birth of the postcolonial Indian state, encompassing visions of a liberal, unfettered and free life guaranteed in a new Constitution and charter of Fundamental Rights, freedom for some was to prove as elusive as citizens as it had been as subjects. PMID:28503082

  14. Postcolonial penality: Liberty and repression in the shadow of independence, India c. 1947.

    PubMed

    Brown, Mark

    2017-05-01

    This article reports primary archival data on the colonial penal history of British India and its reconfiguration into the postcolonial Indian state. It introduces criminologists to frameworks through which postcolonial scholars have sought to make sense of the continuities and discontinuities of rule across the colonial/postcolonial divide. The article examines the postcolonial life of one example of colonial penal power, known as the criminal tribes policy, under which more than three million Indian subjects of British rule were restricted in their movements, subject to a host of administrative rules and sometimes severe punishments, sequestered in settlements and limited in access to legal redress. It illustrates how at the birth of the postcolonial Indian state, encompassing visions of a liberal, unfettered and free life guaranteed in a new Constitution and charter of Fundamental Rights, freedom for some was to prove as elusive as citizens as it had been as subjects.

  15. A flexible model for the mean and variance functions, with application to medical cost data.

    PubMed

    Chen, Jinsong; Liu, Lei; Zhang, Daowen; Shih, Ya-Chen T

    2013-10-30

    Medical cost data are often skewed to the right and heteroscedastic, having a nonlinear relation with covariates. To tackle these issues, we consider an extension to generalized linear models by assuming nonlinear associations of covariates in the mean function and allowing the variance to be an unknown but smooth function of the mean. We make no further assumption on the distributional form. The unknown functions are described by penalized splines, and the estimation is carried out using nonparametric quasi-likelihood. Simulation studies show the flexibility and advantages of our approach. We apply the model to the annual medical costs of heart failure patients in the clinical data repository at the University of Virginia Hospital System. Copyright © 2013 John Wiley & Sons, Ltd.

  16. Interaction Models for Functional Regression.

    PubMed

    Usset, Joseph; Staicu, Ana-Maria; Maity, Arnab

    2016-02-01

    A functional regression model with a scalar response and multiple functional predictors is proposed that accommodates two-way interactions in addition to their main effects. The proposed estimation procedure models the main effects using penalized regression splines, and the interaction effect by a tensor product basis. Extensions to generalized linear models and data observed on sparse grids or with measurement error are presented. A hypothesis testing procedure for the functional interaction effect is described. The proposed method can be easily implemented through existing software. Numerical studies show that fitting an additive model in the presence of interaction leads to both poor estimation performance and lost prediction power, while fitting an interaction model where there is in fact no interaction leads to negligible losses. The methodology is illustrated on the AneuRisk65 study data.

  17. 2D/3D fetal cardiac dataset segmentation using a deformable model.

    PubMed

    Dindoyal, Irving; Lambrou, Tryphon; Deng, Jing; Todd-Pokropek, Andrew

    2011-07-01

    To segment the fetal heart in order to facilitate the 3D assessment of the cardiac function and structure. Ultrasound acquisition typically results in drop-out artifacts of the chamber walls. The authors outline a level set deformable model to automatically delineate the small fetal cardiac chambers. The level set is penalized from growing into an adjacent cardiac compartment using a novel collision detection term. The region based model allows simultaneous segmentation of all four cardiac chambers from a user defined seed point placed in each chamber. The segmented boundaries are automatically penalized from intersecting at walls with signal dropout. Root mean square errors of the perpendicular distances between the algorithm's delineation and manual tracings are within 2 mm which is less than 10% of the length of a typical fetal heart. The ejection fractions were determined from the 3D datasets. We validate the algorithm using a physical phantom and obtain volumes that are comparable to those from physically determined means. The algorithm segments volumes with an error of within 13% as determined using a physical phantom. Our original work in fetal cardiac segmentation compares automatic and manual tracings to a physical phantom and also measures inter observer variation.

  18. Critical evaluation of methods to incorporate entropy loss upon binding in high-throughput docking.

    PubMed

    Salaniwal, Sumeet; Manas, Eric S; Alvarez, Juan C; Unwalla, Rayomand J

    2007-02-01

    Proper accounting of the positional/orientational/conformational entropy loss associated with protein-ligand binding is important to obtain reliable predictions of binding affinity. Herein, we critically examine two simplified statistical mechanics-based approaches, namely a constant penalty per rotor method, and a more rigorous method, referred to here as the partition function-based scoring (PFS) method, to account for such entropy losses in high-throughput docking calculations. Our results on the estrogen receptor beta and dihydrofolate reductase proteins demonstrate that, while the constant penalty method over-penalizes molecules for their conformational flexibility, the PFS method behaves in a more "DeltaG-like" manner by penalizing different rotors differently depending on their residual entropy in the bound state. Furthermore, in contrast to no entropic penalty or the constant penalty approximation, the PFS method does not exhibit any bias towards either rigid or flexible molecules in the hit list. Preliminary enrichment studies using a lead-like random molecular database suggest that an accurate representation of the "true" energy landscape of the protein-ligand complex is critical for reliable predictions of relative binding affinities by the PFS method. Copyright 2006 Wiley-Liss, Inc.

  19. Penalized gaussian process regression and classification for high-dimensional nonlinear data.

    PubMed

    Yi, G; Shi, J Q; Choi, T

    2011-12-01

    The model based on Gaussian process (GP) prior and a kernel covariance function can be used to fit nonlinear data with multidimensional covariates. It has been used as a flexible nonparametric approach for curve fitting, classification, clustering, and other statistical problems, and has been widely applied to deal with complex nonlinear systems in many different areas particularly in machine learning. However, it is a challenging problem when the model is used for the large-scale data sets and high-dimensional data, for example, for the meat data discussed in this article that have 100 highly correlated covariates. For such data, it suffers from large variance of parameter estimation and high predictive errors, and numerically, it suffers from unstable computation. In this article, penalized likelihood framework will be applied to the model based on GPs. Different penalties will be investigated, and their ability in application given to suit the characteristics of GP models will be discussed. The asymptotic properties will also be discussed with the relevant proofs. Several applications to real biomechanical and bioinformatics data sets will be reported. © 2011, The International Biometric Society No claim to original US government works.

  20. Variable selection for zero-inflated and overdispersed data with application to health care demand in Germany.

    PubMed

    Wang, Zhu; Ma, Shuangge; Wang, Ching-Yun

    2015-09-01

    In health services and outcome research, count outcomes are frequently encountered and often have a large proportion of zeros. The zero-inflated negative binomial (ZINB) regression model has important applications for this type of data. With many possible candidate risk factors, this paper proposes new variable selection methods for the ZINB model. We consider maximum likelihood function plus a penalty including the least absolute shrinkage and selection operator (LASSO), smoothly clipped absolute deviation (SCAD), and minimax concave penalty (MCP). An EM (expectation-maximization) algorithm is proposed for estimating the model parameters and conducting variable selection simultaneously. This algorithm consists of estimating penalized weighted negative binomial models and penalized logistic models via the coordinated descent algorithm. Furthermore, statistical properties including the standard error formulae are provided. A simulation study shows that the new algorithm not only has more accurate or at least comparable estimation, but also is more robust than the traditional stepwise variable selection. The proposed methods are applied to analyze the health care demand in Germany using the open-source R package mpath. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Structured functional additive regression in reproducing kernel Hilbert spaces

    PubMed Central

    Zhu, Hongxiao; Yao, Fang; Zhang, Hao Helen

    2013-01-01

    Summary Functional additive models (FAMs) provide a flexible yet simple framework for regressions involving functional predictors. The utilization of data-driven basis in an additive rather than linear structure naturally extends the classical functional linear model. However, the critical issue of selecting nonlinear additive components has been less studied. In this work, we propose a new regularization framework for the structure estimation in the context of Reproducing Kernel Hilbert Spaces. The proposed approach takes advantage of the functional principal components which greatly facilitates the implementation and the theoretical analysis. The selection and estimation are achieved by penalized least squares using a penalty which encourages the sparse structure of the additive components. Theoretical properties such as the rate of convergence are investigated. The empirical performance is demonstrated through simulation studies and a real data application. PMID:25013362

  2. Integrated analysis of DNA-methylation and gene expression using high-dimensional penalized regression: a cohort study on bone mineral density in postmenopausal women.

    PubMed

    Lien, Tonje G; Borgan, Ørnulf; Reppe, Sjur; Gautvik, Kaare; Glad, Ingrid Kristine

    2018-03-07

    Using high-dimensional penalized regression we studied genome-wide DNA-methylation in bone biopsies of 80 postmenopausal women in relation to their bone mineral density (BMD). The women showed BMD varying from severely osteoporotic to normal. Global gene expression data from the same individuals was available, and since DNA-methylation often affects gene expression, the overall aim of this paper was to include both of these omics data sets into an integrated analysis. The classical penalized regression uses one penalty, but we incorporated individual penalties for each of the DNA-methylation sites. These individual penalties were guided by the strength of association between DNA-methylations and gene transcript levels. DNA-methylations that were highly associated to one or more transcripts got lower penalties and were therefore favored compared to DNA-methylations showing less association to expression. Because of the complex pathways and interactions among genes, we investigated both the association between DNA-methylations and their corresponding cis gene, as well as the association between DNA-methylations and trans-located genes. Two integrating penalized methods were used: first, an adaptive group-regularized ridge regression, and secondly, variable selection was performed through a modified version of the weighted lasso. When information from gene expressions was integrated, predictive performance was considerably improved, in terms of predictive mean square error, compared to classical penalized regression without data integration. We found a 14.7% improvement in the ridge regression case and a 17% improvement for the lasso case. Our version of the weighted lasso with data integration found a list of 22 interesting methylation sites. Several corresponded to genes that are known to be important in bone formation. Using BMD as response and these 22 methylation sites as covariates, least square regression analyses resulted in R 2 =0.726, comparable to an average R 2 =0.438 for 10000 randomly selected groups of DNA-methylations with group size 22. Two recent types of penalized regression methods were adapted to integrate DNA-methylation and their association to gene expression in the analysis of bone mineral density. In both cases predictions clearly benefit from including the additional information on gene expressions.

  3. Tracking Multiple Video Targets with an Improved GM-PHD Tracker

    PubMed Central

    Zhou, Xiaolong; Yu, Hui; Liu, Honghai; Li, Youfu

    2015-01-01

    Tracking multiple moving targets from a video plays an important role in many vision-based robotic applications. In this paper, we propose an improved Gaussian mixture probability hypothesis density (GM-PHD) tracker with weight penalization to effectively and accurately track multiple moving targets from a video. First, an entropy-based birth intensity estimation method is incorporated to eliminate the false positives caused by noisy video data. Then, a weight-penalized method with multi-feature fusion is proposed to accurately track the targets in close movement. For targets without occlusion, a weight matrix that contains all updated weights between the predicted target states and the measurements is constructed, and a simple, but effective method based on total weight and predicted target state is proposed to search the ambiguous weights in the weight matrix. The ambiguous weights are then penalized according to the fused target features that include spatial-colour appearance, histogram of oriented gradient and target area and further re-normalized to form a new weight matrix. With this new weight matrix, the tracker can correctly track the targets in close movement without occlusion. For targets with occlusion, a robust game-theoretical method is used. Finally, the experiments conducted on various video scenarios validate the effectiveness of the proposed penalization method and show the superior performance of our tracker over the state of the art. PMID:26633422

  4. Taking side effects into account for HIV medication.

    PubMed

    Costanza, Vicente; Rivadeneira, Pablo S; Biafore, Federico L; D'Attellis, Carlos E

    2010-09-01

    A control-theoretic approach to the problem of designing "low-side-effects" therapies for HIV patients based on highly active drugs is substantiated here. The evolution of side effects during treatment is modeled by an extra differential equation coupled to the dynamics of virions, healthy T-cells, and infected ones. The new equation reflects the dependence of collateral damages on the amount of each dose administered to the patient and on the evolution of the viral load detected by periodical blood analysis. The cost objective accounts for recommended bounds on healthy cells and virions, and also penalizes the appearance of collateral morbidities caused by the medication. The optimization problem is solved by a hybrid dynamic programming scheme that adhere to discrete-time observation and control actions, but by maintaining the continuous-time setup for predicting states and side effects. The resulting optimal strategies employ less drugs than those prescribed by previous optimization studies, but maintaining high doses at the beginning and the end of each period of six months. If an inverse discount rate is applied to favor early actions, and under a mild penalization of the final viral load, then the optimal doses are found to be high at the beginning and decrease afterward, thus causing an apparent stabilization of the main variables. But in this case, the final viral load turns higher than acceptable.

  5. Development of the IBSAL-SimMOpt Method for the Optimization of Quality in a Corn Stover Supply Chain

    DOE PAGES

    Chavez, Hernan; Castillo-Villar, Krystel; Webb, Erin

    2017-08-01

    Variability on the physical characteristics of feedstock has a relevant effect on the reactor’s reliability and operating cost. Most of the models developed to optimize biomass supply chains have failed to quantify the effect of biomass quality and preprocessing operations required to meet biomass specifications on overall cost and performance. The Integrated Biomass Supply Analysis and Logistics (IBSAL) model estimates the harvesting, collection, transportation, and storage cost while considering the stochastic behavior of the field-to-biorefinery supply chain. This paper proposes an IBSAL-SimMOpt (Simulation-based Multi-Objective Optimization) method for optimizing the biomass quality and costs associated with the efforts needed to meetmore » conversion technology specifications. The method is developed in two phases. For the first phase, a SimMOpt tool that interacts with the extended IBSAL is developed. For the second phase, the baseline IBSAL model is extended so that the cost for meeting and/or penalization for failing in meeting specifications are considered. The IBSAL-SimMOpt method is designed to optimize quality characteristics of biomass, cost related to activities intended to improve the quality of feedstock, and the penalization cost. A case study based on 1916 farms in Ontario, Canada is considered for testing the proposed method. Analysis of the results demonstrates that this method is able to find a high-quality set of non-dominated solutions.« less

  6. Development of the IBSAL-SimMOpt Method for the Optimization of Quality in a Corn Stover Supply Chain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chavez, Hernan; Castillo-Villar, Krystel; Webb, Erin

    Variability on the physical characteristics of feedstock has a relevant effect on the reactor’s reliability and operating cost. Most of the models developed to optimize biomass supply chains have failed to quantify the effect of biomass quality and preprocessing operations required to meet biomass specifications on overall cost and performance. The Integrated Biomass Supply Analysis and Logistics (IBSAL) model estimates the harvesting, collection, transportation, and storage cost while considering the stochastic behavior of the field-to-biorefinery supply chain. This paper proposes an IBSAL-SimMOpt (Simulation-based Multi-Objective Optimization) method for optimizing the biomass quality and costs associated with the efforts needed to meetmore » conversion technology specifications. The method is developed in two phases. For the first phase, a SimMOpt tool that interacts with the extended IBSAL is developed. For the second phase, the baseline IBSAL model is extended so that the cost for meeting and/or penalization for failing in meeting specifications are considered. The IBSAL-SimMOpt method is designed to optimize quality characteristics of biomass, cost related to activities intended to improve the quality of feedstock, and the penalization cost. A case study based on 1916 farms in Ontario, Canada is considered for testing the proposed method. Analysis of the results demonstrates that this method is able to find a high-quality set of non-dominated solutions.« less

  7. ["Integrity" in the healthcare system : Recognize and avoid risks: on dealing with the Association of Statutory Health Insurance Physicians and the public prosecutors office].

    PubMed

    Wohlgemuth, Martin; Heinrich, Julia

    2018-05-24

    This article describes the introduction of the law to combat corruption in the healthcare system. The effects of the introduced penal regulations on the delivery of medical services is critically scrutinized and the associated procedures as well as indications for the course of action are presented. Knowledge of the relevant regulations and types of procedure is decisive for the penal, social legislative and professional conduct risk minimization.

  8. To Amend Certain Federal Statutes to Enhance the Effectiveness of Job Training Programs in Penal Institutions. Hearing before the Subcommittee on Labor Standards of the Committee on Education and Labor, House of Representatives, Ninety-Fourth Congress, Second Session.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. House Committee on Education and Labor.

    A hearing before the subcommittee on labor standards was held to receive testimony on a bill, H.R. 2715, to amend Federal statutes to improve the effectiveness of job training programs in penal institutions. H.R. 2715, sponsored by Congressman Albert H. Quie of Minnesota, would permit the distribution in interstate commerce of goods produced by…

  9. Religious counseling in the penal context: strategies of trust and establishment of trusting relationships in a context of distrust.

    PubMed

    Brandner, Tobias

    2013-06-01

    The paper describes how distrust shapes the network of relationships between the different agents in the penal context, among inmates, between inmates and their family, between inmates and staff, between counselors and staff, and between inmates and counselors, and discusses how counseling strategies need to be adjusted to counter the effects of the institutional and biographical context of distrust. The paper is based on many years of participation and observation in the context of Hong Kong.

  10. A feature refinement approach for statistical interior CT reconstruction

    NASA Astrophysics Data System (ADS)

    Hu, Zhanli; Zhang, Yunwan; Liu, Jianbo; Ma, Jianhua; Zheng, Hairong; Liang, Dong

    2016-07-01

    Interior tomography is clinically desired to reduce the radiation dose rendered to patients. In this work, a new statistical interior tomography approach for computed tomography is proposed. The developed design focuses on taking into account the statistical nature of local projection data and recovering fine structures which are lost in the conventional total-variation (TV)—minimization reconstruction. The proposed method falls within the compressed sensing framework of TV minimization, which only assumes that the interior ROI is piecewise constant or polynomial and does not need any additional prior knowledge. To integrate the statistical distribution property of projection data, the objective function is built under the criteria of penalized weighed least-square (PWLS-TV). In the implementation of the proposed method, the interior projection extrapolation based FBP reconstruction is first used as the initial guess to mitigate truncation artifacts and also provide an extended field-of-view. Moreover, an interior feature refinement step, as an important processing operation is performed after each iteration of PWLS-TV to recover the desired structure information which is lost during the TV minimization. Here, a feature descriptor is specifically designed and employed to distinguish structure from noise and noise-like artifacts. A modified steepest descent algorithm is adopted to minimize the associated objective function. The proposed method is applied to both digital phantom and in vivo Micro-CT datasets, and compared to FBP, ART-TV and PWLS-TV. The reconstruction results demonstrate that the proposed method performs better than other conventional methods in suppressing noise, reducing truncated and streak artifacts, and preserving features. The proposed approach demonstrates its potential usefulness for feature preservation of interior tomography under truncated projection measurements.

  11. A feature refinement approach for statistical interior CT reconstruction.

    PubMed

    Hu, Zhanli; Zhang, Yunwan; Liu, Jianbo; Ma, Jianhua; Zheng, Hairong; Liang, Dong

    2016-07-21

    Interior tomography is clinically desired to reduce the radiation dose rendered to patients. In this work, a new statistical interior tomography approach for computed tomography is proposed. The developed design focuses on taking into account the statistical nature of local projection data and recovering fine structures which are lost in the conventional total-variation (TV)-minimization reconstruction. The proposed method falls within the compressed sensing framework of TV minimization, which only assumes that the interior ROI is piecewise constant or polynomial and does not need any additional prior knowledge. To integrate the statistical distribution property of projection data, the objective function is built under the criteria of penalized weighed least-square (PWLS-TV). In the implementation of the proposed method, the interior projection extrapolation based FBP reconstruction is first used as the initial guess to mitigate truncation artifacts and also provide an extended field-of-view. Moreover, an interior feature refinement step, as an important processing operation is performed after each iteration of PWLS-TV to recover the desired structure information which is lost during the TV minimization. Here, a feature descriptor is specifically designed and employed to distinguish structure from noise and noise-like artifacts. A modified steepest descent algorithm is adopted to minimize the associated objective function. The proposed method is applied to both digital phantom and in vivo Micro-CT datasets, and compared to FBP, ART-TV and PWLS-TV. The reconstruction results demonstrate that the proposed method performs better than other conventional methods in suppressing noise, reducing truncated and streak artifacts, and preserving features. The proposed approach demonstrates its potential usefulness for feature preservation of interior tomography under truncated projection measurements.

  12. Theoretical stability in coefficient inverse problems for general hyperbolic equations with numerical reconstruction

    NASA Astrophysics Data System (ADS)

    Yu, Jie; Liu, Yikan; Yamamoto, Masahiro

    2018-04-01

    In this article, we investigate the determination of the spatial component in the time-dependent second order coefficient of a hyperbolic equation from both theoretical and numerical aspects. By the Carleman estimates for general hyperbolic operators and an auxiliary Carleman estimate, we establish local Hölder stability with either partial boundary or interior measurements under certain geometrical conditions. For numerical reconstruction, we minimize a Tikhonov functional which penalizes the gradient of the unknown function. Based on the resulting variational equation, we design an iteration method which is updated by solving a Poisson equation at each step. One-dimensional prototype examples illustrate the numerical performance of the proposed iteration.

  13. Supervised Variational Relevance Learning, An Analytic Geometric Feature Selection with Applications to Omic Datasets.

    PubMed

    Boareto, Marcelo; Cesar, Jonatas; Leite, Vitor B P; Caticha, Nestor

    2015-01-01

    We introduce Supervised Variational Relevance Learning (Suvrel), a variational method to determine metric tensors to define distance based similarity in pattern classification, inspired in relevance learning. The variational method is applied to a cost function that penalizes large intraclass distances and favors small interclass distances. We find analytically the metric tensor that minimizes the cost function. Preprocessing the patterns by doing linear transformations using the metric tensor yields a dataset which can be more efficiently classified. We test our methods using publicly available datasets, for some standard classifiers. Among these datasets, two were tested by the MAQC-II project and, even without the use of further preprocessing, our results improve on their performance.

  14. Spatial frequency performance limitations of radiation dose optimization and beam positioning

    NASA Astrophysics Data System (ADS)

    Stewart, James M. P.; Stapleton, Shawn; Chaudary, Naz; Lindsay, Patricia E.; Jaffray, David A.

    2018-06-01

    The flexibility and sophistication of modern radiotherapy treatment planning and delivery methods have advanced techniques to improve the therapeutic ratio. Contemporary dose optimization and calculation algorithms facilitate radiotherapy plans which closely conform the three-dimensional dose distribution to the target, with beam shaping devices and image guided field targeting ensuring the fidelity and accuracy of treatment delivery. Ultimately, dose distribution conformity is limited by the maximum deliverable dose gradient; shallow dose gradients challenge techniques to deliver a tumoricidal radiation dose while minimizing dose to surrounding tissue. In this work, this ‘dose delivery resolution’ observation is rigorously formalized for a general dose delivery model based on the superposition of dose kernel primitives. It is proven that the spatial resolution of a delivered dose is bounded by the spatial frequency content of the underlying dose kernel, which in turn defines a lower bound in the minimization of a dose optimization objective function. In addition, it is shown that this optimization is penalized by a dose deposition strategy which enforces a constant relative phase (or constant spacing) between individual radiation beams. These results are further refined to provide a direct, analytic method to estimate the dose distribution arising from the minimization of such an optimization function. The efficacy of the overall framework is demonstrated on an image guided small animal microirradiator for a set of two-dimensional hypoxia guided dose prescriptions.

  15. Hospitals with higher nurse staffing had lower odds of readmissions penalties than hospitals with lower staffing.

    PubMed

    McHugh, Matthew D; Berez, Julie; Small, Dylan S

    2013-10-01

    The Affordable Care Act's Hospital Readmissions Reduction Program (HRRP) penalizes hospitals based on excess readmission rates among Medicare beneficiaries. The aim of the program is to reduce readmissions while aligning hospitals' financial incentives with payers' and patients' quality goals. Many evidence-based interventions that reduce readmissions, such as discharge preparation, care coordination, and patient education, are grounded in the fundamentals of basic nursing care. Yet inadequate staffing can hinder nurses' efforts to carry out these processes of care. We estimated the effect that nurse staffing had on the likelihood that a hospital was penalized under the HRRP. Hospitals with higher nurse staffing had 25 percent lower odds of being penalized compared to otherwise similar hospitals with lower staffing. Investment in nursing is a potential system-level intervention to reduce readmissions that policy makers and hospital administrators should consider in the new regulatory environment as they examine the quality of care delivered to US hospital patients.

  16. Integrative Analysis of Cancer Diagnosis Studies with Composite Penalization

    PubMed Central

    Liu, Jin; Huang, Jian; Ma, Shuangge

    2013-01-01

    Summary In cancer diagnosis studies, high-throughput gene profiling has been extensively conducted, searching for genes whose expressions may serve as markers. Data generated from such studies have the “large d, small n” feature, with the number of genes profiled much larger than the sample size. Penalization has been extensively adopted for simultaneous estimation and marker selection. Because of small sample sizes, markers identified from the analysis of single datasets can be unsatisfactory. A cost-effective remedy is to conduct integrative analysis of multiple heterogeneous datasets. In this article, we investigate composite penalization methods for estimation and marker selection in integrative analysis. The proposed methods use the minimax concave penalty (MCP) as the outer penalty. Under the homogeneity model, the ridge penalty is adopted as the inner penalty. Under the heterogeneity model, the Lasso penalty and MCP are adopted as the inner penalty. Effective computational algorithms based on coordinate descent are developed. Numerical studies, including simulation and analysis of practical cancer datasets, show satisfactory performance of the proposed methods. PMID:24578589

  17. Orthogonalizing EM: A design-based least squares algorithm.

    PubMed

    Xiong, Shifeng; Dai, Bin; Huling, Jared; Qian, Peter Z G

    We introduce an efficient iterative algorithm, intended for various least squares problems, based on a design of experiments perspective. The algorithm, called orthogonalizing EM (OEM), works for ordinary least squares and can be easily extended to penalized least squares. The main idea of the procedure is to orthogonalize a design matrix by adding new rows and then solve the original problem by embedding the augmented design in a missing data framework. We establish several attractive theoretical properties concerning OEM. For the ordinary least squares with a singular regression matrix, an OEM sequence converges to the Moore-Penrose generalized inverse-based least squares estimator. For ordinary and penalized least squares with various penalties, it converges to a point having grouping coherence for fully aliased regression matrices. Convergence and the convergence rate of the algorithm are examined. Finally, we demonstrate that OEM is highly efficient for large-scale least squares and penalized least squares problems, and is considerably faster than competing methods when n is much larger than p . Supplementary materials for this article are available online.

  18. Hospitals With Higher Nurse Staffing Had Lower Odds Of Readmissions Penalties Than Hospitals With Lower Staffing

    PubMed Central

    McHugh, Matthew D.; Berez, Julie; Small, Dylan S.

    2015-01-01

    The Affordable Care Act’s Hospital Readmissions Reduction Program (HRRP) penalizes hospitals based on excess readmission rates among Medicare beneficiaries. The aim of the program is to reduce readmissions while aligning hospitals’ financial incentives with payers’ and patients’ quality goals. Many evidence-based interventions that reduce readmissions, such as discharge preparation, care coordination, and patient education, are grounded in the fundamentals of basic nursing care. Yet inadequate staffing can hinder nurses’ efforts to carry out these processes of care. We estimated the effect that nurse staffing had on the likelihood that a hospital was penalized under the HRRP. Hospitals with higher nurse staffing had 25 percent lower odds of being penalized compared to otherwise similar hospitals with lower staffing. Investment in nursing is a potential system-level intervention to reduce readmissions that policy makers and hospital administrators should consider in the new regulatory environment as they examine the quality of care delivered to US hospital patients. PMID:24101063

  19. A Reward-Maximizing Spiking Neuron as a Bounded Rational Decision Maker.

    PubMed

    Leibfried, Felix; Braun, Daniel A

    2015-08-01

    Rate distortion theory describes how to communicate relevant information most efficiently over a channel with limited capacity. One of the many applications of rate distortion theory is bounded rational decision making, where decision makers are modeled as information channels that transform sensory input into motor output under the constraint that their channel capacity is limited. Such a bounded rational decision maker can be thought to optimize an objective function that trades off the decision maker's utility or cumulative reward against the information processing cost measured by the mutual information between sensory input and motor output. In this study, we interpret a spiking neuron as a bounded rational decision maker that aims to maximize its expected reward under the computational constraint that the mutual information between the neuron's input and output is upper bounded. This abstract computational constraint translates into a penalization of the deviation between the neuron's instantaneous and average firing behavior. We derive a synaptic weight update rule for such a rate distortion optimizing neuron and show in simulations that the neuron efficiently extracts reward-relevant information from the input by trading off its synaptic strengths against the collected reward.

  20. Risk management for sulfur dioxide abatement under multiple uncertainties

    NASA Astrophysics Data System (ADS)

    Dai, C.; Sun, W.; Tan, Q.; Liu, Y.; Lu, W. T.; Guo, H. C.

    2016-03-01

    In this study, interval-parameter programming, two-stage stochastic programming (TSP), and conditional value-at-risk (CVaR) were incorporated into a general optimization framework, leading to an interval-parameter CVaR-based two-stage programming (ICTP) method. The ICTP method had several advantages: (i) its objective function simultaneously took expected cost and risk cost into consideration, and also used discrete random variables and discrete intervals to reflect uncertain properties; (ii) it quantitatively evaluated the right tail of distributions of random variables which could better calculate the risk of violated environmental standards; (iii) it was useful for helping decision makers to analyze the trade-offs between cost and risk; and (iv) it was effective to penalize the second-stage costs, as well as to capture the notion of risk in stochastic programming. The developed model was applied to sulfur dioxide abatement in an air quality management system. The results indicated that the ICTP method could be used for generating a series of air quality management schemes under different risk-aversion levels, for identifying desired air quality management strategies for decision makers, and for considering a proper balance between system economy and environmental quality.

  1. Penalized discriminant analysis for the detection of wild-grown and cultivated Ganoderma lucidum using Fourier transform infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Zhu, Ying; Tan, Tuck Lee

    2016-04-01

    An effective and simple analytical method using Fourier transform infrared (FTIR) spectroscopy to distinguish wild-grown high-quality Ganoderma lucidum (G. lucidum) from cultivated one is of essential importance for its quality assurance and medicinal value estimation. Commonly used chemical and analytical methods using full spectrum are not so effective for the detection and interpretation due to the complex system of the herbal medicine. In this study, two penalized discriminant analysis models, penalized linear discriminant analysis (PLDA) and elastic net (Elnet),using FTIR spectroscopy have been explored for the purpose of discrimination and interpretation. The classification performances of the two penalized models have been compared with two widely used multivariate methods, principal component discriminant analysis (PCDA) and partial least squares discriminant analysis (PLSDA). The Elnet model involving a combination of L1 and L2 norm penalties enabled an automatic selection of a small number of informative spectral absorption bands and gave an excellent classification accuracy of 99% for discrimination between spectra of wild-grown and cultivated G. lucidum. Its classification performance was superior to that of the PLDA model in a pure L1 setting and outperformed the PCDA and PLSDA models using full wavelength. The well-performed selection of informative spectral features leads to substantial reduction in model complexity and improvement of classification accuracy, and it is particularly helpful for the quantitative interpretations of the major chemical constituents of G. lucidum regarding its anti-cancer effects.

  2. The "other" side of labor reform: accounts of incarceration and resistance in the Straits Settlements penal system, 1825-1873.

    PubMed

    Pieris, Anoma

    2011-01-01

    The rhetoric surrounding the transportation of prisoners to the Straits Settlements and the reformative capacity of the penal labor regime assumed a uniform subject, an impoverished criminal, who could be disciplined and accordingly civilized through labor. Stamford Raffles, as lieutenant governor of Benkulen, believed that upon realizing the advantages of the new colony, criminals would willingly become settlers. These two colonial prerogatives of labor and population categorized transportees into laboring classes where their exploitation supposedly brought mutual benefit. The colonized was collectively homogenized as a class of laborers and evidence to the contrary, of politically challenging and resistant individuals was suppressed. This paper focuses on two prisoners who were incriminated during the anti-colonial rebellions of the mid-nineteenth century and were transported to the Straits Settlements. Nihal Singh, a political prisoner from Lahore, was incarcerated in isolation to prevent his martyrdom and denied the supposed benefits of labor reform. Conversely, Tikiri Banda Dunuwille, a lawyer from Ceylon was sent to labor in Melaka as a form of humiliation. Tikiri’s many schemes to evade labor damned him in the eyes of the authorities. The personal histories of these two individuals expose how colonial penal policy recognized and manipulated individual differences during a time of rising anti-colonial sentiment. The experiences of these prisoners, the response of their communities and the voices of their descendents offer us a very different entry point into colonial penal history.

  3. SU-F-18C-13: Low-Dose X-Ray CT Reconstruction Using a Hybrid First-Order Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, L; Lin, W; Jin, M

    2014-06-15

    Purpose: To develop a novel reconstruction method for X-ray CT that can lead to accurate reconstruction at significantly reduced dose levels combining low X-ray incident intensity and few views of projection data. Methods: The noise nature of the projection data at low X-ray incident intensity was modeled and accounted by the weighted least-squares (WLS) criterion. The total variation (TV) penalty was used to mitigate artifacts caused by few views of data. The first order primal-dual (FOPD) algorithm was used to minimize TV in image domain, which avoided the difficulty of the non-smooth objective function. The TV penalized WLS reconstruction wasmore » achieved by alternated FOPD TV minimization and projection onto convex sets (POCS) for data fidelity constraints. The proposed FOPD-POCS method was evaluated using the FORBILD jaw phantom and the real cadaver head CT data. Results: The quantitative measures, root mean square error (RMSE) and contrast-to-noise ratio (CNR), demonstrate the superior denoising capability of WLS over LS-based TV iterative reconstruction. The improvement of RMSE (WLS vs. LS) is 15%∼21% and that of CNR is 17%∼72% when the incident counts per ray are ranged from 1×10{sup 5} to 1×10{sup 3}. In addition, the TV regularization can accurately reconstruct images from about 50 views of the jaw phantom. The FOPD-POCS reconstruction reveals more structural details and suffers fewer artifacts in both the phantom and real head images. The FOPD-POCS method also shows fast convergence at low X-ray incident intensity. Conclusion: The new hybrid FOPD-POCS method, based on TV penalized WLS, yields excellent image quality when the incident X-ray intensity is low and the projection views are limited. The reconstruction is computationally efficient since the FOPD minimization of TV is applied only in the image domain. The characteristics of FOPD-POCS can be exploited to significantly reduce radiation dose of X-ray CT without compromising accuracy for diagnosis or treatment planning.« less

  4. Bayesian penalized-likelihood reconstruction algorithm suppresses edge artifacts in PET reconstruction based on point-spread-function.

    PubMed

    Yamaguchi, Shotaro; Wagatsuma, Kei; Miwa, Kenta; Ishii, Kenji; Inoue, Kazumasa; Fukushi, Masahiro

    2018-03-01

    The Bayesian penalized-likelihood reconstruction algorithm (BPL), Q.Clear, uses relative difference penalty as a regularization function to control image noise and the degree of edge-preservation in PET images. The present study aimed to determine the effects of suppression on edge artifacts due to point-spread-function (PSF) correction using a Q.Clear. Spheres of a cylindrical phantom contained a background of 5.3 kBq/mL of [ 18 F]FDG and sphere-to-background ratios (SBR) of 16, 8, 4 and 2. The background also contained water and spheres containing 21.2 kBq/mL of [ 18 F]FDG as non-background. All data were acquired using a Discovery PET/CT 710 and were reconstructed using three-dimensional ordered-subset expectation maximization with time-of-flight (TOF) and PSF correction (3D-OSEM), and Q.Clear with TOF (BPL). We investigated β-values of 200-800 using BPL. The PET images were analyzed using visual assessment and profile curves, edge variability and contrast recovery coefficients were measured. The 38- and 27-mm spheres were surrounded by higher radioactivity concentration when reconstructed with 3D-OSEM as opposed to BPL, which suppressed edge artifacts. Images of 10-mm spheres had sharper overshoot at high SBR and non-background when reconstructed with BPL. Although contrast recovery coefficients of 10-mm spheres in BPL decreased as a function of increasing β, higher penalty parameter decreased the overshoot. BPL is a feasible method for the suppression of edge artifacts of PSF correction, although this depends on SBR and sphere size. Overshoot associated with BPL caused overestimation in small spheres at high SBR. Higher penalty parameter in BPL can suppress overshoot more effectively. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. Space station onboard propulsion system: Technology study

    NASA Technical Reports Server (NTRS)

    Mcallister, J. G.; Rudland, R. S.; Redd, L. R.; Beekman, D. H.; Cuffin, S. M.; Beer, C. M.; Mccarthy, K. K.

    1987-01-01

    The objective was to prepare for the design of the space station propulsion system. Propulsion system concepts were defined and schematics were developed for the most viable concepts. A dual model bipropellant system was found to deliver the largest amount of payload. However, when resupply is considered, an electrolysis system with 10 percent accumulators requires less resupply propellant, though it is penalized by the amount of time required to fill the accumulators and the power requirements for the electrolyzer. A computer simulation was prepared, which was originally intended to simulate the water electrolysis propulsion system but which was expanded to model other types of systems such as cold gas, monopropellant and bipropellant storable systems.

  6. [Penal and non-penal legislative policy in relation to human biotechnology].

    PubMed

    Romeo Casabona, Carlos María

    2007-01-01

    The Spanish legislator has introduced a set of legislative novelties in the field of human biotechnology or is about to do so. This will be done either through the reform of some laws or through the approval of new laws, that is, without previous regulatory references available. The greater part of these novelties turn on research with cells or cell lines of human origin, specifically those from human embryos and through the use of diverse techniques, such as reproductive cloning and non-reproductive ('therapeutic') cloning.

  7. Fast Spatial Resolution Analysis of Quadratic Penalized Least-Squares Image Reconstruction With Separate Real and Imaginary Roughness Penalty: Application to fMRI.

    PubMed

    Olafsson, Valur T; Noll, Douglas C; Fessler, Jeffrey A

    2018-02-01

    Penalized least-squares iterative image reconstruction algorithms used for spatial resolution-limited imaging, such as functional magnetic resonance imaging (fMRI), commonly use a quadratic roughness penalty to regularize the reconstructed images. When used for complex-valued images, the conventional roughness penalty regularizes the real and imaginary parts equally. However, these imaging methods sometimes benefit from separate penalties for each part. The spatial smoothness from the roughness penalty on the reconstructed image is dictated by the regularization parameter(s). One method to set the parameter to a desired smoothness level is to evaluate the full width at half maximum of the reconstruction method's local impulse response. Previous work has shown that when using the conventional quadratic roughness penalty, one can approximate the local impulse response using an FFT-based calculation. However, that acceleration method cannot be applied directly for separate real and imaginary regularization. This paper proposes a fast and stable calculation for this case that also uses FFT-based calculations to approximate the local impulse responses of the real and imaginary parts. This approach is demonstrated with a quadratic image reconstruction of fMRI data that uses separate roughness penalties for the real and imaginary parts.

  8. Advisory Algorithm for Scheduling Open Sectors, Operating Positions, and Workstations

    NASA Technical Reports Server (NTRS)

    Bloem, Michael; Drew, Michael; Lai, Chok Fung; Bilimoria, Karl D.

    2012-01-01

    Air traffic controller supervisors configure available sector, operating position, and work-station resources to safely and efficiently control air traffic in a region of airspace. In this paper, an algorithm for assisting supervisors with this task is described and demonstrated on two sample problem instances. The algorithm produces configuration schedule advisories that minimize a cost. The cost is a weighted sum of two competing costs: one penalizing mismatches between configurations and predicted air traffic demand and another penalizing the effort associated with changing configurations. The problem considered by the algorithm is a shortest path problem that is solved with a dynamic programming value iteration algorithm. The cost function contains numerous parameters. Default values for most of these are suggested based on descriptions of air traffic control procedures and subject-matter expert feedback. The parameter determining the relative importance of the two competing costs is tuned by comparing historical configurations with corresponding algorithm advisories. Two sample problem instances for which appropriate configuration advisories are obvious were designed to illustrate characteristics of the algorithm. Results demonstrate how the algorithm suggests advisories that appropriately utilize changes in airspace configurations and changes in the number of operating positions allocated to each open sector. The results also demonstrate how the advisories suggest appropriate times for configuration changes.

  9. A Fast and Scalable Method for A-Optimal Design of Experiments for Infinite-dimensional Bayesian Nonlinear Inverse Problems with Application to Porous Medium Flow

    NASA Astrophysics Data System (ADS)

    Petra, N.; Alexanderian, A.; Stadler, G.; Ghattas, O.

    2015-12-01

    We address the problem of optimal experimental design (OED) for Bayesian nonlinear inverse problems governed by partial differential equations (PDEs). The inverse problem seeks to infer a parameter field (e.g., the log permeability field in a porous medium flow model problem) from synthetic observations at a set of sensor locations and from the governing PDEs. The goal of the OED problem is to find an optimal placement of sensors so as to minimize the uncertainty in the inferred parameter field. We formulate the OED objective function by generalizing the classical A-optimal experimental design criterion using the expected value of the trace of the posterior covariance. This expected value is computed through sample averaging over the set of likely experimental data. Due to the infinite-dimensional character of the parameter field, we seek an optimization method that solves the OED problem at a cost (measured in the number of forward PDE solves) that is independent of both the parameter and the sensor dimension. To facilitate this goal, we construct a Gaussian approximation to the posterior at the maximum a posteriori probability (MAP) point, and use the resulting covariance operator to define the OED objective function. We use randomized trace estimation to compute the trace of this covariance operator. The resulting OED problem includes as constraints the system of PDEs characterizing the MAP point, and the PDEs describing the action of the covariance (of the Gaussian approximation to the posterior) to vectors. We control the sparsity of the sensor configurations using sparsifying penalty functions, and solve the resulting penalized bilevel optimization problem via an interior-point quasi-Newton method, where gradient information is computed via adjoints. We elaborate our OED method for the problem of determining the optimal sensor configuration to best infer the log permeability field in a porous medium flow problem. Numerical results show that the number of PDE solves required for the evaluation of the OED objective function and its gradient is essentially independent of both the parameter dimension and the sensor dimension (i.e., the number of candidate sensor locations). The number of quasi-Newton iterations for computing an OED also exhibits the same dimension invariance properties.

  10. Hydrophobic potential of mean force as a solvation function for protein structure prediction.

    PubMed

    Lin, Matthew S; Fawzi, Nicolas Lux; Head-Gordon, Teresa

    2007-06-01

    We have developed a solvation function that combines a Generalized Born model for polarization of protein charge by the high dielectric solvent, with a hydrophobic potential of mean force (HPMF) as a model for hydrophobic interaction, to aid in the discrimination of native structures from other misfolded states in protein structure prediction. We find that our energy function outperforms other reported scoring functions in terms of correct native ranking for 91% of proteins and low Z scores for a variety of decoy sets, including the challenging Rosetta decoys. This work shows that the stabilizing effect of hydrophobic exposure to aqueous solvent that defines the HPMF hydration physics is an apparent improvement over solvent-accessible surface area models that penalize hydrophobic exposure. Decoys generated by thermal sampling around the native-state basin reveal a potentially important role for side-chain entropy in the future development of even more accurate free energy surfaces.

  11. Should Lifestyles Be a Criterion for Healthcare Rationing? Evidence from a Portuguese Survey.

    PubMed

    Borges, Ana Pinto; Pinho, Micaela

    2017-11-18

    We evaluated whether different personal responsibilities should influence the allocation healthcare resources and whether attitudes toward the penalization of risk behaviours vary among individual's sociodemographic characteristics and health related habits. A cross-sectional study. We developed an online survey and made it available on various social networks for six months, during 2015. The sample covered the population aged 18 yr and older living in Portugal and we got 296 valid answers. Respondents faced four lifestyle choices: smoking, consumption of alcoholic beverages, unhealthy diet and illegal drug use, and should decide whether each one is relevant when establishing healthcare priorities. Logistic regressions were used to explore the relation of respondents' sociodemographic characteristics and health related behaviours in the likelihood of agreeing with the patients engaged in risky behaviour deserve a lower priority. Using illegal drugs was the behaviour most penalized (65.5%) followed by heavy drinkers (61.5%) and smoking (51.0%). The slight penalization was the unhealthy dieting (29.7%). The sociodemographic characteristics had different impact in penalization of the risks' behaviours. Moreover, the respondents who support the idea that unhealthy lifestyles should have a lower priority, all strongly agreed that the smoking habit (OR=36.05; 95% CI: 8.72, 149.12), the unhealthy diets (OR=12.87; 95% CI: 3.21, 51.53), drink alcohol in excess (OR=20.51; 95% CI: 12.09, 85.46) and illegal drug use (OR=73.21; 95% CI: 9.78, 97.83) must have a lower priority in the access to healthcare. The respondents accept the notion of rationing healthcare based on lifestyles.

  12. Penalized discriminant analysis for the detection of wild-grown and cultivated Ganoderma lucidum using Fourier transform infrared spectroscopy.

    PubMed

    Zhu, Ying; Tan, Tuck Lee

    2016-04-15

    An effective and simple analytical method using Fourier transform infrared (FTIR) spectroscopy to distinguish wild-grown high-quality Ganoderma lucidum (G. lucidum) from cultivated one is of essential importance for its quality assurance and medicinal value estimation. Commonly used chemical and analytical methods using full spectrum are not so effective for the detection and interpretation due to the complex system of the herbal medicine. In this study, two penalized discriminant analysis models, penalized linear discriminant analysis (PLDA) and elastic net (Elnet),using FTIR spectroscopy have been explored for the purpose of discrimination and interpretation. The classification performances of the two penalized models have been compared with two widely used multivariate methods, principal component discriminant analysis (PCDA) and partial least squares discriminant analysis (PLSDA). The Elnet model involving a combination of L1 and L2 norm penalties enabled an automatic selection of a small number of informative spectral absorption bands and gave an excellent classification accuracy of 99% for discrimination between spectra of wild-grown and cultivated G. lucidum. Its classification performance was superior to that of the PLDA model in a pure L1 setting and outperformed the PCDA and PLSDA models using full wavelength. The well-performed selection of informative spectral features leads to substantial reduction in model complexity and improvement of classification accuracy, and it is particularly helpful for the quantitative interpretations of the major chemical constituents of G. lucidum regarding its anti-cancer effects. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Folded concave penalized learning in identifying multimodal MRI marker for Parkinson’s disease

    PubMed Central

    Liu, Hongcheng; Du, Guangwei; Zhang, Lijun; Lewis, Mechelle M.; Wang, Xue; Yao, Tao; Li, Runze; Huang, Xuemei

    2016-01-01

    Background Brain MRI holds promise to gauge different aspects of Parkinson’s disease (PD)-related pathological changes. Its analysis, however, is hindered by the high-dimensional nature of the data. New method This study introduces folded concave penalized (FCP) sparse logistic regression to identify biomarkers for PD from a large number of potential factors. The proposed statistical procedures target the challenges of high-dimensionality with limited data samples acquired. The maximization problem associated with the sparse logistic regression model is solved by local linear approximation. The proposed procedures then are applied to the empirical analysis of multimodal MRI data. Results From 45 features, the proposed approach identified 15 MRI markers and the UPSIT, which are known to be clinically relevant to PD. By combining the MRI and clinical markers, we can enhance substantially the specificity and sensitivity of the model, as indicated by the ROC curves. Comparison to existing methods We compare the folded concave penalized learning scheme with both the Lasso penalized scheme and the principle component analysis-based feature selection (PCA) in the Parkinson’s biomarker identification problem that takes into account both the clinical features and MRI markers. The folded concave penalty method demonstrates a substantially better clinical potential than both the Lasso and PCA in terms of specificity and sensitivity. Conclusions For the first time, we applied the FCP learning method to MRI biomarker discovery in PD. The proposed approach successfully identified MRI markers that are clinically relevant. Combining these biomarkers with clinical features can substantially enhance performance. PMID:27102045

  14. Evaluating penalized logistic regression models to predict Heat-Related Electric grid stress days

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bramer, L. M.; Rounds, J.; Burleyson, C. D.

    Understanding the conditions associated with stress on the electricity grid is important in the development of contingency plans for maintaining reliability during periods when the grid is stressed. In this paper, heat-related grid stress and the relationship with weather conditions is examined using data from the eastern United States. Penalized logistic regression models were developed and applied to predict stress on the electric grid using weather data. The inclusion of other weather variables, such as precipitation, in addition to temperature improved model performance. Several candidate models and datasets were examined. A penalized logistic regression model fit at the operation-zone levelmore » was found to provide predictive value and interpretability. Additionally, the importance of different weather variables observed at different time scales were examined. Maximum temperature and precipitation were identified as important across all zones while the importance of other weather variables was zone specific. The methods presented in this work are extensible to other regions and can be used to aid in planning and development of the electrical grid.« less

  15. Evaluating penalized logistic regression models to predict Heat-Related Electric grid stress days

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bramer, Lisa M.; Rounds, J.; Burleyson, C. D.

    Understanding the conditions associated with stress on the electricity grid is important in the development of contingency plans for maintaining reliability during periods when the grid is stressed. In this paper, heat-related grid stress and the relationship with weather conditions were examined using data from the eastern United States. Penalized logistic regression models were developed and applied to predict stress on the electric grid using weather data. The inclusion of other weather variables, such as precipitation, in addition to temperature improved model performance. Several candidate models and combinations of predictive variables were examined. A penalized logistic regression model which wasmore » fit at the operation-zone level was found to provide predictive value and interpretability. Additionally, the importance of different weather variables observed at various time scales were examined. Maximum temperature and precipitation were identified as important across all zones while the importance of other weather variables was zone specific. In conclusion, the methods presented in this work are extensible to other regions and can be used to aid in planning and development of the electrical grid.« less

  16. A premodern legacy: the "easy" criminalization of homosexual acts between women in the Finnish Penal Code of 1889.

    PubMed

    Löfström, J

    1998-01-01

    Homosexual acts between women were criminalized in Finland in the 1889 Penal Code which also criminalized men's homosexual acts for the first time explicitly in Finnish legislation. The inclusion of women in the Penal Code took place without much ado. In the article it is argued that the uncomplicated juxtaposing of men and women was due to the legacy of a cultural pattern where man and woman, as categories, were not in an all-pervasive polarity to each other, for example, in sexual subjectivity. A cultural pattern of low gender polarization was typical of preindustrial rural culture, and it can help us apprehend also certain other features in contemporary Finnish social and political life, for example, women obtaining a general franchise and eligibility for the parliament first in the world, in 1906. A modern image of "public man" and "private woman" was only making its way in Finnish society; hence, there was not much anxiety at women's entry in politics, or, for that matter, at their potential for (homo)sexual subjectivity becoming recognized publicly in criminal law.

  17. [Forensic-psychiatric assessment of pedophilia].

    PubMed

    Nitschke, J; Osterheider, M; Mokros, A

    2011-09-01

    The present paper illustrates the approach of a forensic psychiatric expert witness regarding the assessment of pedophilia. In a first step it is inevitable to differentiate if the defendant is suffering from pedophilia or if the alleged crime might have been committed because of other motivations (antisociality, sexual activity as redirection, impulsivity). A sound diagnostic assessment is indispendable for this task. In a second step the level of severity needs to be gauged in order to clarify whether the requirement of the entry criteria of §§ 20, 21 of the German penal code are fulfilled. In a third step, significant impairments of self-control mechanisms need to be elucidated. The present article reviews indicators of such impairments regarding pedophilia. With respect to a mandatory treatment order (§ 63 German penal code) or preventive detention (§ 66 German penal code) the legal prognosis of the defendant needs to be considered. The present paper gives an overview of the current state of risk assessment research and discusses the transfer to an individual prognosis critically. © Georg Thieme Verlag KG Stuttgart · New York.

  18. Orthogonalizing EM: A design-based least squares algorithm

    PubMed Central

    Xiong, Shifeng; Dai, Bin; Huling, Jared; Qian, Peter Z. G.

    2016-01-01

    We introduce an efficient iterative algorithm, intended for various least squares problems, based on a design of experiments perspective. The algorithm, called orthogonalizing EM (OEM), works for ordinary least squares and can be easily extended to penalized least squares. The main idea of the procedure is to orthogonalize a design matrix by adding new rows and then solve the original problem by embedding the augmented design in a missing data framework. We establish several attractive theoretical properties concerning OEM. For the ordinary least squares with a singular regression matrix, an OEM sequence converges to the Moore-Penrose generalized inverse-based least squares estimator. For ordinary and penalized least squares with various penalties, it converges to a point having grouping coherence for fully aliased regression matrices. Convergence and the convergence rate of the algorithm are examined. Finally, we demonstrate that OEM is highly efficient for large-scale least squares and penalized least squares problems, and is considerably faster than competing methods when n is much larger than p. Supplementary materials for this article are available online. PMID:27499558

  19. Evaluating penalized logistic regression models to predict Heat-Related Electric grid stress days

    DOE PAGES

    Bramer, Lisa M.; Rounds, J.; Burleyson, C. D.; ...

    2017-09-22

    Understanding the conditions associated with stress on the electricity grid is important in the development of contingency plans for maintaining reliability during periods when the grid is stressed. In this paper, heat-related grid stress and the relationship with weather conditions were examined using data from the eastern United States. Penalized logistic regression models were developed and applied to predict stress on the electric grid using weather data. The inclusion of other weather variables, such as precipitation, in addition to temperature improved model performance. Several candidate models and combinations of predictive variables were examined. A penalized logistic regression model which wasmore » fit at the operation-zone level was found to provide predictive value and interpretability. Additionally, the importance of different weather variables observed at various time scales were examined. Maximum temperature and precipitation were identified as important across all zones while the importance of other weather variables was zone specific. In conclusion, the methods presented in this work are extensible to other regions and can be used to aid in planning and development of the electrical grid.« less

  20. A 2-step penalized regression method for family-based next-generation sequencing association studies.

    PubMed

    Ding, Xiuhua; Su, Shaoyong; Nandakumar, Kannabiran; Wang, Xiaoling; Fardo, David W

    2014-01-01

    Large-scale genetic studies are often composed of related participants, and utilizing familial relationships can be cumbersome and computationally challenging. We present an approach to efficiently handle sequencing data from complex pedigrees that incorporates information from rare variants as well as common variants. Our method employs a 2-step procedure that sequentially regresses out correlation from familial relatedness and then uses the resulting phenotypic residuals in a penalized regression framework to test for associations with variants within genetic units. The operating characteristics of this approach are detailed using simulation data based on a large, multigenerational cohort.

  1. Parallel algorithm of real-time infrared image restoration based on total variation theory

    NASA Astrophysics Data System (ADS)

    Zhu, Ran; Li, Miao; Long, Yunli; Zeng, Yaoyuan; An, Wei

    2015-10-01

    Image restoration is a necessary preprocessing step for infrared remote sensing applications. Traditional methods allow us to remove the noise but penalize too much the gradients corresponding to edges. Image restoration techniques based on variational approaches can solve this over-smoothing problem for the merits of their well-defined mathematical modeling of the restore procedure. The total variation (TV) of infrared image is introduced as a L1 regularization term added to the objective energy functional. It converts the restoration process to an optimization problem of functional involving a fidelity term to the image data plus a regularization term. Infrared image restoration technology with TV-L1 model exploits the remote sensing data obtained sufficiently and preserves information at edges caused by clouds. Numerical implementation algorithm is presented in detail. Analysis indicates that the structure of this algorithm can be easily implemented in parallelization. Therefore a parallel implementation of the TV-L1 filter based on multicore architecture with shared memory is proposed for infrared real-time remote sensing systems. Massive computation of image data is performed in parallel by cooperating threads running simultaneously on multiple cores. Several groups of synthetic infrared image data are used to validate the feasibility and effectiveness of the proposed parallel algorithm. Quantitative analysis of measuring the restored image quality compared to input image is presented. Experiment results show that the TV-L1 filter can restore the varying background image reasonably, and that its performance can achieve the requirement of real-time image processing.

  2. The geology of the Penal/Barrackpore field, onshore Trinidad

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dyer, B.L.

    1991-03-01

    The Penal/Barrackpore field was discovered in 1938 and is located in the southern subbasin of onshore Trinidad. It is one of a series of northeast-southwest trending en echelon middle Miocene anticlinal structures that was later accentuated by late Pliocene transpressional folding. The middle Miocene Herrera and Karamat turbiditic sandstones are the primary reservoir rock in the subsurface anticline of the Penal/Barrackpore field. These turbidites were sourced from the north and deposited within the marls and clays of the Cipero Formation. The Karamat sandstones are followed in vertical stratigraphic succession by the shales and boulder beds of the Lengua formation, themore » turbidites and deltaics of the lower and middle Cruse, and the deltaics of the upper Cruse, the Forest, and the Morne L'Enfer formations. Relative movement of the South American and Caribbean plates climaxed in the middle Miocene compressive tectonic event and produced an imbricate pattern of southward-facing basement-involved thrusts. The Pliocene deltaics were sourced by erosion of Miocene highs to the north and the South American landmass to the south. These deltaics exhibit onlap onto the preexisting Miocene highs. The late Pliocene transpression also coincides with the onset of oil migration along faults, diapirs, and unconformities from the Cretaceous Naparima Hill source. The Lengua Formation and the upper Forest clays are considered effect seals. Hydrocarbon trapping is structurally and stratigraphically controlled, with structure being the dominant trapping mechanism. Ultimate recoverable reserves for the Penal/Barrackpore field are estimated at 127.9 MMBO and 628.8 bcf. The field is presently owned and operated by the Trinidad and Tobago Oil Company Limited (TRINTOC).« less

  3. Valuing hydrological alteration in multi-objective water resources management

    NASA Astrophysics Data System (ADS)

    Bizzi, Simone; Pianosi, Francesca; Soncini-Sessa, Rodolfo

    2012-11-01

    SummaryThe management of water through the impoundment of rivers by dams and reservoirs is necessary to support key human activities such as hydropower production, agriculture and flood risk mitigation. Advances in multi-objective optimization techniques and ever growing computing power make it possible to design reservoir operating policies that represent Pareto-optimal tradeoffs between multiple interests. On the one hand, such optimization methods can enhance performances of commonly targeted objectives (such as hydropower production or water supply), on the other hand they risk strongly penalizing all the interests not directly (i.e. mathematically) included in the optimization algorithm. The alteration of the downstream hydrological regime is a well established cause of ecological degradation and its evaluation and rehabilitation is commonly required by recent legislation (as the Water Framework Directive in Europe). However, it is rarely embedded in reservoir optimization routines and, even when explicitly considered, the criteria adopted for its evaluation are doubted and not commonly trusted, undermining the possibility of real implementation of environmentally friendly policies. The main challenges in defining and assessing hydrological alterations are: how to define a reference state (referencing); how to define criteria upon which to build mathematical indicators of alteration (measuring); and finally how to aggregate the indicators in a single evaluation index (valuing) that can serve as objective function in the optimization problem. This paper aims to address these issues by: (i) discussing the benefits and constrains of different approaches to referencing, measuring and valuing hydrological alteration; (ii) testing two alternative indices of hydrological alteration, one based on the established framework of Indicators of Hydrological Alteration (Richter et al., 1996), and one satisfying the mathematical properties required by widely used optimization methods based on dynamic programming; (iii) demonstrating and discussing these indices by application River Ticino, in Italy; (iv) providing a framework to effectively include hydrological alteration within reservoir operation optimization.

  4. Nanoscale multiphase phase field approach for stress- and temperature-induced martensitic phase transformations with interfacial stresses at finite strains

    NASA Astrophysics Data System (ADS)

    Basak, Anup; Levitas, Valery I.

    2018-04-01

    A thermodynamically consistent, novel multiphase phase field approach for stress- and temperature-induced martensitic phase transformations at finite strains and with interfacial stresses has been developed. The model considers a single order parameter to describe the austenite↔martensitic transformations, and another N order parameters describing N variants and constrained to a plane in an N-dimensional order parameter space. In the free energy model coexistence of three or more phases at a single material point (multiphase junction), and deviation of each variant-variant transformation path from a straight line have been penalized. Some shortcomings of the existing models are resolved. Three different kinematic models (KMs) for the transformation deformation gradient tensors are assumed: (i) In KM-I the transformation deformation gradient tensor is a linear function of the Bain tensors for the variants. (ii) In KM-II the natural logarithms of the transformation deformation gradient is taken as a linear combination of the natural logarithm of the Bain tensors multiplied with the interpolation functions. (iii) In KM-III it is derived using the twinning equation from the crystallographic theory. The instability criteria for all the phase transformations have been derived for all the kinematic models, and their comparative study is presented. A large strain finite element procedure has been developed and used for studying the evolution of some complex microstructures in nanoscale samples under various loading conditions. Also, the stresses within variant-variant boundaries, the sample size effect, effect of penalizing the triple junctions, and twinned microstructures have been studied. The present approach can be extended for studying grain growth, solidifications, para↔ferro electric transformations, and diffusive phase transformations.

  5. Penalized maximum likelihood reconstruction for x-ray differential phase-contrast tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brendel, Bernhard, E-mail: bernhard.brendel@philips.com; Teuffenbach, Maximilian von; Noël, Peter B.

    2016-01-15

    Purpose: The purpose of this work is to propose a cost function with regularization to iteratively reconstruct attenuation, phase, and scatter images simultaneously from differential phase contrast (DPC) acquisitions, without the need of phase retrieval, and examine its properties. Furthermore this reconstruction method is applied to an acquisition pattern that is suitable for a DPC tomographic system with continuously rotating gantry (sliding window acquisition), overcoming the severe smearing in noniterative reconstruction. Methods: We derive a penalized maximum likelihood reconstruction algorithm to directly reconstruct attenuation, phase, and scatter image from the measured detector values of a DPC acquisition. The proposed penaltymore » comprises, for each of the three images, an independent smoothing prior. Image quality of the proposed reconstruction is compared to images generated with FBP and iterative reconstruction after phase retrieval. Furthermore, the influence between the priors is analyzed. Finally, the proposed reconstruction algorithm is applied to experimental sliding window data acquired at a synchrotron and results are compared to reconstructions based on phase retrieval. Results: The results show that the proposed algorithm significantly increases image quality in comparison to reconstructions based on phase retrieval. No significant mutual influence between the proposed independent priors could be observed. Further it could be illustrated that the iterative reconstruction of a sliding window acquisition results in images with substantially reduced smearing artifacts. Conclusions: Although the proposed cost function is inherently nonconvex, it can be used to reconstruct images with less aliasing artifacts and less streak artifacts than reconstruction methods based on phase retrieval. Furthermore, the proposed method can be used to reconstruct images of sliding window acquisitions with negligible smearing artifacts.« less

  6. Semiparametric mixed-effects analysis of PK/PD models using differential equations.

    PubMed

    Wang, Yi; Eskridge, Kent M; Zhang, Shunpu

    2008-08-01

    Motivated by the use of semiparametric nonlinear mixed-effects modeling on longitudinal data, we develop a new semiparametric modeling approach to address potential structural model misspecification for population pharmacokinetic/pharmacodynamic (PK/PD) analysis. Specifically, we use a set of ordinary differential equations (ODEs) with form dx/dt = A(t)x + B(t) where B(t) is a nonparametric function that is estimated using penalized splines. The inclusion of a nonparametric function in the ODEs makes identification of structural model misspecification feasible by quantifying the model uncertainty and provides flexibility for accommodating possible structural model deficiencies. The resulting model will be implemented in a nonlinear mixed-effects modeling setup for population analysis. We illustrate the method with an application to cefamandole data and evaluate its performance through simulations.

  7. Iterative raw measurements restoration method with penalized weighted least squares approach for low-dose CT

    NASA Astrophysics Data System (ADS)

    Takahashi, Hisashi; Goto, Taiga; Hirokawa, Koichi; Miyazaki, Osamu

    2014-03-01

    Statistical iterative reconstruction and post-log data restoration algorithms for CT noise reduction have been widely studied and these techniques have enabled us to reduce irradiation doses while maintaining image qualities. In low dose scanning, electronic noise becomes obvious and it results in some non-positive signals in raw measurements. The nonpositive signal should be converted to positive signal so that it can be log-transformed. Since conventional conversion methods do not consider local variance on the sinogram, they have difficulty of controlling the strength of the filtering. Thus, in this work, we propose a method to convert the non-positive signal to the positive signal by mainly controlling the local variance. The method is implemented in two separate steps. First, an iterative restoration algorithm based on penalized weighted least squares is used to mitigate the effect of electronic noise. The algorithm preserves the local mean and reduces the local variance induced by the electronic noise. Second, smoothed raw measurements by the iterative algorithm are converted to the positive signal according to a function which replaces the non-positive signal with its local mean. In phantom studies, we confirm that the proposed method properly preserves the local mean and reduce the variance induced by the electronic noise. Our technique results in dramatically reduced shading artifacts and can also successfully cooperate with the post-log data filter to reduce streak artifacts.

  8. [The clinical predictors of heteroaggressive behaviour of the women serving sentence in penitentiary].

    PubMed

    Shaklein, K N; Bardenshtein, L M; Demcheva, N K

    To identify clinical predictors of heteroaggressive behavior. Three hundreds and three women serving sentence in a penal colony were examined using clinical, neurologic and statistical methods. The main group consisted of 225 women with heteroaggressive behavior, the control group included 78 women without aggressive behavior. Differences between the main and control groups in the structure of mental disorders and key syndromes were revealed. The authors conclude that the states with elements of dysphoria, dysthymia, decompensation of personality disorders, which are defined in the various forms of mental pathology, are the most significant predictors of heteroaggressive behavior in women in the penal colony.

  9. HIV counselling in prisons.

    PubMed

    Curran, L; McHugh, M; Nooney, K

    1989-01-01

    HIV presents particular problem in penal establishments: the nature of the population; conditions in prison; media attention and misinformation; the possibility of transmission within and beyond the prison population; the extra issues that apply to female prisoners. These are discussed in the context of prison policy regarding HIV and the broad strategic approach which is being adopted to manage the problem of HIV within penal institutions. Counselling has a key role in the overall strategy. Pre- and post-test counselling with prisoners is described and the particular problems presented by inmates are discussed and illustrated by reference to case histories. Developments in counselling provision for inmates are outlined.

  10. Gender Norms in Portuguese College Students' Judgments in Familial Homicides: Bad Men and Mad Women.

    PubMed

    Saavedra, Luísa; Cameira, Miguel; Rebelo, Ana Sofia; Sebastião, Cátia

    2015-05-08

    The gender of the offender has been proved to be an important factor in judicial sentencing. In this study, we analyze the judgments of College students regarding perpetrators of familial homicides to evaluate the presence of these gender norms and biases in the larger society. The sample included 303 college students (54.8% female) enrolled in several social sciences and engineering courses. Participants were asked to read 12 vignettes based on real crimes taken from Portuguese newspapers. Half were related to infanticide, and half were related to intimate partner homicide. The sex of the offender was orthogonally manipulated to the type of crime. The results show that gender had an important impact on sentences, with males being more harshly penalized by reasons of perversity and women less penalized by reason of mental disorders. In addition, filicide was more heavily penalized than was intimate partner homicide. The results also revealed a tendency toward a retributive conception of punishment. We discuss how gender norms in justice seem to be embedded in society as well as the need for intervention against the punitive tendency of this population. © The Author(s) 2015.

  11. Medico-legal implications of mobbing. A false accusation of psychological harassment at the workplace.

    PubMed

    Jarreta, Begoña Martínez; García-Campayo, Javier; Gascón, Santiago; Bolea, Miguel

    2004-12-02

    Mobbing, or psychological harassment at the workplace, is usually defined as a situation in which a person or a group of people engage in extreme psychological violence against another person. In Spain, the number of reports for mobbing has increased extraordinarily in the last years. The reports are increasing dramatically not only before the Labour Courts, but also before the Civil Courts, with claims for damages, and before the Penal Court for offences causing physical or moral injury, etc., since at the present time this figure is not typified as an offence in the Spanish Penal Code. The high degree of complexity of this situation has given rise to frequent misuse of the term and to a number of false accusations of mobbing. A recent European Parliament Resolution on harassment at the workplace addressed the devastating consequences of false accusations. In this paper we present a case in which the "false" victim was mentally ill (paranoia) but succeed in generating an extreme dangerous environment of great harassment against the "false" assailants that were "falsely" accused of mobbing. Forensic diagnosis of the psychiatric disorder suffered by the "false" victim was essential to clarify the issue at the Penal Court.

  12. [Euthanasia - an attempt to organize issue].

    PubMed

    Kirmes, Tomasz; Wilk, Mateusz; Chowaniec, Czesław

    This article is an attempt to complete and holistically discuss problem of euthanasia, especially its ethical and legal aspects, comparing to Polish law. The subject of euthanasia arouse interest of the society because it touches one of the most important aspects of life, which is the death. Even bigger emotions are aroused amongst physicians. They are forced to put on the line the life as biggest value on the one side and autonomy of human being on the other. It also touches the empathy for suffering. The euthanasia was divided into three forms: active euthanasia, passive euthanasia and assisted suicide. Any form of euthanasia is illegal in Poland according to both the Penal Code and Code of Medical Ethics. Range of possible penal consequences perpetrator is very wide from waiver of punishment to life imprisonment and it comes from different penal qualification of the euthanasia. Qualification of the euthanasia is based on terms of intent of perpetrator's act, request of patient, strong empathy for suffering if the patient and decision based on up-to-date medical knowledge. It is valuable to mention "do-not-resuscitate" DNR procedure, which in case of medical futility is legally accepted in Poland, but in other form may be qualified as passive euthanasia.

  13. Simulation of confined magnetohydrodynamic flows with Dirichlet boundary conditions using a pseudo-spectral method with volume penalization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morales, Jorge A.; Leroy, Matthieu; Bos, Wouter J.T.

    A volume penalization approach to simulate magnetohydrodynamic (MHD) flows in confined domains is presented. Here the incompressible visco-resistive MHD equations are solved using parallel pseudo-spectral solvers in Cartesian geometries. The volume penalization technique is an immersed boundary method which is characterized by a high flexibility for the geometry of the considered flow. In the present case, it allows to use other than periodic boundary conditions in a Fourier pseudo-spectral approach. The numerical method is validated and its convergence is assessed for two- and three-dimensional hydrodynamic (HD) and MHD flows, by comparing the numerical results with results from literature and analyticalmore » solutions. The test cases considered are two-dimensional Taylor–Couette flow, the z-pinch configuration, three dimensional Orszag–Tang flow, Ohmic-decay in a periodic cylinder, three-dimensional Taylor–Couette flow with and without axial magnetic field and three-dimensional Hartmann-instabilities in a cylinder with an imposed helical magnetic field. Finally, we present a magnetohydrodynamic flow simulation in toroidal geometry with non-symmetric cross section and imposing a helical magnetic field to illustrate the potential of the method.« less

  14. Mental health/illness and prisons as place: frontline clinicians׳ perspectives of mental health work in a penal setting.

    PubMed

    Wright, Nicola; Jordan, Melanie; Kane, Eddie

    2014-09-01

    This article takes mental health and prisons as its two foci. It explores the links between social and structural aspects of the penal setting, the provision of mental healthcare in prisons, and mental health work in this environment. This analysis utilises qualitative interview data from prison-based fieldwork undertaken in Her Majesty׳s Prison Service, England. Two themes are discussed: (1) the desire and practicalities of doing mental health work and (2) prison staff as mental health work allies. Concepts covered include equivalence, training, ownership, informal communication, mental health knowledge, service gatekeepers, case identification, and unmet need. Implications for practice are (1) the mental health knowledge and understanding of prison wing staff could be appraised and developed to improve mental healthcare and address unmet need. Their role as observers and gatekeepers could be considered. (2) The realities of frontline mental health work for clinicians in the penal environment should be embraced and used to produce and implement improved policy and practice guidance, which is in better accord with the actuality of the context - both socially and structurally. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Threshold-driven optimization for reference-based auto-planning

    NASA Astrophysics Data System (ADS)

    Long, Troy; Chen, Mingli; Jiang, Steve; Lu, Weiguo

    2018-02-01

    We study threshold-driven optimization methodology for automatically generating a treatment plan that is motivated by a reference DVH for IMRT treatment planning. We present a framework for threshold-driven optimization for reference-based auto-planning (TORA). Commonly used voxel-based quadratic penalties have two components for penalizing under- and over-dosing of voxels: a reference dose threshold and associated penalty weight. Conventional manual- and auto-planning using such a function involves iteratively updating the preference weights while keeping the thresholds constant, an unintuitive and often inconsistent method for planning toward some reference DVH. However, driving a dose distribution by threshold values instead of preference weights can achieve similar plans with less computational effort. The proposed methodology spatially assigns reference DVH information to threshold values, and iteratively improves the quality of that assignment. The methodology effectively handles both sub-optimal and infeasible DVHs. TORA was applied to a prostate case and a liver case as a proof-of-concept. Reference DVHs were generated using a conventional voxel-based objective, then altered to be either infeasible or easy-to-achieve. TORA was able to closely recreate reference DVHs in 5-15 iterations of solving a simple convex sub-problem. TORA has the potential to be effective for auto-planning based on reference DVHs. As dose prediction and knowledge-based planning becomes more prevalent in the clinical setting, incorporating such data into the treatment planning model in a clear, efficient way will be crucial for automated planning. A threshold-focused objective tuning should be explored over conventional methods of updating preference weights for DVH-guided treatment planning.

  16. Sub-optimal control of unsteady boundary layer separation and optimal control of Saltzman-Lorenz model

    NASA Astrophysics Data System (ADS)

    Sardesai, Chetan R.

    The primary objective of this research is to explore the application of optimal control theory in nonlinear, unsteady, fluid dynamical settings. Two problems are considered: (1) control of unsteady boundary-layer separation, and (2) control of the Saltzman-Lorenz model. The unsteady boundary-layer equations are nonlinear partial differential equations that govern the eruptive events that arise when an adverse pressure gradient acts on a boundary layer at high Reynolds numbers. The Saltzman-Lorenz model consists of a coupled set of three nonlinear ordinary differential equations that govern the time-dependent coefficients in truncated Fourier expansions of Rayleigh-Renard convection and exhibit deterministic chaos. Variational methods are used to derive the nonlinear optimal control formulations based on cost functionals that define the control objective through a performance measure and a penalty function that penalizes the cost of control. The resulting formulation consists of the nonlinear state equations, which must be integrated forward in time, and the nonlinear control (adjoint) equations, which are integrated backward in time. Such coupled forward-backward time integrations are computationally demanding; therefore, the full optimal control problem for the Saltzman-Lorenz model is carried out, while the more complex unsteady boundary-layer case is solved using a sub-optimal approach. The latter is a quasi-steady technique in which the unsteady boundary-layer equations are integrated forward in time, and the steady control equation is solved at each time step. Both sub-optimal control of the unsteady boundary-layer equations and optimal control of the Saltzman-Lorenz model are found to be successful in meeting the control objectives for each problem. In the case of boundary-layer separation, the control results indicate that it is necessary to eliminate the recirculation region that is a precursor to the unsteady boundary-layer eruptions. In the case of the Saltzman-Lorenz model, it is possible to control the system about either of the two unstable equilibrium points representing clockwise and counterclockwise rotation of the convection roles in a parameter regime for which the uncontrolled solution would exhibit deterministic chaos.

  17. Wind farm optimization using evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Ituarte-Villarreal, Carlos M.

    In recent years, the wind power industry has focused its efforts on solving the Wind Farm Layout Optimization (WFLO) problem. Wind resource assessment is a pivotal step in optimizing the wind-farm design and siting and, in determining whether a project is economically feasible or not. In the present work, three (3) different optimization methods are proposed for the solution of the WFLO: (i) A modified Viral System Algorithm applied to the optimization of the proper location of the components in a wind-farm to maximize the energy output given a stated wind environment of the site. The optimization problem is formulated as the minimization of energy cost per unit produced and applies a penalization for the lack of system reliability. The viral system algorithm utilized in this research solves three (3) well-known problems in the wind-energy literature; (ii) a new multiple objective evolutionary algorithm to obtain optimal placement of wind turbines while considering the power output, cost, and reliability of the system. The algorithm presented is based on evolutionary computation and the objective functions considered are the maximization of power output, the minimization of wind farm cost and the maximization of system reliability. The final solution to this multiple objective problem is presented as a set of Pareto solutions and, (iii) A hybrid viral-based optimization algorithm adapted to find the proper component configuration for a wind farm with the introduction of the universal generating function (UGF) analytical approach to discretize the different operating or mechanical levels of the wind turbines in addition to the various wind speed states. The proposed methodology considers the specific probability functions of the wind resource to describe their proper behaviors to account for the stochastic comportment of the renewable energy components, aiming to increase their power output and the reliability of these systems. The developed heuristic considers a variable number of system components and wind turbines with different operating characteristics and sizes, to have a more heterogeneous model that can deal with changes in the layout and in the power generation requirements over the time. Moreover, the approach evaluates the impact of the wind-wake effect of the wind turbines upon one another to describe and evaluate the power production capacity reduction of the system depending on the layout distribution of the wind turbines.

  18. Optimal orbit transfer suitable for large flexible structures

    NASA Technical Reports Server (NTRS)

    Chatterjee, Alok K.

    1989-01-01

    The problem of continuous low-thrust planar orbit transfer of large flexible structures is formulated as an optimal control problem with terminal state constraints. The dynamics of the spacecraft motion are treated as a point-mass central force field problem; the thrust-acceleration magnitude is treated as an additional state variable; and the rate of change of thrust-acceleration is treated as a control variable. To ensure smooth transfer, essential for flexible structures, an additional quadratic term is appended to the time cost functional. This term penalizes any abrupt change in acceleration. Numerical results are presented for the special case of a planar transfer.

  19. [The genetic fingerprints file in France: between security and freedom].

    PubMed

    Manaouil, C; Gignon, M; Werbrouck, A; Jarde, O

    2008-01-01

    In France, the French National File Automated with Genetic fingerprints (FNAEG) is a bank automated by genetic data which is used in penal domain. It facilitates search of the authors of malpractices, or the missing people. Since 1998, it has enabled to resolve numerous criminal cases. An extension of the field of application has been observed. It is a confidential register which is subjected to numerous controls. Nevertheless, private character of the data and its functioning (criminal character of the refusal of taking, periods of answer, and problem of data's conservation) explain the important contesting of associations worried about the respect of personal freedoms.

  20. Automatic treatment plan re-optimization for adaptive radiotherapy guided with the initial plan DVHs.

    PubMed

    Li, Nan; Zarepisheh, Masoud; Uribe-Sanchez, Andres; Moore, Kevin; Tian, Zhen; Zhen, Xin; Graves, Yan Jiang; Gautier, Quentin; Mell, Loren; Zhou, Linghong; Jia, Xun; Jiang, Steve

    2013-12-21

    Adaptive radiation therapy (ART) can reduce normal tissue toxicity and/or improve tumor control through treatment adaptations based on the current patient anatomy. Developing an efficient and effective re-planning algorithm is an important step toward the clinical realization of ART. For the re-planning process, manual trial-and-error approach to fine-tune planning parameters is time-consuming and is usually considered unpractical, especially for online ART. It is desirable to automate this step to yield a plan of acceptable quality with minimal interventions. In ART, prior information in the original plan is available, such as dose-volume histogram (DVH), which can be employed to facilitate the automatic re-planning process. The goal of this work is to develop an automatic re-planning algorithm to generate a plan with similar, or possibly better, DVH curves compared with the clinically delivered original plan. Specifically, our algorithm iterates the following two loops. An inner loop is the traditional fluence map optimization, in which we optimize a quadratic objective function penalizing the deviation of the dose received by each voxel from its prescribed or threshold dose with a set of fixed voxel weighting factors. In outer loop, the voxel weighting factors in the objective function are adjusted according to the deviation of the current DVH curves from those in the original plan. The process is repeated until the DVH curves are acceptable or maximum iteration step is reached. The whole algorithm is implemented on GPU for high efficiency. The feasibility of our algorithm has been demonstrated with three head-and-neck cancer IMRT cases, each having an initial planning CT scan and another treatment CT scan acquired in the middle of treatment course. Compared with the DVH curves in the original plan, the DVH curves in the resulting plan using our algorithm with 30 iterations are better for almost all structures. The re-optimization process takes about 30 s using our in-house optimization engine.

  1. Functional mixed effects spectral analysis

    PubMed Central

    KRAFTY, ROBERT T.; HALL, MARTICA; GUO, WENSHENG

    2011-01-01

    SUMMARY In many experiments, time series data can be collected from multiple units and multiple time series segments can be collected from the same unit. This article introduces a mixed effects Cramér spectral representation which can be used to model the effects of design covariates on the second-order power spectrum while accounting for potential correlations among the time series segments collected from the same unit. The transfer function is composed of a deterministic component to account for the population-average effects and a random component to account for the unit-specific deviations. The resulting log-spectrum has a functional mixed effects representation where both the fixed effects and random effects are functions in the frequency domain. It is shown that, when the replicate-specific spectra are smooth, the log-periodograms converge to a functional mixed effects model. A data-driven iterative estimation procedure is offered for the periodic smoothing spline estimation of the fixed effects, penalized estimation of the functional covariance of the random effects, and unit-specific random effects prediction via the best linear unbiased predictor. PMID:26855437

  2. [Demands for death (suicide assistance and euthanasia) in palliative medicine].

    PubMed

    Moynier-Vantieghem, Karine; Weber, Catherine; Espolio, Desbaillet Yolanda; Pautex, Sophie; Zulian, Gilbert

    2010-02-03

    During a prospective open survey over 12 months of hospitalized patients, 44 death demands were registered for 39 patients (25 cancer, 6 cardiovascular disorder, 2 Parkinson's disease, 3 arthritis, 1 COPD, 1 dementia and 1 severe depression). 14 patients were also depressed. 28 requested euthanasia, 16 suicide assistance. At 1 month, 3 persisted, 16 had abandoned, 16 had died and 4 were not questioned. At 6 months, 7 were alive but had abandoned and 2 had committed suicide at their home. The majority of death demands correspond to euthanasia which is a murder according to the penal code. In front of such demand, realistic short-term objectives must be established. Many patients give up their project. This indicates great uncertainty in front of care and greatest ambivalence in front of life.

  3. [Conscientious objection in the matter of abortion].

    PubMed

    Serrano Gil, A; García Casado, M L

    1992-03-01

    The issue of conscientious objection in Spain has been used by pro-choice groups against objecting health personnel as one of the obstacles to the implementation of the abortion law, a misnomer. At present objection is massive in the public sector; 95% of abortions are carried out in private clinics with highly lucrative returns; abortion tourism has decreased; and false objection has proliferated in the public sector when the objector performs abortions in the private sector for high fees. The legal framework for conscientious objection is absent in Spain. Neither Article 417 of the Penal Code depenalizing abortion, nor the Ministerial Decree of July 31, 1985, nor the Royal Decree of November 21, 1986 recognize such a concept. However, the ruling of the Constitutional Court on April 11, 1985 confirmed that such objection can be exercised with independence. Some authors refer to the applicability of Law No. 48 of December 16, 1984 that regulates conscientious objection in military service to health personnel. The future law concerning the fundamental right of ideological and religious liberty embodied in Article 16.1 of the Constitution has to be revised. A draft bill was submitted in the Congress or Representatives concerning this issue on May 3, 1985 that recognizes the right of medical personnel to object to abortion without career repercussions. Another draft bill was introduced on April 17, 1985 that would allow the nonparticipation of medical personnel in the interruption of pregnancy, however, they would be prohibited from practicing such in the private hospitals. Neither of these proposed bills became law. Professional groups either object unequivocally, or do not object at all, or object on an ethical level but do not object to therapeutic abortion. The resolution of this issue has to be by consensus and not by imposition.

  4. Examination of influential observations in penalized spline regression

    NASA Astrophysics Data System (ADS)

    Türkan, Semra

    2013-10-01

    In parametric or nonparametric regression models, the results of regression analysis are affected by some anomalous observations in the data set. Thus, detection of these observations is one of the major steps in regression analysis. These observations are precisely detected by well-known influence measures. Pena's statistic is one of them. In this study, Pena's approach is formulated for penalized spline regression in terms of ordinary residuals and leverages. The real data and artificial data are used to see illustrate the effectiveness of Pena's statistic as to Cook's distance on detecting influential observations. The results of the study clearly reveal that the proposed measure is superior to Cook's Distance to detect these observations in large data set.

  5. Nonparaxial rogue waves in optical Kerr media.

    PubMed

    Temgoua, D D Estelle; Kofane, T C

    2015-06-01

    We consider the inhomogeneous nonparaxial nonlinear Schrödinger (NLS) equation with varying dispersion, nonlinearity, and nonparaxiality coefficients, which governs the nonlinear wave propagation in an inhomogeneous optical fiber system. We present the similarity and Darboux transformations and for the chosen specific set of parameters and free functions, the first- and second-order rational solutions of the nonparaxial NLS equation are generated. In particular, the features of rogue waves throughout polynomial and Jacobian elliptic functions are analyzed, showing the nonparaxial effects. It is shown that the nonparaxiality increases the intensity of rogue waves by increasing the length and reducing the width simultaneously, by the way it increases their speed and penalizes interactions between them. These properties and the characteristic controllability of the nonparaxial rogue waves may give another opportunity to perform experimental realizations and potential applications in optical fibers.

  6. A single-index threshold Cox proportional hazard model for identifying a treatment-sensitive subset based on multiple biomarkers.

    PubMed

    He, Ye; Lin, Huazhen; Tu, Dongsheng

    2018-06-04

    In this paper, we introduce a single-index threshold Cox proportional hazard model to select and combine biomarkers to identify patients who may be sensitive to a specific treatment. A penalized smoothed partial likelihood is proposed to estimate the parameters in the model. A simple, efficient, and unified algorithm is presented to maximize this likelihood function. The estimators based on this likelihood function are shown to be consistent and asymptotically normal. Under mild conditions, the proposed estimators also achieve the oracle property. The proposed approach is evaluated through simulation analyses and application to the analysis of data from two clinical trials, one involving patients with locally advanced or metastatic pancreatic cancer and one involving patients with resectable lung cancer. Copyright © 2018 John Wiley & Sons, Ltd.

  7. Integrative Analysis of High-throughput Cancer Studies with Contrasted Penalization

    PubMed Central

    Shi, Xingjie; Liu, Jin; Huang, Jian; Zhou, Yong; Shia, BenChang; Ma, Shuangge

    2015-01-01

    In cancer studies with high-throughput genetic and genomic measurements, integrative analysis provides a way to effectively pool and analyze heterogeneous raw data from multiple independent studies and outperforms “classic” meta-analysis and single-dataset analysis. When marker selection is of interest, the genetic basis of multiple datasets can be described using the homogeneity model or the heterogeneity model. In this study, we consider marker selection under the heterogeneity model, which includes the homogeneity model as a special case and can be more flexible. Penalization methods have been developed in the literature for marker selection. This study advances from the published ones by introducing the contrast penalties, which can accommodate the within- and across-dataset structures of covariates/regression coefficients and, by doing so, further improve marker selection performance. Specifically, we develop a penalization method that accommodates the across-dataset structures by smoothing over regression coefficients. An effective iterative algorithm, which calls an inner coordinate descent iteration, is developed. Simulation shows that the proposed method outperforms the benchmark with more accurate marker identification. The analysis of breast cancer and lung cancer prognosis studies with gene expression measurements shows that the proposed method identifies genes different from those using the benchmark and has better prediction performance. PMID:24395534

  8. Psychic trauma as cause of death.

    PubMed

    Terranova, C; Snenghi, R; Thiene, G; Ferrara, S D

    2011-01-01

    of study Psychic trauma is described as the action of 'an emotionally overwhelming factor' capable of causing neurovegetative alterations leading to transitory or persisting bodily changes. The medico-legal concept of psychic trauma and its definition as a cause in penal cases is debated. The authors present three cases of death after psychic trauma, and discuss the definition of cause within the penal ambit of identified 'emotionally overwhelming factors'. The methodological approach to ascertainment and criterion-based assessment in each case involved the following phases: (1) examination of circumstantial evidence, clinical records and documentation; (2) autopsy; (3) ascertainment of cause of death; and (4) ascertainment of psychic trauma, and its coexisting relationship with the cause of death. The results and assessment of each of the three cases are discussed from the viewpoint of the causal connotation of psychic trauma. In the cases presented, psychic trauma caused death, as deduced from assessment of the type of externally caused emotional insult, the subjects' personal characteristics and the circumstances of the event causing death. In cases of death due to psychic trauma, careful methodological ascertainment is essential, with the double aim of defining 'emotionally overwhelming factors' as a significant cause of death from the penal point of view, and of identifying the responsibility of third parties involved in the death event and associated dynamics of homicide.

  9. Detection of Protein Complexes Based on Penalized Matrix Decomposition in a Sparse Protein⁻Protein Interaction Network.

    PubMed

    Cao, Buwen; Deng, Shuguang; Qin, Hua; Ding, Pingjian; Chen, Shaopeng; Li, Guanghui

    2018-06-15

    High-throughput technology has generated large-scale protein interaction data, which is crucial in our understanding of biological organisms. Many complex identification algorithms have been developed to determine protein complexes. However, these methods are only suitable for dense protein interaction networks, because their capabilities decrease rapidly when applied to sparse protein⁻protein interaction (PPI) networks. In this study, based on penalized matrix decomposition ( PMD ), a novel method of penalized matrix decomposition for the identification of protein complexes (i.e., PMD pc ) was developed to detect protein complexes in the human protein interaction network. This method mainly consists of three steps. First, the adjacent matrix of the protein interaction network is normalized. Second, the normalized matrix is decomposed into three factor matrices. The PMD pc method can detect protein complexes in sparse PPI networks by imposing appropriate constraints on factor matrices. Finally, the results of our method are compared with those of other methods in human PPI network. Experimental results show that our method can not only outperform classical algorithms, such as CFinder, ClusterONE, RRW, HC-PIN, and PCE-FR, but can also achieve an ideal overall performance in terms of a composite score consisting of F-measure, accuracy (ACC), and the maximum matching ratio (MMR).

  10. Investigation of statistical iterative reconstruction for dedicated breast CT

    PubMed Central

    Makeev, Andrey; Glick, Stephen J.

    2013-01-01

    Purpose: Dedicated breast CT has great potential for improving the detection and diagnosis of breast cancer. Statistical iterative reconstruction (SIR) in dedicated breast CT is a promising alternative to traditional filtered backprojection (FBP). One of the difficulties in using SIR is the presence of free parameters in the algorithm that control the appearance of the resulting image. These parameters require tuning in order to achieve high quality reconstructions. In this study, the authors investigated the penalized maximum likelihood (PML) method with two commonly used types of roughness penalty functions: hyperbolic potential and anisotropic total variation (TV) norm. Reconstructed images were compared with images obtained using standard FBP. Optimal parameters for PML with the hyperbolic prior are reported for the task of detecting microcalcifications embedded in breast tissue. Methods: Computer simulations were used to acquire projections in a half-cone beam geometry. The modeled setup describes a realistic breast CT benchtop system, with an x-ray spectra produced by a point source and an a-Si, CsI:Tl flat-panel detector. A voxelized anthropomorphic breast phantom with 280 μm microcalcification spheres embedded in it was used to model attenuation properties of the uncompressed woman's breast in a pendant position. The reconstruction of 3D images was performed using the separable paraboloidal surrogates algorithm with ordered subsets. Task performance was assessed with the ideal observer detectability index to determine optimal PML parameters. Results: The authors' findings suggest that there is a preferred range of values of the roughness penalty weight and the edge preservation threshold in the penalized objective function with the hyperbolic potential, which resulted in low noise images with high contrast microcalcifications preserved. In terms of numerical observer detectability index, the PML method with optimal parameters yielded substantially improved performance (by a factor of greater than 10) compared to FBP. The hyperbolic prior was also observed to be superior to the TV norm. A few of the best-performing parameter pairs for the PML method also demonstrated superior performance for various radiation doses. In fact, using PML with certain parameter values results in better images, acquired using 2 mGy dose, than FBP-reconstructed images acquired using 6 mGy dose. Conclusions: A range of optimal free parameters for the PML algorithm with hyperbolic and TV norm-based potentials is presented for the microcalcification detection task, in dedicated breast CT. The reported values can be used as starting values of the free parameters, when SIR techniques are used for image reconstruction. Significant improvement in image quality can be achieved by using PML with optimal combination of parameters, as compared to FBP. Importantly, these results suggest improved detection of microcalcifications can be obtained by using PML with lower radiation dose to the patient, than using FBP with higher dose. PMID:23927318

  11. SkyFACT: high-dimensional modeling of gamma-ray emission with adaptive templates and penalized likelihoods

    NASA Astrophysics Data System (ADS)

    Storm, Emma; Weniger, Christoph; Calore, Francesca

    2017-08-01

    We present SkyFACT (Sky Factorization with Adaptive Constrained Templates), a new approach for studying, modeling and decomposing diffuse gamma-ray emission. Like most previous analyses, the approach relies on predictions from cosmic-ray propagation codes like GALPROP and DRAGON. However, in contrast to previous approaches, we account for the fact that models are not perfect and allow for a very large number (gtrsim 105) of nuisance parameters to parameterize these imperfections. We combine methods of image reconstruction and adaptive spatio-spectral template regression in one coherent hybrid approach. To this end, we use penalized Poisson likelihood regression, with regularization functions that are motivated by the maximum entropy method. We introduce methods to efficiently handle the high dimensionality of the convex optimization problem as well as the associated semi-sparse covariance matrix, using the L-BFGS-B algorithm and Cholesky factorization. We test the method both on synthetic data as well as on gamma-ray emission from the inner Galaxy, |l|<90o and |b|<20o, as observed by the Fermi Large Area Telescope. We finally define a simple reference model that removes most of the residual emission from the inner Galaxy, based on conventional diffuse emission components as well as components for the Fermi bubbles, the Fermi Galactic center excess, and extended sources along the Galactic disk. Variants of this reference model can serve as basis for future studies of diffuse emission in and outside the Galactic disk.

  12. Estimation of spline function in nonparametric path analysis based on penalized weighted least square (PWLS)

    NASA Astrophysics Data System (ADS)

    Fernandes, Adji Achmad Rinaldo; Solimun, Arisoesilaningsih, Endang

    2017-12-01

    The aim of this research is to estimate the spline in Path Analysis-based on Nonparametric Regression using Penalized Weighted Least Square (PWLS) approach. Approach used is Reproducing Kernel Hilbert Space at sobolev space. Nonparametric path analysis model on the equation y1 i=f1.1(x1 i)+ε1 i; y2 i=f1.2(x1 i)+f2.2(y1 i)+ε2 i; i =1 ,2 ,…,n Nonparametric Path Analysis which meet the criteria of minimizing PWLS min fw .k∈W2m[aw .k,bw .k], k =1 ,2 { (2n ) -1(y˜-f ˜ ) TΣ-1(y ˜-f ˜ ) + ∑k =1 2 ∑w =1 2 λw .k ∫aw .k bw .k [fw.k (m )(xi) ] 2d xi } is f ˜^=Ay ˜ with A=T1(T1TU1-1∑-1T1)-1T1TU1-1∑-1+V1U1-1∑-1[I-T1(T1TU1-1∑-1T1)-1T1TU1-1∑-1] columnalign="left">+T2(T2TU2-1∑-1T2)-1T2TU2-1∑-1+V2U2-1∑-1[I1-T2(T2TU2-1∑-1T2) -1T2TU2-1∑-1

  13. [Criminal implication of sponsoring in medicine: legal ramifactions and recommendations].

    PubMed

    Mahnken, A H; Theilmann, M; Bolenz, M; Günther, R W

    2005-08-01

    As a consequence of the so-called "Heart-Valve-Affair" in 1994, the German public became aware of the potential criminal significance of industrial sponsoring and third-party financial support in medicine. Since 1997, when the German Anti-Corruption Law came into effect, the penal regulations regarding bribery and benefits for public officers were tightened. Due to the lack of explicit and generally accepted guidelines in combination with regional differences of jurisdiction, there is a lingering uncertainty regarding the criminal aspects of third-party funding and industrial sponsoring. The aim of this review is to summarize the penal and professional implications of third-party funding and sponsoring in medicine including recent aspects of jurisdiction. The currently available recommendations on this issue are introduced.

  14. Penalized Multi-Way Partial Least Squares for Smooth Trajectory Decoding from Electrocorticographic (ECoG) Recording

    PubMed Central

    Eliseyev, Andrey; Aksenova, Tetiana

    2016-01-01

    In the current paper the decoding algorithms for motor-related BCI systems for continuous upper limb trajectory prediction are considered. Two methods for the smooth prediction, namely Sobolev and Polynomial Penalized Multi-Way Partial Least Squares (PLS) regressions, are proposed. The methods are compared to the Multi-Way Partial Least Squares and Kalman Filter approaches. The comparison demonstrated that the proposed methods combined the prediction accuracy of the algorithms of the PLS family and trajectory smoothness of the Kalman Filter. In addition, the prediction delay is significantly lower for the proposed algorithms than for the Kalman Filter approach. The proposed methods could be applied in a wide range of applications beyond neuroscience. PMID:27196417

  15. A Permutation Approach for Selecting the Penalty Parameter in Penalized Model Selection

    PubMed Central

    Sabourin, Jeremy A; Valdar, William; Nobel, Andrew B

    2015-01-01

    Summary We describe a simple, computationally effcient, permutation-based procedure for selecting the penalty parameter in LASSO penalized regression. The procedure, permutation selection, is intended for applications where variable selection is the primary focus, and can be applied in a variety of structural settings, including that of generalized linear models. We briefly discuss connections between permutation selection and existing theory for the LASSO. In addition, we present a simulation study and an analysis of real biomedical data sets in which permutation selection is compared with selection based on the following: cross-validation (CV), the Bayesian information criterion (BIC), Scaled Sparse Linear Regression, and a selection method based on recently developed testing procedures for the LASSO. PMID:26243050

  16. Simulation Studies as Designed Experiments: The Comparison of Penalized Regression Models in the “Large p, Small n” Setting

    PubMed Central

    Chaibub Neto, Elias; Bare, J. Christopher; Margolin, Adam A.

    2014-01-01

    New algorithms are continuously proposed in computational biology. Performance evaluation of novel methods is important in practice. Nonetheless, the field experiences a lack of rigorous methodology aimed to systematically and objectively evaluate competing approaches. Simulation studies are frequently used to show that a particular method outperforms another. Often times, however, simulation studies are not well designed, and it is hard to characterize the particular conditions under which different methods perform better. In this paper we propose the adoption of well established techniques in the design of computer and physical experiments for developing effective simulation studies. By following best practices in planning of experiments we are better able to understand the strengths and weaknesses of competing algorithms leading to more informed decisions about which method to use for a particular task. We illustrate the application of our proposed simulation framework with a detailed comparison of the ridge-regression, lasso and elastic-net algorithms in a large scale study investigating the effects on predictive performance of sample size, number of features, true model sparsity, signal-to-noise ratio, and feature correlation, in situations where the number of covariates is usually much larger than sample size. Analysis of data sets containing tens of thousands of features but only a few hundred samples is nowadays routine in computational biology, where “omics” features such as gene expression, copy number variation and sequence data are frequently used in the predictive modeling of complex phenotypes such as anticancer drug response. The penalized regression approaches investigated in this study are popular choices in this setting and our simulations corroborate well established results concerning the conditions under which each one of these methods is expected to perform best while providing several novel insights. PMID:25289666

  17. Nonparametric probability density estimation by optimization theoretic techniques

    NASA Technical Reports Server (NTRS)

    Scott, D. W.

    1976-01-01

    Two nonparametric probability density estimators are considered. The first is the kernel estimator. The problem of choosing the kernel scaling factor based solely on a random sample is addressed. An interactive mode is discussed and an algorithm proposed to choose the scaling factor automatically. The second nonparametric probability estimate uses penalty function techniques with the maximum likelihood criterion. A discrete maximum penalized likelihood estimator is proposed and is shown to be consistent in the mean square error. A numerical implementation technique for the discrete solution is discussed and examples displayed. An extensive simulation study compares the integrated mean square error of the discrete and kernel estimators. The robustness of the discrete estimator is demonstrated graphically.

  18. Ride comfort control in large flexible aircraft. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Warren, M. E.

    1971-01-01

    The problem of ameliorating the discomfort of passengers on a large air transport subject to flight disturbances is examined. The longitudinal dynamics of the aircraft, including effects of body flexing, are developed in terms of linear, constant coefficient differential equations in state variables. A cost functional, penalizing the rigid body displacements and flexure accelerations over the surface of the aircraft is formulated as a quadratic form. The resulting control problem, to minimize the cost subject to the state equation constraints, is of a class whose solutions are well known. The feedback gains for the optimal controller are calculated digitally, and the resulting autopilot is simulated on an analog computer and its performance evaluated.

  19. Patients' functioning as predictor of nursing workload in acute hospital units providing rehabilitation care: a multi-centre cohort study

    PubMed Central

    2010-01-01

    Background Management decisions regarding quality and quantity of nurse staffing have important consequences for hospital budgets. Furthermore, these management decisions must address the nursing care requirements of the particular patients within an organizational unit. In order to determine optimal nurse staffing needs, the extent of nursing workload must first be known. Nursing workload is largely a function of the composite of the patients' individual health status, particularly with respect to functioning status, individual need for nursing care, and severity of symptoms. The International Classification of Functioning, Disability and Health (ICF) and the derived subsets, the so-called ICF Core Sets, are a standardized approach to describe patients' functioning status. The objectives of this study were to (1) examine the association between patients' functioning, as encoded by categories of the Acute ICF Core Sets, and nursing workload in patients in the acute care situation, (2) compare the variance in nursing workload explained by the ICF Core Set categories and with the Barthel Index, and (3) validate the Acute ICF Core Sets by their ability to predict nursing workload. Methods Patients' functioning at admission was assessed using the respective Acute ICF Core Set and the Barthel Index, whereas nursing workload data was collected using an established instrument. Associations between dependent and independent variables were modelled using linear regression. Variable selection was carried out using penalized regression. Results In patients with neurological and cardiopulmonary conditions, selected ICF categories and the Barthel Index Score explained the same variance in nursing workload (44% in neurological conditions, 35% in cardiopulmonary conditions), whereas ICF was slightly superior to Barthel Index Score for musculoskeletal conditions (20% versus 16%). Conclusions A substantial fraction of the variance in nursing workload in patients with rehabilitation needs in the acute hospital could be predicted by selected categories of the Acute ICF Core Sets, or by the Barthel Index score. Incorporating ICF Core Set-based data in nursing management decisions, particularly staffing decisions, may be beneficial. PMID:21034438

  20. Robust estimation for ordinary differential equation models.

    PubMed

    Cao, J; Wang, L; Xu, J

    2011-12-01

    Applied scientists often like to use ordinary differential equations (ODEs) to model complex dynamic processes that arise in biology, engineering, medicine, and many other areas. It is interesting but challenging to estimate ODE parameters from noisy data, especially when the data have some outliers. We propose a robust method to address this problem. The dynamic process is represented with a nonparametric function, which is a linear combination of basis functions. The nonparametric function is estimated by a robust penalized smoothing method. The penalty term is defined with the parametric ODE model, which controls the roughness of the nonparametric function and maintains the fidelity of the nonparametric function to the ODE model. The basis coefficients and ODE parameters are estimated in two nested levels of optimization. The coefficient estimates are treated as an implicit function of ODE parameters, which enables one to derive the analytic gradients for optimization using the implicit function theorem. Simulation studies show that the robust method gives satisfactory estimates for the ODE parameters from noisy data with outliers. The robust method is demonstrated by estimating a predator-prey ODE model from real ecological data. © 2011, The International Biometric Society.

  1. Structured penalties for functional linear models-partially empirical eigenvectors for regression.

    PubMed

    Randolph, Timothy W; Harezlak, Jaroslaw; Feng, Ziding

    2012-01-01

    One of the challenges with functional data is incorporating geometric structure, or local correlation, into the analysis. This structure is inherent in the output from an increasing number of biomedical technologies, and a functional linear model is often used to estimate the relationship between the predictor functions and scalar responses. Common approaches to the problem of estimating a coefficient function typically involve two stages: regularization and estimation. Regularization is usually done via dimension reduction, projecting onto a predefined span of basis functions or a reduced set of eigenvectors (principal components). In contrast, we present a unified approach that directly incorporates geometric structure into the estimation process by exploiting the joint eigenproperties of the predictors and a linear penalty operator. In this sense, the components in the regression are 'partially empirical' and the framework is provided by the generalized singular value decomposition (GSVD). The form of the penalized estimation is not new, but the GSVD clarifies the process and informs the choice of penalty by making explicit the joint influence of the penalty and predictors on the bias, variance and performance of the estimated coefficient function. Laboratory spectroscopy data and simulations are used to illustrate the concepts.

  2. Promote quantitative ischemia imaging via myocardial perfusion CT iterative reconstruction with tensor total generalized variation regularization

    NASA Astrophysics Data System (ADS)

    Gu, Chengwei; Zeng, Dong; Lin, Jiahui; Li, Sui; He, Ji; Zhang, Hao; Bian, Zhaoying; Niu, Shanzhou; Zhang, Zhang; Huang, Jing; Chen, Bo; Zhao, Dazhe; Chen, Wufan; Ma, Jianhua

    2018-06-01

    Myocardial perfusion computed tomography (MPCT) imaging is commonly used to detect myocardial ischemia quantitatively. A limitation in MPCT is that an additional radiation dose is required compared to unenhanced CT due to its repeated dynamic data acquisition. Meanwhile, noise and streak artifacts in low-dose cases are the main factors that degrade the accuracy of quantifying myocardial ischemia and hamper the diagnostic utility of the filtered backprojection reconstructed MPCT images. Moreover, it is noted that the MPCT images are composed of a series of 2/3D images, which can be naturally regarded as a 3/4-order tensor, and the MPCT images are globally correlated along time and are sparse across space. To obtain higher fidelity ischemia from low-dose MPCT acquisitions quantitatively, we propose a robust statistical iterative MPCT image reconstruction algorithm by incorporating tensor total generalized variation (TTGV) regularization into a penalized weighted least-squares framework. Specifically, the TTGV regularization fuses the spatial correlation of the myocardial structure and the temporal continuation of the contrast agent intake during the perfusion. Then, an efficient iterative strategy is developed for the objective function optimization. Comprehensive evaluations have been conducted on a digital XCAT phantom and a preclinical porcine dataset regarding the accuracy of the reconstructed MPCT images, the quantitative differentiation of ischemia and the algorithm’s robustness and efficiency.

  3. A Novel Approach to Prenatal Measurement of the Fetal Frontal Lobe Using Three-Dimensional Sonography

    PubMed Central

    Brown, Steffen A.; Hall, Rebecca; Hund, Lauren; Gutierrez, Hilda L.; Hurley, Timothy; Holbrook, Bradley D.; Bakhireva, Ludmila N.

    2017-01-01

    Objective While prenatal 3D ultrasonography results in improved diagnostic accuracy, no data are available on biometric assessment of the fetal frontal lobe. This study was designed to assess feasibility of a standardized approach to biometric measurement of the fetal frontal lobe and to construct frontal lobe growth trajectories throughout gestation. Study Design A sonographic 3D volume set was obtained and measured in 101 patients between 16.1 and 33.7 gestational weeks. Measurements were obtained by two independent raters. To model the relationship between gestational age and each frontal lobe measurement, flexible linear regression models were fit using penalized regression splines. Results The sample contained an ethnically diverse population (7.9% Native Americans, 45.5% Hispanic/Latina). There was high inter-rater reliability (correlation coefficients: 0.95, 1.0, and 0.87 for frontal lobe length, width, and height; p-values < 0.001). Graphs of the growth trajectories and corresponding percentiles were estimated as a function of gestational age. The estimated rates of frontal lobe growth were 0.096 cm/week, 0.247 cm/week, and 0.111 cm/week for length, width, and height. Conclusion To our knowledge, this is the first study to examine fetal frontal lobe growth trajectories through 3D prenatal ultrasound examination. Such normative data will allow for future prenatal evaluation of a particular disease state by 3D ultrasound imaging. PMID:29075046

  4. Aircraft Optimization for Minimum Environmental Impact

    NASA Technical Reports Server (NTRS)

    Antoine, Nicolas; Kroo, Ilan M.

    2001-01-01

    The objective of this research is to investigate the tradeoff between operating cost and environmental acceptability of commercial aircraft. This involves optimizing the aircraft design and mission to minimize operating cost while constraining exterior noise and emissions. Growth in air traffic and airport neighboring communities has resulted in increased pressure to severely penalize airlines that do not meet strict local noise and emissions requirements. As a result, environmental concerns have become potent driving forces in commercial aviation. Traditionally, aircraft have been first designed to meet performance and cost goals, and adjusted to satisfy the environmental requirements at given airports. The focus of the present study is to determine the feasibility of including noise and emissions constraints in the early design of the aircraft and mission. This paper introduces the design tool and results from a case study involving a 250-passenger airliner.

  5. Bayesian Scalar-on-Image Regression with Application to Association Between Intracranial DTI and Cognitive Outcomes

    PubMed Central

    Huang, Lei; Goldsmith, Jeff; Reiss, Philip T.; Reich, Daniel S.; Crainiceanu, Ciprian M.

    2013-01-01

    Diffusion tensor imaging (DTI) measures water diffusion within white matter, allowing for in vivo quantification of brain pathways. These pathways often subserve specific functions, and impairment of those functions is often associated with imaging abnormalities. As a method for predicting clinical disability from DTI images, we propose a hierarchical Bayesian “scalar-on-image” regression procedure. Our procedure introduces a latent binary map that estimates the locations of predictive voxels and penalizes the magnitude of effect sizes in these voxels, thereby resolving the ill-posed nature of the problem. By inducing a spatial prior structure, the procedure yields a sparse association map that also maintains spatial continuity of predictive regions. The method is demonstrated on a simulation study and on a study of association between fractional anisotropy and cognitive disability in a cross-sectional sample of 135 multiple sclerosis patients. PMID:23792220

  6. Level set formulation of two-dimensional Lagrangian vortex detection methods

    NASA Astrophysics Data System (ADS)

    Hadjighasem, Alireza; Haller, George

    2016-10-01

    We propose here the use of the variational level set methodology to capture Lagrangian vortex boundaries in 2D unsteady velocity fields. This method reformulates earlier approaches that seek material vortex boundaries as extremum solutions of variational problems. We demonstrate the performance of this technique for two different variational formulations built upon different notions of coherence. The first formulation uses an energy functional that penalizes the deviation of a closed material line from piecewise uniform stretching [Haller and Beron-Vera, J. Fluid Mech. 731, R4 (2013)]. The second energy function is derived for a graph-based approach to vortex boundary detection [Hadjighasem et al., Phys. Rev. E 93, 063107 (2016)]. Our level-set formulation captures an a priori unknown number of vortices simultaneously at relatively low computational cost. We illustrate the approach by identifying vortices from different coherence principles in several examples.

  7. Regularization destriping of remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Basnayake, Ranil; Bollt, Erik; Tufillaro, Nicholas; Sun, Jie; Gierach, Michelle

    2017-07-01

    We illustrate the utility of variational destriping for ocean color images from both multispectral and hyperspectral sensors. In particular, we examine data from a filter spectrometer, the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi National Polar Partnership (NPP) orbiter, and an airborne grating spectrometer, the Jet Population Laboratory's (JPL) hyperspectral Portable Remote Imaging Spectrometer (PRISM) sensor. We solve the destriping problem using a variational regularization method by giving weights spatially to preserve the other features of the image during the destriping process. The target functional penalizes the neighborhood of stripes (strictly, directionally uniform features) while promoting data fidelity, and the functional is minimized by solving the Euler-Lagrange equations with an explicit finite-difference scheme. We show the accuracy of our method from a benchmark data set which represents the sea surface temperature off the coast of Oregon, USA. Technical details, such as how to impose continuity across data gaps using inpainting, are also described.

  8. Research on output feedback control

    NASA Technical Reports Server (NTRS)

    Calise, A. J.; Kramer, F. S.

    1985-01-01

    In designing fixed order compensators, an output feedback formulation has been adopted by suitably augmenting the system description to include the compensator states. However, the minimization of the performance index over the range of possible compensator descriptions was impeded due to the nonuniqueness of the compensator transfer function. A controller canonical form of the compensator was chosen to reduce the number of free parameters to its minimal number in the optimization. In the MIMO case, the controller form requires a prespecified set of ascending controllability indices. This constraint on the compensator structure is rather innocuous in relation to the increase in convergence rate of the optimization. Moreover, the controller form is easily relatable to a unique controller transfer function description. This structure of the compensator does not require penalizing the compensator states for a nonzero or coupled solution, a problem that occurs when following a standard output feedback synthesis formulation.

  9. Flexible and structured survival model for a simultaneous estimation of non-linear and non-proportional effects and complex interactions between continuous variables: Performance of this multidimensional penalized spline approach in net survival trend analysis.

    PubMed

    Remontet, Laurent; Uhry, Zoé; Bossard, Nadine; Iwaz, Jean; Belot, Aurélien; Danieli, Coraline; Charvat, Hadrien; Roche, Laurent

    2018-01-01

    Cancer survival trend analyses are essential to describe accurately the way medical practices impact patients' survival according to the year of diagnosis. To this end, survival models should be able to account simultaneously for non-linear and non-proportional effects and for complex interactions between continuous variables. However, in the statistical literature, there is no consensus yet on how to build such models that should be flexible but still provide smooth estimates of survival. In this article, we tackle this challenge by smoothing the complex hypersurface (time since diagnosis, age at diagnosis, year of diagnosis, and mortality hazard) using a multidimensional penalized spline built from the tensor product of the marginal bases of time, age, and year. Considering this penalized survival model as a Poisson model, we assess the performance of this approach in estimating the net survival with a comprehensive simulation study that reflects simple and complex realistic survival trends. The bias was generally small and the root mean squared error was good and often similar to that of the true model that generated the data. This parametric approach offers many advantages and interesting prospects (such as forecasting) that make it an attractive and efficient tool for survival trend analyses.

  10. Generalized linear mixed models with varying coefficients for longitudinal data.

    PubMed

    Zhang, Daowen

    2004-03-01

    The routinely assumed parametric functional form in the linear predictor of a generalized linear mixed model for longitudinal data may be too restrictive to represent true underlying covariate effects. We relax this assumption by representing these covariate effects by smooth but otherwise arbitrary functions of time, with random effects used to model the correlation induced by among-subject and within-subject variation. Due to the usually intractable integration involved in evaluating the quasi-likelihood function, the double penalized quasi-likelihood (DPQL) approach of Lin and Zhang (1999, Journal of the Royal Statistical Society, Series B61, 381-400) is used to estimate the varying coefficients and the variance components simultaneously by representing a nonparametric function by a linear combination of fixed effects and random effects. A scaled chi-squared test based on the mixed model representation of the proposed model is developed to test whether an underlying varying coefficient is a polynomial of certain degree. We evaluate the performance of the procedures through simulation studies and illustrate their application with Indonesian children infectious disease data.

  11. 29 CFR 1975.5 - States and political subdivisions thereof.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... penal institutions; State, county, and municipal judicial bodies; State University Boards of Trustees; State, county, and municipal public school boards and commissions; and public libraries. (2) Depending...

  12. Fisher's method of scoring in statistical image reconstruction: comparison of Jacobi and Gauss-Seidel iterative schemes.

    PubMed

    Hudson, H M; Ma, J; Green, P

    1994-01-01

    Many algorithms for medical image reconstruction adopt versions of the expectation-maximization (EM) algorithm. In this approach, parameter estimates are obtained which maximize a complete data likelihood or penalized likelihood, in each iteration. Implicitly (and sometimes explicitly) penalized algorithms require smoothing of the current reconstruction in the image domain as part of their iteration scheme. In this paper, we discuss alternatives to EM which adapt Fisher's method of scoring (FS) and other methods for direct maximization of the incomplete data likelihood. Jacobi and Gauss-Seidel methods for non-linear optimization provide efficient algorithms applying FS in tomography. One approach uses smoothed projection data in its iterations. We investigate the convergence of Jacobi and Gauss-Seidel algorithms with clinical tomographic projection data.

  13. Route Flap Damping Made Usable

    NASA Astrophysics Data System (ADS)

    Pelsser, Cristel; Maennel, Olaf; Mohapatra, Pradosh; Bush, Randy; Patel, Keyur

    The Border Gateway Protocol (BGP), the de facto inter-domain routing protocol of the Internet, is known to be noisy. The protocol has two main mechanisms to ameliorate this, MinRouteAdvertisementInterval (MRAI), and Route Flap Damping (RFD). MRAI deals with very short bursts on the order of a few to 30 seconds. RFD deals with longer bursts, minutes to hours. Unfortunately, RFD was found to severely penalize sites for being well-connected because topological richness amplifies the number of update messages exchanged. So most operators have disabled it. Through measurement, this paper explores the avenue of absolutely minimal change to code, and shows that a few RFD algorithmic constants and limits can be trivially modified, with the result being damping a non-trivial amount of long term churn without penalizing well-behaved prefixes' normal convergence process.

  14. Stationary wavelet transform for under-sampled MRI reconstruction.

    PubMed

    Kayvanrad, Mohammad H; McLeod, A Jonathan; Baxter, John S H; McKenzie, Charles A; Peters, Terry M

    2014-12-01

    In addition to coil sensitivity data (parallel imaging), sparsity constraints are often used as an additional lp-penalty for under-sampled MRI reconstruction (compressed sensing). Penalizing the traditional decimated wavelet transform (DWT) coefficients, however, results in visual pseudo-Gibbs artifacts, some of which are attributed to the lack of translation invariance of the wavelet basis. We show that these artifacts can be greatly reduced by penalizing the translation-invariant stationary wavelet transform (SWT) coefficients. This holds with various additional reconstruction constraints, including coil sensitivity profiles and total variation. Additionally, SWT reconstructions result in lower error values and faster convergence compared to DWT. These concepts are illustrated with extensive experiments on in vivo MRI data with particular emphasis on multiple-channel acquisitions. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Mathematical optimization of high dose-rate brachytherapy—derivation of a linear penalty model from a dose-volume model

    NASA Astrophysics Data System (ADS)

    Morén, B.; Larsson, T.; Carlsson Tedgren, Å.

    2018-03-01

    High dose-rate brachytherapy is a method for cancer treatment where the radiation source is placed within the body, inside or close to a tumour. For dose planning, mathematical optimization techniques are being used in practice and the most common approach is to use a linear model which penalizes deviations from specified dose limits for the tumour and for nearby organs. This linear penalty model is easy to solve, but its weakness lies in the poor correlation of its objective value and the dose-volume objectives that are used clinically to evaluate dose distributions. Furthermore, the model contains parameters that have no clear clinical interpretation. Another approach for dose planning is to solve mixed-integer optimization models with explicit dose-volume constraints which include parameters that directly correspond to dose-volume objectives, and which are therefore tangible. The two mentioned models take the overall goals for dose planning into account in fundamentally different ways. We show that there is, however, a mathematical relationship between them by deriving a linear penalty model from a dose-volume model. This relationship has not been established before and improves the understanding of the linear penalty model. In particular, the parameters of the linear penalty model can be interpreted as dual variables in the dose-volume model.

  16. Bayesian inference of Calibration curves: application to archaeomagnetism

    NASA Astrophysics Data System (ADS)

    Lanos, P.

    2003-04-01

    The range of errors that occur at different stages of the archaeomagnetic calibration process are modelled using a Bayesian hierarchical model. The archaeomagnetic data obtained from archaeological structures such as hearths, kilns or sets of bricks and tiles, exhibit considerable experimental errors and are typically more or less well dated by archaeological context, history or chronometric methods (14C, TL, dendrochronology, etc.). They can also be associated with stratigraphic observations which provide prior relative chronological information. The modelling we describe in this paper allows all these observations, on materials from a given period, to be linked together, and the use of penalized maximum likelihood for smoothing univariate, spherical or three-dimensional time series data allows representation of the secular variation of the geomagnetic field over time. The smooth curve we obtain (which takes the form of a penalized natural cubic spline) provides an adaptation to the effects of variability in the density of reference points over time. Since our model takes account of all the known errors in the archaeomagnetic calibration process, we are able to obtain a functional highest-posterior-density envelope on the new curve. With this new posterior estimate of the curve available to us, the Bayesian statistical framework then allows us to estimate the calendar dates of undated archaeological features (such as kilns) based on one, two or three geomagnetic parameters (inclination, declination and/or intensity). Date estimates are presented in much the same way as those that arise from radiocarbon dating. In order to illustrate the model and inference methods used, we will present results based on German archaeomagnetic data recently published by a German team.

  17. SkyFACT: high-dimensional modeling of gamma-ray emission with adaptive templates and penalized likelihoods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Storm, Emma; Weniger, Christoph; Calore, Francesca, E-mail: e.m.storm@uva.nl, E-mail: c.weniger@uva.nl, E-mail: francesca.calore@lapth.cnrs.fr

    We present SkyFACT (Sky Factorization with Adaptive Constrained Templates), a new approach for studying, modeling and decomposing diffuse gamma-ray emission. Like most previous analyses, the approach relies on predictions from cosmic-ray propagation codes like GALPROP and DRAGON. However, in contrast to previous approaches, we account for the fact that models are not perfect and allow for a very large number (∼> 10{sup 5}) of nuisance parameters to parameterize these imperfections. We combine methods of image reconstruction and adaptive spatio-spectral template regression in one coherent hybrid approach. To this end, we use penalized Poisson likelihood regression, with regularization functions that aremore » motivated by the maximum entropy method. We introduce methods to efficiently handle the high dimensionality of the convex optimization problem as well as the associated semi-sparse covariance matrix, using the L-BFGS-B algorithm and Cholesky factorization. We test the method both on synthetic data as well as on gamma-ray emission from the inner Galaxy, |ℓ|<90{sup o} and | b |<20{sup o}, as observed by the Fermi Large Area Telescope. We finally define a simple reference model that removes most of the residual emission from the inner Galaxy, based on conventional diffuse emission components as well as components for the Fermi bubbles, the Fermi Galactic center excess, and extended sources along the Galactic disk. Variants of this reference model can serve as basis for future studies of diffuse emission in and outside the Galactic disk.« less

  18. Exposure-lag-response in Longitudinal Studies: Application of Distributed Lag Non-linear Models in an Occupational Cohort.

    PubMed

    Neophytou, Andreas M; Picciotto, Sally; Brown, Daniel M; Gallagher, Lisa E; Checkoway, Harvey; Eisen, Ellen A; Costello, Sadie

    2018-02-13

    Prolonged exposures can have complex relationships with health outcomes, as timing, duration, and intensity of exposure are all potentially relevant. Summary measures such as cumulative exposure or average intensity of exposure may not fully capture these relationships. We applied penalized and unpenalized distributed lag non-linear models (DLNMs) with flexible exposure-response and lag-response functions in order to examine the association between crystalline silica exposure and mortality from lung cancer and non-malignant respiratory disease in a cohort study of 2,342 California diatomaceous earth workers, followed 1942-2011. We also assessed associations using simple measures of cumulative exposure assuming linear exposure-response and constant lag-response. Measures of association from DLNMs were generally higher than from simpler models. Rate ratios from penalized DLNMs corresponding to average daily exposures of 0.4 mg/m3 during lag years 31-50 prior to the age of observed cases were 1.47 (95% confidence interval (CI) 0.92, 2.35) for lung cancer and 1.80 (95% CI: 1.14, 2.85) for non-malignant respiratory disease. Rate ratios from the simpler models for the same exposure scenario were 1.15 (95% CI: 0.89-1.48) and 1.23 (95% CI: 1.03-1.46) respectively. Longitudinal cohort studies of prolonged exposures and chronic health outcomes should explore methods allowing for flexibility and non-linearities in the exposure-lag-response. © The Author(s) 2018. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health.

  19. Data-Driven Hierarchical Structure Kernel for Multiscale Part-Based Object Recognition

    PubMed Central

    Wang, Botao; Xiong, Hongkai; Jiang, Xiaoqian; Zheng, Yuan F.

    2017-01-01

    Detecting generic object categories in images and videos are a fundamental issue in computer vision. However, it faces the challenges from inter and intraclass diversity, as well as distortions caused by viewpoints, poses, deformations, and so on. To solve object variations, this paper constructs a structure kernel and proposes a multiscale part-based model incorporating the discriminative power of kernels. The structure kernel would measure the resemblance of part-based objects in three aspects: 1) the global similarity term to measure the resemblance of the global visual appearance of relevant objects; 2) the part similarity term to measure the resemblance of the visual appearance of distinctive parts; and 3) the spatial similarity term to measure the resemblance of the spatial layout of parts. In essence, the deformation of parts in the structure kernel is penalized in a multiscale space with respect to horizontal displacement, vertical displacement, and scale difference. Part similarities are combined with different weights, which are optimized efficiently to maximize the intraclass similarities and minimize the interclass similarities by the normalized stochastic gradient ascent algorithm. In addition, the parameters of the structure kernel are learned during the training process with regard to the distribution of the data in a more discriminative way. With flexible part sizes on scale and displacement, it can be more robust to the intraclass variations, poses, and viewpoints. Theoretical analysis and experimental evaluations demonstrate that the proposed multiscale part-based representation model with structure kernel exhibits accurate and robust performance, and outperforms state-of-the-art object classification approaches. PMID:24808345

  20. Designing Flood Management Systems for Joint Economic and Ecological Robustness

    NASA Astrophysics Data System (ADS)

    Spence, C. M.; Grantham, T.; Brown, C. M.; Poff, N. L.

    2015-12-01

    Freshwater ecosystems across the United States are threatened by hydrologic change caused by water management operations and non-stationary climate trends. Nonstationary hydrology also threatens flood management systems' performance. Ecosystem managers and flood risk managers need tools to design systems that achieve flood risk reduction objectives while sustaining ecosystem functions and services in an uncertain hydrologic future. Robust optimization is used in water resources engineering to guide system design under climate change uncertainty. Using principles introduced by Eco-Engineering Decision Scaling (EEDS), we extend robust optimization techniques to design flood management systems that meet both economic and ecological goals simultaneously across a broad range of future climate conditions. We use three alternative robustness indices to identify flood risk management solutions that preserve critical ecosystem functions in a case study from the Iowa River, where recent severe flooding has tested the limits of the existing flood management system. We seek design modifications to the system that both reduce expected cost of flood damage while increasing ecologically beneficial inundation of riparian floodplains across a wide range of plausible climate futures. The first robustness index measures robustness as the fraction of potential climate scenarios in which both engineering and ecological performance goals are met, implicitly weighting each climate scenario equally. The second index builds on the first by using climate projections to weight each climate scenario, prioritizing acceptable performance in climate scenarios most consistent with climate projections. The last index measures robustness as mean performance across all climate scenarios, but penalizes scenarios with worse performance than average, rewarding consistency. Results stemming from alternate robustness indices reflect implicit assumptions about attitudes toward risk and reveal the tradeoffs between using structural and non-structural flood management strategies to ensure economic and ecological robustness.

  1. Sparse representation and dictionary learning penalized image reconstruction for positron emission tomography.

    PubMed

    Chen, Shuhang; Liu, Huafeng; Shi, Pengcheng; Chen, Yunmei

    2015-01-21

    Accurate and robust reconstruction of the radioactivity concentration is of great importance in positron emission tomography (PET) imaging. Given the Poisson nature of photo-counting measurements, we present a reconstruction framework that integrates sparsity penalty on a dictionary into a maximum likelihood estimator. Patch-sparsity on a dictionary provides the regularization for our effort, and iterative procedures are used to solve the maximum likelihood function formulated on Poisson statistics. Specifically, in our formulation, a dictionary could be trained on CT images, to provide intrinsic anatomical structures for the reconstructed images, or adaptively learned from the noisy measurements of PET. Accuracy of the strategy with very promising application results from Monte-Carlo simulations, and real data are demonstrated.

  2. [Expansive development of the French regulation of the genetic print files after the recent reforms (Part I)].

    PubMed

    Etxeberria Guridi, José Francisco

    2003-01-01

    French regulations related to "genetic prints" and its later incorporation to an automatized file in the frame of the penal process, initially deserved (1998) a positive judgement due to the guarantees surrounding such techniques, considering that with its use an interference was made with the freedom and rights of the individual. This primary regulation is watching a legislative evolution that brings serious doubts about the current guarantee system. A couple of legal reforms with security as their main axis (2001 and 2003) give more importance to the "genetic print" file by extending the causes in which it starts functioning going against the proportionality that must be observed when freedoms and rights of the individual can be affected.

  3. Statistical validation of normal tissue complication probability models.

    PubMed

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. Demography, vulnerabilities and right to health to Brazilian prison population.

    PubMed

    Soares, Marden Marques; Bueno, Paula Michele Martins Gomes

    2016-06-01

    This study investigates the latest research on the profile of the Brazilian prison population, its demography and current laws and regulations. It aims in the direction of ensuring the human right to health. Brazilian prison system is a complex universe in which state and federal criminal contexts keep more than 607,000 people in custody. This population is composed of 75% of young and black people, 67% poorly educated and 41% are pre-trial detainees, living in overcrowded prisons and architecturally vandalized, with population growth of around 575% in 24 years, making this environment a major focus of production of diseases. The prison becomes the object of differentiated intervention by public bodies linked to the executive and the judiciary - it is worth remarking that the data show the high level of inequalities and health vulnerabilities among the prison population, whose needs involve a set of cross-sector of transverse public policies actions towards penal execution.

  5. [Freedom of conscience. Biojuridical conflicts at multicultural societies].

    PubMed

    Albert Márquez, Marta

    2010-01-01

    The paper analyzes the right of healthcare professionals to conscientious objection at multicultural societies. the ethical relativism, characteristic of these societies, lives together with an apparently paradoxical reduction of the exercise of freedom of conscience. It is wrote "Apparently" because, at the end, the ethical relativism tends to the adoption of dogmatic attitudes. Special attention is paid to the situation of Spanish healthcare in relation with euthanasia and abortion. With regard to euthanasia, the "dignified dead" draft bill of Andalucía is considered. With regard to abortion, we will pay attention to the reform of the Penal Code in the context of a new regulation about "reproductive health" of women, which means the adoption of a system of time limits, and the characterization of abortion as a women's right. It is concluded that the freedom of conscience of healthcare professionals would be probably at risk if proposed legal policies doesn't change.

  6. [Views of forensic medicine and criminology on the pathodynamics of homicide].

    PubMed

    Kokavec, M; Dobrotka, G

    2000-01-01

    Trauma and violence represent the domains of forensic medical expertise. The objective finding on the victim of the homicide makes it possible to reconstruct the way and mechanism of the injury connected with it and thus to determine the cause of death. The traces of violence contain many indices pointing at specific personal characteristics of the culprit, his motivation to the crime and his state of mind at the time of the homicide. To judge the penal responsibility and the guilt of the culprit it is important for the court to have the analysis of the dynamics and of the causal background of the crime, especially the synthetic evaluation of subjective, situation, or eventually psychological factors of violence with the mortal effect. The interdisciplinary forensic-medical, forensic-psychological and psychophytological approach will make it possible to provide the court with a complex forensic expertise.

  7. Edge-Preserving Image Smoothing Constraint in Multivariate Curve Resolution-Alternating Least Squares (MCR-ALS) of Hyperspectral Data.

    PubMed

    Hugelier, Siewert; Vitale, Raffaele; Ruckebusch, Cyril

    2018-03-01

    This article explores smoothing with edge-preserving properties as a spatial constraint for the resolution of hyperspectral images with multivariate curve resolution-alternating least squares (MCR-ALS). For each constrained component image (distribution map), irrelevant spatial details and noise are smoothed applying an L 1 - or L 0 -norm penalized least squares regression, highlighting in this way big changes in intensity of adjacent pixels. The feasibility of the constraint is demonstrated on three different case studies, in which the objects under investigation are spatially clearly defined, but have significant spectral overlap. This spectral overlap is detrimental for obtaining a good resolution and additional spatial information should be provided. The final results show that the spatial constraint enables better image (map) abstraction, artifact removal, and better interpretation of the results obtained, compared to a classical MCR-ALS analysis of hyperspectral images.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dall-Anese, Emiliano; Zhao, Changhong; Guggilam, Swaroop

    Power networks have to withstand a variety of disturbances that affect system frequency, and the problem is compounded with the increasing integration of intermittent renewable generation. Following a large-signal generation or load disturbance, system frequency is arrested leveraging primary frequency control provided by governor action in synchronous generators. In this work, we propose a framework for distributed energy resources (DERs) deployed in distribution networks to provide (supplemental) primary frequency response. Particularly, we demonstrate how power-frequency droop slopes for individual DERs can be designed so that the distribution feeder presents a guaranteed frequency-regulation characteristic at the feeder head. Furthermore, the droopmore » slopes are engineered such that injections of individual DERs conform to a well-defined fairness objective that does not penalize them for their location on the distribution feeder. Time-domain simulations for an illustrative network composed of a combined transmission network and distribution network with frequency-responsive DERs are provided to validate the approach.« less

  9. Consent Agreement and Consent Order

    EPA Pesticide Factsheets

    Contains legal consent agreement and consent order for the assessment of a civil penality pursuant to Section 14(1) of the Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA), BioLab Inc., Conyers, GA, September 14, 1998.

  10. A penalized linear and nonlinear combined conjugate gradient method for the reconstruction of fluorescence molecular tomography.

    PubMed

    Shang, Shang; Bai, Jing; Song, Xiaolei; Wang, Hongkai; Lau, Jaclyn

    2007-01-01

    Conjugate gradient method is verified to be efficient for nonlinear optimization problems of large-dimension data. In this paper, a penalized linear and nonlinear combined conjugate gradient method for the reconstruction of fluorescence molecular tomography (FMT) is presented. The algorithm combines the linear conjugate gradient method and the nonlinear conjugate gradient method together based on a restart strategy, in order to take advantage of the two kinds of conjugate gradient methods and compensate for the disadvantages. A quadratic penalty method is adopted to gain a nonnegative constraint and reduce the illposedness of the problem. Simulation studies show that the presented algorithm is accurate, stable, and fast. It has a better performance than the conventional conjugate gradient-based reconstruction algorithms. It offers an effective approach to reconstruct fluorochrome information for FMT.

  11. [Who Benefits from Forensic Psychiatric Treatment? Results of a Catamnestic Study in Swabia].

    PubMed

    Dudeck, Manuela; Franke, Irina; Bezzel, Adelheid; Otte, Stefanie; Ormanns, Norbert; Nigel, Stefanie; Segmiller, Felix; Streb, Judith

    2018-04-17

    Evaluation of treatment outcomes of forensic inpatients in the Bavarian district of Swabia (2010 - 2014). 130 inpatients were interviewed about their social reintegration, substance use and delinquency immediately after discharge from forensic psychiatry and one year after. One year after discharge 67 % of the patients referred due to substance use disorder according to § 64 of the German Penal Code were employed, 57 % were abstinent and 83 % did not reoffend. Patients who were detained due to severe mental illness according to § 63 of the German Penal Code often received inability pensions (57 %), 14 % were integrated in sheltered workshops and 100 % did not reoffend. Forensic-psychiatric treatment contributes to rehabilitation and reduces risk factors of mentally disordered offenders. © Georg Thieme Verlag KG Stuttgart · New York.

  12. Honor crimes: review and proposed definition.

    PubMed

    Elakkary, Sally; Franke, Barbara; Shokri, Dina; Hartwig, Sven; Tsokos, Michael; Püschel, Klaus

    2014-03-01

    There is every reason to believe that honor based violence is one of the forms of domestic violence that is being practiced against females all over the world. This type of violence includes a wide range of crimes, the severest of which is honor killing. Many studies have adopted different definitions for the so-called honor killing. In this paper some of these definitions are discussed and a working definition is proposed. The scope of the problem worldwide is presented. Honor killing goes beyond ethnicity, class, and religion. It is a very old phenomenon that was practiced in ancient Rome, guided by penal codes. Some of the older as well as new penal codes are discussed concerning this matter from different regions of the world. The different efforts of international governmental and nongovernmental organizations in combating this problem are also presented.

  13. Middle Micoene sandstone reservoirs of the Penal/Barrackpore field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dyer, B.L.

    1991-03-01

    The Penal/Barrackpore field was discovered in 1938 and is located in the southern subbasin of onshore Trinidad. The accumulation is one of a series of northeast-southwest trending en echelon middle Miocene anticlinal structures that was later accentuated by late Pliocene transpressional folding. Relative movement of the South American and Caribbean plates climaxed in the middle Miocene compressive tectonic event and produced an imbricate pattern of southward-facing basement-involved thrusts. Further compressive interaction between the plates in the late Pliocene produced a transpressive tectonic episode forming northwest-southeast oriented transcurrent faults, tear faults, basement thrust faults, lystric normal faults, and detached simple foldsmore » with infrequent diapiric cores. The middle Miocene Herrera and Karamat turbiditic sandstones are the primary reservoir rock in the subsurface anticline of the Penal/Barrackpore field. These turbidites were sourced from the north and deposited within the marls and clays of the Cipero Formation. Miocene and Pliocene deltaics and turbidites succeed the Cipero Formation vertically, lapping into preexisting Miocene highs. The late Pliocene transpression also coincides with the onset of oil migration along faults, diapirs, and unconformities from the Cretaceous Naparima Hill source. The Lengua Formation and the upper Forest clays are considered effective seals. Hydrocarbon trapping is structurally and stratigraphically controlled, with structure being the dominant trapping mechanism. Ultimate recoverable reserves for the field are estimated at 127.9 MMBo and 628.8 bcf. The field is presently owned and operated by the Trinidad and Tobago Oil Company Limited (TRINTOC).« less

  14. Improving multisensor estimation of heavy-to-extreme precipitation via conditional bias-penalized optimal estimation

    NASA Astrophysics Data System (ADS)

    Kim, Beomgeun; Seo, Dong-Jun; Noh, Seong Jin; Prat, Olivier P.; Nelson, Brian R.

    2018-01-01

    A new technique for merging radar precipitation estimates and rain gauge data is developed and evaluated to improve multisensor quantitative precipitation estimation (QPE), in particular, of heavy-to-extreme precipitation. Unlike the conventional cokriging methods which are susceptible to conditional bias (CB), the proposed technique, referred to herein as conditional bias-penalized cokriging (CBPCK), explicitly minimizes Type-II CB for improved quantitative estimation of heavy-to-extreme precipitation. CBPCK is a bivariate version of extended conditional bias-penalized kriging (ECBPK) developed for gauge-only analysis. To evaluate CBPCK, cross validation and visual examination are carried out using multi-year hourly radar and gauge data in the North Central Texas region in which CBPCK is compared with the variant of the ordinary cokriging (OCK) algorithm used operationally in the National Weather Service Multisensor Precipitation Estimator. The results show that CBPCK significantly reduces Type-II CB for estimation of heavy-to-extreme precipitation, and that the margin of improvement over OCK is larger in areas of higher fractional coverage (FC) of precipitation. When FC > 0.9 and hourly gauge precipitation is > 60 mm, the reduction in root mean squared error (RMSE) by CBPCK over radar-only (RO) is about 12 mm while the reduction in RMSE by OCK over RO is about 7 mm. CBPCK may be used in real-time analysis or in reanalysis of multisensor precipitation for which accurate estimation of heavy-to-extreme precipitation is of particular importance.

  15. Treating substance abuse is not enough: comorbidities in consecutively admitted female prisoners.

    PubMed

    Mir, Jan; Kastner, Sinja; Priebe, Stefan; Konrad, Norbert; Ströhle, Andreas; Mundt, Adrian P

    2015-07-01

    Several studies have pointed to high rates of substance use disorders among female prisoners. The present study aimed to assess comorbidities of substance use disorders with other mental disorders in female prisoners at admission to a penal justice system. A sample of 150 female prisoners, consecutively admitted to the penal justice system of Berlin, Germany, was interviewed using the Mini-International Neuropsychiatric Interview (MINI). The presence of borderline personality disorder was assessed using the Structured Clinical Interview II for DSM-IV. Prevalence rates and comorbidities were calculated as percentage values and 95% confidence intervals (CIs). Ninety-three prisoners (62%; 95% CI: 54-70) had substance use disorders; n=49 (33%; 95% CI: 24-42) had alcohol abuse/dependence; n=76 (51%; 95% CI: 43-59) had illicit drug abuse/dependence; and n=53 (35%; 95% CI: 28-44) had opiate use disorders. In the group of inmates with substance use disorders, 84 (90%) had at least one other mental disorder; n=63 (68%) had comorbid affective disorders; n=45 (49%) had borderline or antisocial personality disorders; and n=41 (44%) had comorbid anxiety disorders. Female prisoners with addiction have high rates of comorbid mental disorders at admission to the penal justice system, ranging from affective to personality and anxiety disorders. Generic and robust interventions that can address different comorbid mental health problems in a flexible manner may be required to tackle widespread addiction and improve mental health of female prisoners. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Working through mass incarceration: gender and the politics of prison labor from east to west.

    PubMed

    Haney, Lynne A

    2010-01-01

    This article explores the politics and practices of labor in two penal institutions for women: a maximum security facility for women in Hungary and a community‐based facility for women in California. Diverging from other accounts of imprisonment that tend to operate at either the individual or macroeconomic level, this article analyzes the concrete institutional relations of prison and complicates the assumption that they simply reflect the logic of the prison‐industrial complex. Based on years of ethnographic work in two very different penal systems, I describe variation in how prisons institute labor within and across institutions and cultures: the Hungarian facility positioned wage labor as a right and an obligation that formed the basis of women’s social relationships and ties to others, while the U.S. prison excluded wage labor from women’s lives so they could get on with the work of self‐improvement and personal healing. From the comparison, I reveal how prisons can both draw on and subvert broader social meanings assigned to women’s work, making it difficult to view prison labor as wholly exploitative or abusive. I also argue that refusing to allow female inmates to engage in wage labor can be a more profound form of punishment than requiring it of them. By juxtaposing the discourses and practices of work in two very different penal contexts, this article offers a critical reflection on the political economy of prison labor from the ground up.

  17. Adaptive low-rank subspace learning with online optimization for robust visual tracking.

    PubMed

    Liu, Risheng; Wang, Di; Han, Yuzhuo; Fan, Xin; Luo, Zhongxuan

    2017-04-01

    In recent years, sparse and low-rank models have been widely used to formulate appearance subspace for visual tracking. However, most existing methods only consider the sparsity or low-rankness of the coefficients, which is not sufficient enough for appearance subspace learning on complex video sequences. Moreover, as both the low-rank and the column sparse measures are tightly related to all the samples in the sequences, it is challenging to incrementally solve optimization problems with both nuclear norm and column sparse norm on sequentially obtained video data. To address above limitations, this paper develops a novel low-rank subspace learning with adaptive penalization (LSAP) framework for subspace based robust visual tracking. Different from previous work, which often simply decomposes observations as low-rank features and sparse errors, LSAP simultaneously learns the subspace basis, low-rank coefficients and column sparse errors to formulate appearance subspace. Within LSAP framework, we introduce a Hadamard production based regularization to incorporate rich generative/discriminative structure constraints to adaptively penalize the coefficients for subspace learning. It is shown that such adaptive penalization can significantly improve the robustness of LSAP on severely corrupted dataset. To utilize LSAP for online visual tracking, we also develop an efficient incremental optimization scheme for nuclear norm and column sparse norm minimizations. Experiments on 50 challenging video sequences demonstrate that our tracker outperforms other state-of-the-art methods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Occlusion properties of prosthetic contact lenses for the treatment of amblyopia.

    PubMed

    Collins, Randall S; McChesney, Megan E; McCluer, Craig A; Schatz, Martha P

    2008-12-01

    The efficacy of opaque contact lenses as occlusion therapy for amblyopia has been established in the literature. Prosthetic contact lenses use similar tints to improve cosmesis in scarred or deformed eyes and may be an alternative in occlusion therapy. To test this idea, we determined the degree of vision penalization elicited by prosthetic contact lenses and their effect on peripheral fusion. We tested 19 CIBA Vision DuraSoft 3 Prosthetic soft contact lenses with varying iris prints, underprints, and opaque pupil sizes in 10 volunteers with best-corrected Snellen distance visual acuity of 20/20 or better in each eye. Snellen visual acuity and peripheral fusion using the Worth 4-Dot test at near were measured on each subject wearing each of the 19 lenses. Results were analyzed with 3-factor analysis of variance. Mean visual acuity through the various lenses ranged from 20/79 to 20/620. Eight lenses allowed preservation of peripheral fusion in 50% or more of the subjects tested. Iris print pattern and opaque pupil size were significant factors in determining visual acuity (p < 0.05). Sufficient vision penalization can be achieved to make occlusion with prosthetic contact lenses a viable therapy for amblyopia. The degree of penalization can be varied and different iris print patterns and pupil sizes, using peripheral fusion, can be preserved with some lenses. Prosthetic contact lenses can be more cosmetically appealing and more tolerable than other amblyopia treatment modalities. These factors may improve compliance in occlusion therapy.

  19. Automatic treatment plan re-optimization for adaptive radiotherapy guided with the initial plan DVHs

    NASA Astrophysics Data System (ADS)

    Li, Nan; Zarepisheh, Masoud; Uribe-Sanchez, Andres; Moore, Kevin; Tian, Zhen; Zhen, Xin; Jiang Graves, Yan; Gautier, Quentin; Mell, Loren; Zhou, Linghong; Jia, Xun; Jiang, Steve

    2013-12-01

    Adaptive radiation therapy (ART) can reduce normal tissue toxicity and/or improve tumor control through treatment adaptations based on the current patient anatomy. Developing an efficient and effective re-planning algorithm is an important step toward the clinical realization of ART. For the re-planning process, manual trial-and-error approach to fine-tune planning parameters is time-consuming and is usually considered unpractical, especially for online ART. It is desirable to automate this step to yield a plan of acceptable quality with minimal interventions. In ART, prior information in the original plan is available, such as dose-volume histogram (DVH), which can be employed to facilitate the automatic re-planning process. The goal of this work is to develop an automatic re-planning algorithm to generate a plan with similar, or possibly better, DVH curves compared with the clinically delivered original plan. Specifically, our algorithm iterates the following two loops. An inner loop is the traditional fluence map optimization, in which we optimize a quadratic objective function penalizing the deviation of the dose received by each voxel from its prescribed or threshold dose with a set of fixed voxel weighting factors. In outer loop, the voxel weighting factors in the objective function are adjusted according to the deviation of the current DVH curves from those in the original plan. The process is repeated until the DVH curves are acceptable or maximum iteration step is reached. The whole algorithm is implemented on GPU for high efficiency. The feasibility of our algorithm has been demonstrated with three head-and-neck cancer IMRT cases, each having an initial planning CT scan and another treatment CT scan acquired in the middle of treatment course. Compared with the DVH curves in the original plan, the DVH curves in the resulting plan using our algorithm with 30 iterations are better for almost all structures. The re-optimization process takes about 30 s using our in-house optimization engine. This work was originally presented at the 54th AAPM annual meeting in Charlotte, NC, July 29-August 2, 2012.

  20. Investigation of statistical iterative reconstruction for dedicated breast CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makeev, Andrey; Glick, Stephen J.

    2013-08-15

    Purpose: Dedicated breast CT has great potential for improving the detection and diagnosis of breast cancer. Statistical iterative reconstruction (SIR) in dedicated breast CT is a promising alternative to traditional filtered backprojection (FBP). One of the difficulties in using SIR is the presence of free parameters in the algorithm that control the appearance of the resulting image. These parameters require tuning in order to achieve high quality reconstructions. In this study, the authors investigated the penalized maximum likelihood (PML) method with two commonly used types of roughness penalty functions: hyperbolic potential and anisotropic total variation (TV) norm. Reconstructed images weremore » compared with images obtained using standard FBP. Optimal parameters for PML with the hyperbolic prior are reported for the task of detecting microcalcifications embedded in breast tissue.Methods: Computer simulations were used to acquire projections in a half-cone beam geometry. The modeled setup describes a realistic breast CT benchtop system, with an x-ray spectra produced by a point source and an a-Si, CsI:Tl flat-panel detector. A voxelized anthropomorphic breast phantom with 280 μm microcalcification spheres embedded in it was used to model attenuation properties of the uncompressed woman's breast in a pendant position. The reconstruction of 3D images was performed using the separable paraboloidal surrogates algorithm with ordered subsets. Task performance was assessed with the ideal observer detectability index to determine optimal PML parameters.Results: The authors' findings suggest that there is a preferred range of values of the roughness penalty weight and the edge preservation threshold in the penalized objective function with the hyperbolic potential, which resulted in low noise images with high contrast microcalcifications preserved. In terms of numerical observer detectability index, the PML method with optimal parameters yielded substantially improved performance (by a factor of greater than 10) compared to FBP. The hyperbolic prior was also observed to be superior to the TV norm. A few of the best-performing parameter pairs for the PML method also demonstrated superior performance for various radiation doses. In fact, using PML with certain parameter values results in better images, acquired using 2 mGy dose, than FBP-reconstructed images acquired using 6 mGy dose.Conclusions: A range of optimal free parameters for the PML algorithm with hyperbolic and TV norm-based potentials is presented for the microcalcification detection task, in dedicated breast CT. The reported values can be used as starting values of the free parameters, when SIR techniques are used for image reconstruction. Significant improvement in image quality can be achieved by using PML with optimal combination of parameters, as compared to FBP. Importantly, these results suggest improved detection of microcalcifications can be obtained by using PML with lower radiation dose to the patient, than using FBP with higher dose.« less

  1. 27 CFR 26.69 - Strengthening bonds.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... OF THE TREASURY LIQUORS LIQUORS AND ARTICLES FROM PUERTO RICO AND THE VIRGIN ISLANDS Taxpayment of Liquors and Articles in Puerto Rico Bonds § 26.69 Strengthening bonds. In all cases where the penal sum of...

  2. Senegal: Background and U.S. Relations

    DTIC Science & Technology

    2010-08-16

    boys, often separated from their families, receive religious instruction.33 The 1999 Penal Law outlawed domestic violence and female genital ... mutilation ; however, implementation has reportedly been uneven, and both are widespread.34 Human rights organizations have criticized

  3. Vaccine mandates, public trust, and vaccine confidence: understanding perceptions is important.

    PubMed

    Widdus, Roy; Larson, Heidi

    2018-05-01

    The experience in Australia with penalizing parents who refuse to have their children vaccinated demonstrates the need to study and understand resistance to vaccination as a global phenomenon with particular local manifestations.

  4. 49 CFR 26.47 - Can recipients be penalized for failing to meet overall goals?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... BY DISADVANTAGED BUSINESS ENTERPRISES IN DEPARTMENT OF TRANSPORTATION FINANCIAL ASSISTANCE PROGRAMS... attention of FTA, FHWA, or FAA, demonstrates that current trends make it unlikely that you will achieve DBE...

  5. Clustered mixed nonhomogeneous Poisson process spline models for the analysis of recurrent event panel data.

    PubMed

    Nielsen, J D; Dean, C B

    2008-09-01

    A flexible semiparametric model for analyzing longitudinal panel count data arising from mixtures is presented. Panel count data refers here to count data on recurrent events collected as the number of events that have occurred within specific follow-up periods. The model assumes that the counts for each subject are generated by mixtures of nonhomogeneous Poisson processes with smooth intensity functions modeled with penalized splines. Time-dependent covariate effects are also incorporated into the process intensity using splines. Discrete mixtures of these nonhomogeneous Poisson process spline models extract functional information from underlying clusters representing hidden subpopulations. The motivating application is an experiment to test the effectiveness of pheromones in disrupting the mating pattern of the cherry bark tortrix moth. Mature moths arise from hidden, but distinct, subpopulations and monitoring the subpopulation responses was of interest. Within-cluster random effects are used to account for correlation structures and heterogeneity common to this type of data. An estimating equation approach to inference requiring only low moment assumptions is developed and the finite sample properties of the proposed estimating functions are investigated empirically by simulation.

  6. Encoding Dissimilarity Data for Statistical Model Building.

    PubMed

    Wahba, Grace

    2010-12-01

    We summarize, review and comment upon three papers which discuss the use of discrete, noisy, incomplete, scattered pairwise dissimilarity data in statistical model building. Convex cone optimization codes are used to embed the objects into a Euclidean space which respects the dissimilarity information while controlling the dimension of the space. A "newbie" algorithm is provided for embedding new objects into this space. This allows the dissimilarity information to be incorporated into a Smoothing Spline ANOVA penalized likelihood model, a Support Vector Machine, or any model that will admit Reproducing Kernel Hilbert Space components, for nonparametric regression, supervised learning, or semi-supervised learning. Future work and open questions are discussed. The papers are: F. Lu, S. Keles, S. Wright and G. Wahba 2005. A framework for kernel regularization with application to protein clustering. Proceedings of the National Academy of Sciences 102, 12332-1233.G. Corrada Bravo, G. Wahba, K. Lee, B. Klein, R. Klein and S. Iyengar 2009. Examining the relative influence of familial, genetic and environmental covariate information in flexible risk models. Proceedings of the National Academy of Sciences 106, 8128-8133F. Lu, Y. Lin and G. Wahba. Robust manifold unfolding with kernel regularization. TR 1008, Department of Statistics, University of Wisconsin-Madison.

  7. Structure Topology Optimization of Brake Pad in Large- megawatt Wind Turbine Brake Considering Thermal- structural Coupling

    NASA Astrophysics Data System (ADS)

    Zhang, S. F.; Yin, J.; Liu, Y.; Sha, Z. H.; Ma, F. J.

    2016-11-01

    There always exists severe non-uniform wear of brake pad in large-megawatt wind turbine brake during the braking process, which has the brake pad worn out in advance and even threats the safety production of wind turbine. The root cause of this phenomenon is the non-uniform deformation caused by thermal-structural coupling effect between brake pad and disc while braking under the conditions of both high speed and heavy load. For this problem, mathematical model of thermal-structural coupling analysis is built. Based on the topology optimization method of Solid Isotropic Microstructures with Penalization, SIMP, structure topology optimization of brake pad is developed considering the deformation caused by thermal-structural coupling effect. The objective function is the minimum flexibility, and the structure topology optimization model of brake pad is established after indirect thermal- structural coupling analysis. Compared with the optimization result considering non-thermal- structural coupling, the conspicuous influence of thermal effect on brake pad wear and deformation is proven as well as the rationality of taking thermal-structural coupling effect as optimization condition. Reconstructed model is built according to the result, meanwhile analysis for verification is carried out with the same working condition. This study provides theoretical foundation for the design of high-speed and heavy-load brake pad. The new structure may provide design reference for improving the stress condition between brake pad and disc, enhancing the use ratio of friction material and increasing the working performance of large-megawatt wind turbine brake.

  8. A Variational Approach to Video Registration with Subspace Constraints.

    PubMed

    Garg, Ravi; Roussos, Anastasios; Agapito, Lourdes

    2013-01-01

    This paper addresses the problem of non-rigid video registration, or the computation of optical flow from a reference frame to each of the subsequent images in a sequence, when the camera views deformable objects. We exploit the high correlation between 2D trajectories of different points on the same non-rigid surface by assuming that the displacement of any point throughout the sequence can be expressed in a compact way as a linear combination of a low-rank motion basis. This subspace constraint effectively acts as a trajectory regularization term leading to temporally consistent optical flow. We formulate it as a robust soft constraint within a variational framework by penalizing flow fields that lie outside the low-rank manifold. The resulting energy functional can be decoupled into the optimization of the brightness constancy and spatial regularization terms, leading to an efficient optimization scheme. Additionally, we propose a novel optimization scheme for the case of vector valued images, based on the dualization of the data term. This allows us to extend our approach to deal with colour images which results in significant improvements on the registration results. Finally, we provide a new benchmark dataset, based on motion capture data of a flag waving in the wind, with dense ground truth optical flow for evaluation of multi-frame optical flow algorithms for non-rigid surfaces. Our experiments show that our proposed approach outperforms state of the art optical flow and dense non-rigid registration algorithms.

  9. SU-E-I-01: Iterative CBCT Reconstruction with a Feature-Preserving Penalty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyu, Q; Li, B; Southern Medical University, Guangzhou

    2015-06-15

    Purpose: Low-dose CBCT is desired in various clinical applications. Iterative image reconstruction algorithms have shown advantages in suppressing noise in low-dose CBCT. However, due to the smoothness constraint enforced during the reconstruction process, edges may be blurred and image features may lose in the reconstructed image. In this work, we proposed a new penalty design to preserve image features in the image reconstructed by iterative algorithms. Methods: Low-dose CBCT is reconstructed by minimizing the penalized weighted least-squares (PWLS) objective function. Binary Robust Independent Elementary Features (BRIEF) of the image were integrated into the penalty of PWLS. BRIEF is a generalmore » purpose point descriptor that can be used to identify important features of an image. In this work, BRIEF distance of two neighboring pixels was used to weigh the smoothing parameter in PWLS. For pixels of large BRIEF distance, weaker smooth constraint will be enforced. Image features will be better preserved through such a design. The performance of the PWLS algorithm with BRIEF penalty was evaluated by a CatPhan 600 phantom. Results: The image quality reconstructed by the proposed PWLS-BRIEF algorithm is superior to that by the conventional PWLS method and the standard FDK method. At matched noise level, edges in PWLS-BRIEF reconstructed image are better preserved. Conclusion: This study demonstrated that the proposed PWLS-BRIEF algorithm has great potential on preserving image features in low-dose CBCT.« less

  10. Flight evaluation of an advanced technology light twin-engine airplane (ATLIT)

    NASA Technical Reports Server (NTRS)

    Holmes, B. J.

    1977-01-01

    Project organization and execution, airplane description and performance predictions, and the results of the flight evaluation of an advanced technology light twin engine airplane (ATLIT) are presented. The ATLIT is a Piper PA-34-200 Seneca I modified by the installation of new wings incorporating the GA(W)-1 (Whitcomb) airfoil, reduced wing area, roll control spoilers, and full span Fowler flaps. The conclusions for the ATLIT evaluation are based on complete stall and roll flight test results and partial performance test results. The Stalling and rolling characteristics met design expectations. Climb performance was penalized by extensive flow separation in the region of the wing body juncture. Cruise performance was found to be penalized by a large value of zero lift drag. Calculations showed that, with proper attention to construction details, the improvements in span efficiency and zero lift drag would permit the realization of the predicted increases in cruising and maximum rate of climb performance.

  11. GLOBAL SOLUTIONS TO FOLDED CONCAVE PENALIZED NONCONVEX LEARNING

    PubMed Central

    Liu, Hongcheng; Yao, Tao; Li, Runze

    2015-01-01

    This paper is concerned with solving nonconvex learning problems with folded concave penalty. Despite that their global solutions entail desirable statistical properties, there lack optimization techniques that guarantee global optimality in a general setting. In this paper, we show that a class of nonconvex learning problems are equivalent to general quadratic programs. This equivalence facilitates us in developing mixed integer linear programming reformulations, which admit finite algorithms that find a provably global optimal solution. We refer to this reformulation-based technique as the mixed integer programming-based global optimization (MIPGO). To our knowledge, this is the first global optimization scheme with a theoretical guarantee for folded concave penalized nonconvex learning with the SCAD penalty (Fan and Li, 2001) and the MCP penalty (Zhang, 2010). Numerical results indicate a significant outperformance of MIPGO over the state-of-the-art solution scheme, local linear approximation, and other alternative solution techniques in literature in terms of solution quality. PMID:27141126

  12. Adaptive rival penalized competitive learning and combined linear predictor model for financial forecast and investment.

    PubMed

    Cheung, Y M; Leung, W M; Xu, L

    1997-01-01

    We propose a prediction model called Rival Penalized Competitive Learning (RPCL) and Combined Linear Predictor method (CLP), which involves a set of local linear predictors such that a prediction is made by the combination of some activated predictors through a gating network (Xu et al., 1994). Furthermore, we present its improved variant named Adaptive RPCL-CLP that includes an adaptive learning mechanism as well as a data pre-and-post processing scheme. We compare them with some existing models by demonstrating their performance on two real-world financial time series--a China stock price and an exchange-rate series of US Dollar (USD) versus Deutschmark (DEM). Experiments have shown that Adaptive RPCL-CLP not only outperforms the other approaches with the smallest prediction error and training costs, but also brings in considerable high profits in the trading simulation of foreign exchange market.

  13. Lossless quantum data compression with exponential penalization: an operational interpretation of the quantum Rényi entropy.

    PubMed

    Bellomo, Guido; Bosyk, Gustavo M; Holik, Federico; Zozor, Steeve

    2017-11-07

    Based on the problem of quantum data compression in a lossless way, we present here an operational interpretation for the family of quantum Rényi entropies. In order to do this, we appeal to a very general quantum encoding scheme that satisfies a quantum version of the Kraft-McMillan inequality. Then, in the standard situation, where one is intended to minimize the usual average length of the quantum codewords, we recover the known results, namely that the von Neumann entropy of the source bounds the average length of the optimal codes. Otherwise, we show that by invoking an exponential average length, related to an exponential penalization over large codewords, the quantum Rényi entropies arise as the natural quantities relating the optimal encoding schemes with the source description, playing an analogous role to that of von Neumann entropy.

  14. Interquantile Shrinkage in Regression Models

    PubMed Central

    Jiang, Liewen; Wang, Huixia Judy; Bondell, Howard D.

    2012-01-01

    Conventional analysis using quantile regression typically focuses on fitting the regression model at different quantiles separately. However, in situations where the quantile coefficients share some common feature, joint modeling of multiple quantiles to accommodate the commonality often leads to more efficient estimation. One example of common features is that a predictor may have a constant effect over one region of quantile levels but varying effects in other regions. To automatically perform estimation and detection of the interquantile commonality, we develop two penalization methods. When the quantile slope coefficients indeed do not change across quantile levels, the proposed methods will shrink the slopes towards constant and thus improve the estimation efficiency. We establish the oracle properties of the two proposed penalization methods. Through numerical investigations, we demonstrate that the proposed methods lead to estimations with competitive or higher efficiency than the standard quantile regression estimation in finite samples. Supplemental materials for the article are available online. PMID:24363546

  15. ["As we're not willing to hang and behead and not able to deport...". On Emil Kraepelin's influence on Franz von Liszt].

    PubMed

    Schmidt-Recla, A; Steinberg, H

    2008-03-01

    Emil Kraepelin started his scientific career with a pamphlet demanding complete restructure of German penal law. It is well known that Kraepelin was a recipient of Cesare Lombroso's theses on degeneration and atavism. Therefore his demand for a correctional law completely replacing penal law is easily understood. Still undiscussed however is the question of whether Kraepelin's brochure had a decisive effect on German criminal law, especially on the so-called Marburg Program of Franz von Liszt, still viewed as the first emergence of modern criminal law and policies in Germany. Examination of this shows that despite major theoretical faults, Kraepelin came to conclusions that correspond remarkably with von Liszt's. Special focus should be directed on the psychologist Wilhelm Wundt, who criticised Kraepelin's juridical attempt in a very kind yet fundamental way, and on the relationship that existed between Kraepelin and von Liszt.

  16. [Law No. 92-684 of 22 July 1992 reforming provisions of the Penal Code relating to the punishment of crimes and misdemeanors committed against persons].

    PubMed

    1992-07-23

    This Law, reformulating entirely Book II of the French Penal Code, newly criminalizes the following acts: a) sexual harassment; b) subjecting a person to work conditions or lodging contrary to human dignity because that person is in a situation of vulnerability or dependence; c) incitement of minors to engage in dangerous or illegal behavior such as excessive drinking, use of narcotics, or begging; and d) using the pictures of minors for pornographic purposes. Sexual harassment is defined as the use of orders, threats, or force to gain sexual favors by a person whose responsibilities place him in a position of authority over another person. In addition, provisions relating to the punishment of procuring have been strengthened in the new Code. Acts noted in b) above were criminalized in order to combat more forcefully the use of clandestine workers.

  17. Penalizing hospitals for chronic obstructive pulmonary disease readmissions.

    PubMed

    Feemster, Laura C; Au, David H

    2014-03-15

    In October 2014, the U.S. Centers for Medicare and Medicaid Services (CMS) will expand its Hospital Readmission Reduction Program (HRRP) to include chronic obstructive pulmonary disease (COPD). Under the new policy, hospitals with high risk-adjusted, 30-day all-cause unplanned readmission rates after an index hospitalization for a COPD exacerbation will be penalized with reduced reimbursement for the treatment of Medicare beneficiaries. In this perspective, we review the history of the HRRP, including the recent addition of COPD to the policy. We critically assess the use of 30-day all-cause COPD readmissions as an accountability measure, discussing potential benefits and then highlighting the substantial drawbacks and potential unintended consequences of the measure that could adversely affect providers, hospitals, and patients with COPD. We conclude by emphasizing the need to place the 30-day COPD readmission measure in the context of a reconceived model for postdischarge quality and review several frameworks that could help guide this process.

  18. Lung segmentation from HRCT using united geometric active contours

    NASA Astrophysics Data System (ADS)

    Liu, Junwei; Li, Chuanfu; Xiong, Jin; Feng, Huanqing

    2007-12-01

    Accurate lung segmentation from high resolution CT images is a challenging task due to various detail tracheal structures, missing boundary segments and complex lung anatomy. One popular method is based on gray-level threshold, however its results are usually rough. A united geometric active contours model based on level set is proposed for lung segmentation in this paper. Particularly, this method combines local boundary information and region statistical-based model synchronously: 1) Boundary term ensures the integrality of lung tissue.2) Region term makes the level set function evolve with global characteristic and independent on initial settings. A penalizing energy term is introduced into the model, which forces the level set function evolving without re-initialization. The method is found to be much more efficient in lung segmentation than other methods that are only based on boundary or region. Results are shown by 3D lung surface reconstruction, which indicates that the method will play an important role in the design of computer-aided diagnostic (CAD) system.

  19. A practical globalization of one-shot optimization for optimal design of tokamak divertors

    NASA Astrophysics Data System (ADS)

    Blommaert, Maarten; Dekeyser, Wouter; Baelmans, Martine; Gauger, Nicolas R.; Reiter, Detlev

    2017-01-01

    In past studies, nested optimization methods were successfully applied to design of the magnetic divertor configuration in nuclear fusion reactors. In this paper, so-called one-shot optimization methods are pursued. Due to convergence issues, a globalization strategy for the one-shot solver is sought. Whereas Griewank introduced a globalization strategy using a doubly augmented Lagrangian function that includes primal and adjoint residuals, its practical usability is limited by the necessity of second order derivatives and expensive line search iterations. In this paper, a practical alternative is offered that avoids these drawbacks by using a regular augmented Lagrangian merit function that penalizes only state residuals. Additionally, robust rank-two Hessian estimation is achieved by adaptation of Powell's damped BFGS update rule. The application of the novel one-shot approach to magnetic divertor design is considered in detail. For this purpose, the approach is adapted to be complementary with practical in parts adjoint sensitivities. Using the globalization strategy, stable convergence of the one-shot approach is achieved.

  20. Generalized Scalar-on-Image Regression Models via Total Variation.

    PubMed

    Wang, Xiao; Zhu, Hongtu

    2017-01-01

    The use of imaging markers to predict clinical outcomes can have a great impact in public health. The aim of this paper is to develop a class of generalized scalar-on-image regression models via total variation (GSIRM-TV), in the sense of generalized linear models, for scalar response and imaging predictor with the presence of scalar covariates. A key novelty of GSIRM-TV is that it is assumed that the slope function (or image) of GSIRM-TV belongs to the space of bounded total variation in order to explicitly account for the piecewise smooth nature of most imaging data. We develop an efficient penalized total variation optimization to estimate the unknown slope function and other parameters. We also establish nonasymptotic error bounds on the excess risk. These bounds are explicitly specified in terms of sample size, image size, and image smoothness. Our simulations demonstrate a superior performance of GSIRM-TV against many existing approaches. We apply GSIRM-TV to the analysis of hippocampus data obtained from the Alzheimers Disease Neuroimaging Initiative (ADNI) dataset.

  1. Consistent multiphase-field theory for interface driven multidomain dynamics

    NASA Astrophysics Data System (ADS)

    Tóth, Gyula I.; Pusztai, Tamás; Gránásy, László

    2015-11-01

    We present a multiphase-field theory for describing pattern formation in multidomain and/or multicomponent systems. The construction of the free energy functional and the dynamic equations is based on criteria that ensure mathematical and physical consistency. We first analyze previous multiphase-field theories and identify their advantageous and disadvantageous features. On the basis of this analysis, we introduce a way of constructing the free energy surface and derive a generalized multiphase description for arbitrary number of phases (or domains). The presented approach retains the variational formalism, reduces (or extends) naturally to lower (or higher) number of fields on the level of both the free energy functional and the dynamic equations, enables the use of arbitrary pairwise equilibrium interfacial properties, penalizes multiple junctions increasingly with the number of phases, ensures non-negative entropy production and the convergence of the dynamic solutions to the equilibrium solutions, and avoids the appearance of spurious phases on binary interfaces. The approach is tested for multicomponent phase separation and grain coarsening.

  2. Coherence penalty functional: A simple method for adding decoherence in Ehrenfest dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akimov, Alexey V., E-mail: alexvakimov@gmail.com, E-mail: oleg.prezhdo@rochester.edu; Chemistry Department, Brookhaven National Laboratory, Upton, New York 11973; Long, Run

    2014-05-21

    We present a new semiclassical approach for description of decoherence in electronically non-adiabatic molecular dynamics. The method is formulated on the grounds of the Ehrenfest dynamics and the Meyer-Miller-Thoss-Stock mapping of the time-dependent Schrödinger equation onto a fully classical Hamiltonian representation. We introduce a coherence penalty functional (CPF) that accounts for decoherence effects by randomizing the wavefunction phase and penalizing development of coherences in regions of strong non-adiabatic coupling. The performance of the method is demonstrated with several model and realistic systems. Compared to other semiclassical methods tested, the CPF method eliminates artificial interference and improves agreement with the fullymore » quantum calculations on the models. When applied to study electron transfer dynamics in the nanoscale systems, the method shows an improved accuracy of the predicted time scales. The simplicity and high computational efficiency of the CPF approach make it a perfect practical candidate for applications in realistic systems.« less

  3. [Extramural research funds and penal law--status of legislation].

    PubMed

    Ulsenheimer, Klaus

    2005-04-01

    After decades of smooth functioning, the cooperation of physicians and hospitals with the industry (much desired from the side of the government in the interest of clinical research) has fallen in legal discredit due to increasingly frequent criminal inquires and proceedings for unduly privileges, corruption, and embezzlement. The discredit is so severe that the industry funding for clinical research is diverted abroad to an increasing extent. The legal elements of embezzlement assume the intentional violation of the entrusted funds against the interest of the customer. Undue privileges occur when an official requests an advantage in exchange for a service (or is promised one or takes one) in his or somebody else's interest. The elements of corruption are then given when the receiver of the undue privilege provides an illegal service or takes a discretionary decision under the influence of the gratuity. The tension between the prohibition of undue privileges (as regulated by the penal law) and the granting of extramural funds (as regulated by the administrative law in academic institutions) can be reduced through a high degree of transparency and the start of control possibilities--public announcement and authorization by the officials--as well as through exact documentation and observance of the principles of separation of interests and moderation. With the anti-corruption law of 1997, it is possible to charge of corruption also physicians employed in private institutions. In contrast, physicians in private practice are not considered in the above criminal facts. They can only be charged of misdemeanor, or called to respond to the professional board, on the basis of the law that regulates advertising for medicinal products (Heilmittelwerbegesetz).

  4. Comparing classification methods for diffuse reflectance spectra to improve tissue specific laser surgery.

    PubMed

    Engelhardt, Alexander; Kanawade, Rajesh; Knipfer, Christian; Schmid, Matthias; Stelzle, Florian; Adler, Werner

    2014-07-16

    In the field of oral and maxillofacial surgery, newly developed laser scalpels have multiple advantages over traditional metal scalpels. However, they lack haptic feedback. This is dangerous near e.g. nerve tissue, which has to be preserved during surgery. One solution to this problem is to train an algorithm that analyzes the reflected light spectra during surgery and can classify these spectra into different tissue types, in order to ultimately send a warning or temporarily switch off the laser when critical tissue is about to be ablated. Various machine learning algorithms are available for this task, but a detailed analysis is needed to assess the most appropriate algorithm. In this study, a small data set is used to simulate many larger data sets according to a multivariate Gaussian distribution. Various machine learning algorithms are then trained and evaluated on these data sets. The algorithms' performance is subsequently evaluated and compared by averaged confusion matrices and ultimately by boxplots of misclassification rates. The results are validated on the smaller, experimental data set. Most classifiers have a median misclassification rate below 0.25 in the simulated data. The most notable performance was observed for the Penalized Discriminant Analysis, with a misclassifiaction rate of 0.00 in the simulated data, and an average misclassification rate of 0.02 in a 10-fold cross validation on the original data. The results suggest a Penalized Discriminant Analysis is the most promising approach, most probably because it considers the functional, correlated nature of the reflectance spectra.The results of this study improve the accuracy of real-time tissue discrimination and are an essential step towards improving the safety of oral laser surgery.

  5. Penalized maximum likelihood simultaneous longitudinal PET image reconstruction with difference-image priors.

    PubMed

    Ellis, Sam; Reader, Andrew J

    2018-04-26

    Many clinical contexts require the acquisition of multiple positron emission tomography (PET) scans of a single subject, for example, to observe and quantitate changes in functional behaviour in tumors after treatment in oncology. Typically, the datasets from each of these scans are reconstructed individually, without exploiting the similarities between them. We have recently shown that sharing information between longitudinal PET datasets by penalizing voxel-wise differences during image reconstruction can improve reconstructed images by reducing background noise and increasing the contrast-to-noise ratio of high-activity lesions. Here, we present two additional novel longitudinal difference-image priors and evaluate their performance using two-dimesional (2D) simulation studies and a three-dimensional (3D) real dataset case study. We have previously proposed a simultaneous difference-image-based penalized maximum likelihood (PML) longitudinal image reconstruction method that encourages sparse difference images (DS-PML), and in this work we propose two further novel prior terms. The priors are designed to encourage longitudinal images with corresponding differences which have (a) low entropy (DE-PML), and (b) high sparsity in their spatial gradients (DTV-PML). These two new priors and the originally proposed longitudinal prior were applied to 2D-simulated treatment response [ 18 F]fluorodeoxyglucose (FDG) brain tumor datasets and compared to standard maximum likelihood expectation-maximization (MLEM) reconstructions. These 2D simulation studies explored the effects of penalty strengths, tumor behaviour, and interscan coupling on reconstructed images. Finally, a real two-scan longitudinal data series acquired from a head and neck cancer patient was reconstructed with the proposed methods and the results compared to standard reconstruction methods. Using any of the three priors with an appropriate penalty strength produced images with noise levels equivalent to those seen when using standard reconstructions with increased counts levels. In tumor regions, each method produces subtly different results in terms of preservation of tumor quantitation and reconstruction root mean-squared error (RMSE). In particular, in the two-scan simulations, the DE-PML method produced tumor means in close agreement with MLEM reconstructions, while the DTV-PML method produced the lowest errors due to noise reduction within the tumor. Across a range of tumor responses and different numbers of scans, similar results were observed, with DTV-PML producing the lowest errors of the three priors and DE-PML producing the lowest bias. Similar improvements were observed in the reconstructions of the real longitudinal datasets, although imperfect alignment of the two PET images resulted in additional changes in the difference image that affected the performance of the proposed methods. Reconstruction of longitudinal datasets by penalizing difference images between pairs of scans from a data series allows for noise reduction in all reconstructed images. An appropriate choice of penalty term and penalty strength allows for this noise reduction to be achieved while maintaining reconstruction performance in regions of change, either in terms of quantitation of mean intensity via DE-PML, or in terms of tumor RMSE via DTV-PML. Overall, improving the image quality of longitudinal datasets via simultaneous reconstruction has the potential to improve upon currently used methods, allow dose reduction, or reduce scan time while maintaining image quality at current levels. © 2018 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  6. A comparison of "life threatening injury" concept in the Turkish Penal Code and trauma scoring systems.

    PubMed

    Fedakar, Recep; Aydiner, Ahmet Hüsamettin; Ercan, Ilker

    2007-07-01

    To compare accuracy and to check the suitability of the Glasgow Coma Scale (GCS), the Revised Trauma Score (RTS), the Injury Severity Score (ISS), the New Injury Severity Score (NISS) and the Trauma and Injury Severity Score (TRISS), the scoring systems widely used in international trauma studies, in the evaluation of the "life threatening injury" concept established by the Turkish Penal Code. The age, sex, type of trauma, type and localizations of wounds, GCS, RTS, ISS, NISS and TRISS values, the decision of life threatening injury of 627 trauma patients admitted to Emergency Department of the Uludag University Medical School Hospital in year 2003 were examined. A life-threatening injury was present in 35.2% of the cases examined. GCS, RTS, ISS, NISS and TRISS confirmed the decision of life threatening injury with percentages of 74.8%, 76.9%, 88.7%, 86.6% and 68.6%, respectively. The best cut-off point 14 was determined in the ISS system with 79.6% sensitivity and 93.6% specificity. All of the cases with sole linear skull fracture officially decided as life threatening injury had an ISS of 5, a NISS of 6 and the best scores of GCS (15), RTS (7.8408) and TRISS (100%). ISS and NISS appeared to be the best trauma scoring systems that can be used for the decision of life threatening injury, compared with GCS, RTS and TRISS. Thus, ISS and NISS can be acceptable for using the evaluation of the life threatening injury concept established by the Turkish Penal Code.

  7. A penalized quantitative structure-property relationship study on melting point of energetic carbocyclic nitroaromatic compounds using adaptive bridge penalty.

    PubMed

    Al-Fakih, A M; Algamal, Z Y; Lee, M H; Aziz, M

    2018-05-01

    A penalized quantitative structure-property relationship (QSPR) model with adaptive bridge penalty for predicting the melting points of 92 energetic carbocyclic nitroaromatic compounds is proposed. To ensure the consistency of the descriptor selection of the proposed penalized adaptive bridge (PBridge), we proposed a ridge estimator ([Formula: see text]) as an initial weight in the adaptive bridge penalty. The Bayesian information criterion was applied to ensure the accurate selection of the tuning parameter ([Formula: see text]). The PBridge based model was internally and externally validated based on [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], the Y-randomization test, [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text] and the applicability domain. The validation results indicate that the model is robust and not due to chance correlation. The descriptor selection and prediction performance of PBridge for the training dataset outperforms the other methods used. PBridge shows the highest [Formula: see text] of 0.959, [Formula: see text] of 0.953, [Formula: see text] of 0.949 and [Formula: see text] of 0.959, and the lowest [Formula: see text] and [Formula: see text]. For the test dataset, PBridge shows a higher [Formula: see text] of 0.945 and [Formula: see text] of 0.948, and a lower [Formula: see text] and [Formula: see text], indicating its better prediction performance. The results clearly reveal that the proposed PBridge is useful for constructing reliable and robust QSPRs for predicting melting points prior to synthesizing new organic compounds.

  8. Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions.

    PubMed

    Liu, Hongcheng; Yao, Tao; Li, Runze; Ye, Yinyu

    2017-11-01

    This paper concerns the folded concave penalized sparse linear regression (FCPSLR), a class of popular sparse recovery methods. Although FCPSLR yields desirable recovery performance when solved globally, computing a global solution is NP-complete. Despite some existing statistical performance analyses on local minimizers or on specific FCPSLR-based learning algorithms, it still remains open questions whether local solutions that are known to admit fully polynomial-time approximation schemes (FPTAS) may already be sufficient to ensure the statistical performance, and whether that statistical performance can be non-contingent on the specific designs of computing procedures. To address the questions, this paper presents the following threefold results: (i) Any local solution (stationary point) is a sparse estimator, under some conditions on the parameters of the folded concave penalties. (ii) Perhaps more importantly, any local solution satisfying a significant subspace second-order necessary condition (S 3 ONC), which is weaker than the second-order KKT condition, yields a bounded error in approximating the true parameter with high probability. In addition, if the minimal signal strength is sufficient, the S 3 ONC solution likely recovers the oracle solution. This result also explicates that the goal of improving the statistical performance is consistent with the optimization criteria of minimizing the suboptimality gap in solving the non-convex programming formulation of FCPSLR. (iii) We apply (ii) to the special case of FCPSLR with minimax concave penalty (MCP) and show that under the restricted eigenvalue condition, any S 3 ONC solution with a better objective value than the Lasso solution entails the strong oracle property. In addition, such a solution generates a model error (ME) comparable to the optimal but exponential-time sparse estimator given a sufficient sample size, while the worst-case ME is comparable to the Lasso in general. Furthermore, to guarantee the S 3 ONC admits FPTAS.

  9. Dynamic Sensor Tasking for Space Situational Awareness via Reinforcement Learning

    NASA Astrophysics Data System (ADS)

    Linares, R.; Furfaro, R.

    2016-09-01

    This paper studies the Sensor Management (SM) problem for optical Space Object (SO) tracking. The tasking problem is formulated as a Markov Decision Process (MDP) and solved using Reinforcement Learning (RL). The RL problem is solved using the actor-critic policy gradient approach. The actor provides a policy which is random over actions and given by a parametric probability density function (pdf). The critic evaluates the policy by calculating the estimated total reward or the value function for the problem. The parameters of the policy action pdf are optimized using gradients with respect to the reward function. Both the critic and the actor are modeled using deep neural networks (multi-layer neural networks). The policy neural network takes the current state as input and outputs probabilities for each possible action. This policy is random, and can be evaluated by sampling random actions using the probabilities determined by the policy neural network's outputs. The critic approximates the total reward using a neural network. The estimated total reward is used to approximate the gradient of the policy network with respect to the network parameters. This approach is used to find the non-myopic optimal policy for tasking optical sensors to estimate SO orbits. The reward function is based on reducing the uncertainty for the overall catalog to below a user specified uncertainty threshold. This work uses a 30 km total position error for the uncertainty threshold. This work provides the RL method with a negative reward as long as any SO has a total position error above the uncertainty threshold. This penalizes policies that take longer to achieve the desired accuracy. A positive reward is provided when all SOs are below the catalog uncertainty threshold. An optimal policy is sought that takes actions to achieve the desired catalog uncertainty in minimum time. This work trains the policy in simulation by letting it task a single sensor to "learn" from its performance. The proposed approach for the SM problem is tested in simulation and good performance is found using the actor-critic policy gradient method.

  10. Participatory Practices in Adult Education.

    ERIC Educational Resources Information Center

    Campbell, Pat, Ed.; Burnaby, Barbara, Ed.

    Participatory education is a collective effort in which the participants are committed to building a just society through individual and socieoeconomic transformation and to ending domination through changing power relations. This book describes participatory practices in many environments, including educational and penal institutions,…

  11. Office Skills: Measuring Typewriting Output.

    ERIC Educational Resources Information Center

    Petersen, Lois E.; Kruk, Leonard B.

    1978-01-01

    The advent of word processing centers has provided typewriting teachers with an alternative measurement system that, instead of penalizing errors, grades students according to Usable Lines Produced (ULP). The ULP system is job-oriented and promotes realistic office standards in typewriting productivity. (MF)

  12. Senegal: Background and U.S. Relations

    DTIC Science & Technology

    2012-02-20

    press, and assembly; corruption and impunity; rape, domestic violence, sexual harassment of and discrimination against women; female genital mutilation ...Senegal’s Budget Process is Transparent [UNCLASSIFIED],” May 4, 2010. 23 The 1999 Penal Law outlawed domestic violence and female genital mutilation , and

  13. Evaluating the Visually Impaired: Neuropsychological Techniques.

    ERIC Educational Resources Information Center

    Price, J. R.; And Others

    1987-01-01

    Assessment of nonvisual neuropsychological impairments in visually impaired persons can be achieved through modification of existing intelligence, memory, sensory-motor, personality, language, and achievement tests so that they do not require vision or penalize visually impaired persons. The Halstead-Reitan and Luria-Nebraska neuropsychological…

  14. Golden Handcuffs

    ERIC Educational Resources Information Center

    Costrell, Robert M.; Podgursky, Michael

    2010-01-01

    Teacher pensions consume a substantial portion of school budgets. If relatively generous pensions help attract effective teachers, the expense might be justified. But new evidence suggests that current pension systems, by concentrating benefits on teachers who spend their entire careers in a single state and penalizing mobile teachers, may…

  15. 24 CFR 882.401 - Eligible properties.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...) Ineligible properties. (1) Nursing homes, units within the grounds of penal, reformatory, medical, mental and similar public or private institutions, and facilities providing continual psychiatric, medical or nursing... State or unit of general local government is not eligible for assistance under this program. (3) High...

  16. 24 CFR 882.401 - Eligible properties.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...) Ineligible properties. (1) Nursing homes, units within the grounds of penal, reformatory, medical, mental and similar public or private institutions, and facilities providing continual psychiatric, medical or nursing... State or unit of general local government is not eligible for assistance under this program. (3) High...

  17. 24 CFR 882.401 - Eligible properties.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...) Ineligible properties. (1) Nursing homes, units within the grounds of penal, reformatory, medical, mental and similar public or private institutions, and facilities providing continual psychiatric, medical or nursing... State or unit of general local government is not eligible for assistance under this program. (3) High...

  18. 24 CFR 882.401 - Eligible properties.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...) Ineligible properties. (1) Nursing homes, units within the grounds of penal, reformatory, medical, mental and similar public or private institutions, and facilities providing continual psychiatric, medical or nursing... State or unit of general local government is not eligible for assistance under this program. (3) High...

  19. Correlates of sleep quality in midlife and beyond: a machine learning analysis.

    PubMed

    Kaplan, Katherine A; Hardas, Prajesh P; Redline, Susan; Zeitzer, Jamie M

    2017-06-01

    In older adults, traditional metrics derived from polysomnography (PSG) are not well correlated with subjective sleep quality. Little is known about whether the association between PSG and subjective sleep quality changes with age, or whether quantitative electroencephalography (qEEG) is associated with sleep quality. Therefore, we examined the relationship between subjective sleep quality and objective sleep characteristics (standard PSG and qEEG) across middle to older adulthood. Using cross-sectional analyses of 3173 community-dwelling men and women aged between 39 and 90 participating in the Sleep Heart Health Study, we examined the relationship between a morning rating of the prior night's sleep quality (sleep depth and restfulness) and polysomnographic, and qEEG descriptors of that single night of sleep, along with clinical and demographic measures. Multivariable models were constructed using two machine learning methods, namely lasso penalized regressions and random forests. Little variance was explained across models. Greater objective sleep efficiency, reduced wake after sleep onset, and fewer sleep-to-wake stage transitions were each associated with higher sleep quality; qEEG variables contributed little explanatory power. The oldest adults reported the highest sleep quality even as objective sleep deteriorated such that they would rate their sleep better, given the same level of sleep efficiency. Despite this, there were no major differences in the predictors of subjective sleep across the age span. Standard metrics derived from PSG, including qEEG, contribute little to explaining subjective sleep quality in middle-aged to older adults. The objective correlates of subjective sleep quality do not appear to systematically change with age despite a change in the relationship between subjective sleep quality and objective sleep efficiency. Published by Elsevier B.V.

  20. Abortion checks at German-Dutch border.

    PubMed

    Von Baross, J

    1991-05-01

    The commentary on West German abortion law, particularly in illegal abortion in the Netherlands, finds the law restrictive and in violation of the dignity and rights of women. The Max-Planck Institute in 1990 published a study that found that a main point of prosecution between 1976 and 1986, as reported by Der Spiegal, was in border crossings from the Netherlands. It is estimated that 10,000 annually have abortions abroad, and 6,000 to 7,000 in the Netherlands. The procedure was for an official to stop a young person and query about drugs; later the woman would admit to an abortion, and be forced into a medical examination. The German Penal Code Section 218 stipulates abortion only for certain reasons testified to by a doctor other than the one performing the abortion. Counseling on available social assistance must be completed 3 days prior to the abortion. Many counseling offices are church related and opposed to abortions. Many doctors refuse legally to certify, and access to abortion is limited. The required hospital stay is 3-4 nights with no day care facilities. Penal Code Section 5 No. 9 allows prosecution for uncounseled illegal abortion. Abortion law reform is anticipated by the end of 1992 in the Bundestag due to the Treaty or the Unification of Germany. The Treaty states that the rights of the unborn child must be protected and that pregnant women relieve their distress in a way compatible with the Constitution, but improved over legal regulations from either West or East Germany, which permits abortion on request within 12 weeks of conception without counseling. It is hoped that the law will be liberalized and Penal Code Section 5 No. 9 will be abolished.

  1. Aero-acoustic performance comparison of core engine noise suppressors on NASA quiet engine C

    NASA Technical Reports Server (NTRS)

    Bloomer, H. E.; Schaefer, J. W.

    1977-01-01

    The relative aero-acoustic effectiveness of two core engine suppressors, a contractor-designed suppressor delivered with the Quiet Engine, and a NASA-designed suppressor was evaluated. The NASA suppressor was tested with and without a splitter making a total of three configurations being reported in addition to the baseline hardwall case. The aerodynamic results are presented in terms of tailpipe pressure loss, corrected net thrust, and corrected specific fuel consumption as functions of engine power setting. The acoustic results are divided into duct and far-field acoustic data. The NASA-designed core suppressor did the better job of suppressing aft end noise, but the splitter associated with it caused a significant engine performance penality. The NASA core suppressor without the spltter suppressed most of the core noise without any engine performance penalty.

  2. Prophetic Granger Causality to infer gene regulatory networks.

    PubMed

    Carlin, Daniel E; Paull, Evan O; Graim, Kiley; Wong, Christopher K; Bivol, Adrian; Ryabinin, Peter; Ellrott, Kyle; Sokolov, Artem; Stuart, Joshua M

    2017-01-01

    We introduce a novel method called Prophetic Granger Causality (PGC) for inferring gene regulatory networks (GRNs) from protein-level time series data. The method uses an L1-penalized regression adaptation of Granger Causality to model protein levels as a function of time, stimuli, and other perturbations. When combined with a data-independent network prior, the framework outperformed all other methods submitted to the HPN-DREAM 8 breast cancer network inference challenge. Our investigations reveal that PGC provides complementary information to other approaches, raising the performance of ensemble learners, while on its own achieves moderate performance. Thus, PGC serves as a valuable new tool in the bioinformatics toolkit for analyzing temporal datasets. We investigate the general and cell-specific interactions predicted by our method and find several novel interactions, demonstrating the utility of the approach in charting new tumor wiring.

  3. Prophetic Granger Causality to infer gene regulatory networks

    PubMed Central

    Carlin, Daniel E.; Paull, Evan O.; Graim, Kiley; Wong, Christopher K.; Bivol, Adrian; Ryabinin, Peter; Ellrott, Kyle; Sokolov, Artem

    2017-01-01

    We introduce a novel method called Prophetic Granger Causality (PGC) for inferring gene regulatory networks (GRNs) from protein-level time series data. The method uses an L1-penalized regression adaptation of Granger Causality to model protein levels as a function of time, stimuli, and other perturbations. When combined with a data-independent network prior, the framework outperformed all other methods submitted to the HPN-DREAM 8 breast cancer network inference challenge. Our investigations reveal that PGC provides complementary information to other approaches, raising the performance of ensemble learners, while on its own achieves moderate performance. Thus, PGC serves as a valuable new tool in the bioinformatics toolkit for analyzing temporal datasets. We investigate the general and cell-specific interactions predicted by our method and find several novel interactions, demonstrating the utility of the approach in charting new tumor wiring. PMID:29211761

  4. Failure: A Source of Progress in Maintenance and Design

    NASA Astrophysics Data System (ADS)

    Chaïb, R.; Taleb, M.; Benidir, M.; Verzea, I.; Bellaouar, A.

    This approach, allows using the failure as a source of progress in maintenance and design to detect the most critical components in equipment, to determine the priority order maintenance actions to lead and direct the exploitation procedure towards the most penalizing links in this equipment, even define the necessary changes and recommendations for future improvement. Thus, appreciate the pathological behaviour of the material and increase its availability, even increase its lifespan and improve its future design. In this context and in the light of these points, the failures are important in managing the maintenance function. Indeed, it has become important to understand the phenomena of failure and degradation of equipments in order to establish an appropriate maintenance policy for the rational use of mechanical components and move to the practice of proactive maintenance [1], do maintenance at the design [2].

  5. Primary Frequency Response with Aggregated DERs: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guggilam, Swaroop S.; Dhople, Sairaj V.; Zhao, Changhong

    2017-03-03

    Power networks have to withstand a variety of disturbances that affect system frequency, and the problem is compounded with the increasing integration of intermittent renewable generation. Following a large-signal generation or load disturbance, system frequency is arrested leveraging primary frequency control provided by governor action in synchronous generators. In this work, we propose a framework for distributed energy resources (DERs) deployed in distribution networks to provide (supplemental) primary frequency response. Particularly, we demonstrate how power-frequency droop slopes for individual DERs can be designed so that the distribution feeder presents a guaranteed frequency-regulation characteristic at the feeder head. Furthermore, the droopmore » slopes are engineered such that injections of individual DERs conform to a well-defined fairness objective that does not penalize them for their location on the distribution feeder. Time-domain simulations for an illustrative network composed of a combined transmission network and distribution network with frequency-responsive DERs are provided to validate the approach.« less

  6. A rapid detection method of Escherichia coli by surface enhanced Raman scattering

    NASA Astrophysics Data System (ADS)

    Tao, Feifei; Peng, Yankun; Xu, Tianfeng

    2015-05-01

    Conventional microbiological detection and enumeration methods are time-consuming, labor-intensive, and giving retrospective information. The objectives of the present work are to study the capability of surface enhanced Raman scattering (SERS) to detect Escherichia coli (E. coli) using the presented silver colloidal substrate. The obtained results showed that the adaptive iteratively reweighed Penalized Least Squares (airPLS) algorithm could effectively remove the fluorescent background from original Raman spectra, and Raman characteristic peaks of 558, 682, 726, 1128, 1210 and 1328 cm-1 could be observed stably in the baseline corrected SERS spectra of all studied bacterial concentrations. The detection limit of SERS could be determined to be as low as 0.73 log CFU/ml for E. coli with the prepared silver colloidal substrate. The quantitative prediction results using the intensity values of characteristic peaks were not good, with the correlation coefficients of calibration set and cross validation set of 0.99 and 0.64, respectively.

  7. Nonparametric modeling of longitudinal covariance structure in functional mapping of quantitative trait loci.

    PubMed

    Yap, John Stephen; Fan, Jianqing; Wu, Rongling

    2009-12-01

    Estimation of the covariance structure of longitudinal processes is a fundamental prerequisite for the practical deployment of functional mapping designed to study the genetic regulation and network of quantitative variation in dynamic complex traits. We present a nonparametric approach for estimating the covariance structure of a quantitative trait measured repeatedly at a series of time points. Specifically, we adopt Huang et al.'s (2006, Biometrika 93, 85-98) approach of invoking the modified Cholesky decomposition and converting the problem into modeling a sequence of regressions of responses. A regularized covariance estimator is obtained using a normal penalized likelihood with an L(2) penalty. This approach, embedded within a mixture likelihood framework, leads to enhanced accuracy, precision, and flexibility of functional mapping while preserving its biological relevance. Simulation studies are performed to reveal the statistical properties and advantages of the proposed method. A real example from a mouse genome project is analyzed to illustrate the utilization of the methodology. The new method will provide a useful tool for genome-wide scanning for the existence and distribution of quantitative trait loci underlying a dynamic trait important to agriculture, biology, and health sciences.

  8. 45 CFR 46.303 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Definitions. 46.303 Section 46.303 Public Welfare... Protections Pertaining to Biomedical and Behavioral Research Involving Prisoners as Subjects § 46.303... involuntarily confined or detained in a penal institution. The term is intended to encompass individuals...

  9. 45 CFR 46.303 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Definitions. 46.303 Section 46.303 Public Welfare... Protections Pertaining to Biomedical and Behavioral Research Involving Prisoners as Subjects § 46.303... involuntarily confined or detained in a penal institution. The term is intended to encompass individuals...

  10. 45 CFR 46.303 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Definitions. 46.303 Section 46.303 Public Welfare... Protections Pertaining to Biomedical and Behavioral Research Involving Prisoners as Subjects § 46.303... involuntarily confined or detained in a penal institution. The term is intended to encompass individuals...

  11. 7 CFR 1437.401 - Forage.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Coverage of Forage Intended for Animal Consumption § 1437.401 Forage. (a) Forage eligible for benefits... impact of disaster conditions, as determined by CCC, shall not be penalized. Benefits are not available..., except claims on forage for grazing benefits will be determined according to paragraph (f) of this...

  12. 7 CFR 1437.401 - Forage.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Coverage of Forage Intended for Animal Consumption § 1437.401 Forage. (a) Forage eligible for benefits... impact of disaster conditions, as determined by CCC, shall not be penalized. Benefits are not available..., except claims on forage for grazing benefits will be determined according to paragraph (f) of this...

  13. 7 CFR 1437.401 - Forage.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Coverage of Forage Intended for Animal Consumption § 1437.401 Forage. (a) Forage eligible for benefits... impact of disaster conditions, as determined by CCC, shall not be penalized. Benefits are not available..., except claims on forage for grazing benefits will be determined according to paragraph (f) of this...

  14. 7 CFR 1437.401 - Forage.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Coverage of Forage Intended for Animal Consumption § 1437.401 Forage. (a) Forage eligible for benefits... impact of disaster conditions, as determined by CCC, shall not be penalized. Benefits are not available..., except claims on forage for grazing benefits will be determined according to paragraph (f) of this...

  15. 7 CFR 1437.401 - Forage.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Coverage of Forage Intended for Animal Consumption § 1437.401 Forage. (a) Forage eligible for benefits... impact of disaster conditions, as determined by CCC, shall not be penalized. Benefits are not available..., except claims on forage for grazing benefits will be determined according to paragraph (f) of this...

  16. Striking a Balance

    ERIC Educational Resources Information Center

    Martin, Susan Ferguson; Green, Andre

    2012-01-01

    Learning centers can help teachers assess students' content knowledge without penalizing them for language barriers. With the increasing number of English language learners (ELLs) in classrooms, the emphasis on mastery of content and inclusion of all students in class discussions and activities will provide all students a chance for scientific…

  17. 21 CFR 113.3 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... steam into the closed retort and the time when the retort reaches the required processing temperature..., school, penal, or other organization) processing of food, including pet food. Persons engaged in the... flames to achieve sterilization temperatures. A holding period in a heated section may follow the initial...

  18. [Commitment prerequisites according to paragraph 64 of the German penal code].

    PubMed

    Rasch, W

    1986-05-01

    Article 64 of the Penal Code of the Federal Republic of Germany provides for the commitment of alcohol and drug addicts to special institutions if further severe offences are to be expected due to their addiction. The measure of correction cannot be applied if treatment endeavours are given no chance. As the lack of compliance very often only appears in the course of the treatment, an amendment was made which added a new article. According to the new legal regulations, the measure of correction can be annulled after one year of detainment if the addict shows no aptitude for the therapy. The new regulation implicates an intensification of the therapeuticts role conflict, which is based on the fact that coerced therapy was little chance to be successful. Before establishing the deficient therapeutical capability of a patient, it should be conscientiously verified how far a team conflict or an insufficient therapeutical offer determines the decision. It should be considered that possibly one only wants to get rid of a troublesome patient.

  19. Act No. 62, Penal Code, 29 December 1987.

    PubMed

    1988-01-01

    This document contains various provisions of the 1987 Cuban Penal Code. Chapter 6 of Title 8 (crimes against life and bodily integrity) outlaws abortion and sets prison terms for its performance under various circumstances. Chapter 7 sets a penalty of five to 12 years imprisonment for performing a sterilization procedure. Chapter 8 outlines the penalties for abandonment of minors and incompetent or helpless people. Under Title 9 (crimes against individual rights), Chapter 8 renders it illegal to discriminate on the grounds of sex, race, color, or national origin. Chapter 1 of Title 11 deals with crimes against the normal development of sexual relations, setting penalties for rape, pederasty with violence, and lascivious abuse. Chapter 2 covers crimes against the normal development of the family such as incest, sexual relations with a minor, bigamy, illegal marriage, and substitution of one child for another. Chapter 3 places penalties for crimes against the normal development of childhood and youth, such as the corruption of minors, the neglect of minors, and the failure to support minors.

  20. When is coercive methadone therapy justified?

    PubMed

    D'Hotman, Daniel; Pugh, Jonathan; Douglas, Thomas

    2018-06-08

    Heroin use poses a significant health and economic burden to society, and individuals with heroin dependence are responsible for a significant amount of crime. Owing to its efficacy and cost-effectiveness, methadone maintenance therapy (MMT) is offered as an optional alternative to imprisonment for drug offenders in several jurisdictions. Some object to such 'MMT offers' on the basis that they involve coercion and thus invalidate the offender's consent to MMT. While we find these arguments unpersuasive, we do not attempt to build a case against them here. Instead, we explore whether administration of MMT following acceptance of an MMT offer might be permissible even on the assumption that MMT offers are coercive, and in such a way that the resulting MMT is non-consensual. We argue that non-consensual MMT following an MMT offer is typically permissible. We first offer empirical evidence to demonstrate the substantial benefits to the offender and society of implementing non-consensual MMT in the criminal justice system. We then explore and respond to potential objections to such uses of MMT. These appeal respectively to harm, autonomy, bodily and mental interference, and penal theoretic considerations. Finally, we introduce and dismiss a potential response to our argument that takes a revisionist position, rejecting prevailing incarceration practices. © 2018 John Wiley & Sons Ltd.

  1. Searching for discrimination rules in protease proteolytic cleavage activity using genetic programming with a min-max scoring function.

    PubMed

    Yang, Zheng Rong; Thomson, Rebecca; Hodgman, T Charles; Dry, Jonathan; Doyle, Austin K; Narayanan, Ajit; Wu, XiKun

    2003-11-01

    This paper presents an algorithm which is able to extract discriminant rules from oligopeptides for protease proteolytic cleavage activity prediction. The algorithm is developed using genetic programming. Three important components in the algorithm are a min-max scoring function, the reverse Polish notation (RPN) and the use of minimum description length. The min-max scoring function is developed using amino acid similarity matrices for measuring the similarity between an oligopeptide and a rule, which is a complex algebraic equation of amino acids rather than a simple pattern sequence. The Fisher ratio is then calculated on the scoring values using the class label associated with the oligopeptides. The discriminant ability of each rule can therefore be evaluated. The use of RPN makes the evolutionary operations simpler and therefore reduces the computational cost. To prevent overfitting, the concept of minimum description length is used to penalize over-complicated rules. A fitness function is therefore composed of the Fisher ratio and the use of minimum description length for an efficient evolutionary process. In the application to four protease datasets (Trypsin, Factor Xa, Hepatitis C Virus and HIV protease cleavage site prediction), our algorithm is superior to C5, a conventional method for deriving decision trees.

  2. Integration of Component Knowledge in Penalized-Likelihood Reconstruction with Morphological and Spectral Uncertainties.

    PubMed

    Stayman, J Webster; Tilley, Steven; Siewerdsen, Jeffrey H

    2014-01-01

    Previous investigations [1-3] have demonstrated that integrating specific knowledge of the structure and composition of components like surgical implants, devices, and tools into a model-based reconstruction framework can improve image quality and allow for potential exposure reductions in CT. Using device knowledge in practice is complicated by uncertainties in the exact shape of components and their particular material composition. Such unknowns in the morphology and attenuation properties lead to errors in the forward model that limit the utility of component integration. In this work, a methodology is presented to accommodate both uncertainties in shape as well as unknown energy-dependent attenuation properties of the surgical devices. This work leverages the so-called known-component reconstruction (KCR) framework [1] with a generalized deformable registration operator and modifications to accommodate a spectral transfer function in the component model. Moreover, since this framework decomposes the object into separate background anatomy and "known" component factors, a mixed fidelity forward model can be adopted so that measurements associated with projections through the surgical devices can be modeled with much greater accuracy. A deformable KCR (dKCR) approach using the mixed fidelity model is introduced and applied to a flexible wire component with unknown structure and composition. Image quality advantages of dKCR over traditional reconstruction methods are illustrated in cone-beam CT (CBCT) data acquired on a testbench emulating a 3D-guided needle biopsy procedure - i.e., a deformable component (needle) with strong energy-dependent attenuation characteristics (steel) within a complex soft-tissue background.

  3. Learning semantic histopathological representation for basal cell carcinoma classification

    NASA Astrophysics Data System (ADS)

    Gutiérrez, Ricardo; Rueda, Andrea; Romero, Eduardo

    2013-03-01

    Diagnosis of a histopathology glass slide is a complex process that involves accurate recognition of several structures, their function in the tissue and their relation with other structures. The way in which the pathologist represents the image content and the relations between those objects yields a better and accurate diagnoses. Therefore, an appropriate semantic representation of the image content will be useful in several analysis tasks such as cancer classification, tissue retrieval and histopahological image analysis, among others. Nevertheless, to automatically recognize those structures and extract their inner semantic meaning are still very challenging tasks. In this paper we introduce a new semantic representation that allows to describe histopathological concepts suitable for classification. The approach herein identify local concepts using a dictionary learning approach, i.e., the algorithm learns the most representative atoms from a set of random sampled patches, and then models the spatial relations among them by counting the co-occurrence between atoms, while penalizing the spatial distance. The proposed approach was compared with a bag-of-features representation in a tissue classification task. For this purpose, 240 histological microscopical fields of view, 24 per tissue class, were collected. Those images fed a Support Vector Machine classifier per class, using 120 images as train set and the remaining ones for testing, maintaining the same proportion of each concept in the train and test sets. The obtained classification results, averaged from 100 random partitions of training and test sets, shows that our approach is more sensitive in average than the bag-of-features representation in almost 6%.

  4. Soft-tissue imaging with C-arm cone-beam CT using statistical reconstruction

    NASA Astrophysics Data System (ADS)

    Wang, Adam S.; Webster Stayman, J.; Otake, Yoshito; Kleinszig, Gerhard; Vogt, Sebastian; Gallia, Gary L.; Khanna, A. Jay; Siewerdsen, Jeffrey H.

    2014-02-01

    The potential for statistical image reconstruction methods such as penalized-likelihood (PL) to improve C-arm cone-beam CT (CBCT) soft-tissue visualization for intraoperative imaging over conventional filtered backprojection (FBP) is assessed in this work by making a fair comparison in relation to soft-tissue performance. A prototype mobile C-arm was used to scan anthropomorphic head and abdomen phantoms as well as a cadaveric torso at doses substantially lower than typical values in diagnostic CT, and the effects of dose reduction via tube current reduction and sparse sampling were also compared. Matched spatial resolution between PL and FBP was determined by the edge spread function of low-contrast (˜40-80 HU) spheres in the phantoms, which were representative of soft-tissue imaging tasks. PL using the non-quadratic Huber penalty was found to substantially reduce noise relative to FBP, especially at lower spatial resolution where PL provides a contrast-to-noise ratio increase up to 1.4-2.2× over FBP at 50% dose reduction across all objects. Comparison of sampling strategies indicates that soft-tissue imaging benefits from fully sampled acquisitions at dose above ˜1.7 mGy and benefits from 50% sparsity at dose below ˜1.0 mGy. Therefore, an appropriate sampling strategy along with the improved low-contrast visualization offered by statistical reconstruction demonstrates the potential for extending intraoperative C-arm CBCT to applications in soft-tissue interventions in neurosurgery as well as thoracic and abdominal surgeries by overcoming conventional tradeoffs in noise, spatial resolution, and dose.

  5. Prostate cancer incidence in relation to time windows of exposure to metalworking fluids in the auto industry.

    PubMed

    Agalliu, Ilir; Kriebel, David; Quinn, Margaret M; Wegman, David H; Eisen, Ellen A

    2005-09-01

    Exposure to metalworking fluids has been previously associated with prostate cancer mortality in a cohort of autoworkers. Our objective was to further explore this finding in a study of prostate cancer incidence in the same cohort, with reduced misclassification of outcome. We conducted a nested case-control study in the General Motors cohort of autoworkers. Incident cases of prostate cancer (n = 872) were identified via the Michigan Cancer Registry from 1985 through 2000. Controls were selected using incidence-density sampling with 5:1 ratio. Using cumulative exposure (mg/m-years) as the dose metric, we first examined varying lengths of lags (0-25 years). Then, we evaluated consecutive windows of exposure: 25 or more years before risk age, and fewer than 25 years. We used penalized splines to model the relative risk as a smooth function of exposure, and adjusted for race and calendar year of diagnosis in a Cox model. Risk of prostate cancer increased with exposure to soluble and straight fluids 25 years or more before risk age but not with exposure in the last 25 years. The relationship with soluble fluids was piecewise linear, with a small increase in risk at lower exposures followed by a steeper rise. By contrast, the relationship with straight fluids was linear, with a relative risk of 1.12 per 10mg/m-years of exposure (95% confidence interval = 1.04-1.20). Exposure to oil-based fluids, soluble and straight, is modestly associated with prostate cancer risk among autoworkers, with a latency period of at least 25 years.

  6. [Genetic expertise and the penal process].

    PubMed

    Choclán Montalvo, J A

    1998-01-01

    The author reflects on the major forensic biology issues related to human genome analysis. He also discusses, from the comparative law perspective, the extent to which genetic test evidence is binding on judges. He concludes with a discussion of the influence of genetic research on people's fundamental rights.

  7. Literacy Training in Penal Institutions.

    ERIC Educational Resources Information Center

    Gold, Patricia Cohen

    Most presently existing literacy training programs for inmates in America's prisons are inadequate. Before program planners and developers can remedy this situation, they must be able to obtain accurate information on the numbers of illiterate inmates and the numbers of inmates currently receiving literacy instruction in America's prisons. The…

  8. Intelligent Intelligence Testing.

    ERIC Educational Resources Information Center

    Pedrini, Bonnie; Pedrini, D. T.

    Intelligence tests should be used to help persons; they should not be used to penalize persons. Furthermore, our focus should be on treatment; it should not be on labeling. IQ testers often stigmatize young children and poor persons (children, adolescents, adults). Large groups of Black Americans, Spanish Americans, and Indian Americans are…

  9. Penal Innovation in New Zealand: He Ara Hou.

    ERIC Educational Resources Information Center

    Newbold, Greg; Eskridge, Chris

    1994-01-01

    Explores prison history/development in New Zealand, focusing on recent implementation of progressive prison operation/management program, He Ara Hou. Notes extremely positive results of program, such as higher administrative efficiency; greatly decreased levels of internal disorder; competent, stable workforce; and human product whose senses of…

  10. Functional Impairment and Hospital Readmission in Medicare Seniors

    PubMed Central

    Greysen, S. Ryan; Cenzer, Irena Stijacic; Auerbach, Andrew D.; Covinsky, Kenneth E.

    2015-01-01

    Importance Medicare currently penalizes hospitals for high rates of readmission for seniors but does not account for common age-related syndromes such as functional impairment. Objectives Given the high prevalence of functional impairments in community-dwelling seniors, we assessed effects of functional impairment on Medicare hospital readmissions. Study Design, Participants, and Setting We created a nationally-representative cohort of 7,854 community-dwelling seniors in the Health and Retirement Study (HRS) with 22,289 Medicare hospitalizations from 2000–2010. Main Outcome and Measurements Outcome was 30-day readmission, assessed by Medicare claims. Main predictor was functional impairment determined from HRS interview preceding hospitalization, stratified into 5 levels: no functional impairments, difficulty with ≥1 instrumental activity of daily living (IADL), difficulty with ≥1 activity of daily living (ADL), dependency (need for help) in 1–2 ADLs, and dependency in ≥3 ADLs. Adjustment variables included age, race, gender, income, and net worth and comorbid conditions (Elixhauser score from Medicare claims), and prior admission. We performed multivariable logistic regression adjusted for clustering at patient level to characterize the association of functional impairments and readmission. Results Mean age 79 (±8; 65–105), 58% female, 85% White, 90% reported ≥3 comorbidities, 86% had ≥1 hospitalization in previous year. Overall, 48% had some level of functional impairment prior to admission and 15% experienced a 30-day readmission. We found a progressive increase in adjusted risk of readmission as the degree of functional impairment increased: 13.5% with no functional impairment, 14.3% with ≥ 1 IADL difficulty (OR 1.06; 95% CI 0.94–1.20), 14.4% with ≥1 ADL difficulty (OR 1.08; 0.96–1.21), 16.5% with dependency in 1–2 ADLs (OR 1.26; 1.11–1.44), and 18.2% with dependency in ≥3 ADLs (1.42; 1.20–1.69). Sub-analysis restricted to patients admitted with conditions targeted by Medicare (heart failure, myocardial infarction, and pneumonia) revealed a parallel trend with larger effects for the most-impaired (16.9% readmission rate for no impairment vs. 25.7% for dependency in ≥3 ADLs, OR 1.70; 1.04–2.78). Conclusions Functional impairment is associated with increased risk of 30-day, all-cause hospital readmission in Medicare seniors, especially those admitted for heart failure, myocardial infarction or pneumonia. Functional impairment on admission may be an overlooked but highly suitable target for interventions to reduce Medicare hospital readmissions. Relevance Functional impairment may be an important but under-addressed factor in preventing readmissions for Medicare seniors. PMID:25642907

  11. An optimization-based approach for high-order accurate discretization of conservation laws with discontinuous solutions

    NASA Astrophysics Data System (ADS)

    Zahr, M. J.; Persson, P.-O.

    2018-07-01

    This work introduces a novel discontinuity-tracking framework for resolving discontinuous solutions of conservation laws with high-order numerical discretizations that support inter-element solution discontinuities, such as discontinuous Galerkin or finite volume methods. The proposed method aims to align inter-element boundaries with discontinuities in the solution by deforming the computational mesh. A discontinuity-aligned mesh ensures the discontinuity is represented through inter-element jumps while smooth basis functions interior to elements are only used to approximate smooth regions of the solution, thereby avoiding Gibbs' phenomena that create well-known stability issues. Therefore, very coarse high-order discretizations accurately resolve the piecewise smooth solution throughout the domain, provided the discontinuity is tracked. Central to the proposed discontinuity-tracking framework is a discrete PDE-constrained optimization formulation that simultaneously aligns the computational mesh with discontinuities in the solution and solves the discretized conservation law on this mesh. The optimization objective is taken as a combination of the deviation of the finite-dimensional solution from its element-wise average and a mesh distortion metric to simultaneously penalize Gibbs' phenomena and distorted meshes. It will be shown that our objective function satisfies two critical properties that are required for this discontinuity-tracking framework to be practical: (1) possesses a local minima at a discontinuity-aligned mesh and (2) decreases monotonically to this minimum in a neighborhood of radius approximately h / 2, whereas other popular discontinuity indicators fail to satisfy the latter. Another important contribution of this work is the observation that traditional reduced space PDE-constrained optimization solvers that repeatedly solve the conservation law at various mesh configurations are not viable in this context since severe overshoot and undershoot in the solution, i.e., Gibbs' phenomena, may make it impossible to solve the discrete conservation law on non-aligned meshes. Therefore, we advocate a gradient-based, full space solver where the mesh and conservation law solution converge to their optimal values simultaneously and therefore never require the solution of the discrete conservation law on a non-aligned mesh. The merit of the proposed method is demonstrated on a number of one- and two-dimensional model problems including the L2 projection of discontinuous functions, Burgers' equation with a discontinuous source term, transonic flow through a nozzle, and supersonic flow around a bluff body. We demonstrate optimal O (h p + 1) convergence rates in the L1 norm for up to polynomial order p = 6 and show that accurate solutions can be obtained on extremely coarse meshes.

  12. [Abortion and rights. Legal thinking about abortion].

    PubMed

    Perez Duarte, A E

    1991-01-01

    Analysis of abortion in Mexico from a juridical perspective requires recognition that Mexico as a national community participates in a double system of values. Politically it is defined as a liberal, democratic, and secular state, but culturally the Judeo-Christian ideology is dominant in all social strata. This duality complicates all juridical-penal decisions regarding abortion. Public opinion on abortion is influenced on the 1 hand by extremely conservative groups who condemn abortion as homicide, and on the other hand by groups who demand legislative reform in congruence with characteristics that define the state: an attitude of tolerance toward the different ideological-moral positions that coexist in the country. The discussion concerns the rights of women to voluntary maternity, protection of health, and to making their own decisions regarding their bodies vs. the rights of the fetus to life. The type of analysis is not objective, and conclusions depend on the ideology of the analyst. Other elements must be examined for an objective consideration of the social problem of abortion. For example, aspects related to maternal morbidity and mortality and the demographic, economic, and physical and mental health of the population would all seem to support the democratic juridical doctrine that sees the clandestine nature of abortion as the principal problem. It is also observed that the illegality of abortion does not guarantee its elimination. Desperate women will seek abortion under any circumstances. The illegality of abortion also impedes health and educational policies that would lower abortion mortality. There are various problems from a strictly juridical perspective. A correct definition of the term abortion is needed that would coincide with the medical definition. The discussion must be clearly centered on the protected juridical right and the definition of reproductive and health rights and rights to their own bodies of women. The experiences of other countries with decriminalization of abortion should also be assessed. Factors considered should include the true impunity of abortion, public health problems and socioeconomic problems generated by the state through criminalization of abortion, and the psychological and economic implications for women of the criminal status of abortion. Systems of decriminalization should be examined to decide which would be appropriate for Mexico. These systems include authorizing complete freedom of choice for the 1st trimester and permitting abortion only for specific indications. All penal codes in Mexico now use the system of abortion for specific indications. Few cases are accepted for legal pregnancy termination.

  13. Efficient robust doubly adaptive regularized regression with applications.

    PubMed

    Karunamuni, Rohana J; Kong, Linglong; Tu, Wei

    2018-01-01

    We consider the problem of estimation and variable selection for general linear regression models. Regularized regression procedures have been widely used for variable selection, but most existing methods perform poorly in the presence of outliers. We construct a new penalized procedure that simultaneously attains full efficiency and maximum robustness. Furthermore, the proposed procedure satisfies the oracle properties. The new procedure is designed to achieve sparse and robust solutions by imposing adaptive weights on both the decision loss and the penalty function. The proposed method of estimation and variable selection attains full efficiency when the model is correct and, at the same time, achieves maximum robustness when outliers are present. We examine the robustness properties using the finite-sample breakdown point and an influence function. We show that the proposed estimator attains the maximum breakdown point. Furthermore, there is no loss in efficiency when there are no outliers or the error distribution is normal. For practical implementation of the proposed method, we present a computational algorithm. We examine the finite-sample and robustness properties using Monte Carlo studies. Two datasets are also analyzed.

  14. A practical globalization of one-shot optimization for optimal design of tokamak divertors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blommaert, Maarten, E-mail: maarten.blommaert@kuleuven.be; Dekeyser, Wouter; Baelmans, Martine

    In past studies, nested optimization methods were successfully applied to design of the magnetic divertor configuration in nuclear fusion reactors. In this paper, so-called one-shot optimization methods are pursued. Due to convergence issues, a globalization strategy for the one-shot solver is sought. Whereas Griewank introduced a globalization strategy using a doubly augmented Lagrangian function that includes primal and adjoint residuals, its practical usability is limited by the necessity of second order derivatives and expensive line search iterations. In this paper, a practical alternative is offered that avoids these drawbacks by using a regular augmented Lagrangian merit function that penalizes onlymore » state residuals. Additionally, robust rank-two Hessian estimation is achieved by adaptation of Powell's damped BFGS update rule. The application of the novel one-shot approach to magnetic divertor design is considered in detail. For this purpose, the approach is adapted to be complementary with practical in parts adjoint sensitivities. Using the globalization strategy, stable convergence of the one-shot approach is achieved.« less

  15. [Amblyopia].

    PubMed

    Orssaud, C

    2014-06-01

    Amblyopia is a developmental disorder of the entire visual system, including the extra-striate cortex. It manifests mainly by impaired visual acuity in the amblyopic eye. However, other abnormalities of visual function can be observed, such as decreased contrast sensitivity and stereoscopic vision, and some abnormalities can be found in the "good" eye. Amblyopia occurs during the critical period of brain development. It may be due to organic pathology of the visual pathways, visual deprivation or functional abnormalities, mainly anisometropia or strabismus. The diagnosis of amblyopia must be confirmed prior to treatment. Confirmation is based on cycloplegic refraction, visual acuity measurement and orthoptic assessment. However, screening for amblyopia and associated risk factors permits earlier diagnosis and treatment. The younger the child, the more effective the treatment, and it can only be achieved during the critical period. It requires parental cooperation in order to be effective and is based on occlusion or penalization of the healthy eye. The amblyopic eye may then develop better vision. Maintenance therapy must be performed until the end of the critical period to avoid recurrence. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  16. Correction of patient motion in cone-beam CT using 3D-2D registration

    NASA Astrophysics Data System (ADS)

    Ouadah, S.; Jacobson, M.; Stayman, J. W.; Ehtiati, T.; Weiss, C.; Siewerdsen, J. H.

    2017-12-01

    Cone-beam CT (CBCT) is increasingly common in guidance of interventional procedures, but can be subject to artifacts arising from patient motion during fairly long (~5-60 s) scan times. We present a fiducial-free method to mitigate motion artifacts using 3D-2D image registration that simultaneously corrects residual errors in the intrinsic and extrinsic parameters of geometric calibration. The 3D-2D registration process registers each projection to a prior 3D image by maximizing gradient orientation using the covariance matrix adaptation-evolution strategy optimizer. The resulting rigid transforms are applied to the system projection matrices, and a 3D image is reconstructed via model-based iterative reconstruction. Phantom experiments were conducted using a Zeego robotic C-arm to image a head phantom undergoing 5-15 cm translations and 5-15° rotations. To further test the algorithm, clinical images were acquired with a CBCT head scanner in which long scan times were susceptible to significant patient motion. CBCT images were reconstructed using a penalized likelihood objective function. For phantom studies the structural similarity (SSIM) between motion-free and motion-corrected images was  >0.995, with significant improvement (p  <  0.001) compared to the SSIM values of uncorrected images. Additionally, motion-corrected images exhibited a point-spread function with full-width at half maximum comparable to that of the motion-free reference image. Qualitative comparison of the motion-corrupted and motion-corrected clinical images demonstrated a significant improvement in image quality after motion correction. This indicates that the 3D-2D registration method could provide a useful approach to motion artifact correction under assumptions of local rigidity, as in the head, pelvis, and extremities. The method is highly parallelizable, and the automatic correction of residual geometric calibration errors provides added benefit that could be valuable in routine use.

  17. Motion Compensation in Extremity Cone-Beam CT Using a Penalized Image Sharpness Criterion

    PubMed Central

    Sisniega, A.; Stayman, J. W.; Yorkston, J.; Siewerdsen, J. H.; Zbijewski, W.

    2017-01-01

    Cone-beam CT (CBCT) for musculoskeletal imaging would benefit from a method to reduce the effects of involuntary patient motion. In particular, the continuing improvement in spatial resolution of CBCT may enable tasks such as quantitative assessment of bone microarchitecture (0.1 mm – 0.2 mm detail size), where even subtle, sub-mm motion blur might be detrimental. We propose a purely image based motion compensation method that requires no fiducials, tracking hardware or prior images. A statistical optimization algorithm (CMA-ES) is used to estimate a motion trajectory that optimizes an objective function consisting of an image sharpness criterion augmented by a regularization term that encourages smooth motion trajectories. The objective function is evaluated using a volume of interest (VOI, e.g. a single bone and surrounding area) where the motion can be assumed to be rigid. More complex motions can be addressed by using multiple VOIs. Gradient variance was found to be a suitable sharpness metric for this application. The performance of the compensation algorithm was evaluated in simulated and experimental CBCT data, and in a clinical dataset. Motion-induced artifacts and blurring were significantly reduced across a broad range of motion amplitudes, from 0.5 mm to 10 mm. Structure Similarity Index (SSIM) against a static volume was used in the simulation studies to quantify the performance of the motion compensation. In studies with translational motion, the SSIM improved from 0.86 before compensation to 0.97 after compensation for 0.5 mm motion, from 0.8 to 0.94 for 2 mm motion and from 0.52 to 0.87 for 10 mm motion (~70% increase). Similar reduction of artifacts was observed in a benchtop experiment with controlled translational motion of an anthropomorphic hand phantom, where SSIM (against a reconstruction of a static phantom) improved from 0.3 to 0.8 for 10 mm motion. Application to a clinical dataset of a lower extremity showed dramatic reduction of streaks and improvement in delineation of tissue boundaries and trabecular structures throughout the whole volume. The proposed method will support new applications of extremity CBCT in areas where patient motion may not be sufficiently managed by immobilization, such as imaging under load and quantitative assessment of subchondral bone architecture. PMID:28327471

  18. The Iran Sanctions Act (ISA)

    DTIC Science & Technology

    2008-08-26

    its partners, Gazprom of Russia and Petronas of Malaysia to develop phases 2 and 3 of the 25-phase South Pars gas field. The EU pledged to increase...would not be penalized. Total and Petronas subsequently negotiated to develop a liquified natural gas (LNG) export capability at Phase 11 of South

  19. The Iran Sanctions Act (ISA)

    DTIC Science & Technology

    2008-07-23

    France and its partners, Gazprom of Russia and Petronas of Malaysia to develop phases 2 and 3 of the 25-phase South Pars gas field. The EU pledged...EU firms in Iran would not be penalized. Total and Petronas subsequently negotiated to develop a liquified natural gas (LNG) export capability at

  20. The Many Sides of Academic Dishonesty Sanctions

    ERIC Educational Resources Information Center

    Beasley, Eric Matthew

    2012-01-01

    In the fall of 2009, Michigan State University (MSU) implemented a new policy regarding reports of undergraduate academic dishonesty. Under the new system, instructors are required to submit an academic dishonesty report for any student that they penalize for violations of academic integrity, and these students are placed into a remediation class…

  1. 75 FR 4635 - Risk-Based Capital Guidelines; Capital Adequacy Guidelines; Capital Maintenance: Regulatory...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-28

    ... phase-in would unfairly penalize banking organizations given their already established businesses..., will aid banking organizations with capital planning as they implement FAS 166 and FAS 167 and adjust... assets a banking organization consolidates as a result of changes to U.S. generally accepted accounting...

  2. Military Policy toward Homosexuals: Scientific, Historic, and Legal Perspectives

    DTIC Science & Technology

    1990-04-01

    International Trends During the 1950s, the American Law Institute recommended that states adopt a Model Penal Code that decriminalized all non-violent...personalities. For much of our history, the military’s fear of racial tension kept black soldiers segregated from whites. Fear of sexual tensions, until

  3. Chronicle of Higher Education. Volume 50, Number 18, January 9, 2004

    ERIC Educational Resources Information Center

    Chronicle of Higher Education, 2004

    2004-01-01

    "Chronicle of Higher Education" presents an abundant source of news and information for college and university faculty members and administrators. This January 9, 2004 issue of "Chronicle of Higher Education" includes the following articles: (1) "Regional Accreditors Penalize 13 Institutions in New England and the…

  4. Teaching Writing in Graduate School

    ERIC Educational Resources Information Center

    Sallee, Margaret; Hallett, Ronald; Tierney, William

    2011-01-01

    Graduate students are typically expected to know how to write. Those who write poorly are occasionally penalized, but little in-class attention is given to help students continue to develop and refine their writing skills. More often than not, writing courses at the graduate level are remedial programs designed for international students and…

  5. 20 CFR 404.1022 - American Samoa, Guam, or the Commonwealth of the Northern Mariana Islands.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the Northern Mariana Islands. 404.1022 Section 404.1022 Employees' Benefits SOCIAL SECURITY... patient or inmate of the hospital or penal institution. (d) Medicare qualified government employment. If your work is not covered under Social Security, it may be covered as Medicare qualified government...

  6. Federal Law Enforcement in Bi-National Perspective: The United States FBI and the Mexican PFM

    DTIC Science & Technology

    2014-09-01

    de Ciencias Penales INCLE International Narcotics Control and Law Enforcement IT information technology LISSSTE Ley del Instituto de Seguridad y...Instituto Nacional de Ciencias Penales—INACIPE).195 However, if the video on Youtube.com is an indication of the seriousness with which ministerial

  7. 33 CFR 138.80 - Financial responsibility, how established.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... issuance of Federal bonds in the maximum penal sum of each bond to be issued under this subpart. (3) Self... which it applies. (ii) Semiannual self-insurance submissions. When the self-insuring applicant's or.... (iii) Additional self-insurance submissions. A self-insuring applicant or certificant— (A) Must, upon...

  8. 33 CFR 138.80 - Financial responsibility, how established.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... issuance of Federal bonds in the maximum penal sum of each bond to be issued under this subpart. (3) Self... which it applies. (ii) Semiannual self-insurance submissions. When the self-insuring applicant's or.... (iii) Additional self-insurance submissions. A self-insuring applicant or certificant— (A) Must, upon...

  9. 24 CFR 983.53 - Prohibition of assistance for ineligible units.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... grounds of a penal, reformatory, medical, mental, or similar public or private institution; (3) Nursing... facility that provides home health care services such as nursing and therapy for residents of the housing... designated for occupancy by students of the institution; (5) Manufactured homes; (6) Cooperative housing; and...

  10. 10 CFR 820.40 - Purpose and scope.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Purpose and scope. 820.40 Section 820.40 Energy DEPARTMENT OF ENERGY PROCEDURAL RULES FOR DOE NUCLEAR ACTIVITIES Compliance Orders § 820.40 Purpose and scope. This subpart provides for the issuance of Compliance Orders to prevent, rectify or penalize violations...

  11. 10 CFR 820.40 - Purpose and scope.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Purpose and scope. 820.40 Section 820.40 Energy DEPARTMENT OF ENERGY PROCEDURAL RULES FOR DOE NUCLEAR ACTIVITIES Compliance Orders § 820.40 Purpose and scope. This subpart provides for the issuance of Compliance Orders to prevent, rectify or penalize violations...

  12. Alternative Funding Options for Post-Secondary Correctional Education (Part Two)

    ERIC Educational Resources Information Center

    Taylor, Jon Marc

    2005-01-01

    Post-Secondary Correctional Education (PSCE) programs have been offered in United States penal faculties for half-a-century. The primary determinant of these program opportunities has been funding availability. With the exclusion of prisoner-students from participating in the Pell Grant financial aid program, approximately half of the existing…

  13. Florida Community College Finance: Update. Report 2.

    ERIC Educational Resources Information Center

    Florida State Postsecondary Education Planning Commission, Tallahassee.

    In Florida, community college enrollments have increased by 41% over the past 5 years. The use of a 3-year average to fund enrollment has penalized those colleges which have grown dramatically. This report analyzes the range of community college expenditures by instructional program; the relationship of instructional expenditures to support…

  14. 27 CFR 26.69 - Strengthening bonds.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Strengthening bonds. 26.69... Liquors and Articles in Puerto Rico Bonds § 26.69 Strengthening bonds. In all cases where the penal sum of any bond becomes insufficient, the principal shall either give a strengthening bond with the same...

  15. 27 CFR 26.69 - Strengthening bonds.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2011-04-01 2011-04-01 false Strengthening bonds. 26.69... Liquors and Articles in Puerto Rico Bonds § 26.69 Strengthening bonds. In all cases where the penal sum of any bond becomes insufficient, the principal shall either give a strengthening bond with the same...

  16. 27 CFR 26.69 - Strengthening bonds.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2014-04-01 2014-04-01 false Strengthening bonds. 26.69... Liquors and Articles in Puerto Rico Bonds § 26.69 Strengthening bonds. In all cases where the penal sum of any bond becomes insufficient, the principal shall either give a strengthening bond with the same...

  17. 27 CFR 26.69 - Strengthening bonds.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2013-04-01 2013-04-01 false Strengthening bonds. 26.69... Liquors and Articles in Puerto Rico Bonds § 26.69 Strengthening bonds. In all cases where the penal sum of any bond becomes insufficient, the principal shall either give a strengthening bond with the same...

  18. Escaping Devil's Island: Confronting Racism, Learning History

    ERIC Educational Resources Information Center

    Grant, Carl A.

    2011-01-01

    This article argues that African Americans, especially males living in urban areas, are physically and mentally trapped on a Devil's Island. The penal colony on the coast of French Guiana is a metaphor for the boundaries and constraints that close off opportunities and constrain African American historical knowledge. The article argues that…

  19. Avoiding Degeneracy in Multidimensional Unfolding by Penalizing on the Coefficient of Variation

    ERIC Educational Resources Information Center

    Busing, Frank M. T. A.; Groenen, Patrick J. K.; Heiser, Willem J.

    2005-01-01

    Multidimensional unfolding methods suffer from the degeneracy problem in almost all circumstances. Most degeneracies are easily recognized: the solutions are perfect but trivial, characterized by approximately equal distances between points from different sets. A definition of an absolutely degenerate solution is proposed, which makes clear that…

  20. 27 CFR 25.98 - Surety or security.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... coverage. Bonds required by this part will be given with corporate surety or collateral security. (b... limitations set forth for corporate security by the Secretary which are set forth in the current revision of... penal sum of the bond. (e) Deposit of collateral securities in lieu of corporate surety. Bonds or notes...

  1. 27 CFR 25.98 - Surety or security.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... coverage. Bonds required by this part will be given with corporate surety or collateral security. (b... limitations set forth for corporate security by the Secretary which are set forth in the current revision of... penal sum of the bond. (e) Deposit of collateral securities in lieu of corporate surety. Bonds or notes...

  2. 27 CFR 25.98 - Surety or security.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... coverage. Bonds required by this part will be given with corporate surety or collateral security. (b... limitations set forth for corporate security by the Secretary which are set forth in the current revision of... penal sum of the bond. (e) Deposit of collateral securities in lieu of corporate surety. Bonds or notes...

  3. 27 CFR 25.98 - Surety or security.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... coverage. Bonds required by this part will be given with corporate surety or collateral security. (b... limitations set forth for corporate security by the Secretary which are set forth in the current revision of... penal sum of the bond. (e) Deposit of collateral securities in lieu of corporate surety. Bonds or notes...

  4. 27 CFR 25.98 - Surety or security.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... coverage. Bonds required by this part will be given with corporate surety or collateral security. (b... limitations set forth for corporate security by the Secretary which are set forth in the current revision of... penal sum of the bond. (e) Deposit of collateral securities in lieu of corporate surety. Bonds or notes...

  5. Porn in Prison: How Does It Get in? Who Receives It?

    ERIC Educational Resources Information Center

    Tewksbury, Richard; DeMichele, Matthew

    2005-01-01

    Prison administrators are faced with the arduous task of maintaining order in an environment that is often characterized as chaotic. This task is made increasingly more difficult as administrators must observe individual rights, operate within rapidly diminishing budgets, and satisfy shifting philosophical penal goals--oscillating between…

  6. 32 CFR 763.5 - Entry procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... GOVERNING PUBLIC ACCESS Entry Regulations for Kaho'olawe Island, Hawaii § 763.5 Entry procedures. (a) It is... Harbor, Hawaii 96860, at least 15 days prior to the access requested, providing therein confirmed access... proscribed by either Federal law or the State of Hawaii Penal Code, as incorporated under the Federal...

  7. Create a College Access Contract

    ERIC Educational Resources Information Center

    Dannenberg, Michael

    2007-01-01

    America's financial aid system provides too much taxpayer support to banks making college loans, demands too little of students assuming them, and burdens families with too much debt. The system fails to reward rigorous college-preparatory work in high school and penalizes students who hold jobs while in college. Lenders make extraordinary…

  8. Space station control moment gyro control

    NASA Technical Reports Server (NTRS)

    Bordano, Aldo

    1987-01-01

    The potential large center-of-pressure to center-of-gravity offset of the space station makes the short term, within an orbit, variations in density of primary importance. The large range of uncertainty in the prediction of solar activity will penalize the design, developments, and operation of the space station.

  9. 45 CFR 261.13 - May an individual be penalized for not following an individual responsibility plan?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN AND FAMILIES, DEPARTMENT OF HEALTH AND HUMAN SERVICES ENSURING THAT RECIPIENTS WORK What Are the Provisions... responsibility plan? Yes. If an individual fails without good cause to comply with an individual responsibility...

  10. [Harassment in the public sector].

    PubMed

    Puech, Paloma; Pitcho, Benjamin

    2013-01-01

    The French Labour Code, which provides full protection against moral and sexual harassment, is not applicable to public sector workers. The public hospital is however not exempt from such behaviour, which could go unpunished. Public sector workers are therefore protected by the French General Civil Service Regulations and the penal code.

  11. Our Brother's Keeper: The Indian in White America.

    ERIC Educational Resources Information Center

    Cahn, Edgar S., Ed.; Hearne, David W., Ed.

    The text describes the American Indian's frustrations with his closed world, which thwarts and penalizes individual and tribal self-realization, which rewards and perpetrates dependency, and which demands alienation from one's heritage as a price for survival. American society is described as arrogant and as attempting to insure that by systematic…

  12. Brief Report: No Increase in Criminal Convictions in Hans Asperger's Original Cohort

    ERIC Educational Resources Information Center

    Hippler, Kathrin; Viding, Essi; Klicpera, Christian; Happe, Francesca

    2010-01-01

    Hans Asperger originally used the term "autistic psychopathy" to describe his patients on the autism spectrum, leading to a possible confusion with psychopathic disorder and delinquent behaviour. We conducted a penal register search for 177 former patients of Asperger's clinic with a childhood diagnosis of "autistic…

  13. Optical Peaking Enhancement in High-Speed Ring Modulators

    PubMed Central

    Müller, J.; Merget, F.; Azadeh, S. Sharif; Hauck, J.; García, S. Romero; Shen, B.; Witzens, J.

    2014-01-01

    Ring resonator modulators (RRM) combine extreme compactness, low power consumption and wavelength division multiplexing functionality, making them a frontrunner for addressing the scalability requirements of short distance optical links. To extend data rates beyond the classically assumed bandwidth capability, we derive and experimentally verify closed form equations of the electro-optic response and asymmetric side band generation resulting from inherent transient time dynamics and leverage these to significantly improve device performance. An equivalent circuit description with a commonly used peaking amplifier model allows straightforward assessment of the effect on existing communication system architectures. A small signal analytical expression of peaking in the electro-optic response of RRMs is derived and used to extend the electro-optic bandwidth of the device above 40 GHz as well as to open eye diagrams penalized by intersymbol interference at 32, 40 and 44 Gbps. Predicted peaking and asymmetric side band generation are in excellent agreement with experiments. PMID:25209255

  14. Numerical and experimental validation of a particle Galerkin method for metal grinding simulation

    NASA Astrophysics Data System (ADS)

    Wu, C. T.; Bui, Tinh Quoc; Wu, Youcai; Luo, Tzui-Liang; Wang, Morris; Liao, Chien-Chih; Chen, Pei-Yin; Lai, Yu-Sheng

    2018-03-01

    In this paper, a numerical approach with an experimental validation is introduced for modelling high-speed metal grinding processes in 6061-T6 aluminum alloys. The derivation of the present numerical method starts with an establishment of a stabilized particle Galerkin approximation. A non-residual penalty term from strain smoothing is introduced as a means of stabilizing the particle Galerkin method. Additionally, second-order strain gradients are introduced to the penalized functional for the regularization of damage-induced strain localization problem. To handle the severe deformation in metal grinding simulation, an adaptive anisotropic Lagrangian kernel is employed. Finally, the formulation incorporates a bond-based failure criterion to bypass the prospective spurious damage growth issues in material failure and cutting debris simulation. A three-dimensional metal grinding problem is analyzed and compared with the experimental results to demonstrate the effectiveness and accuracy of the proposed numerical approach.

  15. Consistent model identification of varying coefficient quantile regression with BIC tuning parameter selection

    PubMed Central

    Zheng, Qi; Peng, Limin

    2016-01-01

    Quantile regression provides a flexible platform for evaluating covariate effects on different segments of the conditional distribution of response. As the effects of covariates may change with quantile level, contemporaneously examining a spectrum of quantiles is expected to have a better capacity to identify variables with either partial or full effects on the response distribution, as compared to focusing on a single quantile. Under this motivation, we study a general adaptively weighted LASSO penalization strategy in the quantile regression setting, where a continuum of quantile index is considered and coefficients are allowed to vary with quantile index. We establish the oracle properties of the resulting estimator of coefficient function. Furthermore, we formally investigate a BIC-type uniform tuning parameter selector and show that it can ensure consistent model selection. Our numerical studies confirm the theoretical findings and illustrate an application of the new variable selection procedure. PMID:28008212

  16. Optimizing energy functions for protein-protein interface design.

    PubMed

    Sharabi, Oz; Yanover, Chen; Dekel, Ayelet; Shifman, Julia M

    2011-01-15

    Protein design methods have been originally developed for the design of monomeric proteins. When applied to the more challenging task of protein–protein complex design, these methods yield suboptimal results. In particular, they often fail to recapitulate favorable hydrogen bonds and electrostatic interactions across the interface. In this work, we aim to improve the energy function of the protein design program ORBIT to better account for binding interactions between proteins. By using the advanced machine learning framework of conditional random fields, we optimize the relative importance of all the terms in the energy function, attempting to reproduce the native side-chain conformations in protein–protein interfaces. We evaluate the performance of several optimized energy functions, each describes the van der Waals interactions using a different potential. In comparison with the original energy function, our best energy function (a) incorporates a much “softer” repulsive van der Waals potential, suitable for the discrete rotameric representation of amino acid side chains; (b) does not penalize burial of polar atoms, reflecting the frequent occurrence of polar buried residues in protein–protein interfaces; and (c) significantly up-weights the electrostatic term, attesting to the high importance of these interactions for protein–protein complex formation. Using this energy function considerably improves side chain placement accuracy for interface residues in a large test set of protein–protein complexes. Moreover, the optimized energy function recovers the native sequences of protein–protein interface at a higher rate than the default function and performs substantially better in predicting changes in free energy of binding due to mutations.

  17. The Role of Penal Quarantine in Reducing Violent Crime

    ERIC Educational Resources Information Center

    Johnson, Perry M.

    1978-01-01

    An examination of the limits of quarantine's potential effect under actual and ideal circumstances leads to the conclusion that current proposals for increasing the use of quarantine would reduce serious violent crime by no more than 10 percent at a staggering cost for prison construction and operation. Two alternative proposals are presented.…

  18. [Penal bone of American mink (Mustela vison Brisson, 1756)].

    PubMed

    Gościcka, D; Gielecki, J

    1990-01-01

    30 male minks were examined anatomically, histologically, and radiologically. It was found that penis bone constituted the always-present part of their penis. It was shown, on the grounds of both the X-ray and histological examinations, that there is the connective tissue basis for osteogenesis of penis bone closer ending.

  19. Using Tutors to Improve Educational Games: A Cognitive Game for Policy Argument

    ERIC Educational Resources Information Center

    Easterday, Matthew W.; Aleven, Vincent; Scheines, Richard; Carver, Sharon M.

    2017-01-01

    How might we balance assistance and penalties to intelligent tutors and educational games that increase learning and interest? We created two versions of an educational game for learning policy argumentation called Policy World. The game (only) version provided minimal feedback and penalized students for errors whereas the game+tutor version…

  20. 75 FR 48276 - Defense Federal Acquisition Regulation Supplement; Management of Unpriced Change Orders

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-10

    ...D to retain the uniquely Government right to issue unilateral change orders and then penalize... Regulatory Flexibility Act, 5 U.S.C. 601, et seq., because the change is to internal Government operating... Subjects in 48 CFR Parts 215, 217, and 243 Government procurement. Ynette R. Shelkin, Editor, Defense...

  1. 20 CFR 404.1021 - Work for the District of Columbia.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Section 404.1021 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND...; or (b) You are— (1) A patient or inmate of a hospital or penal institution and your work is for that... employment. If your work is not covered under Social Security, it may be covered as Medicare qualified...

  2. From Punishment to Education: The International Debate on Juvenile Penal Reform before World War I

    ERIC Educational Resources Information Center

    Fuchs, Eckhardt

    2015-01-01

    The article addresses international efforts at child protection, emphasizing the criminal law on juveniles before 1914, and focuses on key international organizations and their various conferences and congresses. Although there was an institutional divide between welfare in general, child protection and youth crime, the organizations covered…

  3. 48 CFR 970.5244-1 - Contractor purchasing system.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... forth in paragraphs (b) through (y) of this clause. (b) Acquisition of utility services. Utility... obligees. The penal amounts shall be determined in accordance with 48 CFR 28.102-2(b). (3) For fixed-price... $100,000, the Contractor shall select two or more of the payment protections at 48 CFR 28.102-1(b...

  4. 27 CFR 26.68a - Bond account.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Section 26.68a Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE BUREAU, DEPARTMENT OF THE TREASURY LIQUORS LIQUORS AND ARTICLES FROM PUERTO RICO AND THE VIRGIN ISLANDS Taxpayment of... subpart shall keep an account of the charges against and credits to the bond if the penal sum of his bond...

  5. Beyond School-to-Prison Pipeline and toward an Educational and Penal Realism

    ERIC Educational Resources Information Center

    Fasching-Varner, Kenneth J.; Mitchell, Roland W.; Martin, Lori L.; Bennett-Haron, Karen P.

    2014-01-01

    Much scholarly attention has been paid to the school-to-prison pipeline and the sanitized discourse of "death by education," called the achievement gap. Additionally, there exists a longstanding discourse surrounding the alleged crisis of educational failure. This article offers no solutions to the crisis and suggests instead that the…

  6. 27 CFR 28.66 - Strengthening bonds.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2011-04-01 2011-04-01 false Strengthening bonds. 28.66... OF THE TREASURY LIQUORS EXPORTATION OF ALCOHOL Bonds and Consents of Surety § 28.66 Strengthening... give a strengthening bond with the same surety to attain a sufficient penal sum, or give a new bond to...

  7. 27 CFR 25.94 - Strengthening bonds.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Strengthening bonds. 25.94... OF THE TREASURY LIQUORS BEER Bonds and Consents of Surety § 25.94 Strengthening bonds. (a... strengthening bond in sufficient penal sum if the surety is the same as on the bond in effect. If the surety is...

  8. 27 CFR 25.94 - Strengthening bonds.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2011-04-01 2011-04-01 false Strengthening bonds. 25.94... OF THE TREASURY LIQUORS BEER Bonds and Consents of Surety § 25.94 Strengthening bonds. (a... strengthening bond in sufficient penal sum if the surety is the same as on the bond in effect. If the surety is...

  9. 27 CFR 28.66 - Strengthening bonds.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2013-04-01 2013-04-01 false Strengthening bonds. 28.66... OF THE TREASURY ALCOHOL EXPORTATION OF ALCOHOL Bonds and Consents of Surety § 28.66 Strengthening... give a strengthening bond with the same surety to attain a sufficient penal sum, or give a new bond to...

  10. 27 CFR 28.66 - Strengthening bonds.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Strengthening bonds. 28.66... OF THE TREASURY LIQUORS EXPORTATION OF ALCOHOL Bonds and Consents of Surety § 28.66 Strengthening... give a strengthening bond with the same surety to attain a sufficient penal sum, or give a new bond to...

  11. 27 CFR 28.66 - Strengthening bonds.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2014-04-01 2014-04-01 false Strengthening bonds. 28.66... OF THE TREASURY ALCOHOL EXPORTATION OF ALCOHOL Bonds and Consents of Surety § 28.66 Strengthening... give a strengthening bond with the same surety to attain a sufficient penal sum, or give a new bond to...

  12. 27 CFR 25.94 - Strengthening bonds.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2013-04-01 2013-04-01 false Strengthening bonds. 25.94... OF THE TREASURY ALCOHOL BEER Bonds and Consents of Surety § 25.94 Strengthening bonds. (a... strengthening bond in sufficient penal sum if the surety is the same as on the bond in effect. If the surety is...

  13. 27 CFR 28.66 - Strengthening bonds.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2012-04-01 2012-04-01 false Strengthening bonds. 28.66... OF THE TREASURY LIQUORS EXPORTATION OF ALCOHOL Bonds and Consents of Surety § 28.66 Strengthening... give a strengthening bond with the same surety to attain a sufficient penal sum, or give a new bond to...

  14. 27 CFR 25.94 - Strengthening bonds.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2014-04-01 2014-04-01 false Strengthening bonds. 25.94... OF THE TREASURY ALCOHOL BEER Bonds and Consents of Surety § 25.94 Strengthening bonds. (a... strengthening bond in sufficient penal sum if the surety is the same as on the bond in effect. If the surety is...

  15. 27 CFR 25.94 - Strengthening bonds.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2012-04-01 2012-04-01 false Strengthening bonds. 25.94... OF THE TREASURY LIQUORS BEER Bonds and Consents of Surety § 25.94 Strengthening bonds. (a... strengthening bond in sufficient penal sum if the surety is the same as on the bond in effect. If the surety is...

  16. All Net: A Meaningful Way to Look at College Prices

    ERIC Educational Resources Information Center

    Baum, Sandy

    2004-01-01

    Recent headlines about spiraling college prices combined with congressional proposals to penalize colleges and universities that increase their tuitions much faster than the rate of inflation could lead one to conclude that America faces an unprecedented crisis in college affordability. Closer examination of what students actually pay for college,…

  17. Social Networking Goes to School

    ERIC Educational Resources Information Center

    Davis, Michelle R.

    2010-01-01

    Just a few years ago, social networking meant little more to educators than the headache of determining whether to penalize students for inappropriate activities captured on Facebook or MySpace. Now, teachers and students have an array of social-networking sites and tools--from Ning to VoiceThread and Second Life--to draw on for such serious uses…

  18. Ready, Set, Algebra?

    ERIC Educational Resources Information Center

    Levy, Alissa Beth

    2012-01-01

    The California Department of Education (CDE) has long asserted that success Algebra I by Grade 8 is the goal for all California public school students. In fact, the state's accountability system penalizes schools that do not require all of their students to take the Algebra I end-of-course examination by Grade 8 (CDE, 2009). In this dissertation,…

  19. Assessing Multiple Choice Question (MCQ) Tests--A Mathematical Perspective

    ERIC Educational Resources Information Center

    Scharf, Eric M.; Baldwin, Lynne P.

    2007-01-01

    The reasoning behind popular methods for analysing the raw data generated by multiple choice question (MCQ) tests is not always appreciated, occasionally with disastrous results. This article discusses and analyses three options for processing the raw data produced by MCQ tests. The article shows that one extreme option is not to penalize a…

  20. Do Students' Religion and School Absences Moderate the Effect of Ethnic Stereotypes on School-Placement Recommendations?

    ERIC Educational Resources Information Center

    Klapproth, Florian; Kärchner, Henrike; Glock, Sabine

    2018-01-01

    The results of two experiments demonstrate that preservice teachers made biased school-placement recommendations depending on student's ethnicity, which on average penalized students from an ethnic minority. Moreover, additional information that was supposed to disconfirm ethnic stereotypes (religious affiliation in Experiment 1, number of missed…

Top